WO2004084039A2 - Methods and apparatus for implementing a remote application over a network - Google Patents

Methods and apparatus for implementing a remote application over a network Download PDF

Info

Publication number
WO2004084039A2
WO2004084039A2 PCT/US2004/008278 US2004008278W WO2004084039A2 WO 2004084039 A2 WO2004084039 A2 WO 2004084039A2 US 2004008278 W US2004008278 W US 2004008278W WO 2004084039 A2 WO2004084039 A2 WO 2004084039A2
Authority
WO
WIPO (PCT)
Prior art keywords
display
scene
implementing
client
user interface
Prior art date
Application number
PCT/US2004/008278
Other languages
French (fr)
Other versions
WO2004084039A3 (en
Inventor
Daniel Putterman
Brad Dietrich
John Doornbos
Pedro Freitas
Original Assignee
Mediabolic, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/391,116 external-priority patent/US7213228B2/en
Application filed by Mediabolic, Inc. filed Critical Mediabolic, Inc.
Publication of WO2004084039A2 publication Critical patent/WO2004084039A2/en
Publication of WO2004084039A3 publication Critical patent/WO2004084039A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • H04N21/43637Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Definitions

  • the present invention is directed toward the field of network software and devices,
  • application is an application that runs on a first computer but provides the functionality of
  • a second computer e.g., implements a user interface
  • Windows environment remotes applications such that thin client computers, or terminals, access a computer, such as a server, over a network to obtain the application's
  • a server may host a word processing
  • the thin client computer or terminal communicates with a server to operate the word processing program.
  • the application program running on the server,
  • applications implement a user interface using a user interface tool kit
  • widget set sometimes referred to as a widget set, a rendering engine, and underlying hardware to
  • the application provides parameters to the user interface tool
  • kit based on specifics of the application. For example, some applications define buttons,
  • the user interface tool kit provides
  • the user interface tool kit may specify placement of the buttons, toolbars and menus used in the
  • This layout is sometimes referred to as a logical layout of the user interface.
  • the rendering engine which receives the logical layout from the user interface tool kit, defines how to translate the logical layout to a physical representation for rendering on an
  • the remote computer display is a graphics display
  • the rendering engine may convert digital data to RGB data for storage in a frame buffer for rendering on the output display.
  • the user interface hardware may include, for a
  • graphics display a frame buffer, graphics processor and raster scan display for rendering
  • the application and user interface tool kit typically, to remote an application, the application and user interface tool kit
  • the local computer i.e., the computer providing the user interface
  • the local computer includes
  • the remote application specifies a logical layout supported by
  • the remote application may
  • the remote application knows the overall resolution of the
  • the rendering engine at the local client translates the
  • the contents of the frame buffer are thereafter rendered on the local client's output
  • the application logic is typically tuned to a specific display resolution, particularly if the web page contains graphic information. Often times Web
  • Web applications are not very smooth and interactive applications. Although Web technology may be useful because
  • remote user interfaces and remote display of information are remote user interfaces and remote display of information.
  • a user interface for an application program is implemented on a display client.
  • the core application logic for the application program is executed on the remote
  • the remote computer transfers, to the display client, an
  • the scene defines an abstract layout for a screen display of the user interface.
  • the user interface receives
  • the input event is interrupted, and data is generated based on the interpretation of the input event.
  • the display client interrupts the scene and the data based on the display capabilities of the
  • the display client Based on this interpretation, the display client generates a display scene
  • the display data is rendered on an output device of the
  • the display client stores a plurality of pre-defined scenes
  • a scene instantiates the scene based on the pre-defined scenes and the scene descriptor.
  • the display client iterates
  • a slot is
  • a widget includes a controller, a model and a view.
  • controller interrupts the input event, the model generates data in response to the
  • controller, model, and view portions of a widget may be implemented at the display client
  • Figure 1 illustrates a media space configured in accordance with one embodiment of the present invention.
  • Figure 2 illustrates one embodiment for integrating devices into a single media space.
  • Figure 3 is a block diagram illustrating one embodiment for implementing a remote application.
  • Figure 4 is a block diagram illustrating an example abstract scene layout.
  • Figure 5 is a block diagram illustrating one embodiment for implementing an abstract scene with widgets.
  • Figure 6 illustrates an example screen display generated for a graphics display.
  • FIG. 7 illustrates an example screen display generated for a liquid crystal display (“LCD").
  • LCD liquid crystal display
  • Figure 8 is a block diagram illustrating a further embodiment for implementing a remote application.
  • Figure 9 is a block diagram illustrating another embodiment for implementing a remote application.
  • Figure 10 is a block diagram illustrating one embodiment for implementing widget-based controllers for the remote user interface of the present invention.
  • Figure 11 is a block diagram illustrating one embodiment for providing a model for a user interface.
  • Figure 12 is a block diagram illustrating another embodiment for a remote application.
  • Figure 13 is a flow diagram illustrating one embodiment for a method to remote an application.
  • Figure 14 is a flow diagram illustrating one embodiment for implementing a user interface from a remote application.
  • Figure 15 illustrates an example user interface for a television implemented on a client device.
  • Figure 16 illustrates one embodiment for rendering a user interface of an audio application on a client device.
  • Figure 17 illustrates an application for the remote display of media information from a media server to an A/V receiver.
  • a media convergence platform provides an efficient and easy way for one or more
  • space connotes one or more media storage devices coupled to one or more media players
  • Figure 1 illustrates a media space configured in accordance with one embodiment
  • the media space 100 includes "n" media
  • media storage devices 110 store any type of media.
  • the media may be any type of media.
  • the media may be any type of media.
  • the media may be any type of media.
  • storage devices 110 store digital media, such as digital audio, digital video (e.g., DVD,
  • the media space 100 also includes "m” media players
  • players 120 are devices suitable for playing and or viewing various types of media. For example,
  • a media player may comprise a stereo system for playing music or a television
  • the media storage devices 110 are coupled to the media players 120.
  • the media storage devices 110 and the media players 120 are shown in
  • FIG. 1 as separate devices to depict the separate functions of media storage and media
  • the media players may perform both the storage and playback
  • a media player may comprise a DND player that includes a hard
  • media and the playback/viewing of media are performed by separate devices. For this
  • the media players 120 playback content stored on the media storage devices
  • a video clip stored on media storage device "1" may be played on any video clip stored on media storage device "1"
  • the storage devices 110 and media players 120 are controlled by management
  • management component 130 permits users to aggregate
  • the management component 130 may be implemented
  • the media space of Figure 1 shows a plurality of users 140 to depict that more than one user may playback/view media through different media players.
  • the system supports playback of different media through multiple media players (i.e., the
  • management component 130 may also organize, control, and browse media available within the media space.
  • the management component 130 provides a distributed means to
  • Figure 2 illustrates one embodiment for integrating devices into a single media
  • system 200 shown in Figure 2 may be a home media system.
  • the media space 200 includes at least one personal video recorder
  • PVR personal video recorder
  • media space may include many media servers.
  • media server 210 stores media for distribution throughout the media space 200.
  • media server 210 stores system software to integrate the components of the
  • media space to distribute media tlirough the media space, and to provide a user interface
  • the PVR-media server 210 may also include one
  • the PVR-media server 210 is coupled to different types of
  • media players including one or more televisions (e.g., television 250) and one or more
  • media players e.g., audio and video playback devices
  • playback device 240 media players
  • media playback device may comprise AVR receivers, CD players, digital music players
  • the PVR-media server 210 is
  • the PVR-media server 210 executes software to perform a
  • server 210 operates as a "thick client.” A user accesses and controls the functions of the
  • the user interface utilizes
  • the user interface includes a plurality of
  • a screen of the user interface includes one or more items for
  • remote control 260 controls the user
  • a user interface permits the user, through use of a remote control, to perform a variety of
  • the components of the media convergence platform are integrated through a
  • the devices e.g., PVR-media server 210, television 250, remote control 260, media player 240 and media manager 280
  • the devices e.g., PVR-media server 210, television 250, remote control 260, media player 240 and media manager 280
  • Network 225 may comprise any type of network,
  • network 225 may comprise networks
  • one or more thin video clients may be integrated
  • a thin video client may be coupled to PVR-media
  • a thin video client to provide playback of digital media on television 250.
  • a thin video client receives media from PVR-media server
  • PVR-media server 210 transmits a digital movie over network 225, and the thin
  • video client processes the digital movie for display on television 250. In one embodiment
  • the thin video client processes the digital movie "on the fly” to provide
  • NTSC or PAL formatted video for playback on a standard television.
  • a thin video client
  • the media convergence platform system also optionally integrates one or more
  • a thin audio client may receive
  • digital music (e.g., MP3 format) from PVR-media server 210 over network 225, and may
  • the thin audio client includes a small display (e.g., liquid crystal display "LCD”) and
  • the PVR-media server 210 transmits items and
  • the thin audio client For example, the thin audio
  • client may display lists of tracks for playback on an audio system. The user selects items
  • the thin film displayed on the screen using the buttons to command the system.
  • the buttons to command the system.
  • the thin film displayed on the screen using the buttons to command the system.
  • audio client screen may display a list of albums available in the media space, and the user,
  • buttons may command the user interface to display a list of tracks for a
  • the user may select a track displayed on the screen for playback on
  • the media manager 280 is an optional component for the media convergence
  • the media manager 280 permits the user to organize,
  • manager 280 may store media for integration into the media space (i.e., store media for
  • the media space may be extended to access media stored external to those
  • the media convergence platform system integrates content from external
  • the 210 may access content external to the local network 225.
  • the external content may include any type of media, such as digital music and video.
  • platform system may be coupled to external content 290 through a broadband connection
  • the external content may be delivered to the media convergence platform system through
  • the external content may be broadcasted.
  • the media server 210 may access external content 290 through a data casting
  • a “remote application” connotes software, operating on a device
  • the software system separates the user interface (“UI")
  • the system defines user
  • abstract scene may define, for a particular display, a title at the top of the display, a
  • the software comprises pre-defined scenes, UI application logic, a scene
  • the application For example, a user may select a first item displayed on the current UI display, hi response, the application logic selects, if applicable, a new abstract scene and
  • the application logic is implemented independent of the scene and the UI rendering.
  • the application logic selects a scene descriptor, to define an abstract layout, in
  • a scene manager running on the local client, interprets the scene descriptors
  • the display client For example, if the display for a
  • the scene manager may convert other logical elements to a list for display on the LCD display.
  • the UI rendering engine renders
  • the display elements include display resolution, font size for textual
  • the output device is a
  • the UI rendering engine generates graphics data (i.e., RGB data)
  • suitable for display of the scene on the television screen e.g., proper resolution, font size,
  • the output display is a liquid crystal display (“LCD)
  • the UI rendering engine If the output display is a liquid crystal display (“LCD”), the UI rendering engine
  • a user interface implementation that separates the UI application logic from the
  • the application logic does not require any information regarding the capabilities of the output display. Instead, the application logic
  • this separation permits a graphical designer of a user interface system to easily change the scenes of the
  • the application logic receives the revised scene
  • the media convergence platform permits implementing user
  • the application logic is
  • the device executes on a device remote from the device displaying a user interface.
  • displaying the user interface contains the UI rendering software.
  • the data and scenes for a user interface (e.g., scene descriptors) exist on a remote device.
  • the scene interface interface between the scene descriptors
  • device e.g., server
  • descriptor information with data is transferred. This delineation of functions provides a
  • a remote device hosting the application logic does not require
  • the architecture permits implementing a thin client in a media convergence platform because the thin client need not run the application logic software.
  • the architecture permits implementing a thin client in a media convergence platform because the thin client need not run the application logic software.
  • Figure 3 is a block diagram illustrating one embodiment for implementing a
  • a remote application 310 includes
  • remote application 310 may comprise a media server with considerable processing capabilities, such as a
  • a client device, 370 has a display 360, for displaying
  • the rendering engine 355 receives, as input, data
  • the model from scene manager 350, and generates, as output, display data.
  • the display data
  • display 360 comprises a graphics display, then display data includes information (e.g.,
  • RGB data to render a graphical image on a display.
  • Figure 3 illustrates separating a UI rendering, implemented on a client device
  • a remote device 310
  • a remote device 310
  • list of objects may be displayed on display 360.
  • objects e.g., musical albums
  • a scene descriptor (320) may define an
  • the scene descriptor may define a list of
  • the application logic 330 receives the
  • the application logic 330 populates the elements of the scene descriptor
  • application logic 330
  • the application logic 330 then transmits, through interface 340, the
  • the scene manager 350 converts the scene elements with data to the display elements.
  • the rendering engine 355 generates
  • display 360 is an
  • rendering engine 355 generates a textual list of audio tracks.
  • rendering engine 355 generates
  • graphics data e.g., RGB
  • the techniques use "abstract scenes", defined by scene
  • a scene descriptor in its simplest
  • a form may constitute a list (e.g., a list scene).
  • a scene descriptor defines a
  • the slots of a scene provide the framework for an application to render
  • the system divides labor between the remote application
  • the remote application communicates the scene descriptor, in terms of logical coordinates,
  • the local display computer translates the scene descriptor
  • the remote application based on its underlying display capabilities, hi other embodiments, the remote application
  • the remote application may provide less information about a scene, thereby assigning more UI operations to the local client computer.
  • a scene descriptor may include one or more titles, a message, and
  • Figure 4 is a block diagram illustrating an example scene descriptor.
  • the example scene includes a plurality of slots (i.e., A, B and C).
  • a slo. A located on the top of the scene, may be used to display a major title (e.g., the title
  • Slots located in the center of the scene, includes a plurality of
  • slots may be used by the application to
  • each subcomponent e.g., SIO. BI , slot ⁇ 2 • • • S1O_ B
  • the menu items comprise media items
  • the remote application may use slotc to display an icon (e.g., a logo for the remote application software publisher) .
  • the remote application constructs a list of elements for a scene
  • descriptor which includes data for display in the slots, and transfers the list of elements in
  • the remote application interrogates the scene (at the client) to
  • the list elements may include a data model, abstract
  • a widget corresponds to one or more slots on an abstract scene.
  • a widget comprises a controller, model, and view subcomponents.
  • a view is an interpretation of the abstract scene suitable for a specific display. For example,
  • a first view of an abstract scene may be suitable for rendering on a graphical display
  • a second view of an abstract scene may be suitable for rendering the abstract scene on an
  • the model provides the underlining data for slots of an abstract scene.
  • the model for that slot may include
  • a controller provides the logic to
  • interpret user interface events i.e., user input to the user interface. For example, if a user
  • the controller provides the logic to interpret the event, and initiate,
  • Figure 5 is a block diagram illustrating one embodiment for implementing an
  • a widget corresponds to each slot on the abstract scene. Specifically, widge. A is
  • widgets is instantiated for slots
  • widgetc is instantiated for slotA
  • each widget (A, B and C) includes a controller, model
  • Widgets may be configured to render slots, and its subcomponents.
  • FIGS. 6 and 7 illustrate two different example screens supported by the
  • An example user interface display e.g., screen for a graphics display is shown in Figure 6.
  • LCD display
  • underlying widget for screen 600 presents the text string, "Home Media Applications", in a box.
  • the text string "Home Media Applications” is displayed
  • Slots ( Figure 4) contains a plurality of elements. For this
  • the elements represent menu items (i.e., home media applications available).
  • LCD display 700 ( Figure 7) can't display graphics, and therefore the graphics symbol is
  • Figure 9 is a block diagram illustrating another embodiment for implementing a
  • the application logic 910 implements the functionality of an
  • application logic 910 communicates with display
  • client 940 in a manner to divide functionality between application logic 910 and display
  • display client 940 performs more functions than a
  • display client 940 may be described as a “thicker” client.
  • display client 940 includes modules to implement a user interface for the application
  • display client 940 includes scene manager 945, scene 950, slots 955, and pre-defined scenes 365.
  • a widget
  • model 930 includes model 930, view 965, and controller 970 components implemented in both
  • display client 940 implements controller 970 and view 965 portions of
  • widget 960 The model portion of widget 960 is implemented on application logic 910
  • application logic 910 also includes scene
  • application logic 910 selects an abstract scene for the user interface.
  • application logic 910 interrogates display client 940 to determine the scenes
  • display client 940 i.e., scenes available in pre-defined scenes 365.
  • application logic 910 transmits a scene descriptor (one of scene descriptors 920) to
  • the scene manager module 945 instantiates a scene for the user interface.
  • the instantiated scene is
  • scene 950 depicted in Figure 9 as scene 950.
  • the scene 950 aggregates through the slots 955 to
  • widget 960 Specifically, input events, input from the user through the user interface, are
  • the model to support the slots is provided from the model
  • Figure 8 is a block diagram illustrating a further embodiment for implementing a
  • widget 860 supporting the slots for a scene, is implemented entirely on application logic 810.
  • display client 840 may be characterized as a "thin" client. Similar to the embodiment of Figure 9,
  • application logic 810 interrogates display client 840 to determine the available scenes
  • application logic 810 transmits, over a network, a scene descriptor to
  • scene manager 845 instantiates
  • a scene e.g., scene 850.
  • the slots for the scene are populated through use of widget 860.
  • Model 830 provides data to support the user interface slots, and view module 865 supports the view module 865.
  • view module 865 supports the view module 865.
  • controller portion of a widget may reside locally on a client, or may be invoked across the
  • Figure 10 is a block diagram illustrating one embodiment for implementing
  • widget-based controllers for the remote user interface of the present invention.
  • local display device 1030 instantiates a widget, widget A , to control and render
  • widget A consists of software
  • the local area network device 1010 (remote controller 1025).
  • the local area network device 1010 remote controller 1025.
  • the local area network device 1010 remote controller 1025.
  • the local area network device 1010 remote controller 1025
  • controller 1040 processes certain events for widget A - Typically, local controller 1040 may
  • an event may be generated when a user moves the cursor from one item on a list to another, ha
  • the user mterface may highlight each item to indicate the placement of the user's cursor.
  • a widget may use local controller 1040
  • remote application h one embodiment, to accomplish this, the remote application
  • remote controller 1025 may not be a separate object, but may be
  • widget A is part of procedural code within remote application 1020. As shown in Figure 10, widget A ,
  • widget A uses a remote procedure call
  • RPC Remote controller
  • widgetA may receive an event from a user to select an application displayed on the screen from a menu list of available
  • the remote controller 1025 may generate a top
  • the new top menu screen may require a new scene
  • Data may be supplied to a local display device either locally or from across the
  • Figure 11 is a block diagram illustrating one embodiment for providing a
  • local display device 1130 operates
  • remote application 1120 operating on remote network device 1110.
  • data model object 1125 For this example, data model
  • the object 1125 provides an interface to a data store 1140.
  • the data store 1140 may reside on
  • the remote network device 1110 or it may reside anywhere accessible by the network (e.g., another network device or a service integrating the data store from an external source to the network).
  • the controller interprets an event, and invokes the data model object in accordance with the interpretation of the event.
  • the controller may interpret an event that requests all available musical tracks within a specified genre. For this request, data model object
  • 1125 may generate a query to database 1140 for all musical tracks classified in the
  • data model object 1125 communicates the model (data) to local display device 1130 through interface 1150.
  • the model may comprise a text string.
  • a text string For example, a
  • current UI screen may consist of a list of all high-level functions available to a user.
  • a user may select a function, and in response, the system may display a list, which consists of text strings, of all sub-functions available for the selected function, h another embodiment, the data model may be provided as a handle to the user interface implementation.
  • FIG 12 is a block diagram illustrating another embodiment for a remote application.
  • the widgets for a local display device are highly distributed.
  • one or more widgets may be implemented across multiple
  • a first widget, widg ⁇ ti, comprises viewi (1180)
  • model - controller (1157) implementation for widgeti is implemented on application serveri (1155).
  • Widget 2 which has a view component (1185) implemented on local display device 1175,
  • model - controller component (1162) on a separate device (i.e.,
  • a third device, application server 3 (1170), implements the model -
  • controller component (1172) for a third widget. As shown in Figure 12, the view
  • component (1190) for the third widget is implemented on local display device 1175.
  • the remote application technology of the present invention supports highly repetitive tasks
  • a single display e.g., local
  • display device 1175) may support three different remote applications.
  • the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the three applications is integrated on a single display. Since the view component of the
  • the view of information for the remote applications may be tailored to the particular display device.
  • Figure 13 is a flow diagram illustrating one embodiment for a method to remote
  • the remote application defines a set of widgets and selects a scene
  • handle provides a means for the remote application to communicate with the local display
  • the remote application queries the local display device for a scene interface for
  • the scene interface defines slots for the abstract
  • the remote application populates a data
  • the local display device instantiates software to locally implement the abstract
  • a widget may incorporate all or portions of the controller, model, view
  • the remote application transfers the
  • the local display device renders the initial scene using the data
  • the widget executes control logic based on the
  • controller may
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the local display may be implemented by one or more remote applications.
  • the device may include two widgets.
  • the controller of a first widget may be implemented on
  • the controller for the second widget may be implemented on
  • the local display device may
  • the model may be supplied
  • the local display device may be implemented with
  • server may supply the model for two widgets and another remote application server,
  • running on a second application server may supply the model for the other two widgets.
  • the widget renders a new scene with the data model supplied (block 1290, Figure
  • Figure 14 is a flow diagram illustrating one embodiment for implementing a user
  • a widget corresponding to the slot on the abstract scene, interprets the user input and generates an event (block 1320, Figure
  • the widget that manages the corresponding slot generates an event to signify selection of the menu item.
  • the controller for the event across the network (block 1340, Figure 14).
  • the controller for the event across the network (block 1340, Figure 14).
  • widget may reside on one more computers remote from the local display device. If the
  • widget interprets the event locally and the event does not require an external data model
  • the widget supplies the data model (blocks 1360 and 1370, Figure 14). If the event
  • the widget remotes the controller over the network to one or more remote devices, then
  • one or more remote applications iterate through the active widgets on the remote scene
  • the scene is rendered at the local display device (block 1390,
  • the present invention has application to configure a two-way universal remote
  • the remote controller may be configured, on the fly, to control any
  • a user interface operating as a remote application
  • remote application e.g., user interface
  • the remote controller operating as a client device, renders the user interface.
  • the remote controller operating as the rendering client, communicates
  • the remote controller communicates with the target device to control the target device.
  • the remote controller may comprise a graphical display to implement a graphical user interface
  • the remote controller or rendering client may comprise a
  • PDA personal digital assistant
  • PC tablet personal computer
  • Java® phone any combination thereof
  • the remote controller or rendering client may be implemented using a
  • the two-way remote controller implements a television user
  • the remote application may implement an electronic
  • EPG programming guide
  • Figure 15 illustrates an
  • the television 1400 displays an electronic programming guide ("EPG").
  • EPG electronic programming guide
  • multiple channels are displayed in a first vertical column.
  • EPG displayed
  • channels 500 - 506 are shown.
  • the user may view, on the EPG, additional
  • time slots for the channels e.g., the time slots 4:30, 5:00 and 5:30.
  • the EPG displays, beneath the time slot columns, the name of the program playing
  • client device 1410 comprises a
  • the client device display 1420 may also be sensitive to pressure
  • client device display 1420 which renders the user interface, is smaller than the display of
  • the client device display 1420 Therefore, in rendering the user interface on the client
  • the example client device rendering of the user interface does not display metadata of a
  • the television user interface is rendered on a client device with a small
  • the concepts of the present invention may also be applied to generate a user
  • the present invention also has application for rendering non-graphical user
  • a user of home media network 200 (Figure 2)
  • the playback device 240 may not have a graphical user interface. Instead, the playback device 240 may not have a graphical user interface. Instead, the playback device 240 may
  • Figure 16 illustrates one embodiment for rendering a user
  • the target device may comprise an audio-video receiver ("AVR").
  • AVR audio-video receiver
  • display 1510 displays the
  • AVR input source e.g., tuner
  • additional information e.g., band and station currently tuned.
  • the user interface on the AVR 1500 further includes buttons 1520 for
  • the AVR 1500 may also have a remote control specific to the AVR.
  • a client device 1530 renders a user interface for the AVR.
  • the client device is not manufactured specifically for the AVR. Instead, a remote
  • AVR 1500 remotes the user interface of the AVR
  • the client device 1530 (i.e., target device) to the client device 1530.
  • the client device 1530 the client device 1530
  • a graphical display 1540 also used as a user input device (i.e., user touches the graphical display 1540).
  • the client device 1530 displays screen to interact with the user interface.
  • the client device 1530 For this AVR example, the client device 1530
  • tuning control 1550 renders, as part of the user interface, tuning control 1550 and volume control 1570 for
  • the client device 1530 also controls control of the tuning and volume on AVR 1500, respectively.
  • the client device 1530 also controls control of the tuning and volume on AVR 1500, respectively.
  • the client device 1530 also controls control of the tuning and volume on AVR 1500, respectively.
  • application and client device may also be configured to change the user interface based on
  • the target device may comprise a compact disc (“CD”) device, a digital video disc (“DVD”) device, a digital music playback device, or any combination thereof.
  • CD compact disc
  • DVD digital video disc
  • the target device may comprise a compact disc (“CD”) device, a digital video disc (“DVD”) device, a digital music playback device, or any combination thereof.
  • the techniques of the present invention have application to render a user interface of a media convergence platform to a client device.
  • convergence platform may present different types of media within a single user interface
  • the user interface is television based.
  • the user interface is television based.
  • the user interface is television based.
  • a television display may display, on a television display, selectable items to represent a music application, a
  • the television display to invoke an application.
  • the music application permits a user to
  • the photo albums application permits a
  • a two-way remote control device is configured to implement a
  • a host computer device For example, a host computer device,
  • a media server e.g., PVR-media server 210, Figure 2
  • PVR-media server 210 Figure 2
  • the media convergence platform user interface is television based (e.g.,
  • convergence platform user interface is implemented on a personal computer (e.g., implemented on personal computer 250, Figure 2).
  • the application program may remote
  • the user interface to a client device, such as a PDA.
  • client device such as a PDA.
  • control target devices on the network tlirough a rendition of the media convergence platform user interface. This allows the user the ability to use the user interface of the
  • the media convergence platform may primarily use
  • a television to display screens for the user interface, and a television remote control to
  • the television may desire to remote the user interface to a client device because the television is not
  • the television is not in the same room as the user.
  • the user may proceed to control a target
  • the device e.g., an audio player in the room with the user.
  • the host computer device remotes a user interface to a client
  • the PVR-media server 210 may run a user interface for playback device 240.
  • the PVR-media server 210 remotes the user interface for playback
  • media manager 280 may remote a
  • remote control 260 user interface for displaying photos to remote control 260.
  • remote control 260 remote control 260
  • an application which implements a user interface for a
  • remote device may reside anywhere on the network to control any other device on the network.
  • Remote Display Applications :
  • the present invention has application to render display information at a client
  • remote device e.g., media server
  • client device e.g., media server
  • the client device may display the information on a
  • the client device renders display
  • the client may request media or information about the media. For example, the client
  • the device may display information about media playing through the client playback device.
  • Figure 17 illustrates an application for the remote display of media information
  • a media server operates a
  • the media server may operate a media convergence platform user interface for browsing and selecting, for playback, video, audio
  • media server 1620 with output device
  • output device 1600 is currently operating an audio application.
  • output device 1600 is currently operating an audio application.
  • output device 1600 is currently operating an audio application.
  • output device 1600 is currently operating an audio application.
  • output device 1600 displays an identification of the audio track currently playing (e.g.,
  • An A/V receiver 1610 coupled to media server 1620, is
  • A/V receiver also includes a display
  • media server 1630 e.g., LCD
  • software operating on media server 1620, remotes
  • the A/V receiver displays information about the current music playing.
  • the client device displays video information at a client
  • a DVD player may be configured as a playback device to play video from a source on the network (e.g., media server).
  • a media server may
  • a display on the DVD player may display the name of the DVD currently playing as well as additional
  • a client device may display

Abstract

A user interface is implemented on a client device remote from a host device. The host device operates an application program that implements a user interace, such as an electronic programming guide or a guide for a personal video recorder, that permits a user to control at least one target device. The host device transfers to the client device an identification of at least one scene. In general, a scene defines an abstract layout for at least one screen display of the user interface. The client device generates at least one screen display for the scene based on its interpretation of the scene. The client device then displays the screen as an implementation of the user interface. Thereafter, a user initiates, using the cliend device, an operation to control the target device. In response, the target device performs the operation. The host device may also display information at a client device. For example, the host device may transmit information about a media currently playing at the client device.

Description

METHODS AND APPARATUS FOR IMPLEMENTING A REMOTE APPLICATION OVER A
NETWORK
CROSS-REFERENCES TO RELATED APPLICATIONS
This application claims the benefit of U.S. Patent Application No. 10/391,116,
filed March 17, 2003, entitled "Methods and Apparatus For Implementing A Remote
Application Over A Network" and U.S. Patent Application No. 10/779,953, filed
February 14, 2004, entitled "Methods and Apparatus For Rendering User Interfaces and Display Information on Remote Client Devices."
BACKGROUND OF THE INVENTION
Field of the Invention:
The present invention is directed toward the field of network software and devices,
and more particularly towards rendering user interfaces and displays on devices remote
from a host device.
Art Background:
Prior art techniques exist to "remote" applications. In general, a remote
application is an application that runs on a first computer but provides the functionality of
the application to a second computer (e.g., implements a user interface) remote from the
first computer. Remote application techniques have been used in client - server
environments, wherein the application programs are stored on a server, and the client
computers accesses the server to obtain functionality from the applications. The X
Windows environment remotes applications such that thin client computers, or terminals, access a computer, such as a server, over a network to obtain the application's
functionality at the terminals. For example, a server may host a word processing
application. The thin client computer or terminal communicates with a server to operate the word processing program. The application program, running on the server,
implements the user interface at the local computer for the underlying application
program.
One issue that arises when implementing remote applications is that the remote applications require specific knowledge about the display characteristics of the client
computer or terminal. If the client-server environment has many client computers, then
the remote application must know the requirements of each client computer. This limits
the types of devices or computers that the remote application can support, or significantly
increases complexity of the server software to support the various types of devices.
Therefore, it is desirable to develop software that permits a remote application to operate
on a client computer or device without requiring the remote application to have any
knowledge of the client's configuration.
Typically, applications implement a user interface using a user interface tool kit,
sometimes referred to as a widget set, a rendering engine, and underlying hardware to
display the user interface. The application provides parameters to the user interface tool
kit based on specifics of the application. For example, some applications define buttons,
toolbars, menus, etc. for use with the application. The user interface tool kit provides
specific layout information for the application requirements. For example, the user interface tool kit may specify placement of the buttons, toolbars and menus used in the
application. This layout is sometimes referred to as a logical layout of the user interface.
The rendering engine, which receives the logical layout from the user interface tool kit, defines how to translate the logical layout to a physical representation for rendering on an
output display. For example, if the remote computer display is a graphics display, then
the rendering engine may convert digital data to RGB data for storage in a frame buffer for rendering on the output display. The user interface hardware may include, for a
graphics display, a frame buffer, graphics processor and raster scan display for rendering
pixel data.
Typically, to remote an application, the application and user interface tool kit
software are run on the remote computer (i.e., the computer running the remote
application). The local computer (i.e., the computer providing the user interface) includes
the rendering engine and display hardware. An interface between the remote computer
and local computer defines specific parameters for displaying information on the local
computer (e.g., screen resolution, graphics or textual display, color palettes supported,
etc.). Using this interface, the remote application specifies a logical layout supported by
the physical rendering of the local client. For example, the remote application may
specify, in a logical layout, the toolbar at the top of the output display, hi order to ensure
that the toolbar is readable, the remote application knows the overall resolution of the
output display on the local client. The rendering engine at the local client translates the
data for the toolbar, and stores the graphics data in the frame buffer at the local client.
The contents of the frame buffer are thereafter rendered on the local client's output
display.
Internet technology uses markup languages and Web browsers to display Web
applications on local computer displays. Using Web browser technology, the application
running on the web server does not need to know the specifics of the client display
characteristics. However, the application logic is typically tuned to a specific display resolution, particularly if the web page contains graphic information. Often times Web
applications specify to the user at the local computer a viewing resolution for the Web site
because the Web application was designed for that specific resolution. Thus, Web
technology still requires the application to have specific knowledge of the display
characteristics of a computer that displays Web pages, hi addition, user interfaces on the
Web are very disconnected, so as to require splitting the application logic between the
server and the client (e.g., Javascript). Alternatively, the Web applications are not very smooth and interactive applications. Although Web technology may be useful because
most users view the information from desktop or notebook computers with a pre-defined
resolution, the technology is not effective for use in systems that integrate devices with
different types of displays. Accordingly, it is desirable to develop a remote application
technology that permits using local client displays for remote applications regardless of the type of display at the local client. It is also desirable to develop applications to utilize
remote user interfaces and remote display of information.
SUMMARY OF THE INVENTION
A user interface for an application program is implemented on a display client.
The core application logic for the application program is executed on the remote
computer. The division of functionality between the application program, operating on
the remote computer, and the user interface, operating on the display client, does not
require the application program to possess information regarding the display capabilities
of the display client. The remote computer transfers, to the display client, an
identification of a scene for a user interface of the application program. The scene defines an abstract layout for a screen display of the user interface. The user interface receives
input from a user (e.g., the user selects a menu item from the user interface). The input event is interrupted, and data is generated based on the interpretation of the input event.
The display client interrupts the scene and the data based on the display capabilities of the
display client. Based on this interpretation, the display client generates a display scene
and display data for the scene. The display data is rendered on an output device of the
display client.
hi one embodiment, the display client stores a plurality of pre-defined scenes, and
receives a scene descriptor from the application logic, hi response, the display client
instantiates the scene based on the pre-defined scenes and the scene descriptor. A scene
includes a plurality of slots. To render the display scene, the display client iterates
through the slots of the scene to populate the display data, hi one embodiment, a slot is
implemented using a widget. A widget includes a controller, a model and a view. The
controller interrupts the input event, the model generates data in response to the
interpretation of the input event, and the view renders the data on the display client. The
controller, model, and view portions of a widget may be implemented at the display client
or remote from the display client.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 illustrates a media space configured in accordance with one embodiment of the present invention.
Figure 2 illustrates one embodiment for integrating devices into a single media space.
Figure 3 is a block diagram illustrating one embodiment for implementing a remote application.
Figure 4 is a block diagram illustrating an example abstract scene layout. Figure 5 is a block diagram illustrating one embodiment for implementing an abstract scene with widgets.
Figure 6 illustrates an example screen display generated for a graphics display.
Figure 7 illustrates an example screen display generated for a liquid crystal display ("LCD").
Figure 8 is a block diagram illustrating a further embodiment for implementing a remote application.
Figure 9 is a block diagram illustrating another embodiment for implementing a remote application.
Figure 10 is a block diagram illustrating one embodiment for implementing widget-based controllers for the remote user interface of the present invention.
Figure 11 is a block diagram illustrating one embodiment for providing a model for a user interface.
Figure 12 is a block diagram illustrating another embodiment for a remote application.
Figure 13 is a flow diagram illustrating one embodiment for a method to remote an application.
Figure 14 is a flow diagram illustrating one embodiment for implementing a user interface from a remote application.
Figure 15 illustrates an example user interface for a television implemented on a client device.
Figure 16 illustrates one embodiment for rendering a user interface of an audio application on a client device. Figure 17 illustrates an application for the remote display of media information from a media server to an A/V receiver.
DETAILED DESCRIPTION
Media Convergence Platform:
A media convergence platform provides an efficient and easy way for one or more
users to manage and playback media within a "media space." As used herein, a "media
space" connotes one or more media storage devices coupled to one or more media players
for use by one or more users. The integration of media storage devices and media players
into a single media space permits distributed management and control of content available
within the media space.
Figure 1 illustrates a media space configured in accordance with one embodiment
of the present invention. As shown in Figure 1, the media space 100 includes "n" media
storage devices 110, where "n" is any integer value greater than or equal to one. The
media storage devices 110 store any type of media. In one embodiment, the media
storage devices 110 store digital media, such as digital audio, digital video (e.g., DVD,
MPEG, etc.), and digital images. The media space 100 also includes "m" media players
120, where "m" is any integer value greater than or equal to one. hi general, the media
players 120 are devices suitable for playing and or viewing various types of media. For
example, a media player may comprise a stereo system for playing music or a television
for playing DVDs or viewing digital photos. As shown in Figure 1, the media storage devices 110 are coupled to the media players 120. The media storage devices 110 and the media players 120 are shown in
Figure 1 as separate devices to depict the separate functions of media storage and media
playback; however, the media players may perform both the storage and playback
functions. For example, a media player may comprise a DND player that includes a hard
drive for the storage and playback of digital video, h other embodiments, the storage of
media and the playback/viewing of media are performed by separate devices. For this
embodiment, the media players 120 playback content stored on the media storage devices
110. For example, a video clip stored on media storage device "1" may be played on any
of the applicable "m" media players 120.
The storage devices 110 and media players 120 are controlled by management
component 130. hi general, management component 130 permits users to aggregate,
organize, control (e.g., add, delete or modify), browse, and playback media available
within the media space 100. The management component 130 may be implemented
across multiple devices. The media space of Figure 1 shows a plurality of users 140 to depict that more than one user may playback/view media through different media players.
The system supports playback of different media through multiple media players (i.e., the
system provides multiple streams of media simultaneously). The users 140, through
management component 130, may also organize, control, and browse media available within the media space. The management component 130 provides a distributed means to
manage and control all media within the media space.
Figure 2 illustrates one embodiment for integrating devices into a single media
space. For example, system 200, shown in Figure 2, may be a home media system. For this embodiment, the media space 200 includes at least one personal video recorder
("PVR") -media server 210 (i.e., the media space may include many media servers). The
media server 210 stores media for distribution throughout the media space 200. hi
addition, the media server 210 stores system software to integrate the components of the
media space, to distribute media tlirough the media space, and to provide a user interface
for the components of the media space. The PVR-media server 210 may also include one
or more television tuners and software to record television signals on a storage medium.
As shown in Figure 2, the PVR-media server 210 is coupled to different types of
media players, including one or more televisions (e.g., television 250) and one or more
media players (e.g., audio and video playback devices), such as playback device 240. The
media playback device may comprise AVR receivers, CD players, digital music players
(e.g., MP3), DVD players, VCRs, etc. For this embodiment, the PVR-media server 210 is
also coupled to one or more media managers 280 and to external content provider(s) 290.
For this embodiment, the PVR-media server 210 executes software to perform a
variety of functions within the media space. Thus, in this configuration, the PVR-media
server 210 operates as a "thick client." A user accesses and controls the functions of the
media convergence platform through a system user interface. The user interface utilizes
the thick and thin clients, as well as some media players (e.g., television 250 & media playback device 240). In one embodiment, the user interface includes a plurality of
interactive screens displayed on media player output devices to permit a user to access the
functionality of the system. A screen of the user interface includes one or more items for
selection by a user. The user navigates through the user interface using a remote control
device (e.g., remote control 260). The user, through use of a remote control, controls the
display of screens in the user interface and selects items displayed on the screens. A user interface permits the user, through use of a remote control, to perform a variety of
functions pertaining to the media available in the media space.
The components of the media convergence platform are integrated through a
network. For example, in the embodiment of Figure 2, the devices (e.g., PVR-media server 210, television 250, remote control 260, media player 240 and media manager 280)
are integrated through network 225. Network 225 may comprise any type of network,
including wireless networks. For example, network 225 may comprise networks
implemented in accordance with standards, such as Ethernet 10/100 on Category 5,
HPNA, Home Plug, IEEE 802. llx, IEEE 1394, and USB 1.1 / 2.0.
For the embodiment of Figure 2, one or more thin video clients may be integrated
into the media space. For example, a thin video client may be coupled to PVR-media
server 210 to provide playback of digital media on television 250. A thin video client
does not store media. Instead, a thin video client receives media from PVR-media server
210, and processes the media for display or playback on a standard television. For
example, PVR-media server 210 transmits a digital movie over network 225, and the thin
video client processes the digital movie for display on television 250. In one
embodiment, the thin video client processes the digital movie "on the fly" to provide
NTSC or PAL formatted video for playback on a standard television. A thin video client
may be integrated into a television.
The media convergence platform system also optionally integrates one or more
thin audio clients into the media space. For example, a thin audio client may receive
digital music (e.g., MP3 format) from PVR-media server 210 over network 225, and may
process the digital music for playback on a standard audio system. In one embodiment, the thin audio client includes a small display (e.g., liquid crystal display "LCD") and
buttons for use as a user interface. The PVR-media server 210 transmits items and
identifiers for the items for display on the thin audio client. For example, the thin audio
client may display lists of tracks for playback on an audio system. The user selects items
displayed on the screen using the buttons to command the system. For example, the thin
audio client screen may display a list of albums available in the media space, and the user,
through use of the buttons, may command the user interface to display a list of tracks for a
selected album. Then, the user may select a track displayed on the screen for playback on
the audio system.
The media manager 280 is an optional component for the media convergence
platform system. In general, the media manager 280 permits the user to organize,
download, and edit media in the personal computer "PC" environment. The media
manager 280 may store media for integration into the media space (i.e., store media for
use by other components in the media space). In one embodiment, the media manager
280 permits the user to perform system functions on a PC that are less suitable for
implementation on a television based user interface.
The media space may be extended to access media stored external to those
components located in the same general physical proximity (e.g., a house). In one
embodiment, the media convergence platform system integrates content from external
sources into the media space. For example, as shown in Figure 2, the PVR-media server
210 may access content external to the local network 225. The external content may include any type of media, such as digital music and video. The media convergence
platform system may be coupled to external content 290 through a broadband connection
(i.e., high bandwidth communications link) to permit downloading of media rich content. The external content may be delivered to the media convergence platform system through
use of the Internet, or the external content may be delivered tlirough use of private
distribution networks, hi other embodiments, the external content may be broadcasted.
For example, the media server 210 may access external content 290 through a data casting
service (i.e., data modulated and broadcast using RF, microwave, or satellite technology).
Remote Applications:
As used herein, a "remote application" connotes software, operating on a device
other than a local device, used to provide functionality on a local device. As described herein, the techniques of the present invention do not require the remote application to
possess pre-existing information about the characteristics of the local display device (e.g.,
display resolution, graphics capabilities, etc.).
ha one embodiment, the software system separates the user interface ("UI")
application logic from the UI rendering. In one implementation, the system defines user
interface displays in terms of "abstract scenes." In general, an abstract scene is a layout
for a screen display, and it consists of logical entities or elements. For example, an
abstract scene may define, for a particular display, a title at the top of the display, a
message at the bottom the display, and a list of elements in the middle of the display. The
scene itself does not define the particular data for the title, message and list, hi one
implementation, the software comprises pre-defined scenes, UI application logic, a scene
manager, and UI rendering engine. In general, pre-defined scenes describe an abstract
layout in terms of logical entities for a UI display. Typically, the application logic
determines the scene and provides data to populate the scene based on the logical flow of
the application. For example, a user may select a first item displayed on the current UI display, hi response, the application logic selects, if applicable, a new abstract scene and
data to populate the new scene based on the user selection.
The application logic is implemented independent of the scene and the UI rendering. The application logic selects a scene descriptor, to define an abstract layout, in
terms of the abstract elements. The application logic then populates the logical elements
with data, and transfers the abstract layout (scene descriptors) with data to the display
client. A scene manager, running on the local client, interprets the scene descriptors
based on the display capabilities of the display client. For example, if the display for a
display client is only capable of displaying lists, then the scene manager translates the
scene with data to display only lists. This translation may result in deleting some
information from the scene to render the display. The scene manager may convert other logical elements to a list for display on the LCD display. The UI rendering engine renders
display data for the scene with display elements particular to the output display for the
display client. The display elements include display resolution, font size for textual
display, the ability to display graphics, etc. For example, if the output device is a
television screen, then the UI rendering engine generates graphics data (i.e., RGB data)
suitable for display of the scene on the television screen (e.g., proper resolution, font size,
etc.). If the output display is a liquid crystal display ("LCD"), the UI rendering engine
translates the scene logical entities to a format suitable for display on the LCD display.
A user interface implementation that separates the UI application logic from the
UI rendering has several advantages. First, the application logic does not require any information regarding the capabilities of the output display. Instead, the application logic
only views the UI display in terms of logical entities, and populates data for those logic
entities based on user input and logical flow of the user interface. Second, this separation permits a graphical designer of a user interface system to easily change the scenes of the
user interface. For example, if a graphical designer desires to change a scene in the user
interface, the graphical designer only changes the mapping from abstract to physical
layout of the scene. During runtime, the application logic receives the revised scene
descriptor, populates the revised scene descriptor with data via slots, and transmits the
scene descriptor with data to the local client. Software on the local client determines
those display elements to display the scene based on the device's display. Thus, a change
to the scene does not require a change to the display elements particular to each output display because the conversion from the scene to the display elements occurs locally.
h one embodiment, the media convergence platform permits implementing user
interface software remote from a device. In one implementation, the application logic is
executed on a device remote from the device displaying a user interface. The device
displaying the user interface contains the UI rendering software. For this implementation,
the data and scenes for a user interface (e.g., scene descriptors) exist on a remote device.
Using this implementation, the scene interface (interface between the scene descriptors
and the application logic) is remote from the device rendering the display. The remote
device (e.g., server) does not transfer large bitmaps across the network because only scene
descriptor information with data is transferred. This delineation of functions provides a
logical boundary between devices on a network that maximizes throughput over the
network. In addition, a remote device hosting the application logic does not require
information regarding display capabilities of each device on the home network. Thus, this
implementation pushes the UI rendering software to the device rendering the images,
while permitting the application logic to reside on other devices. This architecture
permits implementing a thin client in a media convergence platform because the thin client need not run the application logic software. In addition, the architecture permits
implementing a "thin application server" because the application server does not need to know about every possible rendering client type.
Figure 3 is a block diagram illustrating one embodiment for implementing a
remote application. For this example embodiment, a remote application 310 includes
scene descriptors 320 and application logic 330. For example, remote application 310 may comprise a media server with considerable processing capabilities, such as a
computer or set-top box. A client device, 370, has a display 360, for displaying
information to a user (e.g., displaying data to implement a user interface), a rendering
engine 355, and a scene manager 350. The rendering engine 355 receives, as input, data
model from scene manager 350, and generates, as output, display data. The display data
is a type of data necessary to render an image on the display 360. For example, if the
display 360 comprises a graphics display, then display data includes information (e.g.,
RGB data) to render a graphical image on a display.
Figure 3 illustrates separating a UI rendering, implemented on a client device,
from application logic implemented on a remote device (310). In an example operation, a
list of objects (e.g., musical albums) may be displayed on display 360. In this example,
the user may select an album for playback. A scene descriptor (320) may define an
abstract layout for this application. For example, the scene descriptor may define a list of
audio track elements and control information. The application logic 330 receives the
scene descriptor. The application logic 330 populates the elements of the scene descriptor
with data particular to the selection. Thus, for this example, application logic 330
populates the list of audio track elements with the names of the audio tracks for the album selected by the user. The application logic 330 then transmits, through interface 340, the
scene data to the scene manager 350 on client 370. The scene manager 350 converts the scene elements with data to the display elements. The rendering engine 355 generates
data in a format suitable for display on display 360. For example, if display 360 is an
LCD display, then rendering engine 355 generates a textual list of audio tracks. In
another example, if display 360 is a graphics display, then rendering engine 355 generates
graphics data (e.g., RGB), for the list of audio tracks.
In one embodiment, the techniques use "abstract scenes", defined by scene
descriptors, to implement a user interface. In one embodiment, each application
communicates in terms of at least one scene descriptor. A scene descriptor, in its simplest
form, may constitute a list (e.g., a list scene). In general, a scene descriptor defines a
plurality of slots and the relative locations of those slots for rendering the scene on an
output display. The slots of a scene provide the framework for an application to render
specific information on a display. However, an abstract scene defined by a scene
descriptor does not define specific content for a slot. The abstract scene is developed in
the application layout section on the remote computer (i.e., the computer operating the
remote application).
In one embodiment, the system divides labor between the remote application
computer and the local display computer through use of scene descriptors. Specifically,
the remote application communicates the scene descriptor, in terms of logical coordinates,
to the local display computer. The local display computer translates the scene descriptor
based on its underlying display capabilities, hi other embodiments, the remote application
may define additional information about a scene, so as to shift more UI operations to the
remote application, hi yet other embodiments, the remote application may provide less information about a scene, thereby assigning more UI operations to the local client computer.
As an example, a scene descriptor may include one or more titles, a message, and
a list of elements. Figure 4 is a block diagram illustrating an example scene descriptor.
As shown in Figure 4, the example scene includes a plurality of slots (i.e., A, B and C).
A slo.A, located on the top of the scene, may be used to display a major title (e.g., the title
of the application). Slots, located in the center of the scene, includes a plurality of
elements, 1-n, to display a list. For example, slots may be used by the application to
display menu items. For this example, each subcomponent (e.g., SIO.BI, slotβ2 • • • S1O_B ) may represent a menu item. In one application, the menu items comprise media items
(e.g., music, video, etc.) available in a media system. The number of menu items
displayed may be variable and dependent upon the display capabilities of the local
computer. The third slot shown in Figure 4, slotc, is displayed in the lower left corner.
The remote application may use slotc to display an icon (e.g., a logo for the remote application software publisher) .
hi one embodiment, the remote application constructs a list of elements for a scene
descriptor, which includes data for display in the slots, and transfers the list of elements in
a block defined by the interface (e.g., interface 340, Figure 3) to the local display device,
hi one embodiment, the remote application interrogates the scene (at the client) to
determine the number of visible elements for display, and then retrieves the list items for
those visible elements. For example, the list elements may include a data model, abstract
interface model, raw string, etc. hi one embodiment, "widgets", a software implementation, are used in the user interface For this embodiment, an abstract scene is implemented with a collection of
widgets. A widget corresponds to one or more slots on an abstract scene. In one
implementation, a widget comprises a controller, model, and view subcomponents. A view is an interpretation of the abstract scene suitable for a specific display. For example,
a first view of an abstract scene may be suitable for rendering on a graphical display, and
a second view of an abstract scene may be suitable for rendering the abstract scene on an
LCD display. The model provides the underlining data for slots of an abstract scene. For
example, if a slot consists of a list of menu items, then the model for that slot may include
a list of text strings to display the menu items. Finally, a controller provides the logic to
interpret user interface events (i.e., user input to the user interface). For example, if a user
selects a menu item displayed on the user interface, an event is generated to indicate the
selection of the item. The controller provides the logic to interpret the event, and initiate,
if necessary, a new model and view.
Figure 5 is a block diagram illustrating one embodiment for implementing an
abstract scene with widgets. The example abstract scene of Figure 4 is shown in Figure
5. A widget corresponds to each slot on the abstract scene. Specifically, widge.A is
instantiated for slotA, widgets is instantiated for slots, and widgetc is instantiated for
slotc. Also, as shown in Figure 5, each widget (A, B and C) includes a controller, model
and view. Note that slots on the abstract interface includes a number of subcomponents.
Widgets may be configured to render slots, and its subcomponents.
Figures 6 and 7 illustrate two different example screens supported by the
techniques of the present invention. An example user interface display (e.g., screen) for a graphics display is shown in Figure 6. A second example screen for a liquid crystal
display ("LCD") is shown in Figure 7. The example screens utilize the example scene
descriptor of Figure 4. The text string "Home Media Applications" is populated in SlotA
(Figure 4) on screen 600 of Figure 6 and on screen 700 of Figure 7. However, the
underlying widget for screen 600 presents the text string, "Home Media Applications", in a box. For the LCD display 700, the text string "Home Media Applications" is displayed
on the first line of the display. Slots (Figure 4) contains a plurality of elements. For this
example, the elements represent menu items (i.e., home media applications available).
Each element (i.e., "Music Jukebox", "Photo Albums", "Video Clips", and "Internet
Content") is individually displayed in a graphics box on display 600. For display 700, the
menu items are displayed on individual lines of the LCD display. A third slot, Slotc, for
the scene descriptor (Figure 4) is displayed on screen 600 as a graphical symbol. The
LCD display 700 (Figure 7) can't display graphics, and therefore the graphics symbol is
not displayed. The example user interface displays of Figure 6 and 7 illustrate two different screens generated for the same remote application.
Figure 9 is a block diagram illustrating another embodiment for implementing a
remote application. The application logic 910 implements the functionality of an
application program, and display client 940 implements a user interface. As shown in
Figure 9, to implement the application, application logic 910 communicates with display
client 940 in a manner to divide functionality between application logic 910 and display
client 940. For this embodiment, display client 940 performs more functions than a
purely "thin" client (i.e., display client 940 may be described as a "thicker" client). The
display client 940 includes modules to implement a user interface for the application
specific to the display capabilities of the display client. To this end, display client 940 includes scene manager 945, scene 950, slots 955, and pre-defined scenes 365. A widget,
960, includes model 930, view 965, and controller 970 components implemented in both
application logic 910 and display client 940. The dashed line around widget 960 indicates
that the widget is implemented across both application logic 910 and display client 940.
Specifically, display client 940 implements controller 970 and view 965 portions of
widget 960. The model portion of widget 960 is implemented on application logic 910
(i.e., model 930). As shown in Figure 9, application logic 910 also includes scene
descriptors 920.
In operation, application logic 910 selects an abstract scene for the user interface.
To this end, application logic 910 interrogates display client 940 to determine the scenes
supported by display client 940 (i.e., scenes available in pre-defined scenes 365). The
application logic 910 transmits a scene descriptor (one of scene descriptors 920) to
display client 945 to identify the abstract scene. Based on the scene descriptor, the scene manager module 945 instantiates a scene for the user interface. The instantiated scene is
depicted in Figure 9 as scene 950. The scene 950 aggregates through the slots 955 to
compose a user interface screen. The slots 955 of scene 950 are populated through use of
widget 960. Specifically, input events, input from the user through the user interface, are
processed by controller 970. The model to support the slots is provided from the model
930 in application logic 910. Finally, the view of each slot is supported by view module
965, implemented by the display client 940.
Figure 8 is a block diagram illustrating a further embodiment for implementing a
remote application. For this embodiment, widget 860, supporting the slots for a scene, is implemented entirely on application logic 810. Thus, for this embodiment, display client 840 may be characterized as a "thin" client. Similar to the embodiment of Figure 9,
application logic 810 interrogates display client 840 to determine the available scenes
(i.e., scenes available in predefined scenes 865). To implement the user interface on display client 840, application logic 810 transmits, over a network, a scene descriptor to
identify an abstract scene. Based on the scene descriptor, scene manager 845 instantiates
a scene (e.g., scene 850). The slots for the scene are populated through use of widget 860.
Specifically, input events, received from the user interface, are propagated, across the
network, to controller module 870. Model 830 provides data to support the user interface slots, and view module 865 supports the view module 865. For this embodiment, both the
model and the view modules are implemented on application logic 810. As shown in
Figure 8, the view is communicated back to display client 850. The scene 850 aggregates
through the slots 855 to generate a screen for the user interface. The software for the
controller portion of a widget may reside locally on a client, or may be invoked across the
network from a remote network device.
Figure 10 is a block diagram illustrating one embodiment for implementing
widget-based controllers for the remote user interface of the present invention. For this
example, local display device 1030 instantiates a widget, widgetA, to control and render
one or more slots of the abstract scene. For this example, widgetA consists of software
located on both the local display device 1030 (local controller 1040) and on the client
network device 1010 (remote controller 1025). For this implementation, the local
controller 1040 processes certain events for widgetA- Typically, local controller 1040 may
process simple events that are less driven by the application logic. For example, an event may be generated when a user moves the cursor from one item on a list to another, ha
response to this action, the user mterface may highlight each item to indicate the placement of the user's cursor. For this example, a widget may use local controller 1040
to process the event to initiate a new model and view (e.g., render a highlighted menu item on the list).
Other events may require more sophisticated operations from the underlining
remote application, h one embodiment, to accomplish this, the remote application
(1020), operating on client network device (1010), instantiates a remote controller (1025).
In other embodiments, remote controller 1025 may not be a separate object, but may be
part of procedural code within remote application 1020. As shown in Figure 10, widgetA,
operating on local display device 1030, propagates an event to remote controller 1025
through interface 1050. In one embodiment, widgetA uses a remote procedure call
("RPC") mechanism to invoke remote controller 1025 on remote network device 1010 for
operation at the local display device 1030. For example, widgetA may receive an event from a user to select an application displayed on the screen from a menu list of available
applications, h response to the event, the remote controller 1025 may generate a top
menu screen for the new application. The new top menu screen may require a new scene
descriptor, or may use the existing scene descriptor.
Data may be supplied to a local display device either locally or from across the
network. Figure 11 is a block diagram illustrating one embodiment for providing a
model for a user interface. For this embodiment, local display device 1130 operates
remote application 1120 operating on remote network device 1110. The remote
application 1120 instantiates a data model object 1125. For this example, data model
object 1125 provides an interface to a data store 1140. The data store 1140 may reside on
the remote network device 1110, or it may reside anywhere accessible by the network (e.g., another network device or a service integrating the data store from an external source to the network). For this embodiment, the controller (not shown) interprets an event, and invokes the data model object in accordance with the interpretation of the event. For example, in a media system, the controller may interpret an event that requests all available musical tracks within a specified genre. For this request, data model object
1125 may generate a query to database 1140 for all musical tracks classified in the
specified genre. As shown in Figure 11, data model object 1125 communicates the model (data) to local display device 1130 through interface 1150.
In other implementations, the model may comprise a text string. For example, a
current UI screen may consist of a list of all high-level functions available to a user. For this example, a user may select a function, and in response, the system may display a list, which consists of text strings, of all sub-functions available for the selected function, h another embodiment, the data model may be provided as a handle to the user interface implementation.
Figure 12 is a block diagram illustrating another embodiment for a remote application. For this embodiment, the widgets for a local display device are highly distributed. For example, one or more widgets may be implemented across multiple
devices (e.g., multiple application servers). For the example of Figure 12, three widgets
are utilized to implement the local display device 1175. Although the example of Figure 12 shows three widgets distributed among three devices, any number of widgets implemented over any number of devices may be implemented without deviating from the spirit of scope of the invention. A first widget, widgβti, comprises viewi (1180)
implemented on local display device 1175. As shown in Figure 12, the model - controller (1157) implementation for widgeti is implemented on application serveri (1155).
Widget2, which has a view component (1185) implemented on local display device 1175,
implements the model - controller component (1162) on a separate device (i.e.,
application server 2). A third device, application server 3 (1170), implements the model -
controller component (1172) for a third widget. As shown in Figure 12, the view
component (1190) for the third widget is implemented on local display device 1175.
The remote application technology of the present invention supports highly
distributed applications. For the example shown in Figure 12, a single display (e.g., local
display device 1175) may support three different remote applications. Information for
the three applications is integrated on a single display. Since the view component of the
widget is implemented on the local display device, the view of information for the remote applications may be tailored to the particular display device.
Figure 13 is a flow diagram illustrating one embodiment for a method to remote
an application. First, the remote application defines a set of widgets and selects a scene
descriptor that describes an abstract scene (block 1210, Figure 13). The remote
application obtains a handle to the local display device (block 1220, Figure 13). The
handle provides a means for the remote application to communicate with the local display
device. The remote application queries the local display device for a scene interface for
the defined scene descriptor (block 1230, Figure 13). hi general, the scene interface
provides a means for the remote application and local display device to communicate in
terms of an abstract scene. To this end, the scene interface defines slots for the abstract
scene. For example, to provide a data model, the remote application populates a data
structure in the scene interface. The local display device instantiates software to locally implement the abstract
scene and one or more components of one or more widgets (block 1240, Figure 13). As described above, a widget may incorporate all or portions of the controller, model, view
subcomponents of a widget, h one embodiment, the remote application transfers the
initial scene data (model) to the local display device through the scene interface (block
1250, Figure 13). h turn, the local display device renders the initial scene using the data
model (block 1250, Figure 13). When the user submits input to the user interface (e.g.,
the user selects a menu item from a list), the widget executes control logic based on the
user input (i.e., event) (block 1270, Figure 13). In some embodiments, the controller may
be implemented by one or more remote applications. For example, the local display
device may include two widgets. The controller of a first widget may be implemented on
a first application server, and the controller for the second widget may be implemented on
a second application server. In other embodiments, the local display device may
implement the controller. A new model is implemented based on interpretation of the
user event (block 1280, Figure 13). In some embodiments, the model may be supplied
by one or more remote applications. The local display device may be implemented with
four widgets. For this example, a first remote application, running on a first application
server, may supply the model for two widgets and another remote application server,
running on a second application server, may supply the model for the other two widgets.
Also, the widget renders a new scene with the data model supplied (block 1290, Figure
13).
Figure 14 is a flow diagram illustrating one embodiment for implementing a user
interface from a remote application. The process is initiated when the user interface
receives input from a user (block 1310, Figure 14). A widget, corresponding to the slot on the abstract scene, interprets the user input and generates an event (block 1320, Figure
14). For example, if the user selects a menu item displayed in a slot on the abstract scene,
then the widget that manages the corresponding slot generates an event to signify selection of the menu item.
If the widget controller for the event is local, then the widget confroUer, operating
on the local display device, interprets the event (blocks 1330 and 1350, Figure 14). Alternatively, if the widget controller is implemented remotely, then the widget remotes
the event across the network (block 1340, Figure 14). For example, the controller for the
widget may reside on one more computers remote from the local display device. If the
widget interprets the event locally and the event does not require an external data model,
then the widget supplies the data model (blocks 1360 and 1370, Figure 14). If the event
does require an external data model from one or more remote devices (over the network)
or the widget remotes the controller over the network to one or more remote devices, then
one or more remote applications iterate through the active widgets on the remote scene
and provide a data model to the local display device (blocks 1360 and 1380, Figure 14).
Using the data model, the scene is rendered at the local display device (block 1390,
Figure 14).
Two-Way Universal Remote Controller
The present invention has application to configure a two-way universal remote
controller, h general, the remote controller may be configured, on the fly, to control any
device on the network. Specifically, a user interface, operating as a remote application, is
implemented on a remote controller to control a target device. For this application, the
remote application (e.g., user interface) runs on a host computer device, and the remote controller, operating as a client device, renders the user interface. As a first "way" of
communications, the remote controller, operating as the rendering client, communicates
with the remote application to implement the user interface. Then, as a second "way" of
eommiinication, the remote controller communicates with the target device to control the target device.
The remote controller may comprise a graphical display to implement a graphical
user interface. For example, the remote controller or rendering client may comprise a
personal digital assistant ("PDA"), a tablet personal computer ("PC"), a Java® phone, any
portable device operating Windows® CE, or a television user interface. In other
embodiments, the remote controller or rendering client may be implemented using a
character display.
hi one embodiment, the two-way remote controller implements a television user
interface. For example, the remote application may implement an electronic
programming guide ("EPG") that operates as a user interface. Through use of the EPG, a
user may select programming for viewing on a television. Figure 15 illustrates an
example user interface for a television implemented on a client device. For this example,
television 1400 displays an electronic programming guide ("EPG"). The EPG permits a
user to select programming for viewing on a television. For this embodiment of an EPG,
multiple channels are displayed in a first vertical column. For the example EPG displayed
in Figure 15, channels 500 - 506 are shown. The user may view, on the EPG, additional
channels by scrolling up or down, through use of a remote control device, the list of
channels displayed in the EPG. Additional vertical columns are displayed that indicate
time slots for the channels (e.g., the time slots 4:30, 5:00 and 5:30). As shown in Figure 15, the EPG displays, beneath the time slot columns, the name of the program playing
during that time slot. For example, "Movie Showcase" is playing on channel 500 in time
slots 4:30 - 5:30. On the top of the EPG, information about a selected program is shown.
For the example shown in Figure 15, the program "April Morning" is selected, and
information about "April Morning", such as type of programming, genre, actors, short description, and program options, is displayed.
For the example of Figure 15, a portion of the EPG, displayed on television 1400,
is rendered on a client device 1410. For this example, client device 1410 comprises a
graphics display 1420. The client device display 1420 may also be sensitive to pressure
(e.g., touch sensitive) to permit user input through the display screen. For this example,
client device display 1420, which renders the user interface, is smaller than the display of
television 1400. Thus, it is not practical or effective to map the user interface directly to
the client device display 1420. Therefore, in rendering the user interface on the client
device display, only essential information is displayed. For the example rendering shown
in Figure 15, the television column, displaying channels 500 - 506, and one time slot
column, with corresponding progr__mming information, are displayed. A user may scroll,
either vertically or horizontally, to view different channels or time slots of programming.
The example client device rendering of the user interface does not display metadata of a
selected program (e.g., the information regarding "April Morning" shown in television
1400). Thus, the television user interface is rendered on a client device with a small
display. The concepts of the present invention may also be applied to generate a user
interface for a personal video recorder ("PVR") or a digital video recorder ("DVR"). The present invention also has application for rendering non-graphical user
interfaces on a client device. For example, a user of home media network 200 (Figure 2)
may desire to use remote control 260 to control playback device 240. The playback
device 240 may not have a graphical user interface. Instead, the playback device 240 may
have a character-based display. Figure 16 illustrates one embodiment for rendering a user
interface of an audio application on a client device, hi one embodiment shown in Figure
16, the target device may comprise an audio-video receiver ("AVR"). The AVR displays
information on a character-based display 1510. For example, display 1510 displays the
AVR input source (e.g., tuner) as well as additional information (e.g., band and station currently tuned). The user interface on the AVR 1500 further includes buttons 1520 for
user input (i.e., the AVR 1500 may also have a remote control specific to the AVR).
For this embodiment, a client device 1530 renders a user interface for the AVR.
The client device is not manufactured specifically for the AVR. Instead, a remote
application, residing on AVR 1500 or elsewhere, remotes the user interface of the AVR
(i.e., target device) to the client device 1530. For this embodiment, the client device 1530
comprises a graphical display 1540, also used as a user input device (i.e., user touches the
screen to interact with the user interface). For this AVR example, the client device 1530
renders, as part of the user interface, tuning control 1550 and volume control 1570 for
control of the tuning and volume on AVR 1500, respectively. The client device 1530 also
renders, as part of the user interface, additional information that specifies the source of the
AVR and the station currently tuned as well as a volume indicator 1560. The host
application and client device may also be configured to change the user interface based on
the mode of operation of the AVR. For purposes of explanation, a single AVR
application is presented, however, the target device may comprise a compact disc ("CD") device, a digital video disc ("DVD") device, a digital music playback device, or any
device that provides media services to the network.
The techniques of the present invention have application to render a user interface of a media convergence platform to a client device. A user interface for a media
convergence platform may present different types of media within a single user interface,
h one embodiment, the user interface is television based. For example, the user interface
may display, on a television display, selectable items to represent a music application, a
photo albums application, and a video application. The user selects an item displayed on
the television display to invoke an application. The music application permits a user to
select music available within the media convergence platform, and to playback the music
through a device accessible through the network. The photo albums application permits a
user to select one or more photos available within the media convergence platform, and to
view the photos through a device in the media convergence platform. The video
application permits a user to select one or more videos or video clips available within the
media convergence platform and to playback the video / video clips through a device
accessible on the network.
hi one embodiment, a two-way remote control device is configured to implement a
user interface for a media convergence platform. For example, a host computer device,
such as a media server (e.g., PVR-media server 210, Figure 2), may run an application
program to implement the media convergence platform user interface. In one
embodiment, the media convergence platform user interface is television based (e.g.,
implemented on television 250, Figure 2). In another embodiment, the media
convergence platform user interface is implemented on a personal computer (e.g., implemented on personal computer 250, Figure 2). The application program may remote
the user interface to a client device, such as a PDA. Using the client device, the user may
control target devices on the network tlirough a rendition of the media convergence platform user interface. This allows the user the ability to use the user interface of the
media convergence platform even though the primary display for the user interface is not
available or convenient. For example, the media convergence platform may primarily use
a television, to display screens for the user interface, and a television remote control to
accept input from the user. For this example, a user of the media convergence platform
may desire to remote the user interface to a client device because the television is not
available or accessible to the user (e.g., the television is not in the same room as the user).
Using the client device as a two-way remote, the user may proceed to control a target
device (e.g., an audio player in the room with the user).
h one embodiment, the host computer device remotes a user interface to a client
device to control another target device on the network. For example, in the home network
of Figure 2, the PVR-media server 210 may run a user interface for playback device 240.
Under this scenario, the PVR-media server 210 remotes the user interface for playback
device 240 to remote control 260. As another example, media manager 280 may remote a
user interface for displaying photos to remote control 260. In turn, remote control 260
may be used to control an application to view photos on television 250. Accordingly, as
illustrated by the above examples, an application, which implements a user interface for a
remote device, may reside anywhere on the network to control any other device on the network. Remote Display Applications:
The present invention has application to render display information at a client
device from an underlying remote application. An application program, operating on a
remote device (e.g., media server), may remote information to a client device (e.g.,
playback device in a home network). The client device may display the information on a
display, such as an LCD. In one embodiment, the client device renders display
information to identify media or information about the media. For example, the client
device may display information about media playing through the client playback device.
Figure 17 illustrates an application for the remote display of media information
from a media server to an A/V receiver. For this example, a media server operates a
program to browse, identify and select media. The media server may operate a media convergence platform user interface for browsing and selecting, for playback, video, audio
and photos. For the example shown in Figure 17, media server 1620, with output device
1600, is currently operating an audio application. For this application, output device 1600
displays a list of available audio selections (e.g., albums, tracks, artists, etc.). hi addition,
output device 1600 displays an identification of the audio track currently playing (e.g.,
Now Playing Bruce Spingsten). An A/V receiver 1610, coupled to media server 1620, is
used as a playback device for media server 1620. A/V receiver also includes a display
1630 (e.g., LCD). For this example, software, operating on media server 1620, remotes
display information on A/V receiver 1610 for display on display 1630. Specifically, for
this application, the A/V receiver displays information about the current music playing.
In another embodiment, the client device displays video information at a client
device. For example, a DVD player may be configured as a playback device to play video from a source on the network (e.g., media server). For example, a media server may
supply video (e.g., DVD) for playback at the DVD player. For this example, a display on the DVD player may display the name of the DVD currently playing as well as additional
information about the DVD. In other embodiments, a client device may display
information about photos or any other type of media.
Although the present invention has been described in terms of specific exemplary
embodiments, it will be appreciated that various modifications and alterations might be
made by those skilled in the art without departing from the spirit and scope of the
invention.

Claims

CLAIMSWhat is claimed is:
1. A method for implementing a user interface on a display client from an
application program operating on a remote computer, said method comprising the steps of:
operating application logic for an application program on a remote computer;
transferring, from said remote computer to a display client, an identification of a
scene for a user interface of said application program, said scene defines an abstract
layout for at least one screen display of said user interface;
receiving an input event from a user through said user interface; interrupting said input event;
receiving data at said display client in response to said interpretation of said input
event;
interrupting said scene and said data at said display client to generate a display
device scene;
generating display data, for display at said display client, from said display device
scene; and
displaying said display data at said display client.
2. The method as set forth in claim 1, wherein the step of transferring an identification of a scene comprises the steps of:
receiving a scene descriptor from said application logic; and instantiating, at said display client, said scene based on an interpretation aof said scene descriptor.
3. The method as set forth in claim 1, wherein:
the step of transferring an identification of a scene for said user interface further
comprises the step of assigning a plurality of slots to a scene; and
the step of generating display data comprises the step of iterating through said slots of said scene to generate said display data.
4. The method as set forth in claim 1, wherein the step of generating display
data comprises the step of implementing at least one widget comprising a controller, for interrupting said input event, a model, for generating data at said display client in
response to said inteφretation of said input event, and a view for generating said data on
said display client.
5. The method as set forth in claim 4, wherein the step of implementing at
least one widget comprises the step of implementing a controller remote from said display
client.
6. The method as set forth in claim 4, wherein the step of implementing at
least one widget comprises the step of implementing a controller on said display client.
7. The method as set forth in claim 4, wherein the step of implementing at least one widget comprises the step of implementing a model remote from said display
client.
8. The method as set forth in claim 4, wherein the step of implementing at
least one widget comprises the step of implementing a model local to said display client.
9. The method as set forth in claim 4, wherein the step of implementing at
least one widget comprises the step of implementing a view remote from said display
client.
10. The method as set forth in claim 4, wherein the step of implementing at
least one widget comprises the step of implementing a view local to said display client.
11. A system comprising:
network; remote computer, coupled to said network, for operating application logic for an
application program, and for transferring to a display client, an identification of a scene
for a user interface of said application program, said scene defines an abstract layout for at
least one screen display of said user interface;
software for interrupting an input event and for generating data in response to said
inteφretation of said input event; and
display client, coupled to said network, for implementing a user interface for said
application program, for receiving an input event from a user through said user interface, , for interrupting said scene and said data to generate a display device scene, for generating display data from said display device scene, and for displaying said display data.
12. The system as set forth in claim 11, wherein:
said remote computer for storing a plurality of pre-defined scenes at said display client; and
said display client for receiving a scene descriptor from said remote computer, and
for instantiating said scene based on a pre-defined scene and said scene descriptor.
13. The system as set forth in claim 11 , wherein:
the step of transferring an identification of a scene for said user interface further
comprises the step of assigning a plurality of slots to a scene; and
the step of generating display data comprises the step of iterating tlirough said
slots of said scene to generate said display data.
14. The system as set forth in claim 11, wherein said software comprises at
least one widget comprising a controller, for interrupting said input event, a model, for
generating data at said display client in response to said inteφretation of said input event,
and a view for generating said data on said display client.
15. The system as set forth in claim 14, wherein said remote computer
comprises a controller for implementing said controller of said widget.
16. The system as set forth in claim 14, wherein said display client comprises a controller for implementing said controller of said widget.
17. The system as set forth in claim 14, wherein said model of said widget is implemented remote from said display client.
18. The system as set forth in claim 14, wherein said display client comprises a
model for implementing said model of said widget.
19. The system as set forth in claim 14, wherein said remote computer
comprises a view for implementing said view of said widget.
20. The system as set forth in claim 14, wherein said display client comprises a view for implementing said view of said widget.
21. A computer readable medium comprising a plurality of instructions, which
when executed on a computer, cause the computer to perform the steps of:
operating application logic for an application program on a remote computer;
transferring, from said remote computer to a display client, an identification of a
scene for a user interface of said application program, said scene' defines an abstract
layout for at least one screen display of said user interface;
receiving an input event from a user through said user interface; interrupting said input event; receiving data at said display client in response to said inteφretation of said input
event;
interrupting said scene and said data at said display client to generate a display device scene;
generating display data, for display at said display client, from said display device
scene; and
displaying said display data at said display client.
22. The computer readable medium as set forth in claim 21, wherein the step
of transferring an identification of a scene comprises the steps of:
storing a plurality of pre-defined scenes at said display client;
receiving a scene descriptor from said application logic; and
instantiating, at said display client, said scene based on a pre-defined scene and
said scene descriptor.
23. The computer readable medium as set forth in claim 21, wherein:
the step of transferring an identification of a scene for said user interface further
comprises the step of assigning a plurality of slots to a scene; and
the step of generating display data comprises the step of iterating tlirough said
slots of said scene to generate said display data.
24. The computer readable medium as set forth in claim 21, wherein the step of generating display data comprises the step of implementing at least one widget
comprising a controller, for interrupting said input event, a model, for generating data at said display client in response to said inteφretation of said input event, and a view for
generating said data on said display client.
25. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a controller
remote from said display client.
26. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a controller on
said display client.
27. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a model remote
from said display client.
28. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a model local to
said display client.
29. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a view remote
from said display client.
30. The computer readable medium as set forth in claim 24, wherein the step
of implementing at least one widget comprises the step of implementing a view local to said display client.
PCT/US2004/008278 2003-03-17 2004-03-17 Methods and apparatus for implementing a remote application over a network WO2004084039A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US10/391,116 2003-03-17
US10/391,116 US7213228B2 (en) 2003-03-17 2003-03-17 Methods and apparatus for implementing a remote application over a network
US10/779,953 US7574691B2 (en) 2003-03-17 2004-02-14 Methods and apparatus for rendering user interfaces and display information on remote client devices
US10/779,953 2004-02-14

Publications (2)

Publication Number Publication Date
WO2004084039A2 true WO2004084039A2 (en) 2004-09-30
WO2004084039A3 WO2004084039A3 (en) 2006-03-16

Family

ID=33032644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/008278 WO2004084039A2 (en) 2003-03-17 2004-03-17 Methods and apparatus for implementing a remote application over a network

Country Status (2)

Country Link
US (3) US7574691B2 (en)
WO (1) WO2004084039A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007031703A1 (en) * 2005-08-23 2007-03-22 Digifi Limited Media play system
EP1640855A3 (en) * 2004-09-24 2008-08-13 Magix AG Graphical user interface adaptable to be displayed on multiple display devices

Families Citing this family (204)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020046407A1 (en) * 2000-02-18 2002-04-18 Alexander Franco Use of web pages to remotely program a broadcast content recording system
US6658091B1 (en) 2002-02-01 2003-12-02 @Security Broadband Corp. LIfestyle multimedia security system
US8549574B2 (en) 2002-12-10 2013-10-01 Ol2, Inc. Method of combining linear content and interactive content compressed together as streaming interactive video
US8832772B2 (en) 2002-12-10 2014-09-09 Ol2, Inc. System for combining recorded application state with application streaming interactive video output
US8840475B2 (en) 2002-12-10 2014-09-23 Ol2, Inc. Method for user session transitioning among streaming interactive video servers
US9108107B2 (en) 2002-12-10 2015-08-18 Sony Computer Entertainment America Llc Hosting and broadcasting virtual events using streaming interactive video
US9032465B2 (en) 2002-12-10 2015-05-12 Ol2, Inc. Method for multicasting views of real-time streaming interactive video
US8495678B2 (en) 2002-12-10 2013-07-23 Ol2, Inc. System for reporting recorded video preceding system failures
US8468575B2 (en) 2002-12-10 2013-06-18 Ol2, Inc. System for recursive recombination of streaming interactive video
US8387099B2 (en) 2002-12-10 2013-02-26 Ol2, Inc. System for acceleration of web page delivery
US9003461B2 (en) 2002-12-10 2015-04-07 Ol2, Inc. Streaming interactive video integrated with recorded video segments
US8661496B2 (en) 2002-12-10 2014-02-25 Ol2, Inc. System for combining a plurality of views of real-time streaming interactive video
US8949922B2 (en) 2002-12-10 2015-02-03 Ol2, Inc. System for collaborative conferencing using streaming interactive video
US20090118019A1 (en) 2002-12-10 2009-05-07 Onlive, Inc. System for streaming databases serving real-time applications used through streaming interactive video
US8893207B2 (en) 2002-12-10 2014-11-18 Ol2, Inc. System and method for compressing streaming interactive video
US7574691B2 (en) * 2003-03-17 2009-08-11 Macrovision Corporation Methods and apparatus for rendering user interfaces and display information on remote client devices
US7092693B2 (en) * 2003-08-29 2006-08-15 Sony Corporation Ultra-wide band wireless / power-line communication system for delivering audio/video content
US7483694B2 (en) * 2004-02-24 2009-01-27 Research In Motion Limited Method and system for remotely testing a wireless device
US11368327B2 (en) 2008-08-11 2022-06-21 Icontrol Networks, Inc. Integrated cloud system for premises automation
US8988221B2 (en) 2005-03-16 2015-03-24 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US11489812B2 (en) 2004-03-16 2022-11-01 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US10382452B1 (en) 2007-06-12 2019-08-13 Icontrol Networks, Inc. Communication protocols in integrated systems
US7711796B2 (en) 2006-06-12 2010-05-04 Icontrol Networks, Inc. Gateway registry methods and systems
US11343380B2 (en) 2004-03-16 2022-05-24 Icontrol Networks, Inc. Premises system automation
US11244545B2 (en) 2004-03-16 2022-02-08 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
EP1738540B1 (en) 2004-03-16 2017-10-04 Icontrol Networks, Inc. Premises management system
US11113950B2 (en) 2005-03-16 2021-09-07 Icontrol Networks, Inc. Gateway integrated with premises security system
US11201755B2 (en) 2004-03-16 2021-12-14 Icontrol Networks, Inc. Premises system management using status signal
US10142392B2 (en) 2007-01-24 2018-11-27 Icontrol Networks, Inc. Methods and systems for improved system performance
US10237237B2 (en) 2007-06-12 2019-03-19 Icontrol Networks, Inc. Communication protocols in integrated systems
US9191228B2 (en) 2005-03-16 2015-11-17 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US11368429B2 (en) 2004-03-16 2022-06-21 Icontrol Networks, Inc. Premises management configuration and control
US10156959B2 (en) 2005-03-16 2018-12-18 Icontrol Networks, Inc. Cross-client sensor user interface in an integrated security network
US20160065414A1 (en) 2013-06-27 2016-03-03 Ken Sundermeyer Control system user interface
US11677577B2 (en) 2004-03-16 2023-06-13 Icontrol Networks, Inc. Premises system management using status signal
US11277465B2 (en) 2004-03-16 2022-03-15 Icontrol Networks, Inc. Generating risk profile using data of home monitoring and security system
US9531593B2 (en) 2007-06-12 2016-12-27 Icontrol Networks, Inc. Takeover processes in security network integrated with premise security system
US9141276B2 (en) 2005-03-16 2015-09-22 Icontrol Networks, Inc. Integrated interface for mobile device
US9609003B1 (en) 2007-06-12 2017-03-28 Icontrol Networks, Inc. Generating risk profile using data of home monitoring and security system
US10313303B2 (en) 2007-06-12 2019-06-04 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US11316958B2 (en) 2008-08-11 2022-04-26 Icontrol Networks, Inc. Virtual device systems and methods
US11916870B2 (en) 2004-03-16 2024-02-27 Icontrol Networks, Inc. Gateway registry methods and systems
US8963713B2 (en) 2005-03-16 2015-02-24 Icontrol Networks, Inc. Integrated security network with security alarm signaling system
US20090077623A1 (en) 2005-03-16 2009-03-19 Marc Baum Security Network Integrating Security System and Network Devices
US8635350B2 (en) 2006-06-12 2014-01-21 Icontrol Networks, Inc. IP device discovery systems and methods
US11582065B2 (en) 2007-06-12 2023-02-14 Icontrol Networks, Inc. Systems and methods for device communication
US9729342B2 (en) 2010-12-20 2017-08-08 Icontrol Networks, Inc. Defining and implementing sensor triggered response rules
US11811845B2 (en) 2004-03-16 2023-11-07 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US10339791B2 (en) 2007-06-12 2019-07-02 Icontrol Networks, Inc. Security network integrated with premise security system
US11159484B2 (en) 2004-03-16 2021-10-26 Icontrol Networks, Inc. Forming a security network including integrated security system components and network devices
US10522026B2 (en) 2008-08-11 2019-12-31 Icontrol Networks, Inc. Automation system user interface with three-dimensional display
US10444964B2 (en) 2007-06-12 2019-10-15 Icontrol Networks, Inc. Control system user interface
US10200504B2 (en) 2007-06-12 2019-02-05 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US10380871B2 (en) 2005-03-16 2019-08-13 Icontrol Networks, Inc. Control system user interface
US10721087B2 (en) 2005-03-16 2020-07-21 Icontrol Networks, Inc. Method for networked touchscreen with integrated interfaces
US10375253B2 (en) 2008-08-25 2019-08-06 Icontrol Networks, Inc. Security system with networked touchscreen and gateway
US20060080725A1 (en) * 2004-10-13 2006-04-13 Nokia Corporation Systems and methods for recording digital media content
US7885622B2 (en) 2004-10-27 2011-02-08 Chestnut Hill Sound Inc. Entertainment system with bandless tuning
US20190278560A1 (en) 2004-10-27 2019-09-12 Chestnut Hill Sound, Inc. Media appliance with auxiliary source module docking and fail-safe alarm modes
US8090309B2 (en) * 2004-10-27 2012-01-03 Chestnut Hill Sound, Inc. Entertainment system with unified content selection
US10999254B2 (en) 2005-03-16 2021-05-04 Icontrol Networks, Inc. System for data routing in networks
US20110128378A1 (en) 2005-03-16 2011-06-02 Reza Raji Modular Electronic Display Platform
US20120324566A1 (en) 2005-03-16 2012-12-20 Marc Baum Takeover Processes In Security Network Integrated With Premise Security System
US20170180198A1 (en) 2008-08-11 2017-06-22 Marc Baum Forming a security network including integrated security system components
US11615697B2 (en) 2005-03-16 2023-03-28 Icontrol Networks, Inc. Premise management systems and methods
US9306809B2 (en) 2007-06-12 2016-04-05 Icontrol Networks, Inc. Security system with networked touchscreen
US11496568B2 (en) 2005-03-16 2022-11-08 Icontrol Networks, Inc. Security system with networked touchscreen
US11700142B2 (en) 2005-03-16 2023-07-11 Icontrol Networks, Inc. Security network integrating security system and network devices
US7519681B2 (en) * 2005-06-30 2009-04-14 Intel Corporation Systems, methods, and media for discovering remote user interface applications over a network
KR100752630B1 (en) * 2005-07-11 2007-08-30 주식회사 로직플랜트 A method and system of computer remote control that optimized for low bandwidth network and low level personal communication terminal device
JP4533295B2 (en) 2005-10-07 2010-09-01 キヤノン株式会社 Information processing apparatus and control method therefor, information processing system, and computer program
US7702279B2 (en) * 2005-12-20 2010-04-20 Apple Inc. Portable media player as a low power remote control and method thereof
US8806347B2 (en) * 2005-12-27 2014-08-12 Panasonic Corporation Systems and methods for providing distributed user interfaces to configure client devices
US20070250900A1 (en) * 2006-04-07 2007-10-25 Andrew Marcuvitz Media gateway and server
US10079839B1 (en) 2007-06-12 2018-09-18 Icontrol Networks, Inc. Activation of gateway device
US20080065722A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Media device playlists
US8243017B2 (en) 2006-09-11 2012-08-14 Apple Inc. Menu overlay including context dependent menu icon
US20080066135A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Search user interface for media device
US20080062137A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Touch actuation controller for multi-state media presentation
US9565387B2 (en) * 2006-09-11 2017-02-07 Apple Inc. Perspective scale video with navigation menu
JP4270270B2 (en) * 2006-12-05 2009-05-27 ソニー株式会社 Electronic device, imaging apparatus, electronic device control method and program
US11706279B2 (en) 2007-01-24 2023-07-18 Icontrol Networks, Inc. Methods and systems for data communication
KR101333033B1 (en) * 2007-02-27 2013-11-27 삼성전자주식회사 Home A/V network system, Settop-box, image display apparatus, and UI offer method
US7633385B2 (en) 2007-02-28 2009-12-15 Ucontrol, Inc. Method and system for communicating with and controlling an alarm system from a remote server
US8451986B2 (en) 2007-04-23 2013-05-28 Icontrol Networks, Inc. Method and system for automatically providing alternate network access for telecommunications
EP2001223B1 (en) * 2007-06-04 2016-09-21 fm marketing gmbh Multi-media configuration
US11218878B2 (en) 2007-06-12 2022-01-04 Icontrol Networks, Inc. Communication protocols in integrated systems
US10616075B2 (en) 2007-06-12 2020-04-07 Icontrol Networks, Inc. Communication protocols in integrated systems
US11212192B2 (en) 2007-06-12 2021-12-28 Icontrol Networks, Inc. Communication protocols in integrated systems
US11601810B2 (en) 2007-06-12 2023-03-07 Icontrol Networks, Inc. Communication protocols in integrated systems
US20180198788A1 (en) * 2007-06-12 2018-07-12 Icontrol Networks, Inc. Security system integrated with social media platform
US10423309B2 (en) 2007-06-12 2019-09-24 Icontrol Networks, Inc. Device integration framework
US11237714B2 (en) 2007-06-12 2022-02-01 Control Networks, Inc. Control system user interface
US10666523B2 (en) 2007-06-12 2020-05-26 Icontrol Networks, Inc. Communication protocols in integrated systems
US10498830B2 (en) 2007-06-12 2019-12-03 Icontrol Networks, Inc. Wi-Fi-to-serial encapsulation in systems
US11316753B2 (en) 2007-06-12 2022-04-26 Icontrol Networks, Inc. Communication protocols in integrated systems
US11423756B2 (en) 2007-06-12 2022-08-23 Icontrol Networks, Inc. Communication protocols in integrated systems
US10523689B2 (en) 2007-06-12 2019-12-31 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US10389736B2 (en) 2007-06-12 2019-08-20 Icontrol Networks, Inc. Communication protocols in integrated systems
US11646907B2 (en) 2007-06-12 2023-05-09 Icontrol Networks, Inc. Communication protocols in integrated systems
US11089122B2 (en) 2007-06-12 2021-08-10 Icontrol Networks, Inc. Controlling data routing among networks
US10051078B2 (en) 2007-06-12 2018-08-14 Icontrol Networks, Inc. WiFi-to-serial encapsulation in systems
US10223903B2 (en) 2010-09-28 2019-03-05 Icontrol Networks, Inc. Integrated security system with parallel processing architecture
US11831462B2 (en) 2007-08-24 2023-11-28 Icontrol Networks, Inc. Controlling data routing in premises management systems
KR101472912B1 (en) * 2007-09-03 2014-12-15 삼성전자주식회사 Universal remote controller apparatus, universal remote controller system, and method thereof
GB0718362D0 (en) * 2007-09-20 2007-10-31 Armour Home Electronics Ltd Wireless communication device and system
US8127233B2 (en) * 2007-09-24 2012-02-28 Microsoft Corporation Remote user interface updates using difference and motion encoding
US8619877B2 (en) * 2007-10-11 2013-12-31 Microsoft Corporation Optimized key frame caching for remote interface rendering
US8121423B2 (en) * 2007-10-12 2012-02-21 Microsoft Corporation Remote user interface raster segment motion detection and encoding
US8106909B2 (en) * 2007-10-13 2012-01-31 Microsoft Corporation Common key frame caching for a remote user interface
US8234632B1 (en) 2007-10-22 2012-07-31 Google Inc. Adaptive website optimization experiment
US8423893B2 (en) * 2008-01-07 2013-04-16 Altec Lansing Australia Pty Limited User interface for managing the operation of networked media playback devices
US11916928B2 (en) 2008-01-24 2024-02-27 Icontrol Networks, Inc. Communication protocols over internet protocol (IP) networks
US20090210863A1 (en) * 2008-02-19 2009-08-20 Google Inc. Code-based website experiments
KR101490688B1 (en) * 2008-03-03 2015-02-06 삼성전자주식회사 Apparatus for storing and processing contents and method of transmitting object meta information about contents using media transfer protocol from the apparatus
TWI376109B (en) * 2008-04-23 2012-11-01 Compal Communications Inc Wireless access system capable of controlling electronic devices and control method thereof
WO2009135044A1 (en) * 2008-04-30 2009-11-05 Zeevee, Inc. System and method for channel selection for local broadcasting
US7886072B2 (en) * 2008-06-12 2011-02-08 Apple Inc. Network-assisted remote media listening
US20170185278A1 (en) 2008-08-11 2017-06-29 Icontrol Networks, Inc. Automation system user interface
US20100011135A1 (en) * 2008-07-10 2010-01-14 Apple Inc. Synchronization of real-time media playback status
KR101539461B1 (en) * 2008-07-16 2015-07-30 삼성전자주식회사 Apparatus and method for providing an user interface service in a multimedia system
US11729255B2 (en) 2008-08-11 2023-08-15 Icontrol Networks, Inc. Integrated cloud system with lightweight gateway for premises automation
US11758026B2 (en) 2008-08-11 2023-09-12 Icontrol Networks, Inc. Virtual device systems and methods
US11792036B2 (en) 2008-08-11 2023-10-17 Icontrol Networks, Inc. Mobile premises automation platform
US11258625B2 (en) 2008-08-11 2022-02-22 Icontrol Networks, Inc. Mobile premises automation platform
US10530839B2 (en) 2008-08-11 2020-01-07 Icontrol Networks, Inc. Integrated cloud system with lightweight gateway for premises automation
US8510710B2 (en) * 2008-10-06 2013-08-13 Sap Ag System and method of using pooled thread-local character arrays
US9232286B2 (en) * 2008-12-24 2016-01-05 Lg Electronics Inc. IPTV receiver and method for controlling an application in the IPTV receiver
US20140082511A1 (en) * 2009-03-31 2014-03-20 Yubitech Technologies Ltd. Method and system for emulating desktop software applications in a mobile communication network
US8638211B2 (en) 2009-04-30 2014-01-28 Icontrol Networks, Inc. Configurable controller and interface for home SMA, phone and multimedia
US8448074B2 (en) * 2009-05-01 2013-05-21 Qualcomm Incorporated Method and apparatus for providing portioned web pages in a graphical user interface
US8255955B1 (en) 2009-06-16 2012-08-28 Tivo Inc. Dynamic item highlighting system
KR101642111B1 (en) 2009-08-18 2016-07-22 삼성전자주식회사 Broadcast reciver, mobile device, service providing method, and broadcast reciver controlling method
KR101686413B1 (en) * 2009-08-28 2016-12-14 삼성전자주식회사 System and method for remote controling with multiple control user interface
US20110066971A1 (en) * 2009-09-14 2011-03-17 Babak Forutanpour Method and apparatus for providing application interface portions on peripheral computing devices
KR101612845B1 (en) * 2009-11-12 2016-04-15 삼성전자주식회사 Method and apparatus for providing remote UI service
US8700697B2 (en) * 2009-11-30 2014-04-15 Samsung Electronics Co., Ltd Method and apparatus for acquiring RUI-based specialized control user interface
US20130007793A1 (en) * 2010-04-30 2013-01-03 Thomas Licensing Primary screen view control through kinetic ui framework
US8554938B2 (en) * 2010-08-31 2013-10-08 Millind Mittal Web browser proxy-client video system and method
US8836467B1 (en) 2010-09-28 2014-09-16 Icontrol Networks, Inc. Method, system and apparatus for automated reporting of account and sensor zone information to a central station
KR102033764B1 (en) 2010-10-06 2019-10-17 삼성전자주식회사 User interface display method and remote controller using the same
US20120113091A1 (en) * 2010-10-29 2012-05-10 Joel Solomon Isaacson Remote Graphics
US20120117511A1 (en) * 2010-11-09 2012-05-10 Sony Corporation Method and apparatus for providing an external menu display
US11750414B2 (en) 2010-12-16 2023-09-05 Icontrol Networks, Inc. Bidirectional security sensor communication for a premises security system
US9147337B2 (en) 2010-12-17 2015-09-29 Icontrol Networks, Inc. Method and system for logging security event data
KR20120100045A (en) 2011-03-02 2012-09-12 삼성전자주식회사 User terminal apparatus, display apparatus, ui providing method and control method thereof
US8866701B2 (en) * 2011-03-03 2014-10-21 Citrix Systems, Inc. Transparent user interface integration between local and remote computing environments
US9210213B2 (en) * 2011-03-03 2015-12-08 Citrix Systems, Inc. Reverse seamless integration between local and remote computing environments
US9880796B2 (en) 2011-03-08 2018-01-30 Georgia Tech Research Corporation Rapid view mobilization for enterprise applications
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US8840466B2 (en) 2011-04-25 2014-09-23 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9691086B1 (en) * 2011-05-13 2017-06-27 Google Inc. Adaptive content rendering
US9201709B2 (en) 2011-05-20 2015-12-01 Citrix Systems, Inc. Shell integration for an application executing remotely on a server
CN102438004B (en) * 2011-09-05 2017-02-08 深圳市创维软件有限公司 Method and system for acquiring metadata information of media file and multimedia player
US9197844B2 (en) 2011-09-08 2015-11-24 Cisco Technology Inc. User interface
WO2013041888A1 (en) * 2011-09-23 2013-03-28 Videojet Technologies Inc. Networking method
EP2766801A4 (en) * 2011-10-13 2015-04-22 Lg Electronics Inc Input interface controlling apparatus and method thereof
US9760236B2 (en) * 2011-10-14 2017-09-12 Georgia Tech Research Corporation View virtualization and transformations for mobile applications
US8854433B1 (en) 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
JP5994659B2 (en) * 2012-05-07 2016-09-21 株式会社デンソー VEHICLE DEVICE, INFORMATION DISPLAY PROGRAM, VEHICLE SYSTEM
US20130339871A1 (en) * 2012-06-15 2013-12-19 Wal-Mart Stores, Inc. Software Application Abstraction System and Method
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US9430937B2 (en) 2012-07-03 2016-08-30 Google Inc. Contextual, two way remote control
WO2014035936A2 (en) * 2012-08-31 2014-03-06 Citrix Systems Inc. Reverse seamless integration between local and remote computing environments
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9979960B2 (en) 2012-10-01 2018-05-22 Microsoft Technology Licensing, Llc Frame packing and unpacking between frames of chroma sampling formats with different chroma resolutions
US9244720B2 (en) * 2012-10-17 2016-01-26 Cisco Technology, Inc. Automated technique to configure and provision components of a converged infrastructure
US9116604B2 (en) 2012-10-25 2015-08-25 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Multi-device visual correlation interaction
US10942735B2 (en) * 2012-12-04 2021-03-09 Abalta Technologies, Inc. Distributed cross-platform user interface and application projection
KR101919796B1 (en) * 2013-01-11 2018-11-19 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US10348778B2 (en) * 2013-02-08 2019-07-09 Avaya Inc. Dynamic device pairing with media server audio substitution
US9298266B2 (en) * 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US20140325374A1 (en) * 2013-04-30 2014-10-30 Microsoft Corporation Cross-device user interface selection
US20140365906A1 (en) * 2013-06-10 2014-12-11 Hewlett-Packard Development Company, L.P. Displaying pre-defined configurations of content elements
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
CN105981338B (en) * 2013-12-08 2019-10-08 跨端口网路解决公司 For using I/O device link to establish the chain-circuit system of high- speed network communication and file transmission between host
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US11405463B2 (en) 2014-03-03 2022-08-02 Icontrol Networks, Inc. Media content management
US11146637B2 (en) 2014-03-03 2021-10-12 Icontrol Networks, Inc. Media content management
US9769227B2 (en) 2014-09-24 2017-09-19 Microsoft Technology Licensing, Llc Presentation of computing environment on multiple devices
US10448111B2 (en) 2014-09-24 2019-10-15 Microsoft Technology Licensing, Llc Content projection
US10635296B2 (en) 2014-09-24 2020-04-28 Microsoft Technology Licensing, Llc Partitioned application presentation across devices
US10025684B2 (en) * 2014-09-24 2018-07-17 Microsoft Technology Licensing, Llc Lending target device resources to host device computing environment
US20160092152A1 (en) * 2014-09-25 2016-03-31 Oracle International Corporation Extended screen experience
WO2016056223A1 (en) * 2014-10-06 2016-04-14 Sharp Kabushiki Kaisha System for terminal resolution adaptation for devices
US9584855B2 (en) * 2014-12-29 2017-02-28 Arris Enterprises, Inc. Transfer of content between screens
US11209972B2 (en) 2015-09-02 2021-12-28 D&M Holdings, Inc. Combined tablet screen drag-and-drop interface
WO2016183317A1 (en) * 2015-05-12 2016-11-17 D&M Holdings, Inc. Method, system and interface for controlling a subwoofer in a networked audio system
WO2016183263A1 (en) 2015-05-12 2016-11-17 D&M Holdings, Inc. System and method for negotiating group membership for audio controllers
US11113022B2 (en) 2015-05-12 2021-09-07 D&M Holdings, Inc. Method, system and interface for controlling a subwoofer in a networked audio system
US10368080B2 (en) 2016-10-21 2019-07-30 Microsoft Technology Licensing, Llc Selective upsampling or refresh of chroma sample values
CN108509242B (en) * 2018-03-15 2021-09-14 维沃移动通信有限公司 Application program operation guiding method and server
CN110753244B (en) * 2018-07-24 2022-10-28 中兴通讯股份有限公司 Scene synchronization method, terminal and storage medium
US11226727B2 (en) * 2018-11-12 2022-01-18 Citrix Systems, Inc. Systems and methods for live tiles for SaaS
US11880422B2 (en) 2019-02-04 2024-01-23 Cloudflare, Inc. Theft prevention for sensitive information
US10558824B1 (en) 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
US10552639B1 (en) 2019-02-04 2020-02-04 S2 Systems Corporation Local isolator application with cohesive application-isolation interface
US10452868B1 (en) 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
KR102198347B1 (en) * 2019-06-03 2021-01-04 삼성전자주식회사 User terminal apparatus, display apparatus, UI providing method and control method thereof
JP2023512410A (en) 2019-12-27 2023-03-27 アバルタ テクノロジーズ、 インク. Project, control, and manage user device applications using connection resources

Family Cites Families (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3092711B2 (en) * 1990-09-11 2000-09-25 キヤノン株式会社 Output control device and method
JP3793226B2 (en) * 1992-12-23 2006-07-05 オブジェクト テクノロジー ライセンシング コーポレイション Atomic command system
US5506932A (en) * 1993-04-16 1996-04-09 Data Translation, Inc. Synchronizing digital audio to digital video
US5930473A (en) * 1993-06-24 1999-07-27 Teng; Peter Video application server for mediating live video services
US6741617B2 (en) * 1995-04-14 2004-05-25 Koninklijke Philips Electronics N.V. Arrangement for decoding digital video signals
US5798921A (en) 1995-05-05 1998-08-25 Johnson; Todd M. Audio storage/reproduction system with automated inventory control
US5751672A (en) * 1995-07-26 1998-05-12 Sony Corporation Compact disc changer utilizing disc database
US5815297A (en) * 1995-10-25 1998-09-29 General Instrument Corporation Of Delaware Infrared interface and control apparatus for consumer electronics
US5835126A (en) * 1996-03-15 1998-11-10 Multimedia Systems Corporation Interactive system for a closed cable network which includes facsimiles and voice mail on a display
US5945988A (en) * 1996-06-06 1999-08-31 Intel Corporation Method and apparatus for automatically determining and dynamically updating user preferences in an entertainment system
US5793366A (en) * 1996-11-12 1998-08-11 Sony Corporation Graphical display of an animated data stream between devices on a bus
US5883621A (en) * 1996-06-21 1999-03-16 Sony Corporation Device control with topology map in a digital network
PT932398E (en) * 1996-06-28 2006-09-29 Ortho Mcneil Pharm Inc USE OF THE SURFACE OR ITS DERIVATIVES FOR THE PRODUCTION OF A MEDICINAL PRODUCT FOR THE TREATMENT OF MANIAC-DEPRESSIVE BIPOLAR DISTURBLES
US6359661B1 (en) * 1996-11-05 2002-03-19 Gateway, Inc. Multiple user profile remote control
US5969286A (en) * 1996-11-29 1999-10-19 Electronics Development Corporation Low impedence slapper detonator and feed-through assembly
US6177931B1 (en) * 1996-12-19 2001-01-23 Index Systems, Inc. Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information
US6243725B1 (en) * 1997-05-21 2001-06-05 Premier International, Ltd. List building system
ID24894A (en) * 1997-06-25 2000-08-31 Samsung Electronics Co Ltd Cs METHOD AND APPARATUS FOR THREE-OTO DEVELOPMENTS A HOME NETWORK
AU731906B2 (en) * 1997-07-18 2001-04-05 Sony Corporation Image signal multiplexing apparatus and methods, image signal demultiplexing apparatus and methods, and transmission media
US6008802A (en) * 1998-01-05 1999-12-28 Intel Corporation Method and apparatus for automatically performing a function based on the reception of information corresponding to broadcast data
US6038614A (en) * 1998-01-05 2000-03-14 Gateway 2000, Inc. Active volume control with hot key
US6032202A (en) * 1998-01-06 2000-02-29 Sony Corporation Of Japan Home audio/video network with two level device control
US6160796A (en) * 1998-01-06 2000-12-12 Sony Corporation Of Japan Method and system for updating device identification and status information after a local bus reset within a home audio/video network
US6237049B1 (en) * 1998-01-06 2001-05-22 Sony Corporation Of Japan Method and system for defining and discovering proxy functionality on a distributed audio video network
US6085236A (en) * 1998-01-06 2000-07-04 Sony Corporation Of Japan Home audio video network with device control modules for incorporating legacy devices
US6118450A (en) * 1998-04-03 2000-09-12 Sony Corporation Graphic user interface that is usable as a PC interface and an A/V interface
US6154206A (en) * 1998-05-06 2000-11-28 Sony Corporation Of Japan Method and apparatus for distributed conditional access control on a serial communication network
US6393430B1 (en) 1998-05-08 2002-05-21 Sony Corporation Method and system for automatically recording music data files by using the hard drive of a personal computer as an intermediate storage medium
US6219839B1 (en) * 1998-05-12 2001-04-17 Sharp Laboratories Of America, Inc. On-screen electronic resources guide
US7231175B2 (en) * 1998-06-16 2007-06-12 United Video Properties, Inc. Music information system for obtaining information on a second music program while a first music program is played
US5969283A (en) 1998-06-17 1999-10-19 Looney Productions, Llc Music organizer and entertainment center
US6446109B2 (en) * 1998-06-29 2002-09-03 Sun Microsystems, Inc. Application computing environment
US6535919B1 (en) * 1998-06-29 2003-03-18 Canon Kabushiki Kaisha Verification of image data
CN1867068A (en) * 1998-07-14 2006-11-22 联合视频制品公司 Client-server based interactive television program guide system with remote server recording
AR020608A1 (en) * 1998-07-17 2002-05-22 United Video Properties Inc A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK
US6208341B1 (en) * 1998-08-05 2001-03-27 U. S. Philips Corporation GUI of remote control facilitates user-friendly editing of macros
US6111677A (en) * 1998-08-31 2000-08-29 Sony Corporation Optical remote control interface system and method
US6564368B1 (en) * 1998-10-01 2003-05-13 Call Center Technology, Inc. System and method for visual application development without programming
US6324681B1 (en) * 1998-10-01 2001-11-27 Unisys Corporation Automated development system for developing applications that interface with both distributed component object model (DCOM) and enterprise server environments
US6498784B1 (en) * 1998-10-20 2002-12-24 Interdigital Technology Corporation Cancellation of pilot and traffic signals
US6169725B1 (en) * 1998-10-30 2001-01-02 Sony Corporation Of Japan Apparatus and method for restoration of internal connections in a home audio/video system
US7058635B1 (en) * 1998-10-30 2006-06-06 Intel Corporation Method and apparatus for searching through an electronic programming guide
US6594825B1 (en) * 1998-10-30 2003-07-15 Intel Corporation Method and apparatus for selecting a version of an entertainment program based on user preferences
US6408128B1 (en) * 1998-11-12 2002-06-18 Max Abecassis Replaying with supplementary information a segment of a video
US6816175B1 (en) 1998-12-19 2004-11-09 International Business Machines Corporation Orthogonal browsing in object hierarchies
US6342901B1 (en) * 1998-12-22 2002-01-29 Xerox Corporation Interactive device for displaying information from multiple sources
US6505343B1 (en) * 1998-12-31 2003-01-07 Intel Corporation Document/view application development architecture applied to ActiveX technology for web based application delivery
US6225938B1 (en) * 1999-01-14 2001-05-01 Universal Electronics Inc. Universal remote control system with bar code setup
US20020194260A1 (en) 1999-01-22 2002-12-19 Kent Lawrence Headley Method and apparatus for creating multimedia playlists for audio-visual systems
US6236395B1 (en) * 1999-02-01 2001-05-22 Sharp Laboratories Of America, Inc. Audiovisual information management system
US6577735B1 (en) * 1999-02-12 2003-06-10 Hewlett-Packard Development Company, L.P. System and method for backing-up data stored on a portable audio player
US6356971B1 (en) * 1999-03-04 2002-03-12 Sony Corporation System for managing multimedia discs, tracks and files on a standalone computer
US6738964B1 (en) * 1999-03-11 2004-05-18 Texas Instruments Incorporated Graphical development system and method
US6456714B2 (en) * 1999-03-18 2002-09-24 Sony Corporation Apparatus and method for interfacing between multimedia network and telecommunications network
US6487145B1 (en) * 1999-04-22 2002-11-26 Roxio, Inc. Method and system for audio data collection and management
US8099758B2 (en) 1999-05-12 2012-01-17 Microsoft Corporation Policy based composite file system and method
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
US6263503B1 (en) * 1999-05-26 2001-07-17 Neal Margulis Method for effectively implementing a wireless television system
US6901435B1 (en) * 1999-06-17 2005-05-31 Bmc Software, Inc. GUI interpretation technology for client/server environment
US6820260B1 (en) * 1999-06-17 2004-11-16 Avaya Technology Corp. Customized applet-on-hold arrangement
EP1197075A1 (en) * 1999-06-28 2002-04-17 United Video Properties, Inc. Interactive television program guide system and method with niche hubs
US6647417B1 (en) * 2000-02-10 2003-11-11 World Theatre, Inc. Music distribution systems
US20010042107A1 (en) * 2000-01-06 2001-11-15 Palm Stephen R. Networked audio player transport protocol and architecture
JP2001209586A (en) 2000-01-26 2001-08-03 Toshiba Corp Unit and method of controlling contents for computer
US6952737B1 (en) * 2000-03-03 2005-10-04 Intel Corporation Method and apparatus for accessing remote storage in a distributed storage cluster architecture
US20030068154A1 (en) * 2000-03-08 2003-04-10 Edward Zylka Gateway content storage system having database indexing, and method thereof
US20020059616A1 (en) * 2000-03-31 2002-05-16 Ucentric Holdings, Inc. System and method for providing video programming information to television receivers over a unitary set of channels
US6865593B1 (en) * 2000-04-12 2005-03-08 Webcollege, Inc. Dynamic integration of web sites
US8352331B2 (en) * 2000-05-03 2013-01-08 Yahoo! Inc. Relationship discovery engine
US6751402B1 (en) * 2000-06-28 2004-06-15 Keen Personal Media, Inc. Set-top box connectable to a digital video recorder via an auxiliary interface and selects between a recorded video signal received from the digital video recorder and a real-time video signal to provide video data stream to a display device
US6782528B1 (en) * 2000-06-16 2004-08-24 International Business Machines Corporation Method and system for visual programming using a relational diagram
US6882793B1 (en) * 2000-06-16 2005-04-19 Yesvideo, Inc. Video processing system
US6574617B1 (en) * 2000-06-19 2003-06-03 International Business Machines Corporation System and method for selective replication of databases within a workflow, enterprise, and mail-enabled web application server and platform
US6657116B1 (en) * 2000-06-29 2003-12-02 Microsoft Corporation Method and apparatus for scheduling music for specific listeners
US20020010652A1 (en) * 2000-07-14 2002-01-24 Sony Corporation Vendor ID tracking for e-marker
EP1314083A2 (en) * 2000-08-04 2003-05-28 Copan Inc. Method and system for presenting digital media
US6892228B1 (en) * 2000-08-23 2005-05-10 Pure Matrix, Inc. System and method for on-line service creation
AU2001288453B2 (en) * 2000-08-25 2006-05-18 Opentv, Inc. Personalized remote control
JP2002118451A (en) * 2000-10-10 2002-04-19 Fujitsu Ltd Constant current driver circuit
US20020113824A1 (en) * 2000-10-12 2002-08-22 Myers Thomas D. Graphic user interface that is usable as a commercial digital jukebox interface
US20020046315A1 (en) 2000-10-13 2002-04-18 Interactive Objects, Inc. System and method for mapping interface functionality to codec functionality in a portable audio device
US6907301B2 (en) * 2000-10-16 2005-06-14 Sony Corporation Method and system for selecting and controlling devices in a home network
US7206853B2 (en) * 2000-10-23 2007-04-17 Sony Corporation content abstraction layer for use in home network applications
CA2428946C (en) * 2000-11-14 2010-06-22 Scientific-Atlanta, Inc. Networked subscriber television distribution
US6925200B2 (en) * 2000-11-22 2005-08-02 R2 Technology, Inc. Graphical user interface for display of anatomical information
US20020180803A1 (en) * 2001-03-29 2002-12-05 Smartdisk Corporation Systems, methods and computer program products for managing multimedia content
JP2002184114A (en) * 2000-12-11 2002-06-28 Toshiba Corp System for recording and reproducing musical data, and musical data storage medium
KR100520058B1 (en) * 2000-12-13 2005-10-11 삼성전자주식회사 System for upgrading device driver and method for upgrading the same
US8601519B1 (en) * 2000-12-28 2013-12-03 At&T Intellectual Property I, L.P. Digital residential entertainment system
US20020104091A1 (en) * 2001-01-26 2002-08-01 Amal Prabhu Home audio video interoperability implementation for high definition passthrough, on-screen display, and copy protection
US6938101B2 (en) * 2001-01-29 2005-08-30 Universal Electronics Inc. Hand held device having a browser application
US20020166123A1 (en) * 2001-03-02 2002-11-07 Microsoft Corporation Enhanced television services for digital video recording and playback
US7039643B2 (en) * 2001-04-10 2006-05-02 Adobe Systems Incorporated System, method and apparatus for converting and integrating media files
US6802058B2 (en) * 2001-05-10 2004-10-05 International Business Machines Corporation Method and apparatus for synchronized previewing user-interface appearance on multiple platforms
US7346917B2 (en) * 2001-05-21 2008-03-18 Cyberview Technology, Inc. Trusted transactional set-top box
US8291457B2 (en) * 2001-05-24 2012-10-16 Vixs Systems, Inc. Channel selection in a multimedia system
US6839769B2 (en) * 2001-05-31 2005-01-04 Intel Corporation Limiting request propagation in a distributed file system
US6826512B2 (en) * 2001-06-28 2004-11-30 Sony Corporation Using local devices as diagnostic tools for consumer electronic devices
US6901603B2 (en) * 2001-07-10 2005-05-31 General Instrument Corportion Methods and apparatus for advanced recording options on a personal versatile recorder
US20050039208A1 (en) * 2001-10-12 2005-02-17 General Dynamics Ots (Aerospace), Inc. Wireless data communications system for a transportation vehicle
US20040205498A1 (en) * 2001-11-27 2004-10-14 Miller John David Displaying electronic content
US20030110272A1 (en) * 2001-12-11 2003-06-12 Du Castel Bertrand System and method for filtering content
US7254777B2 (en) * 2001-12-20 2007-08-07 Universal Electronics Inc. System and method for controlling the recording functionality of an appliance using a program guide
US7634795B2 (en) * 2002-01-11 2009-12-15 Opentv, Inc. Next generation television receiver
US9485532B2 (en) 2002-04-11 2016-11-01 Arris Enterprises, Inc. System and method for speculative tuning
AU2003239385A1 (en) * 2002-05-10 2003-11-11 Richard R. Reisman Method and apparatus for browsing using multiple coordinated device
KR100485769B1 (en) 2002-05-14 2005-04-28 삼성전자주식회사 Apparatus and method for offering connection between network devices located in different home networks
AU2003238886A1 (en) * 2002-05-23 2003-12-12 Phochron, Inc. System and method for digital content processing and distribution
US8181205B2 (en) * 2002-09-24 2012-05-15 Russ Samuel H PVR channel and PVR IPG information
EP1427148B1 (en) 2002-12-04 2006-06-28 Thomson Licensing Method for communication between nodes in peer-to-peer networks using common group label
US20040117788A1 (en) * 2002-12-11 2004-06-17 Jeyhan Karaoguz Method and system for TV interface for coordinating media exchange with a media peripheral
US7574691B2 (en) * 2003-03-17 2009-08-11 Macrovision Corporation Methods and apparatus for rendering user interfaces and display information on remote client devices
US7213228B2 (en) * 2003-03-17 2007-05-01 Macrovision Corporation Methods and apparatus for implementing a remote application over a network
US7787010B2 (en) * 2003-03-20 2010-08-31 Pixar Video to film flat panel digital recorder and method
US7464110B2 (en) * 2004-06-30 2008-12-09 Nokia Corporation Automated grouping of image and other user data
US7260461B2 (en) 2005-10-31 2007-08-21 Ford Global Technologies, Llc Method for operating a pre-crash sensing system with protruding contact sensor
US20070162661A1 (en) * 2005-12-27 2007-07-12 Pei-Yuan Fu Memory extension apparatus and the method of data transfer applied therein

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ANDERSON D J: "Using MVC Pattern in Web Interactions" INTERNET ARTICLE, [Online] 22 July 2000 (2000-07-22), XP002310590 Retrieved from the Internet: URL:http://www.uidesign.net/Articles/Paper s/WebMVC-BrowsersTransactio.html> [retrieved on 2004-12-14] *
CERI S ET AL: "Web Modeling Language (WebML): a modeling language for designing Web sites" COMPUTER NETWORKS, ELSEVIER SCIENCE PUBLISHERS B.V., AMSTERDAM, NL, vol. 33, no. 1-6, June 2000 (2000-06), pages 137-157, XP004304764 ISSN: 1389-1286 *
MARC ABRAMS & CONTANTINOS PHANOURIOU: "UIML: An XML Language for Building Device-Independent User Interfaces" XML CONFERENCE PROCEEDINGS. PROCEEDINGS OF XML, December 1999 (1999-12), XP002161335 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1640855A3 (en) * 2004-09-24 2008-08-13 Magix AG Graphical user interface adaptable to be displayed on multiple display devices
WO2007031703A1 (en) * 2005-08-23 2007-03-22 Digifi Limited Media play system
GB2443145A (en) * 2005-08-23 2008-04-23 Digifi Ltd Media play system

Also Published As

Publication number Publication date
US20090307658A1 (en) 2009-12-10
US7574691B2 (en) 2009-08-11
US20040183756A1 (en) 2004-09-23
US20140201636A1 (en) 2014-07-17
WO2004084039A3 (en) 2006-03-16

Similar Documents

Publication Publication Date Title
US7574691B2 (en) Methods and apparatus for rendering user interfaces and display information on remote client devices
US7213228B2 (en) Methods and apparatus for implementing a remote application over a network
WO2021212668A1 (en) Screen projection display method and display device
WO2021109434A1 (en) Display device
KR101109264B1 (en) Configuration of user interfaces
WO2021169141A1 (en) Method for displaying audio track language on display device and display device
US20110289460A1 (en) Hierarchical display of content
US20100070925A1 (en) Systems and methods for selecting media content obtained from multple sources
US20100064332A1 (en) Systems and methods for presenting media content obtained from multiple sources
US20100157168A1 (en) Multiple, Independent User Interfaces for an Audio/Video Device
WO2021189697A1 (en) Video display method, terminal, and server
US8386954B2 (en) Interactive media portal
CN111726673B (en) Channel switching method and display device
WO2021189712A1 (en) Method for switching webpage video from full-screen playing to small-window playing, and display device
EP2704397B1 (en) Presenting media content obtained from multiple sources
US20080046099A1 (en) Method and system for customizing access to content aggregated from multiple sources
WO2021139045A1 (en) Method for playing back media project and display device
WO2021109489A1 (en) Display device and electronic program guide presentation method
TW200814782A (en) Method and system for partitioning television channels in a platform
KR101708646B1 (en) Image Display Device and Method for Operating the Same
WO2021197068A1 (en) Display device and content recommendation method
CN113691852B (en) Display equipment and media asset playing method
CN112463733A (en) Method for protecting hard disk data and display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN111405329A (en) Display device and control method for EPG user interface display

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase