US20140201636A1 - Methods and apparatus for rendering user interfaces and display information on remote client devices - Google Patents
Methods and apparatus for rendering user interfaces and display information on remote client devices Download PDFInfo
- Publication number
- US20140201636A1 US20140201636A1 US14/051,619 US201314051619A US2014201636A1 US 20140201636 A1 US20140201636 A1 US 20140201636A1 US 201314051619 A US201314051619 A US 201314051619A US 2014201636 A1 US2014201636 A1 US 2014201636A1
- Authority
- US
- United States
- Prior art keywords
- screen
- scenes
- display
- scene
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4314—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
- H04N21/43637—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
Abstract
Description
- This application claims the benefit of U.S. patent application Ser. No. 10/391,116, filed Mar. 17, 2003, entitled “Methods and Apparatus For Implementing A Remote Application Over A Network.”
- 1. Field of the Invention
- The present invention is directed toward the field of network software and devices, and more particularly towards rendering user interfaces and displays on devices remote from a host device.
- 2. Art Background
- Prior art techniques exist to “remote” applications. In general, a remote application is an application that runs on a first computer but provides the functionality of the application to a second computer (e.g., implements a user interface) remote from the first computer. Remote application techniques have been used in client—server environments, wherein the application programs are stored on a server, and the client computers accesses the server to obtain functionality from the applications. The X Windows environment remotes applications such that thin client computers, or terminals, access a computer, such as a server, over a network to obtain the application's functionality at the terminals. For example, a server may host a word processing application. The thin client computer or terminal communicates with a server to operate the word processing program. The application program, running on the server, implements the user interface at the local computer for the underlying application program.
- One issue that arises when implementing remote applications is that the remote applications require specific knowledge about the display characteristics of the client computer or terminal. If the client-server environment has many client computers, then the remote application must know the requirements of each client computer. This limits the types of devices or computers that the remote application can support, or significantly increases complexity of the server software to support the various types of devices. Therefore, it is desirable to develop software that permits a remote application to operate on a client computer or device without requiring the remote application to have any knowledge of the client's configuration.
- Typically, applications implement a user interface using a user interface tool kit, sometimes referred to as a widget set, a rendering engine, and underlying hardware to display the user interface. The application provides parameters to the user interface tool kit based on specifics of the application. For example, some applications define buttons, toolbars, menus, etc. for use with the application. The user interface tool kit provides specific layout information for the application requirements. For example, the user interface tool kit may specify placement of the buttons, toolbars and menus used in the application. This layout is sometimes referred to as a logical layout of the user interface. The rendering engine, which receives the logical layout from the user interface tool kit, defines how to translate the logical layout to a physical representation for rendering on an output display. For example, if the remote computer display is a graphics display, then the rendering engine may convert digital data to RGB data for storage in a frame buffer for rendering on the output display. The user interface hardware may include, for a graphics display, a frame buffer, graphics processor and raster scan display for rendering pixel data.
- Typically, to remote an application, the application and user interface tool kit software are run on the remote computer (i.e., the computer running the remote application). The local computer (i.e., the computer providing the user interface) includes the rendering engine and display hardware. An interface between the remote computer and local computer defines specific parameters for displaying information on the local computer (e.g., screen resolution, graphics or textual display, color palettes supported, etc.). Using this interface, the remote application specifies a logical layout supported by the physical rendering of the local client. For example, the remote application may specify, in a logical layout, the toolbar at the top of the output display. In order to ensure that the toolbar is readable, the remote application knows the overall resolution of the output display on the local client. The rendering engine at the local client translates the data for the toolbar, and stores the graphics data in the frame buffer at the local client. The contents of the frame buffer are thereafter rendered on the local client's output display.
- Internet technology uses markup languages and Web browsers to display Web applications on local computer displays. Using Web browser technology, the application running on the web server does not need to know the specifics of the client display characteristics. However, the application logic is typically tuned to a specific display resolution, particularly if the web page contains graphic information. Often times Web applications specify. to the user at the local computer a viewing resolution for the Web site because the Web application was designed for that specific resolution. Thus, Web technology still requires the application to have specific knowledge of the display characteristics of a computer that displays Web pages. In addition, user interfaces on the Web are very disconnected, so as to require splitting the application logic between the server and the client (e.g., Javascript). Alternatively, the Web applications are not very smooth and interactive applications. Although Web technology may be useful because most users view the information from desktop or notebook computers with a pre-defined resolution, the technology is not effective for use in systems that integrate devices with different types of displays. Accordingly, it is desirable to develop a remote application technology that permits using local client displays for remote applications regardless of the type of display at the local client. It is also desirable to develop applications to utilize remote user interfaces and remote display of information.
- A user interface is implemented on a client device remote from a host device, in one embodiment, the client device comprises a portable electronic device that includes a graphical display. The host device operates an application program that implements a user interface that permits a user to control at least one target device. For example, the user interface may comprise an electronic programming guide to control a television or a guide for a personal video recorder. In other embodiments, the user interface comprises an interface to control a media playback device. The host device transfers to the client device an identification of at least one scene. In one embodiment, the host and client devices communicate over a wireless network. In general, a scene defines an abstract layout for at least one screen display of the user interface. The client device generates at least one screen display for the scene based on its interpretation of the scene. The client device then displays the screen as an implementation of the user interface. Thereafter, a user initiates, using the client device, an operation to control the target device. In response, the target device performs the operation.
- In other embodiments, the host device displays information at a client device. For this embodiment, the client device receives information from the host device for display at the client device. The client device may have an LCD or a graphical user interface. The host device may transmit information about media currently playing at the client device. For example, the host device may comprise a media server and the client device may comprise a playback device (e.g., CD player). For this application, the host device transmits information about the music playing at the playback device.
-
FIG. 1 illustrates a media space configured in accordance with one embodiment of the present invention. -
FIG. 2 illustrates one embodiment for integrating devices into a single media space. -
FIG. 3 is a block diagram illustrating one embodiment for implementing a remote application. -
FIG. 4 is a block diagram illustrating an example abstract scene layout. -
FIG. 5 is a block diagram illustrating one embodiment for implementing an abstract scene with widgets. -
FIG. 6 illustrates an example screen display generated for a graphics display. -
FIG. 7 illustrates an example screen display generated for a liquid crystal display (“LCD”). -
FIG. 8 is a block diagram illustrating a further embodiment for implementing a remote application. -
FIG. 9 is a block diagram illustrating another embodiment for implementing a remote application. -
FIG. 10 is a block diagram illustrating one embodiment for implementing widget-based controllers for the remote user interface of the present invention. -
FIG. 11 is a block diagram illustrating one embodiment for providing a model for a user interface. -
FIG. 12 is a block diagram illustrating another embodiment for a remote application. -
FIG. 13 is a flow diagram illustrating one embodiment for a method to remote an application. -
FIG. 14 is a flow diagram illustrating one embodiment for implementing a user interface from a remote application. -
FIG. 15 illustrates an example user interface for a television implemented on a client device. -
FIG. 16 illustrates one embodiment for rendering a user interface of an audio application on a client device. -
FIG. 17 illustrates an application for the remote display of media information from a media server to an A/V receiver. - A media convergence platform provides an efficient and easy way for one or more users to manage and playback media within a “media space.” As used herein, a “media space” connotes one or more media storage devices coupled to one or more media players for use by one or more users. The integration of media storage devices and media players into a single media space permits distributed management and control of content available within the media space
-
FIG. 1 illustrates a media space configured in accordance with one embodiment of the present invention. As shown inFIG. 1 , themedia space 100 includes “n”media storage devices 110, where “n” is any integer value greater than or equal to one. Themedia storage devices 110 store any type of media. In one embodiment, themedia storage devices 110 store digital media, such as digital audio, digital video (e.g., DVD, MPEG, etc.), and digital images. Themedia space 100 also includes “m”media players 120, where “m” is any integer value greater than or equal to one. In general, themedia players 120 are devices suitable for playing and or viewing various types of media. For example, a media player may comprise a stereo system for playing music or a television for playing DVDs or viewing digital photos. - As shown in
FIG. 1 , themedia storage devices 110 are coupled to themedia players 120. Themedia storage devices 110 and themedia players 120 are shown inFIG. 1 as separate devices to depict the separate functions of media storage and media playback; however, the media players may perform both the storage and playback functions. For example, a media player may comprise a DVD player that includes a hard drive for the storage and playback of digital video. In other embodiments, the storage of media and the playback/viewing of media are performed by separate devices. For this embodiment, themedia players 120 playback content stored on themedia storage devices 110. For example, a video clip stored on media storage device “l” may be played on any of the applicable “m”media players 120. - The
storage devices 110 andmedia players 120 are controlled bymanagement component 130. In general,management component 130 permits users to aggregate, organize, control (e.g., add, delete or modify), browse, and playback media available within themedia space 100. Themanagement component 130 may be implemented across multiple devices. The media space ofFIG. 1 shows a plurality ofusers 140 to depict that more than one user may playback/view media through different media players. The system supports playback of different media through multiple media players (i.e. the system provides multiple streams of media simultaneously). Theusers 140, throughmanagement component 130, may also organize, control, and browse media available within the media space. Themanagement component 130 provides a distributed means to manage and control all media within the media space. -
FIG. 2 illustrates one embodiment for integrating devices into a single media space. For example,system 200, shown inFIG. 2 , may be a home media system. For this embodiment, themedia space 200 includes at least one personal video recorder (“PVR”)—media server 210 (i.e., the media space may include many media servers). Themedia server 210 stores media for distribution throughout themedia space 200. In addition, themedia server 210 stores system software to integrate the components of the media space, to distribute media through the media space, and to provide a user interface for the components of the media space. The PVR-media server 210 may also include one or more television tuners and software to record television signals on a storage medium. - As shown in
FIG. 2 , the PVR-media server 210 is coupled to different types of media players, including one or more televisions (e.g. television 250) and one or more media players (e.g., audio and video playback devices), such asplayback device 240. The media playback device may comprise AVR receivers, CD players, digital music players (e.g., MP3), DVD players, VCRs, etc. For this embodiment, the PVR-media server 210 is also coupled to one ormore media managers 280 and to external content provider(s) 290. - For this embodiment, the PVR-
media server 210 executes software to perform a variety of functions within the media space. Thus, in this configuration, the PVR-media server 210 operates as a “thick client.” A user accesses and controls the functions of the media convergence platform through a system user interface. The user interface utilizes the thick and thin clients, as well as some media players (e.g.,television 250 & media playback device 240). In one embodiment, the user interface includes a plurality of interactive screens displayed on media player output devices to permit a user to access the functionality of the system. A screen of the user interface includes one or more items for selection by a user. The user navigates through the user interface using a remote control device (e.g., remote control 260). The user, through use of a remote control, controls the display of screens in the user interface and selects items displayed on the screens. A user interface permits the user, through use of a remote control, to perform a variety of functions pertaining to the media available in the media space. - The components of the media convergence platform are integrated through a network. For example, in the embodiment of
FIG. 2 , the devices (e.g., PVR-media server 210,television 250,remote control 260,media player 240 and media manager 280) are integrated throughnetwork 225.Network 225 may comprise any type of network, including wireless networks. For example,network 225 may comprise networks implemented in accordance with standards, such asEthernet 10/100 on Category 5, HPNA, Home Plug, IEEE 802.11x. IEEE 1394, and USB 1.1/2.0. - For the embodiment of
FIG. 2 , one or more thin video clients may be integrated into the media space. For example, a thin video client may be coupled to PVR-media server 210 to provide playback of digital media ontelevision 250. A thin video client does not store media. Instead, a thin video client receives media from PVR-media server 210, and processes the media for display or playback on a standard television. For example, PVR-media server 210 transmits a digital movie overnetwork 225, and the thin video client processes the digital movie for display ontelevision 250. In one embodiment, the thin video client processes the digital movie “on the fly” to provide NTSC or PAL formatted video for playback on a standard television. A thin video client may be integrated into a television. - The media convergence platform system also optionally integrates one or more thin audio clients into the media space. For example, a thin audio client may receive digital music (e.g., MP3 format) from PVR-
media server 210 overnetwork 225, and may process the digital music for playback on a standard audio system. In one embodiment, the thin audio client includes a small display (e.g., liquid crystal display “LCD”) and buttons for use as a user interface. The PVR-media server 210 transmits items and identifiers for the items for display on the thin audio client. For example, the thin audio client may display lists of tracks for playback on an audio system. The user selects items displayed on the screen using the buttons to command the system. For example, the thin audio client screen may display a list of albums available in the media space, and the user, through use of the buttons, may command the user interface to display a list of tracks for a selected album. Then, the user may select a track displayed on the screen for playback on the audio system. - The
media manager 280 is an optional component for the media convergence platform system. In general, themedia manager 280 permits the user to organize, download, and edit media in the personal computer “PC” environment. Themedia manager 280 may store media for integration into the media space (i.e., store media for use by other components in the media space). In one embodiment, themedia manager 280 permits the user to perform system functions on a PC that are less suitable for implementation on a television based user interface. - The media space may be extended to access media stored external to those components located in the same general physical proximity (e.g., a house). In one embodiment, the media convergence platform system integrates content from external sources into the media space. For example, as shown in
FIG. 2 , the PVR-media server 210 may access content external to thelocal network 225. The external content may include any type of media, such as digital music and video. The media convergence platform system may be coupled toexternal content 290 through a broadband connection (i.e., high bandwidth communications link) to permit downloading of media rich content. The external content may be delivered to the media convergence platform system through use of the Internet, or the external content may be delivered through use of private distribution networks. In other embodiments, the external content may be broadcasted. For example, themedia server 210 may accessexternal content 290 through a data casting service (i.e., data modulated and broadcast using RF, microwave, or satellite technology). - As used herein, a “remote application” connotes software, operating on a device other than a local device, used to provide functionality on a local device. As described herein, the techniques of the present invention do not require the remote application to possess pre-existing information about the characteristics of the local display device (e.g., display resolution, graphics capabilities, etc.).
- In one embodiment, the software system separates the user interface (“UI”) application logic from the UI rendering. In one implementation, the system defines user interface displays in terms of “abstract scenes.” In general, an abstract scene is a layout for a screen display, and it consists of logical entities or elements. For example, an abstract scene may define, for a particular display, a title at the top of the display, a message at the bottom the display, and a list of elements in the middle of the display. The scene itself does not define the particular data for the title, message and list. In one implementation, the software comprises pre-defined scenes, UI application logic, a scene manager, and UI rendering engine. In general, pre-defined scenes describe an abstract layout in terms of logical entities for a UI display. Typically, the application logic determines the scene and provides data to populate the scene based on the logical flow of the application. For example, a user may select a first item displayed on the current UI display. In response, the application logic selects, if applicable, a new abstract scene and data to populate the new scene based on the user selection.
- The application logic is implemented independent of the scene and the UI rendering. The application logic selects a scene descriptor, to define an abstract layout, in terms of the abstract elements. The application logic then populates the logical elements with data, and transfers the abstract layout (scene descriptors) with data to the display client. A scene manager, running on the local client, interprets the scene descriptors based on the display capabilities of the display client. For example, if the display for a display client is only capable of displaying lists, then the scene manager translates the scene with data to display only lists. This translation may result in deleting some information from the scene to render the display. The scene manager may convert other logical elements to a list for display on the LCD display. The UI rendering engine renders display data for the scene with display elements particular to the output display for the display client. The display elements include display resolution, font size for textual display, the ability to display graphics, etc. For example, if the output device is a television screen, then the UI rendering engine generates graphics data (i.e., RGB data) suitable for display of the scene on the television screen (e.g., proper resolution, font size, etc.). If the output display is a liquid crystal display (“LCD”), the UI rendering engine translates the scene logical entities to a format suitable for display on the LCD display.
- A user interface implementation that separates the UI application logic from the UI rendering has several advantages. First, the application logic does not require any information regarding the capabilities of the output display. Instead, the application logic only views the UI display in terms of logical entities, and populates data for those logic entities based on user input and logical flow of the user interface. Second, this separation permits a graphical designer of a user interface system to easily change the scenes of the user interface. For example, if a graphical designer desires to change a scene in the user interface, the graphical designer only changes the mapping from abstract to physical layout of the scene. During runtime, the application logic receives the revised scene descriptor, populates the revised scene descriptor with data via slots, and transmits the scene descriptor with data to the local client. Software on the local client determines those display elements to display the scene based on the device's display. Thus, a change to the scene does not require a change to the display elements particular to each output display because the conversion from the scene to the display elements occurs locally.
- In one embodiment, the media convergence platform permits implementing user interface software remote from a device. In one implementation, the application logic is executed on a device remote from the device displaying a user interface. The device displaying the user interface contains the UI rendering software. For this implementation, the data and scenes for a user interface (e.g., scene descriptors) exist on a remote device. Using this implementation, the scene interface (interface between the scene descriptors and the application logic) is remote from the device rendering the display. The remote device (e.g., server) does not transfer large bitmaps across the network because only scene descriptor information with data is transferred. This delineation of functions provides a logical boundary between devices on a network that maximizes throughput over the network. In addition, a remote device hosting the application logic does not require information regarding display capabilities of each device on the home network. Thus, this implementation pushes the UI rendering software to the device rendering the images, while permitting the application logic to reside on other devices. This architecture permits implementing a thin client in a media convergence platform because the thin client need not run the application logic software. In addition, the architecture permits implementing a “thin application server” because the application server does not need to know about every possible rendering client type.
-
FIG. 3 is a block diagram illustrating one embodiment for implementing a remote application. For this example embodiment, a remote application 310-includesscene descriptors 320 andapplication logic 330. For example,remote application 310 may comprise a media server with considerable processing capabilities, such as a computer or set-top box. A client device, 370, has adisplay 360, for displaying information to a user (e.g., displaying data to implement a user interface), arendering engine 355, and ascene manager 350. Therendering engine 355 receives, as input, data model fromscene manager 350, and generates, as output, display data. The display data is a type of data necessary to render an image on thedisplay 360. For example, if thedisplay 360 comprises a graphics display, then display data includes information (e.g., RGB data) to render a graphical image on a display. -
FIG. 3 illustrates separating a UI rendering, implemented on a client device, from application logic implemented on a remote device (310). In an example operation, a list of objects (e.g., musical albums) may be displayed ondisplay 360. In this example, the user may select an album for playback. A scene descriptor (320) may define an abstract layout for this application. For example, the scene descriptor may define a list of audio track elements and control information. Theapplication logic 330 receives the scene descriptor. Theapplication logic 330 populates the elements of the scene descriptor with data particular to the selection. Thus, for this example,application logic 330 populates the list of audio track elements with the names of the audio tracks for the album selected by the user. Theapplication logic 330 then transmits, throughinterface 340, the scene data to thescene manager 350 onclient 370. Thescene manager 350 converts the scene elements with data to the display elements. Therendering engine 355 generates data in a format suitable for display ondisplay 360. For example, ifdisplay 360 is an LCD display, then renderingengine 355 generates a textual list of audio tracks. In another example, ifdisplay 360 is a graphics display, then renderingengine 355 generates graphics data (e.g. RGB), for the list of audio tracks. - In one embodiment, the techniques use “abstract scenes”, defined by scene descriptors, to implement a user interface. In one embodiment, each application communicates in terms of at least one scene descriptor. A scene descriptor, in its simplest form, may constitute a list (e.g., a list scene). In general, a scene descriptor defines a plurality of slots and the relative locations of those slots for rendering the scene on an output display. The slots of a scene provide the framework for an application to render specific information on a display. However, an abstract scene defined by a scene descriptor does not define specific content for a slot. The abstract scene is developed in the application layout section on the remote computer (i.e., the computer operating the remote application).
- In one embodiment, the system divides labor between the remote application computer and the local display computer through use of scene descriptors. Specifically, the remote application communicates the scene descriptor, in terms of logical coordinates, to the local display computer. The local display computer translates the scene descriptor based on its underlying display capabilities. In other embodiments, the remote application may define additional information about a scene, so as to shift more UI operations to the remote application. In yet other embodiments, the remote application may provide less information about a scene, thereby assigning more UI operations to the local client computer.
- As an example, a scene descriptor may include one or more titles, a message, and a list of elements.
FIG. 4 is a block diagram illustrating an example scene descriptor. As shown inFIG. 4 , the example scene includes a plurality of slots (i.e., A, B and C). A slotA, located on the top of the scene, may be used to display a major title (e.g., the title of the application). SlotB, located in the center of the scene, includes a plurality of elements, 1-n, to display a list. For example, slotB may be used by the application to display menu items. For this example, each subcomponent (e.g., slotB1, slotB2 . . . slotBn) may represent a menu item. In one application, the menu items comprise media items (e.g., music, video, etc.) available in a media system. The number of menu items displayed may be variable and dependent upon the display capabilities of the local computer. The third slot shown inFIG. 4 , slotC, is displayed in the lower left corner. The remote application may use slotC to display an icon (e.g. a logo for the remote application software publisher). - In one embodiment, the remote application constructs a list of elements for a scene descriptor, which includes data for display in the slots, and transfers the list of elements in a block defined by the interface (e.g.,
interface 340,FIG. 3 ) to the local display device. In one embodiment, the remote application interrogates the scene (a1 the client) to determine the number of visible elements for display, and then retrieves the list items for those visible elements. For example, the list elements may include a data model, abstract interface model, raw string, etc. - In one embodiment, “widgets”, a software implementation, are used in the user interface. For this embodiment, an abstract scene is implemented with a collection of widgets. A widget corresponds to one or more slots on an abstract scene. In one implementation, a widget comprises a controller, model, and view subcomponents. A view is an interpretation of the abstract scene suitable for a specific display. For example, a first view of an abstract scene may be suitable for rendering on a graphical display, and a second view of an abstract scene may be suitable for rendering the abstract scene on an LCD display. The model provides the underlining data for slots of an abstract scene. For example, if a slot consists of a list of menu items, then the model for that slot may include a list of text strings to display the menu items. Finally, a controller provides the logic to interpret user interface events (i.e. user input to the user interface). For example, if a user selects a menu item displayed on the user interface, an event is generated to indicate the selection of the item. The controller provides the logic to interpret the event, and initiate, if necessary, a new model and view.
-
FIG. 5 is a block diagram illustrating one embodiment for implementing an abstract scene with widgets. The example abstract scene ofFIG. 4 is shown inFIG. 5 . A widget corresponds to each slot on the abstract scene. Specifically, widgetA is instantiated for slotA, widgetB is instantiated for slotB, and widgetC is instantiated for slotC. Also, as shown inFIG. 5 , each widget (A, B and C) includes a controller, model and view. Note that slotB on the abstract interface includes a number of subcomponents. WidgetB may be configured to render slotB, and its subcomponents. -
FIGS. 6 and 7 illustrate two different example screens supported by the techniques of the present invention. An example user interface display (e.g., screen) for a graphics display is shown inFIG. 6 . A second example screen for a liquid crystal display (“LCD”) is shown inFIG. 7 . The example screens utilize the example scene descriptor ofFIG. 4 . The text string “Home Media Applications” is populated in SlotA (FIG. 4 ) onscreen 600 ofFIG. 6 and onscreen 700 ofFIG. 7 . However, the underlying widget forscreen 600 presents the text string, “Home Media Applications”, in a box. For theLCD display 700, the text string “Home Media Applications” is displayed on the first line of the display. SlotB (FIG. 4 ) contains a plurality of elements. For this example, the elements represent menu items (i.e., home media applications available). Each element (i.e., “Music Jukebox”, “Photo Albums”, “Video Clips”, and “Internet Content”) is individually displayed in a graphics box ondisplay 600. Fordisplay 700, the menu items are displayed on individual lines of the LCD display. A third slot, SlotC, for the scene descriptor (FIG. 4 ) is displayed onscreen 600 as a graphical symbol. The LCD display 700 (FIG. 7 ) can't display graphics, and therefore the graphics symbol is not displayed. The example user interface displays ofFIGS. 6 and 7 illustrate two different screens generated for the same remote application. -
FIG. 9 is a block diagram illustrating another embodiment for implementing a remote application. Theapplication logic 910 implements the functionality of an application program, anddisplay client 940 implements a user interface. As shown inFIG. 9 , to implement the application,application logic 910 communicates withdisplay client 940 in a manner to divide functionality betweenapplication logic 910 anddisplay client 940. For this embodiment,display client 940 performs more functions than a purely “thin” client (i.e.,display client 940 may be described as a “thicker” client). Thedisplay client 940 includes modules to implement a user interface for the application specific to the display capabilities of the display client. To this end,display client 940 includesscene manager 945,scene 950,slots 955, andpre-defined scenes 365. A widget, 960, includesmodel 930,view 965, andcontroller 970 components implemented in bothapplication logic 910 anddisplay client 940. The dashed line aroundwidget 960 indicates that the widget is implemented across bothapplication logic 910 anddisplay client 940. Specifically,display client 940 implementscontroller 970 and view 965 portions ofwidget 960. The model portion ofwidget 960 is implemented on application logic 910 (i.e., model 930). As shown inFIG. 9 ,application logic 910 also includesscene descriptors 920. - In operation,
application logic 910 selects an abstract scene for the user interface. To this end,application logic 910 interrogatesdisplay client 940 to determine the scenes supported by display client 940 (i.e., scenes available in pre-defined scenes 365). Theapplication logic 910 transmits a scene descriptor (one of scene descriptors 920) to displayclient 945 to identify the abstract scene. Based on the scene descriptor, thescene manager module 945 instantiates a scene for the user interface. The instantiated scene is depicted inFIG. 9 asscene 950. Thescene 950 aggregates through theslots 955 to compose a user interface screen. Theslots 955 ofscene 950 are populated through use ofwidget 960. Specifically, input events, input from the user through the user interface, are processed bycontroller 970. The model to support the slots is provided from themodel 930 inapplication logic 910. Finally, the view of each slot is supported byview module 965, implemented by thedisplay client 940. -
FIG. 8 is a block diagram illustrating a further embodiment for implementing a remote application. For this embodiment,widget 860, supporting the slots for a scene, is implemented entirely onapplication logic 810. Thus, for this embodiment,display client 840 may be characterized as a “thin” client. Similar to the embodiment ofFIG. 9 ,application logic 810 interrogatesdisplay client 840 to determine the available scenes (i.e., scenes available in predefined scenes 865). To implement the user interface ondisplay client 840,application logic 810 transmits, over a network, a scene descriptor to identify an abstract scene. Based on the scene descriptor,scene manager 845 instantiates a scene (e.g., scene 850). The slots for the scene are populated through use ofwidget 860. Specifically, input events, received from the user interface, are propagated, across the network, tocontroller module 870.Model 830 provides data to support the user interface slots, and view module 865 supports the view module 865. For this embodiment, both the model and the view modules are implemented onapplication logic 810. As shown inFIG. 8 , the view is communicated back todisplay client 850. Thescene 850 aggregates through the slots 855 to generate a screen for the user interface. The software for the controller portion of a widget may reside locally on a client, or may be invoked across the network from a remote network device. -
FIG. 10 is a block diagram illustrating one embodiment for implementing widget-based controllers for the remote user interface of the present invention. For this example,local display device 1030 instantiates a widget, widgetA, to control and render one or more slots of the abstract scene. For this example, widgetA, consists of software located on both the local display device 1030 (local controller 1040) and on the client network device 1010 (remote controller 1025). For this implementation, thelocal controller 1040 processes certain events for widgetA. Typically,local controller 1040 may process simple events that are less driven by the application logic. For example, an event may be generated when a user moves the cursor from one item on a list to another. In response to this action, the user interface may highlight each item to indicate the placement of the user's cursor. For this example, a widget may uselocal controller 1040 to process the event to initiate a new model and view (e.g., render a highlighted menu item on the list). - Other events may require more sophisticated operations from the underlining remote application. In one embodiment, to accomplish this, the remote application (1020), operating on client network device (1010), instantiates a remote controller (1025). In other embodiments,
remote controller 1025 may not be a separate object, but may be part of procedural code withinremote application 1020. As shown inFIG. 10 , widgetA, operating onlocal display device 1030, propagates an event toremote controller 1025 throughinterface 1050. In one embodiment, widgetA uses a remote procedure call (“RPC”) mechanism to invokeremote controller 1025 onremote network device 1010 for operation at thelocal display device 1030. For example, widgetA may receive an event from a user to select an application displayed on the screen from a menu list of available applications. In response to the event, theremote controller 1025 may generate a top menu screen for the new application. The new top menu screen may require a new scene descriptor, or may use the existing scene descriptor. - Data may be supplied to a local display device either locally or from across the network.
FIG. 11 is a block diagram illustrating one embodiment for providing a model for a user interface. For this embodiment,local display device 1130 operatesremote application 1120 operating onremote network device 1110. Theremote application 1120 instantiates adata model object 1125. For this example,data model object 1125 provides an interface to adata store 1140. Thedata store 1140 may reside on theremote network device 1110, or it may reside anywhere accessible by the network (e.g., another network device or a service integrating the data store from an external source to the network). For this embodiment, the controller (not shown) interprets an event, and invokes the data model object in accordance with the interpretation of the event. For example, in a media system, the controller may interpret an event that requests all available musical tracks within a specified genre. For this request,data model object 1125 may generate a query todatabase 1140 for all musical tracks classified in the specified genre. As shown inFIG. 11 ,data model object 1125 communicates the model (data) tolocal display device 1130 throughinterface 1150. - In other implementations, the model may comprise a text string. For example, a current UI screen may consist of a list of all high-level functions available to a user. For this example, a user may select a function, and in response, the system may display a list, which consists of text strings, of all sub-functions available for the selected function. In another embodiment, the data model may be provided as a handle to the user interface implementation.
-
FIG. 12 is a block diagram illustrating another embodiment for a remote application. For this embodiment, the widgets for a local display device are highly distributed. For example, one or more widgets may be implemented across multiple devices (e.g., multiple application servers). For the example ofFIG. 12 , three widgets are utilized to implement thelocal display device 1175. Although the example ofFIG. 12 shows three widgets distributed among three devices, any number of widgets implemented over any number of devices may be implemented without deviating from the spirit of scope of the invention. A first widget, widget1, comprises view1 (1180) implemented onlocal display device 1175. As shown inFIG. 12 , the model—controller (1157) implementation for widgetA, is implemented on application server1 (1155). Widget2, which has a view component (1185) implemented onlocal display device 1175, implements the model—controller component (1162) on a separate device (i.e., application server 2). A third device, application server 3 (1170), implements the model—controller component (1172) for a third widget. As shown inFIG. 12 , the view component (1190) for the third widget is implemented onlocal display device 1175. - The remote application technology of the present invention supports highly distributed applications. For the example shown in
FIG. 12 , a single display (e.g., local display device 1175) may support three different remote applications. Information for the three applications is integrated on a single display. Since the view component of the widget is implemented on the local display device, the view of information for the remote applications may be tailored to the particular display device. -
FIG. 13 is a flow diagram illustrating one embodiment for a method to remote an application. First, the remote application defines a set of widgets and selects a scene descriptor that describes an abstract scene (block 1210,FIG. 13 ). The remote application obtains a handle to the local display device (block 1220,FIG. 13 ). The handle provides a means for the remote application to communicate with the local display device. The remote application queries the local display device for a scene interface for the defined scene descriptor (block 1230,FIG. 13 ). In general, the scene interface provides a means for the remote application and local display device to communicate in terms of an abstract scene. To this end, the scene interface defines slots for the abstract scene. For example, to provide a data model, the remote application populates a data structure in the scene interface. - The local display device instantiates software to locally implement the abstract scene and one or more components of one or more widgets (
block 1240.FIG. 13 ). As described above, a widget may incorporate all or portions of the controller, model, view subcomponents of a widget. In one embodiment, the remote application transfers the initial scene data (model) to the local display device through the scene interface (block 1250,FIG. 13 ). In turn, the local display device renders the initial scene using the data model (block 1250,FIG. 13 ). When the user submits input to the user interface (e.g., the user selects a menu item from a list), the widget executes control logic based on the user input (i.e., event) (block 1270,FIG. 13 ). In some embodiments, the controller may be implemented by one or more remote applications. For example, the local display device may include two widgets. The controller of a First widget may be implemented on a first application server, and the controller for the second widget may be implemented on a second application server. In other embodiments, the local display device may implement the controller. A new model is implemented based on interpretation of the user event (block 1280.FIG. 13 ). In some embodiments, the model may be supplied by one or more remote applications. The local display device may be implemented with four widgets. For this example, a first remote application, running on a first application server, may supply the model for two widgets and another remote application server, running on a second application server, may supply the model for the other two widgets. Also, the widget renders a new scene with the data model supplied (block 1290,FIG. 13 ). -
FIG. 14 is a flow diagram illustrating one embodiment for implementing a user interface from a remote application. The process is initiated when the user interface receives input from a user (block 1310,FIG. 14 ). A widget, corresponding to the slot on the abstract scene, interprets the user input and generates an event (block 1320,FIG. 14 ). For example, if the user selects a menu item displayed in a slot on the abstract scene, then the widget that manages the corresponding slot generates an event to signify selection of the menu item. - If the widget controller for the event is local, then the widget controller, operating on the local display device, interprets the event (
blocks FIG. 14 ). Alternatively, if the widget controller is implemented remotely, then the widget remotes the event across the network (block 1340,FIG. 14 ). For example, the controller for the widget may reside on one more computers remote from the local display device. If the widget interprets the event locally and the event does not require an external data model, then the widget supplies the data model (blocks FIG. 14 ). If the event does require an external data model from one or more remote devices (over the network) or the widget remotes the controller over the network to one or more remote devices, then one or more remote applications iterate through the active widgets on the remote scene and provide a data model to the local display device (blocks FIG. 14 ). Using the data model, the scene is rendered at the local display device (block 1390,FIG. 14 ). - The present invention has application to configure a two-way universal remote controller. In general, the remote controller may be configured, on the fly, to control any device on the network. Specifically, a user interface, operating as a remote application, is implemented on a remote controller to control a target device. For this application, the remote application (e.g., user interface) runs on a host computer device, and the remote controller, operating as a client device, renders the user interface. As a first “way” of communications, the remote controller, operating as the rendering client, communicates with the remote application to implement the user interface. Then, as a second “way” of communication, the remote controller communicates with the target device to control the target device.
- The remote controller may comprise a graphical display to implement a graphical user interface. For example, the remote controller or rendering client may comprise a personal digital assistant (“PDA”), a tablet personal computer (“PC”), a Java® phone, any portable device operating Windows® CE, or a television user interface. In other embodiments, the remote controller or rendering client may be implemented using a character display.
- In one embodiment, the two-way remote controller implements a television user interface. For example, the remote application may implement an electronic programming guide (“EPG”) that operates as a user interface. Through use of the EPG, a user may select programming for viewing on a television.
FIG. 15 illustrates an example user interface for a television implemented on a client device. For this example,television 1400 displays an electronic programming guide (“EPG”). The EPG permits a user to select programming for viewing on a television. For this embodiment of an EPG, multiple channels are displayed in a first vertical column. For the example EPG displayed inFIG. 15 , channels 500-506 are shown. The user may view, on the EPG, additional channels by scrolling up or down, through use of a remote control device, the list of channels displayed in the EPG. Additional vertical columns are displayed that indicate time slots for the channels (e.g., the time slots 4:30, 5:00 and 5:30). As shown inFIG. 15 , the EPG displays, beneath the time slot columns, the name of the program playing during that time slot. For example. “Movie Showcase” is playing onchannel 500 in time slots 4:30-5:30. On the top of the EPG, information about a selected program is shown. For the example shown inFIG. 15 , the program “April Morning” is selected, and information about “April Morning”, such as type of programming, genre, actors, short description, and program options, is displayed. - For the example of
FIG. 15 , a portion of the EPG, displayed ontelevision 1400, is rendered on aclient device 1410. For this example,client device 1410 comprises agraphics display 1420. Theclient device display 1420 may also be sensitive to pressure (e.g., touch sensitive) to permit user input through the display screen. For this example,client device display 1420, which renders the user interface, is smaller than the display oftelevision 1400. Thus, it is not practical or effective to map the user interface directly to theclient device display 1420. Therefore, in rendering the user interface on the client device display, only essential information is displayed. For the example rendering shown inFIG. 15 , the television column, displaying channels 500-506, and one time slot column, with corresponding programming information, are displayed. A user may scroll, either vertically or horizontally, to view different channels or time slots of programming. The example client device rendering of the user interface does not display metadata of a selected program (e.g., the information regarding “April Morning” shown in television 1400). Thus, the television user interface is rendered on a client device with a small display. The concepts of the present invention may also be applied to generate a user interface for a personal video recorder (“PVR”) or a digital video recorder (“DVR”). - The present invention also has application for rendering non-graphical user interfaces on a client device. For example, a user of home media network 200 (
FIG. 2 ) may desire to useremote control 260 to controlplayback device 240. Theplayback device 240 may not have a graphical user interface. Instead, theplayback device 240 may have a character-based display.FIG. 16 illustrates one embodiment for rendering a user interface of an audio application on a client device. In one embodiment shown inFIG. 16 , the target device may comprise an audio-video receiver (“AVR”). The AVR displays information on a character-baseddisplay 1510. For example,display 1510 displays the AVR input source (e.g., tuner) as well as additional information (e.g., band and station currently tuned). The user interface on theAVR 1500 further includesbuttons 1520 for user input (i.e. theAVR 1500 may also have a remote control specific to the AVR). - For this embodiment, a
client device 1530 renders a user interface for the AVR. The client device is not manufactured specifically for the AVR. Instead, a remote application, residing onAVR 1500 or elsewhere, remotes the user interface of the AVR (i.e., target device) to theclient device 1530. For this embodiment, theclient device 1530 comprises a graphical display 1540, also used as a user input device (i.e., user touches the screen to interact with the user interface). For this AVR example, theclient device 1530 renders, as part of the user interface,tuning control 1550 andvolume control 1570 for control of the tuning and volume onAVR 1500, respectively. Theclient device 1530 also renders, as part of the user interface, additional information that specifies the source of the AVR and the station currently tuned as well as avolume indicator 1560. The host application and client device may also be configured to change the user interface based on the mode of operation of the AVR. For purposes of explanation, a single AVR application is presented, however, the target device may comprise a compact disc (“CD”) device, a digital video disc (“DVD”) device, a digital music playback device, or any device that provides media services to the network. - The techniques of the present invention have application to render a user interface of a media convergence platform to a client device. A user interface for a media convergence platform may present different types of media within a single user interface. In one embodiment, the user interface is television based. For example, the user interface may display, on a television display, selectable items to represent a music application, a photo albums application, and a video application. The user selects an item displayed on the television display to invoke an application. The music application permits a user to select music available within the media convergence platform, and to playback the music through a device accessible through the network. The photo albums application permits a user to select one or more photos available within the media convergence platform, and to view the photos through a device in the media convergence platform. The video application permits a user to select one or more videos or video clips available within the media convergence platform and to playback the video/video clips through a device accessible on the network.
- In one embodiment, a two-way remote control device is configured to implement a user interface for a media convergence platform. For example, a host computer device, such as a media server (e.g., PVR-
media server 210,FIG. 2 ), may run an application program to implement the media convergence platform user interface. In one embodiment, the media convergence platform user interface is television based (e.g., implemented ontelevision 250,FIG. 2 ). In another embodiment, the media convergence platform user interface is implemented on a personal computer (e.g., implemented onpersonal computer 250,FIG. 2 ). The application program may remote the user interface to a client device, such as a PDA. Using the client device, the user may control target devices on the network through a rendition of the media convergence platform user interface. This allows the user the ability to use the user interface of the media convergence platform even though the primary display for the user interface is not available or convenient. For example, the media convergence platform may primarily use a television, to display screens for the user interface, and a television remote control to accept input from the user. For this example, a user of the media convergence platform may desire to remote the user interface to a client device because the television is not available or accessible to the user (e.g., the television is not in the same room as the user). Using the client device as a two-way remote, the user may proceed to control a 15 target device (e.g., an audio player in the room with the user). - In one embodiment, the host computer device remotes a user interface to a client device to control another target device on the network. For example, in the home network of
FIG. 2 , the PVR-media server 210 may run a user interface forplayback device 240. Under this scenario, the PVR-media server 210 remotes the user interface forplayback device 240 toremote control 260. As another example,media manager 280 may remote a user interface for displaying photos toremote control 260. In turn,remote control 260 may be used to control an application to view photos ontelevision 250. Accordingly, as illustrated by the above examples, an application, which implements a user interface for a remote device, may reside anywhere on the network to control any other device on the network. - The present invention has application to render display information at a client device from an underlying remote application. An application program, operating on a remote device (e.g., media server), may remote information to a client device (e.g., playback device in a home network). The client device may display the information on a display, such as an LCD. In one embodiment, the client device renders display information to identify media or information about the media. For example, the client device may display information about media playing through the client playback device.
-
FIG. 17 illustrates an application for the remote display of media information from a media server to an A/V receiver. For this example, a media server operates a program to browse, identify and select media. The media server may operate a media convergence platform user interface for browsing and selecting, for playback, video, audio and photos. For the example shown inFIG. 17 ,media server 1620, withoutput device 1600, is currently operating an audio application. For this application,output device 1600 displays a list of available audio selections (e.g., albums, tracks, artists, etc.). In addition,output device 1600 displays an identification of the audio track currently playing (e.g., Now Playing Bruce Spingsten). An A/V receiver 1610, coupled tomedia server 1620, is used as a playback device formedia server 1620. A/V receiver also includes a display 1630 (e.g., LCD). For this example, software, operating onmedia server 1620, remotes display information on A/V receiver 1610 for display ondisplay 1630. Specifically, for this application, the A/V receiver displays information about the current music playing. - In another embodiment, the client device displays video information at a client device. For example, a DVD player may be configured as a playback device to play video from a source on the network (e.g., media server). For example, a media server may supply video (e.g., DVD) for playback at the DVD player. For this example, a display on the DVD player may display the name of the DVD currently playing as well as additional information about the DVD. In other embodiments, a client device may display information about photos or any other type of media.
- Although the present invention has been described in terms of specific exemplary embodiments, it will be appreciated that various modifications and alterations might be made by those skilled in the art without departing from the spirit and scope of the invention.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/051,619 US20140201636A1 (en) | 2003-03-17 | 2013-10-11 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/391,116 US7213228B2 (en) | 2003-03-17 | 2003-03-17 | Methods and apparatus for implementing a remote application over a network |
US10/779,953 US7574691B2 (en) | 2003-03-17 | 2004-02-14 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US12/455,686 US20090307658A1 (en) | 2003-03-17 | 2009-06-05 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US14/051,619 US20140201636A1 (en) | 2003-03-17 | 2013-10-11 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/455,686 Continuation US20090307658A1 (en) | 2003-03-17 | 2009-06-05 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140201636A1 true US20140201636A1 (en) | 2014-07-17 |
Family
ID=33032644
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/779,953 Active 2026-01-17 US7574691B2 (en) | 2003-03-17 | 2004-02-14 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US12/455,686 Abandoned US20090307658A1 (en) | 2003-03-17 | 2009-06-05 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US14/051,619 Abandoned US20140201636A1 (en) | 2003-03-17 | 2013-10-11 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/779,953 Active 2026-01-17 US7574691B2 (en) | 2003-03-17 | 2004-02-14 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US12/455,686 Abandoned US20090307658A1 (en) | 2003-03-17 | 2009-06-05 | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Country Status (2)
Country | Link |
---|---|
US (3) | US7574691B2 (en) |
WO (1) | WO2004084039A2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080320395A1 (en) * | 2006-12-05 | 2008-12-25 | Sony Corporation | Electronic apparatus, an imaging apparatus, a display control method for the same and a program which allows a computer to execute the method |
US20140199947A1 (en) * | 2013-01-11 | 2014-07-17 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20160217093A1 (en) * | 2013-12-08 | 2016-07-28 | Crossport Network Solutions Inc. | Link system for establishing high speed network communications and file transfer between hosts using i/o device links |
CN108509242A (en) * | 2018-03-15 | 2018-09-07 | 维沃移动通信有限公司 | A kind of guidance method and server of application program operation |
RU2688246C2 (en) * | 2014-09-24 | 2019-05-21 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Providing target device resource for rent to a computer environment host device |
US10448111B2 (en) | 2014-09-24 | 2019-10-15 | Microsoft Technology Licensing, Llc | Content projection |
US10635296B2 (en) | 2014-09-24 | 2020-04-28 | Microsoft Technology Licensing, Llc | Partitioned application presentation across devices |
US20200150838A1 (en) * | 2018-11-12 | 2020-05-14 | Citrix Systems, Inc. | Systems and methods for live tiles for saas |
TWI835041B (en) | 2021-12-24 | 2024-03-11 | 瑞昱半導體股份有限公司 | Signal processing device, dongle and adaptor cable |
Families Citing this family (198)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020046407A1 (en) * | 2000-02-18 | 2002-04-18 | Alexander Franco | Use of web pages to remotely program a broadcast content recording system |
US6658091B1 (en) | 2002-02-01 | 2003-12-02 | @Security Broadband Corp. | LIfestyle multimedia security system |
US8840475B2 (en) | 2002-12-10 | 2014-09-23 | Ol2, Inc. | Method for user session transitioning among streaming interactive video servers |
US20090118019A1 (en) | 2002-12-10 | 2009-05-07 | Onlive, Inc. | System for streaming databases serving real-time applications used through streaming interactive video |
US8893207B2 (en) | 2002-12-10 | 2014-11-18 | Ol2, Inc. | System and method for compressing streaming interactive video |
US8661496B2 (en) | 2002-12-10 | 2014-02-25 | Ol2, Inc. | System for combining a plurality of views of real-time streaming interactive video |
US8495678B2 (en) | 2002-12-10 | 2013-07-23 | Ol2, Inc. | System for reporting recorded video preceding system failures |
US8468575B2 (en) | 2002-12-10 | 2013-06-18 | Ol2, Inc. | System for recursive recombination of streaming interactive video |
US9032465B2 (en) | 2002-12-10 | 2015-05-12 | Ol2, Inc. | Method for multicasting views of real-time streaming interactive video |
US9108107B2 (en) | 2002-12-10 | 2015-08-18 | Sony Computer Entertainment America Llc | Hosting and broadcasting virtual events using streaming interactive video |
US8832772B2 (en) | 2002-12-10 | 2014-09-09 | Ol2, Inc. | System for combining recorded application state with application streaming interactive video output |
US9003461B2 (en) | 2002-12-10 | 2015-04-07 | Ol2, Inc. | Streaming interactive video integrated with recorded video segments |
US8387099B2 (en) | 2002-12-10 | 2013-02-26 | Ol2, Inc. | System for acceleration of web page delivery |
US8949922B2 (en) | 2002-12-10 | 2015-02-03 | Ol2, Inc. | System for collaborative conferencing using streaming interactive video |
US8549574B2 (en) | 2002-12-10 | 2013-10-01 | Ol2, Inc. | Method of combining linear content and interactive content compressed together as streaming interactive video |
US7574691B2 (en) * | 2003-03-17 | 2009-08-11 | Macrovision Corporation | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US7092693B2 (en) * | 2003-08-29 | 2006-08-15 | Sony Corporation | Ultra-wide band wireless / power-line communication system for delivering audio/video content |
US7483694B2 (en) * | 2004-02-24 | 2009-01-27 | Research In Motion Limited | Method and system for remotely testing a wireless device |
US11368429B2 (en) | 2004-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US20160065414A1 (en) | 2013-06-27 | 2016-03-03 | Ken Sundermeyer | Control system user interface |
US10382452B1 (en) | 2007-06-12 | 2019-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US9141276B2 (en) | 2005-03-16 | 2015-09-22 | Icontrol Networks, Inc. | Integrated interface for mobile device |
US7711796B2 (en) | 2006-06-12 | 2010-05-04 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US9609003B1 (en) | 2007-06-12 | 2017-03-28 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
AU2005223267B2 (en) | 2004-03-16 | 2010-12-09 | Icontrol Networks, Inc. | Premises management system |
US20090077623A1 (en) | 2005-03-16 | 2009-03-19 | Marc Baum | Security Network Integrating Security System and Network Devices |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US10237237B2 (en) | 2007-06-12 | 2019-03-19 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US20170118037A1 (en) | 2008-08-11 | 2017-04-27 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US10127802B2 (en) | 2010-09-28 | 2018-11-13 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US8635350B2 (en) | 2006-06-12 | 2014-01-21 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US20120066608A1 (en) | 2005-03-16 | 2012-03-15 | Ken Sundermeyer | Control system user interface |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10444964B2 (en) | 2007-06-12 | 2019-10-15 | Icontrol Networks, Inc. | Control system user interface |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10375253B2 (en) | 2008-08-25 | 2019-08-06 | Icontrol Networks, Inc. | Security system with networked touchscreen and gateway |
US8988221B2 (en) | 2005-03-16 | 2015-03-24 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US9531593B2 (en) | 2007-06-12 | 2016-12-27 | Icontrol Networks, Inc. | Takeover processes in security network integrated with premise security system |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US9729342B2 (en) | 2010-12-20 | 2017-08-08 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US10313303B2 (en) | 2007-06-12 | 2019-06-04 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US8963713B2 (en) | 2005-03-16 | 2015-02-24 | Icontrol Networks, Inc. | Integrated security network with security alarm signaling system |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US9191228B2 (en) | 2005-03-16 | 2015-11-17 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US20060067654A1 (en) * | 2004-09-24 | 2006-03-30 | Magix Ag | Graphical user interface adaptable to multiple display devices |
US20060080725A1 (en) * | 2004-10-13 | 2006-04-13 | Nokia Corporation | Systems and methods for recording digital media content |
US7885622B2 (en) | 2004-10-27 | 2011-02-08 | Chestnut Hill Sound Inc. | Entertainment system with bandless tuning |
US8090309B2 (en) | 2004-10-27 | 2012-01-03 | Chestnut Hill Sound, Inc. | Entertainment system with unified content selection |
US20190278560A1 (en) | 2004-10-27 | 2019-09-12 | Chestnut Hill Sound, Inc. | Media appliance with auxiliary source module docking and fail-safe alarm modes |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US20170180198A1 (en) | 2008-08-11 | 2017-06-22 | Marc Baum | Forming a security network including integrated security system components |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US9306809B2 (en) | 2007-06-12 | 2016-04-05 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US20120324566A1 (en) | 2005-03-16 | 2012-12-20 | Marc Baum | Takeover Processes In Security Network Integrated With Premise Security System |
US20110128378A1 (en) | 2005-03-16 | 2011-06-02 | Reza Raji | Modular Electronic Display Platform |
US7519681B2 (en) * | 2005-06-30 | 2009-04-14 | Intel Corporation | Systems, methods, and media for discovering remote user interface applications over a network |
KR100752630B1 (en) * | 2005-07-11 | 2007-08-30 | 주식회사 로직플랜트 | A method and system of computer remote control that optimized for low bandwidth network and low level personal communication terminal device |
GB2429573A (en) * | 2005-08-23 | 2007-02-28 | Digifi Ltd | Multiple input and output media playing network |
JP4533295B2 (en) * | 2005-10-07 | 2010-09-01 | キヤノン株式会社 | Information processing apparatus and control method therefor, information processing system, and computer program |
US7702279B2 (en) * | 2005-12-20 | 2010-04-20 | Apple Inc. | Portable media player as a low power remote control and method thereof |
US8806347B2 (en) * | 2005-12-27 | 2014-08-12 | Panasonic Corporation | Systems and methods for providing distributed user interfaces to configure client devices |
US20070250900A1 (en) * | 2006-04-07 | 2007-10-25 | Andrew Marcuvitz | Media gateway and server |
US10079839B1 (en) | 2007-06-12 | 2018-09-18 | Icontrol Networks, Inc. | Activation of gateway device |
US20080065722A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Media device playlists |
US20080066135A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Search user interface for media device |
US9565387B2 (en) * | 2006-09-11 | 2017-02-07 | Apple Inc. | Perspective scale video with navigation menu |
US20080062137A1 (en) * | 2006-09-11 | 2008-03-13 | Apple Computer, Inc. | Touch actuation controller for multi-state media presentation |
US8243017B2 (en) | 2006-09-11 | 2012-08-14 | Apple Inc. | Menu overlay including context dependent menu icon |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
KR101333033B1 (en) * | 2007-02-27 | 2013-11-27 | 삼성전자주식회사 | Home A/V network system, Settop-box, image display apparatus, and UI offer method |
US7633385B2 (en) | 2007-02-28 | 2009-12-15 | Ucontrol, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US8451986B2 (en) | 2007-04-23 | 2013-05-28 | Icontrol Networks, Inc. | Method and system for automatically providing alternate network access for telecommunications |
EP2001223B1 (en) | 2007-06-04 | 2016-09-21 | fm marketing gmbh | Multi-media configuration |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10389736B2 (en) | 2007-06-12 | 2019-08-20 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10423309B2 (en) | 2007-06-12 | 2019-09-24 | Icontrol Networks, Inc. | Device integration framework |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US20180198788A1 (en) * | 2007-06-12 | 2018-07-12 | Icontrol Networks, Inc. | Security system integrated with social media platform |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US10498830B2 (en) | 2007-06-12 | 2019-12-03 | Icontrol Networks, Inc. | Wi-Fi-to-serial encapsulation in systems |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10051078B2 (en) | 2007-06-12 | 2018-08-14 | Icontrol Networks, Inc. | WiFi-to-serial encapsulation in systems |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
KR101472912B1 (en) * | 2007-09-03 | 2014-12-15 | 삼성전자주식회사 | Universal remote controller apparatus, universal remote controller system, and method thereof |
GB0718362D0 (en) * | 2007-09-20 | 2007-10-31 | Armour Home Electronics Ltd | Wireless communication device and system |
US8127233B2 (en) * | 2007-09-24 | 2012-02-28 | Microsoft Corporation | Remote user interface updates using difference and motion encoding |
US8619877B2 (en) * | 2007-10-11 | 2013-12-31 | Microsoft Corporation | Optimized key frame caching for remote interface rendering |
US8121423B2 (en) | 2007-10-12 | 2012-02-21 | Microsoft Corporation | Remote user interface raster segment motion detection and encoding |
US8106909B2 (en) * | 2007-10-13 | 2012-01-31 | Microsoft Corporation | Common key frame caching for a remote user interface |
US8234632B1 (en) | 2007-10-22 | 2012-07-31 | Google Inc. | Adaptive website optimization experiment |
WO2009086599A1 (en) * | 2008-01-07 | 2009-07-16 | Avega Systems Pty Ltd | A user interface for managing the operation of networked media playback devices |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US20090210863A1 (en) * | 2008-02-19 | 2009-08-20 | Google Inc. | Code-based website experiments |
KR101490688B1 (en) * | 2008-03-03 | 2015-02-06 | 삼성전자주식회사 | Apparatus for storing and processing contents and method of transmitting object meta information about contents using media transfer protocol from the apparatus |
TWI376109B (en) * | 2008-04-23 | 2012-11-01 | Compal Communications Inc | Wireless access system capable of controlling electronic devices and control method thereof |
WO2009134972A1 (en) * | 2008-04-30 | 2009-11-05 | Zeevee, Inc. | Dynamically modifying video and coding behavior |
US7886072B2 (en) | 2008-06-12 | 2011-02-08 | Apple Inc. | Network-assisted remote media listening |
US20170185278A1 (en) | 2008-08-11 | 2017-06-29 | Icontrol Networks, Inc. | Automation system user interface |
US20100011135A1 (en) * | 2008-07-10 | 2010-01-14 | Apple Inc. | Synchronization of real-time media playback status |
KR101539461B1 (en) * | 2008-07-16 | 2015-07-30 | 삼성전자주식회사 | Apparatus and method for providing an user interface service in a multimedia system |
US11792036B2 (en) | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11258625B2 (en) | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US10530839B2 (en) | 2008-08-11 | 2020-01-07 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US8510710B2 (en) * | 2008-10-06 | 2013-08-13 | Sap Ag | System and method of using pooled thread-local character arrays |
WO2010074535A2 (en) * | 2008-12-24 | 2010-07-01 | Lg Electronics Inc. | An iptv receiver and method for controlling an application in the iptv receiver |
WO2010113160A1 (en) * | 2009-03-31 | 2010-10-07 | Yubitech Technologies Ltd. | A method and system for emulating desktop software applications in a mobile communication network |
US8638211B2 (en) | 2009-04-30 | 2014-01-28 | Icontrol Networks, Inc. | Configurable controller and interface for home SMA, phone and multimedia |
US8448074B2 (en) * | 2009-05-01 | 2013-05-21 | Qualcomm Incorporated | Method and apparatus for providing portioned web pages in a graphical user interface |
US8255955B1 (en) | 2009-06-16 | 2012-08-28 | Tivo Inc. | Dynamic item highlighting system |
KR101642111B1 (en) | 2009-08-18 | 2016-07-22 | 삼성전자주식회사 | Broadcast reciver, mobile device, service providing method, and broadcast reciver controlling method |
KR101686413B1 (en) * | 2009-08-28 | 2016-12-14 | 삼성전자주식회사 | System and method for remote controling with multiple control user interface |
US20110066971A1 (en) * | 2009-09-14 | 2011-03-17 | Babak Forutanpour | Method and apparatus for providing application interface portions on peripheral computing devices |
KR101612845B1 (en) * | 2009-11-12 | 2016-04-15 | 삼성전자주식회사 | Method and apparatus for providing remote UI service |
US8700697B2 (en) * | 2009-11-30 | 2014-04-15 | Samsung Electronics Co., Ltd | Method and apparatus for acquiring RUI-based specialized control user interface |
BR112012027437B1 (en) * | 2010-04-30 | 2021-10-26 | Interdigital Madison Patent Holdings | METHOD FOR PROVIDING AND USING A DYNAMIC USER INTERFACE ON A SECOND SCREEN CONTROL DEVICE, AND SYSTEM FOR CONTROLLING CONTENT ON A MAIN DISPLAY USING A DYNAMICLY CREATED USER INTERFACE ON A SECOND SCREEN CONTROL DEVICE |
US8554938B2 (en) * | 2010-08-31 | 2013-10-08 | Millind Mittal | Web browser proxy-client video system and method |
US8836467B1 (en) | 2010-09-28 | 2014-09-16 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
KR102033764B1 (en) | 2010-10-06 | 2019-10-17 | 삼성전자주식회사 | User interface display method and remote controller using the same |
US20120113091A1 (en) * | 2010-10-29 | 2012-05-10 | Joel Solomon Isaacson | Remote Graphics |
US20120117511A1 (en) * | 2010-11-09 | 2012-05-10 | Sony Corporation | Method and apparatus for providing an external menu display |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US9147337B2 (en) | 2010-12-17 | 2015-09-29 | Icontrol Networks, Inc. | Method and system for logging security event data |
KR20120100045A (en) | 2011-03-02 | 2012-09-12 | 삼성전자주식회사 | User terminal apparatus, display apparatus, ui providing method and control method thereof |
US9210213B2 (en) * | 2011-03-03 | 2015-12-08 | Citrix Systems, Inc. | Reverse seamless integration between local and remote computing environments |
US8866701B2 (en) * | 2011-03-03 | 2014-10-21 | Citrix Systems, Inc. | Transparent user interface integration between local and remote computing environments |
US9880796B2 (en) | 2011-03-08 | 2018-01-30 | Georgia Tech Research Corporation | Rapid view mobilization for enterprise applications |
US9857868B2 (en) | 2011-03-19 | 2018-01-02 | The Board Of Trustees Of The Leland Stanford Junior University | Method and system for ergonomic touch-free interface |
US8840466B2 (en) | 2011-04-25 | 2014-09-23 | Aquifi, Inc. | Method and system to create three-dimensional mapping in a two-dimensional game |
US9691086B1 (en) * | 2011-05-13 | 2017-06-27 | Google Inc. | Adaptive content rendering |
US9201709B2 (en) | 2011-05-20 | 2015-12-01 | Citrix Systems, Inc. | Shell integration for an application executing remotely on a server |
CN102438004B (en) * | 2011-09-05 | 2017-02-08 | 深圳市创维软件有限公司 | Method and system for acquiring metadata information of media file and multimedia player |
CN103782603B (en) * | 2011-09-08 | 2017-05-31 | Nds有限公司 | The system and method that user interface shows |
WO2013041888A1 (en) * | 2011-09-23 | 2013-03-28 | Videojet Technologies Inc. | Networking method |
US20140229847A1 (en) * | 2011-10-13 | 2014-08-14 | Lg Electronics Inc. | Input interface controlling apparatus and method thereof |
US9760236B2 (en) * | 2011-10-14 | 2017-09-12 | Georgia Tech Research Corporation | View virtualization and transformations for mobile applications |
US8854433B1 (en) | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
JP5994659B2 (en) * | 2012-05-07 | 2016-09-21 | 株式会社デンソー | VEHICLE DEVICE, INFORMATION DISPLAY PROGRAM, VEHICLE SYSTEM |
US20130339871A1 (en) * | 2012-06-15 | 2013-12-19 | Wal-Mart Stores, Inc. | Software Application Abstraction System and Method |
US9098739B2 (en) | 2012-06-25 | 2015-08-04 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching |
US9111135B2 (en) | 2012-06-25 | 2015-08-18 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera |
US9430937B2 (en) | 2012-07-03 | 2016-08-30 | Google Inc. | Contextual, two way remote control |
EP2891038B1 (en) | 2012-08-31 | 2020-06-24 | Citrix Systems, Inc. | Reverse seamless integration between local and remote computing environments |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
US9979960B2 (en) | 2012-10-01 | 2018-05-22 | Microsoft Technology Licensing, Llc | Frame packing and unpacking between frames of chroma sampling formats with different chroma resolutions |
US9244720B2 (en) * | 2012-10-17 | 2016-01-26 | Cisco Technology, Inc. | Automated technique to configure and provision components of a converged infrastructure |
US9116604B2 (en) | 2012-10-25 | 2015-08-25 | Lenovo Enterprise Solutions (Singapore) Pte. Ltd. | Multi-device visual correlation interaction |
US10942735B2 (en) * | 2012-12-04 | 2021-03-09 | Abalta Technologies, Inc. | Distributed cross-platform user interface and application projection |
US9092665B2 (en) | 2013-01-30 | 2015-07-28 | Aquifi, Inc | Systems and methods for initializing motion tracking of human hands |
US9129155B2 (en) | 2013-01-30 | 2015-09-08 | Aquifi, Inc. | Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map |
US10348778B2 (en) * | 2013-02-08 | 2019-07-09 | Avaya Inc. | Dynamic device pairing with media server audio substitution |
US9298266B2 (en) * | 2013-04-02 | 2016-03-29 | Aquifi, Inc. | Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
US20140325374A1 (en) * | 2013-04-30 | 2014-10-30 | Microsoft Corporation | Cross-device user interface selection |
US20140365906A1 (en) * | 2013-06-10 | 2014-12-11 | Hewlett-Packard Development Company, L.P. | Displaying pre-defined configurations of content elements |
US9798388B1 (en) | 2013-07-31 | 2017-10-24 | Aquifi, Inc. | Vibrotactile system to augment 3D input systems |
US9507417B2 (en) | 2014-01-07 | 2016-11-29 | Aquifi, Inc. | Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
US9619105B1 (en) | 2014-01-30 | 2017-04-11 | Aquifi, Inc. | Systems and methods for gesture based interaction with viewpoint dependent user interfaces |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US9769227B2 (en) | 2014-09-24 | 2017-09-19 | Microsoft Technology Licensing, Llc | Presentation of computing environment on multiple devices |
US20160092152A1 (en) * | 2014-09-25 | 2016-03-31 | Oracle International Corporation | Extended screen experience |
WO2016056223A1 (en) * | 2014-10-06 | 2016-04-14 | Sharp Kabushiki Kaisha | System for terminal resolution adaptation for devices |
US9584855B2 (en) * | 2014-12-29 | 2017-02-28 | Arris Enterprises, Inc. | Transfer of content between screens |
EP3295667B1 (en) * | 2015-05-12 | 2022-01-19 | D&M Holdings, Inc. | Method, system and interface for controlling a subwoofer in a networked audio system |
US11209972B2 (en) | 2015-09-02 | 2021-12-28 | D&M Holdings, Inc. | Combined tablet screen drag-and-drop interface |
US11113022B2 (en) | 2015-05-12 | 2021-09-07 | D&M Holdings, Inc. | Method, system and interface for controlling a subwoofer in a networked audio system |
US9999091B2 (en) | 2015-05-12 | 2018-06-12 | D&M Holdings, Inc. | System and method for negotiating group membership for audio controllers |
US10368080B2 (en) | 2016-10-21 | 2019-07-30 | Microsoft Technology Licensing, Llc | Selective upsampling or refresh of chroma sample values |
CN110753244B (en) * | 2018-07-24 | 2022-10-28 | 中兴通讯股份有限公司 | Scene synchronization method, terminal and storage medium |
US10558824B1 (en) | 2019-02-04 | 2020-02-11 | S2 Systems Corporation | Application remoting using network vector rendering |
US10452868B1 (en) | 2019-02-04 | 2019-10-22 | S2 Systems Corporation | Web browser remoting using network vector rendering |
US11880422B2 (en) | 2019-02-04 | 2024-01-23 | Cloudflare, Inc. | Theft prevention for sensitive information |
US10552639B1 (en) | 2019-02-04 | 2020-02-04 | S2 Systems Corporation | Local isolator application with cohesive application-isolation interface |
KR102198347B1 (en) * | 2019-06-03 | 2021-01-04 | 삼성전자주식회사 | User terminal apparatus, display apparatus, UI providing method and control method thereof |
JP2023512410A (en) | 2019-12-27 | 2023-03-27 | アバルタ テクノロジーズ、 インク. | Project, control, and manage user device applications using connection resources |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6225938B1 (en) * | 1999-01-14 | 2001-05-01 | Universal Electronics Inc. | Universal remote control system with bar code setup |
US7574691B2 (en) * | 2003-03-17 | 2009-08-11 | Macrovision Corporation | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Family Cites Families (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3092711B2 (en) | 1990-09-11 | 2000-09-25 | キヤノン株式会社 | Output control device and method |
EP0664901B1 (en) | 1992-12-23 | 1996-09-18 | Otlc | Atomic command system |
US5506932A (en) | 1993-04-16 | 1996-04-09 | Data Translation, Inc. | Synchronizing digital audio to digital video |
US5930473A (en) * | 1993-06-24 | 1999-07-27 | Teng; Peter | Video application server for mediating live video services |
US6741617B2 (en) | 1995-04-14 | 2004-05-25 | Koninklijke Philips Electronics N.V. | Arrangement for decoding digital video signals |
US5798921A (en) | 1995-05-05 | 1998-08-25 | Johnson; Todd M. | Audio storage/reproduction system with automated inventory control |
US5751672A (en) | 1995-07-26 | 1998-05-12 | Sony Corporation | Compact disc changer utilizing disc database |
US5815297A (en) * | 1995-10-25 | 1998-09-29 | General Instrument Corporation Of Delaware | Infrared interface and control apparatus for consumer electronics |
US5835126A (en) | 1996-03-15 | 1998-11-10 | Multimedia Systems Corporation | Interactive system for a closed cable network which includes facsimiles and voice mail on a display |
US5945988A (en) * | 1996-06-06 | 1999-08-31 | Intel Corporation | Method and apparatus for automatically determining and dynamically updating user preferences in an entertainment system |
US5793366A (en) | 1996-11-12 | 1998-08-11 | Sony Corporation | Graphical display of an animated data stream between devices on a bus |
US5883621A (en) | 1996-06-21 | 1999-03-16 | Sony Corporation | Device control with topology map in a digital network |
PT932398E (en) * | 1996-06-28 | 2006-09-29 | Ortho Mcneil Pharm Inc | USE OF THE SURFACE OR ITS DERIVATIVES FOR THE PRODUCTION OF A MEDICINAL PRODUCT FOR THE TREATMENT OF MANIAC-DEPRESSIVE BIPOLAR DISTURBLES |
US6359661B1 (en) * | 1996-11-05 | 2002-03-19 | Gateway, Inc. | Multiple user profile remote control |
US5969286A (en) * | 1996-11-29 | 1999-10-19 | Electronics Development Corporation | Low impedence slapper detonator and feed-through assembly |
US6177931B1 (en) * | 1996-12-19 | 2001-01-23 | Index Systems, Inc. | Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information |
US6243725B1 (en) | 1997-05-21 | 2001-06-05 | Premier International, Ltd. | List building system |
EP1257094B8 (en) * | 1997-06-25 | 2007-08-08 | Samsung Electronics Co., Ltd. | Browser based command and control network |
EP0933939A4 (en) | 1997-07-18 | 1999-12-22 | Sony Corp | Method and system for multiplexing image signal, method and system for demultiplexing image signal, and transmission medium |
US6038614A (en) * | 1998-01-05 | 2000-03-14 | Gateway 2000, Inc. | Active volume control with hot key |
US6008802A (en) * | 1998-01-05 | 1999-12-28 | Intel Corporation | Method and apparatus for automatically performing a function based on the reception of information corresponding to broadcast data |
US6032202A (en) * | 1998-01-06 | 2000-02-29 | Sony Corporation Of Japan | Home audio/video network with two level device control |
US6085236A (en) * | 1998-01-06 | 2000-07-04 | Sony Corporation Of Japan | Home audio video network with device control modules for incorporating legacy devices |
US6160796A (en) * | 1998-01-06 | 2000-12-12 | Sony Corporation Of Japan | Method and system for updating device identification and status information after a local bus reset within a home audio/video network |
US6237049B1 (en) * | 1998-01-06 | 2001-05-22 | Sony Corporation Of Japan | Method and system for defining and discovering proxy functionality on a distributed audio video network |
US6118450A (en) | 1998-04-03 | 2000-09-12 | Sony Corporation | Graphic user interface that is usable as a PC interface and an A/V interface |
US6154206A (en) * | 1998-05-06 | 2000-11-28 | Sony Corporation Of Japan | Method and apparatus for distributed conditional access control on a serial communication network |
US6393430B1 (en) | 1998-05-08 | 2002-05-21 | Sony Corporation | Method and system for automatically recording music data files by using the hard drive of a personal computer as an intermediate storage medium |
US6219839B1 (en) * | 1998-05-12 | 2001-04-17 | Sharp Laboratories Of America, Inc. | On-screen electronic resources guide |
US7231175B2 (en) | 1998-06-16 | 2007-06-12 | United Video Properties, Inc. | Music information system for obtaining information on a second music program while a first music program is played |
US5969283A (en) | 1998-06-17 | 1999-10-19 | Looney Productions, Llc | Music organizer and entertainment center |
US6535919B1 (en) | 1998-06-29 | 2003-03-18 | Canon Kabushiki Kaisha | Verification of image data |
US6446109B2 (en) | 1998-06-29 | 2002-09-03 | Sun Microsystems, Inc. | Application computing environment |
CN1867068A (en) | 1998-07-14 | 2006-11-22 | 联合视频制品公司 | Client-server based interactive television program guide system with remote server recording |
AR020608A1 (en) | 1998-07-17 | 2002-05-22 | United Video Properties Inc | A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK |
US6208341B1 (en) * | 1998-08-05 | 2001-03-27 | U. S. Philips Corporation | GUI of remote control facilitates user-friendly editing of macros |
US6111677A (en) * | 1998-08-31 | 2000-08-29 | Sony Corporation | Optical remote control interface system and method |
US6564368B1 (en) | 1998-10-01 | 2003-05-13 | Call Center Technology, Inc. | System and method for visual application development without programming |
US6324681B1 (en) | 1998-10-01 | 2001-11-27 | Unisys Corporation | Automated development system for developing applications that interface with both distributed component object model (DCOM) and enterprise server environments |
US6498784B1 (en) | 1998-10-20 | 2002-12-24 | Interdigital Technology Corporation | Cancellation of pilot and traffic signals |
US6594825B1 (en) * | 1998-10-30 | 2003-07-15 | Intel Corporation | Method and apparatus for selecting a version of an entertainment program based on user preferences |
US6169725B1 (en) * | 1998-10-30 | 2001-01-02 | Sony Corporation Of Japan | Apparatus and method for restoration of internal connections in a home audio/video system |
US7058635B1 (en) * | 1998-10-30 | 2006-06-06 | Intel Corporation | Method and apparatus for searching through an electronic programming guide |
US6408128B1 (en) | 1998-11-12 | 2002-06-18 | Max Abecassis | Replaying with supplementary information a segment of a video |
US6816175B1 (en) | 1998-12-19 | 2004-11-09 | International Business Machines Corporation | Orthogonal browsing in object hierarchies |
US6342901B1 (en) | 1998-12-22 | 2002-01-29 | Xerox Corporation | Interactive device for displaying information from multiple sources |
US6505343B1 (en) | 1998-12-31 | 2003-01-07 | Intel Corporation | Document/view application development architecture applied to ActiveX technology for web based application delivery |
US20020194260A1 (en) | 1999-01-22 | 2002-12-19 | Kent Lawrence Headley | Method and apparatus for creating multimedia playlists for audio-visual systems |
US6236395B1 (en) * | 1999-02-01 | 2001-05-22 | Sharp Laboratories Of America, Inc. | Audiovisual information management system |
US6577735B1 (en) | 1999-02-12 | 2003-06-10 | Hewlett-Packard Development Company, L.P. | System and method for backing-up data stored on a portable audio player |
US6356971B1 (en) | 1999-03-04 | 2002-03-12 | Sony Corporation | System for managing multimedia discs, tracks and files on a standalone computer |
US6738964B1 (en) | 1999-03-11 | 2004-05-18 | Texas Instruments Incorporated | Graphical development system and method |
US6456714B2 (en) * | 1999-03-18 | 2002-09-24 | Sony Corporation | Apparatus and method for interfacing between multimedia network and telecommunications network |
US6487145B1 (en) | 1999-04-22 | 2002-11-26 | Roxio, Inc. | Method and system for audio data collection and management |
US8099758B2 (en) | 1999-05-12 | 2012-01-17 | Microsoft Corporation | Policy based composite file system and method |
US6792615B1 (en) | 1999-05-19 | 2004-09-14 | New Horizons Telecasting, Inc. | Encapsulated, streaming media automation and distribution system |
US6263503B1 (en) | 1999-05-26 | 2001-07-17 | Neal Margulis | Method for effectively implementing a wireless television system |
US6901435B1 (en) | 1999-06-17 | 2005-05-31 | Bmc Software, Inc. | GUI interpretation technology for client/server environment |
US6820260B1 (en) | 1999-06-17 | 2004-11-16 | Avaya Technology Corp. | Customized applet-on-hold arrangement |
CA2377941A1 (en) | 1999-06-28 | 2001-01-04 | United Video Properties, Inc. | Interactive television program guide system and method with niche hubs |
US6647417B1 (en) | 2000-02-10 | 2003-11-11 | World Theatre, Inc. | Music distribution systems |
US20010042107A1 (en) | 2000-01-06 | 2001-11-15 | Palm Stephen R. | Networked audio player transport protocol and architecture |
JP2001209586A (en) | 2000-01-26 | 2001-08-03 | Toshiba Corp | Unit and method of controlling contents for computer |
US6952737B1 (en) | 2000-03-03 | 2005-10-04 | Intel Corporation | Method and apparatus for accessing remote storage in a distributed storage cluster architecture |
US20030068154A1 (en) * | 2000-03-08 | 2003-04-10 | Edward Zylka | Gateway content storage system having database indexing, and method thereof |
US20010039660A1 (en) | 2000-03-31 | 2001-11-08 | Ucentric Holdings, Inc. | Home area network including arrangement for distributing television programming over local cable |
US6865593B1 (en) | 2000-04-12 | 2005-03-08 | Webcollege, Inc. | Dynamic integration of web sites |
US8352331B2 (en) | 2000-05-03 | 2013-01-08 | Yahoo! Inc. | Relationship discovery engine |
US6751402B1 (en) | 2000-06-28 | 2004-06-15 | Keen Personal Media, Inc. | Set-top box connectable to a digital video recorder via an auxiliary interface and selects between a recorded video signal received from the digital video recorder and a real-time video signal to provide video data stream to a display device |
US6782528B1 (en) | 2000-06-16 | 2004-08-24 | International Business Machines Corporation | Method and system for visual programming using a relational diagram |
US6882793B1 (en) * | 2000-06-16 | 2005-04-19 | Yesvideo, Inc. | Video processing system |
US6574617B1 (en) | 2000-06-19 | 2003-06-03 | International Business Machines Corporation | System and method for selective replication of databases within a workflow, enterprise, and mail-enabled web application server and platform |
US6657116B1 (en) | 2000-06-29 | 2003-12-02 | Microsoft Corporation | Method and apparatus for scheduling music for specific listeners |
US20020010652A1 (en) | 2000-07-14 | 2002-01-24 | Sony Corporation | Vendor ID tracking for e-marker |
AU2001283538A1 (en) | 2000-08-04 | 2002-02-18 | Tom C. Hill | Method and system for presenting digital media |
US6892228B1 (en) | 2000-08-23 | 2005-05-10 | Pure Matrix, Inc. | System and method for on-line service creation |
DK1312209T3 (en) | 2000-08-25 | 2017-06-26 | Opentv Inc | Individualized remote control |
JP2002118451A (en) | 2000-10-10 | 2002-04-19 | Fujitsu Ltd | Constant current driver circuit |
US20020113824A1 (en) | 2000-10-12 | 2002-08-22 | Myers Thomas D. | Graphic user interface that is usable as a commercial digital jukebox interface |
US20020046315A1 (en) | 2000-10-13 | 2002-04-18 | Interactive Objects, Inc. | System and method for mapping interface functionality to codec functionality in a portable audio device |
US6907301B2 (en) * | 2000-10-16 | 2005-06-14 | Sony Corporation | Method and system for selecting and controlling devices in a home network |
US7206853B2 (en) * | 2000-10-23 | 2007-04-17 | Sony Corporation | content abstraction layer for use in home network applications |
US7861272B2 (en) | 2000-11-14 | 2010-12-28 | Russ Samuel H | Networked subscriber television distribution |
US6925200B2 (en) | 2000-11-22 | 2005-08-02 | R2 Technology, Inc. | Graphical user interface for display of anatomical information |
US20020180803A1 (en) | 2001-03-29 | 2002-12-05 | Smartdisk Corporation | Systems, methods and computer program products for managing multimedia content |
JP2002184114A (en) | 2000-12-11 | 2002-06-28 | Toshiba Corp | System for recording and reproducing musical data, and musical data storage medium |
KR100520058B1 (en) | 2000-12-13 | 2005-10-11 | 삼성전자주식회사 | System for upgrading device driver and method for upgrading the same |
US8601519B1 (en) | 2000-12-28 | 2013-12-03 | At&T Intellectual Property I, L.P. | Digital residential entertainment system |
US20020104091A1 (en) * | 2001-01-26 | 2002-08-01 | Amal Prabhu | Home audio video interoperability implementation for high definition passthrough, on-screen display, and copy protection |
US6938101B2 (en) * | 2001-01-29 | 2005-08-30 | Universal Electronics Inc. | Hand held device having a browser application |
US20020166123A1 (en) | 2001-03-02 | 2002-11-07 | Microsoft Corporation | Enhanced television services for digital video recording and playback |
US7039643B2 (en) | 2001-04-10 | 2006-05-02 | Adobe Systems Incorporated | System, method and apparatus for converting and integrating media files |
US6802058B2 (en) | 2001-05-10 | 2004-10-05 | International Business Machines Corporation | Method and apparatus for synchronized previewing user-interface appearance on multiple platforms |
US7346917B2 (en) | 2001-05-21 | 2008-03-18 | Cyberview Technology, Inc. | Trusted transactional set-top box |
US8291457B2 (en) | 2001-05-24 | 2012-10-16 | Vixs Systems, Inc. | Channel selection in a multimedia system |
US6839769B2 (en) * | 2001-05-31 | 2005-01-04 | Intel Corporation | Limiting request propagation in a distributed file system |
US6826512B2 (en) | 2001-06-28 | 2004-11-30 | Sony Corporation | Using local devices as diagnostic tools for consumer electronic devices |
US6901603B2 (en) | 2001-07-10 | 2005-05-31 | General Instrument Corportion | Methods and apparatus for advanced recording options on a personal versatile recorder |
US20050039208A1 (en) | 2001-10-12 | 2005-02-17 | General Dynamics Ots (Aerospace), Inc. | Wireless data communications system for a transportation vehicle |
US20040205498A1 (en) * | 2001-11-27 | 2004-10-14 | Miller John David | Displaying electronic content |
US20030110272A1 (en) | 2001-12-11 | 2003-06-12 | Du Castel Bertrand | System and method for filtering content |
US7254777B2 (en) * | 2001-12-20 | 2007-08-07 | Universal Electronics Inc. | System and method for controlling the recording functionality of an appliance using a program guide |
US7634795B2 (en) * | 2002-01-11 | 2009-12-15 | Opentv, Inc. | Next generation television receiver |
US9485532B2 (en) | 2002-04-11 | 2016-11-01 | Arris Enterprises, Inc. | System and method for speculative tuning |
WO2003096669A2 (en) * | 2002-05-10 | 2003-11-20 | Reisman Richard R | Method and apparatus for browsing using multiple coordinated device |
KR100485769B1 (en) | 2002-05-14 | 2005-04-28 | 삼성전자주식회사 | Apparatus and method for offering connection between network devices located in different home networks |
CA2526165A1 (en) | 2002-05-23 | 2003-12-04 | Phochron, Inc. | System and method for digital content processing and distribution |
US8181205B2 (en) * | 2002-09-24 | 2012-05-15 | Russ Samuel H | PVR channel and PVR IPG information |
EP1427148B1 (en) | 2002-12-04 | 2006-06-28 | Thomson Licensing | Method for communication between nodes in peer-to-peer networks using common group label |
US20040117788A1 (en) * | 2002-12-11 | 2004-06-17 | Jeyhan Karaoguz | Method and system for TV interface for coordinating media exchange with a media peripheral |
US7213228B2 (en) * | 2003-03-17 | 2007-05-01 | Macrovision Corporation | Methods and apparatus for implementing a remote application over a network |
US7787010B2 (en) | 2003-03-20 | 2010-08-31 | Pixar | Video to film flat panel digital recorder and method |
US7464110B2 (en) | 2004-06-30 | 2008-12-09 | Nokia Corporation | Automated grouping of image and other user data |
US7260461B2 (en) | 2005-10-31 | 2007-08-21 | Ford Global Technologies, Llc | Method for operating a pre-crash sensing system with protruding contact sensor |
US20070162661A1 (en) * | 2005-12-27 | 2007-07-12 | Pei-Yuan Fu | Memory extension apparatus and the method of data transfer applied therein |
-
2004
- 2004-02-14 US US10/779,953 patent/US7574691B2/en active Active
- 2004-03-17 WO PCT/US2004/008278 patent/WO2004084039A2/en active Application Filing
-
2009
- 2009-06-05 US US12/455,686 patent/US20090307658A1/en not_active Abandoned
-
2013
- 2013-10-11 US US14/051,619 patent/US20140201636A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6225938B1 (en) * | 1999-01-14 | 2001-05-01 | Universal Electronics Inc. | Universal remote control system with bar code setup |
US7574691B2 (en) * | 2003-03-17 | 2009-08-11 | Macrovision Corporation | Methods and apparatus for rendering user interfaces and display information on remote client devices |
US20090307658A1 (en) * | 2003-03-17 | 2009-12-10 | Pedro Freitas | Methods and apparatus for rendering user interfaces and display information on remote client devices |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080320395A1 (en) * | 2006-12-05 | 2008-12-25 | Sony Corporation | Electronic apparatus, an imaging apparatus, a display control method for the same and a program which allows a computer to execute the method |
US20140199947A1 (en) * | 2013-01-11 | 2014-07-17 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9014763B2 (en) * | 2013-01-11 | 2015-04-21 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US10133695B2 (en) * | 2013-12-08 | 2018-11-20 | Crossport Network Solutions Inc. | Link system for establishing high speed network communications and file transfer between hosts using I/O device links |
US20160217093A1 (en) * | 2013-12-08 | 2016-07-28 | Crossport Network Solutions Inc. | Link system for establishing high speed network communications and file transfer between hosts using i/o device links |
RU2688246C2 (en) * | 2014-09-24 | 2019-05-21 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Providing target device resource for rent to a computer environment host device |
US10448111B2 (en) | 2014-09-24 | 2019-10-15 | Microsoft Technology Licensing, Llc | Content projection |
US10635296B2 (en) | 2014-09-24 | 2020-04-28 | Microsoft Technology Licensing, Llc | Partitioned application presentation across devices |
US10824531B2 (en) | 2014-09-24 | 2020-11-03 | Microsoft Technology Licensing, Llc | Lending target device resources to host device computing environment |
CN108509242A (en) * | 2018-03-15 | 2018-09-07 | 维沃移动通信有限公司 | A kind of guidance method and server of application program operation |
US20200150838A1 (en) * | 2018-11-12 | 2020-05-14 | Citrix Systems, Inc. | Systems and methods for live tiles for saas |
US11226727B2 (en) * | 2018-11-12 | 2022-01-18 | Citrix Systems, Inc. | Systems and methods for live tiles for SaaS |
TWI835041B (en) | 2021-12-24 | 2024-03-11 | 瑞昱半導體股份有限公司 | Signal processing device, dongle and adaptor cable |
Also Published As
Publication number | Publication date |
---|---|
WO2004084039A3 (en) | 2006-03-16 |
US20040183756A1 (en) | 2004-09-23 |
WO2004084039A2 (en) | 2004-09-30 |
US20090307658A1 (en) | 2009-12-10 |
US7574691B2 (en) | 2009-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7574691B2 (en) | Methods and apparatus for rendering user interfaces and display information on remote client devices | |
US7213228B2 (en) | Methods and apparatus for implementing a remote application over a network | |
WO2021212668A1 (en) | Screen projection display method and display device | |
WO2021169141A1 (en) | Method for displaying audio track language on display device and display device | |
CN111327931B (en) | Viewing history display method and display device | |
US20110289460A1 (en) | Hierarchical display of content | |
US20120078952A1 (en) | Browsing hierarchies with personalized recommendations | |
WO2021189697A1 (en) | Video display method, terminal, and server | |
US20100157168A1 (en) | Multiple, Independent User Interfaces for an Audio/Video Device | |
US20120078937A1 (en) | Media content recommendations based on preferences for different types of media content | |
US8386954B2 (en) | Interactive media portal | |
JP2012141990A (en) | Configuration of user interface | |
CN111726673B (en) | Channel switching method and display device | |
US20050149991A1 (en) | Method and apparatus for finding applications and relating icons loaded on a television | |
CN111291238A (en) | Display device and search display method | |
WO2021189712A1 (en) | Method for switching webpage video from full-screen playing to small-window playing, and display device | |
US11943514B2 (en) | EPG interface presentation method and display apparatus | |
CN112380420A (en) | Searching method and display device | |
US20080046099A1 (en) | Method and system for customizing access to content aggregated from multiple sources | |
US20050149990A1 (en) | Actuating selected Java Applets on a TV using a remote control | |
WO2021139045A1 (en) | Method for playing back media project and display device | |
KR101714661B1 (en) | Method for data input and image display device thereof | |
TW200814782A (en) | Method and system for partitioning television channels in a platform | |
CN111324215A (en) | Display device and search display method | |
CN113347482B (en) | Method for playing data and display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MACROVISION CORPORATION, CALIFORNIA Free format text: MERGER;ASSIGNOR:MEDIABOLIC, INC.;REEL/FRAME:031398/0146 Effective date: 20061220 Owner name: MEDIABOLIC, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PUTTERMAN, DANIEL;DIETRICH, BRAD;DOORNBOS, JOHN;AND OTHERS;SIGNING DATES FROM 20031001 TO 20031003;REEL/FRAME:031387/0812 Owner name: ROVI SOLUTIONS CORPORATION, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:MACROVISION CORPORATION;REEL/FRAME:031398/0197 Effective date: 20091001 |
|
AS | Assignment |
Owner name: MEDIABOLIC, INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO REMOVE INCRRECT INVENTOR JOHN DOORNBOS AND ADD CORRENT INVENTOR JEREMY TOEMAN AND UPDATE EXECUTION DATES PREVIOUSLY RECORDED ON REEL 031387, FRAME 0812. ASSIGNORS HEREBY CONFIRMS THE CORRECT INVENTORS;ASSIGNORS:FREITAS, PEDRO;PUTTERMAN, DANIEL;TOEMAN, JEREMY;AND OTHERS;REEL/FRAME:031775/0355 Effective date: 20040416 |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT, MARYLAND Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:APTIV DIGITAL, INC.;GEMSTAR DEVELOPMENT CORPORATION;INDEX SYSTEMS INC.;AND OTHERS;REEL/FRAME:033407/0035 Effective date: 20140702 Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:APTIV DIGITAL, INC.;GEMSTAR DEVELOPMENT CORPORATION;INDEX SYSTEMS INC.;AND OTHERS;REEL/FRAME:033407/0035 Effective date: 20140702 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: UNITED VIDEO PROPERTIES, INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: GEMSTAR DEVELOPMENT CORPORATION, CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: APTIV DIGITAL INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: INDEX SYSTEMS INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: STARSIGHT TELECAST, INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: VEVEO, INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: SONIC SOLUTIONS LLC, CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: ROVI SOLUTIONS CORPORATION, CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: ROVI TECHNOLOGIES CORPORATION, CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 Owner name: ROVI GUIDES, INC., CALIFORNIA Free format text: RELEASE OF SECURITY INTEREST IN PATENT RIGHTS;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:051145/0090 Effective date: 20191122 |