Recherche Images Maps Play YouTube Actualités Gmail Drive Plus »
Connexion
Les utilisateurs de lecteurs d'écran peuvent cliquer sur ce lien pour activer le mode d'accessibilité. Celui-ci propose les mêmes fonctionnalités principales, mais il est optimisé pour votre lecteur d'écran.

Brevets

  1. Recherche avancée dans les brevets
Numéro de publicationUS9271044 B2
Type de publicationOctroi
Numéro de demandeUS 12/880,965
Date de publication23 févr. 2016
Date de dépôt13 sept. 2010
Date de priorité14 sept. 2009
Autre référence de publicationCN102025933A, EP2328347A2, EP2328347A3, US8819732, US8832747, US8839307, US8931015, US8947350, US8990854, US9043833, US9081422, US9098128, US9110517, US9110518, US9137577, US9197941, US9258617, US9462345, US20110063206, US20110063509, US20110063511, US20110063521, US20110063522, US20110063523, US20110066929, US20110067047, US20110067051, US20110067052, US20110067054, US20110067055, US20110067056, US20110067057, US20110067060, US20110067061, US20110067062, US20110067063, US20110067064, US20110067065, US20110067069, US20110067071, US20140366062, US20140380381, US20140380401, US20150007222, US20150012939, US20150106857, US20150135217, US20150172769, US20150296263, US20150304721, US20150326931, US20160007090
Numéro de publication12880965, 880965, US 9271044 B2, US 9271044B2, US-B2-9271044, US9271044 B2, US9271044B2
InventeursJeyhan Karaoguz, Nambirajan Seshadri
Cessionnaire d'origineBroadcom Corporation
Exporter la citationBiBTeX, EndNote, RefMan
Liens externes: USPTO, Cession USPTO, Espacenet
System and method for providing information of selectable objects in a television program
US 9271044 B2
Résumé
A system and method for providing information of selectable objects in a television program as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
Images(6)
Previous page
Next page
Revendications(26)
What is claimed is:
1. A method for communicating television program information, the method comprising:
by a television or television receiver:
receiving, by the television or television receiver, moving picture information for a television program;
receiving, by the television or television receiver, user-selectable object information corresponding to a user-selectable object in the television program; and
combining, by the television or television receiver, the received moving picture information and the received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program, the completed moving picture data set is formatted in accordance with a moving picture standard;
wherein:
said receiving moving picture information for the television program comprises receiving an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and
said combining comprises modifying the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the completed moving picture data set that are not assigned by the moving picture standard.
2. The method of claim 1, comprising communicating the combined data set in at least one serial data stream over a communication network to at least one recipient, the at least one serial data stream comprising a serial data stream that comprises moving picture information and user-selectable object information.
3. The method of claim 1, comprising storing the combined data set on a computer readable medium, the combined data set comprising user-selectable object information interleaved with moving picture information.
4. The method of claim 1, wherein the moving picture information for the television program is formatted for communicating the television program without information describing user-selectable objects in the television program.
5. The method of claim 1, wherein said modifying comprises changing at least a portion of the initial object information in accordance with the received user-selectable object information.
6. The method of claim 1, wherein the received user-selectable object information corresponding to the user-selectable object in the television program comprises customized user-selectable object information that is customized to a particular set of one or more users.
7. The method of claim 1, wherein the received user-selectable object information corresponding to the user-selectable object in the television program comprises information describing location of the user-selectable object in a frame of the television program.
8. A television receiver comprising:
at least one processor in the television receiver operable to, at least:
receive moving picture information for a television program;
receive user-selectable object information corresponding to a user-selectable object in the television program;
combine the received moving picture information and the received user-selectable object information into a combined data set, the combined data set is formatted in accordance with a moving picture standard; and
communicate the combined data set comprising interleaved moving picture information and user-selectable object information;
wherein:
the at least one processor is operable to receive the moving picture information for the television program by, at least in part, operating to receive an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and
the at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the combined data set that are not assigned by the moving picture standard.
9. The television receiver of claim 8, wherein the at least one processor is operable to communicate the combined data set in at least one serial data stream over a communication network to at least one recipient, the at least one serial data stream comprising a serial data stream that comprises moving picture information and user-selectable object information.
10. The television receiver of claim 8, wherein the at least one processor is operable to store the combined data set on a computer readable medium, the combined data set comprising user-selectable object information interleaved with moving picture information.
11. The television receiver of claim 8, wherein the moving picture information for the television program is formatted for communicating the television program without information describing user-selectable objects in the television program.
12. The television receiver of claim 11, wherein the at least one processor is operable to combine the received moving picture information and the received user-selectable object information in the combined data set by, at least in part, operating to insert the received user-selectable object information in the completed moving picture data set to create the combined data set comprising a moving picture data set and the received user-selectable object information.
13. The television receiver of claim 8, wherein said at least one processor is operable to receive moving picture information for the television program by, at least in part, operating to receive moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program.
14. The television receiver of claim 13, wherein said at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to combine the received moving picture information and the received user-selectable object information into the completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program.
15. The television receiver of claim 8, wherein the at least one processor is operable to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by, at least in part, operating to change at least a portion of the initial object information in accordance with the received user-selectable object information.
16. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises customized user-selectable object information that is customized to a particular set of one or more users.
17. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises information describing location of the user-selectable object in a frame of the television program.
18. The television receiver of claim 8, where the received user-selectable object information corresponding to the user-selectable object in the television program comprises information identifying at least one action to be performed upon user-selection of the user-selectable object.
19. The method of claim 1, further comprising communicating the combined data set in parallel data streams, each of the parallel data streams comprising interleaved moving picture information and user-selectable object information.
20. The method of claim 1, further comprising aggregating the user-selectable object information received from a plurality of data sources into a single user-selectable object data set prior to the combining.
21. A method for communicating television program information, the method comprising:
by a television or television receiver system:
receiving, by the television or television receiver, moving picture information for a television program;
receiving, by the television or television receiver, user-selectable object information corresponding to a user-selectable object in the television program;
combining, by the television or television receiver, the received moving picture information and the received user-selectable object information into a combined data set, the combined set is formatted in accordance with a moving picture standard; and
communicating, by the television or television receiver, the combined data set in parallel data streams, each of the parallel data streams comprising interleaved moving picture information and user-selectable object information;
wherein:
the at least one processor is operable to receive the moving picture information for the television program by, at least in part, operating to receive an initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program; and
the at least one processor is operable to combine the received moving picture information and the received user-selectable object information into the combined data set by, at least in part, operating to modify the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information by inserting the received user-selectable object information in data fields of the combined data set that are not assigned by the moving picture standard.
22. The method according to claim 1, wherein modifying the initial user-selectable object information comprises changing information defining the user-selectable object presented in the television program.
23. The method according to claim 1, wherein modifying the initial user-selectable object information comprises changing information regarding an action performed upon selection of the user-selectable object.
24. The method according to claim 1, wherein modifying the initial user-selectable object information comprises deleting information regarding the user-selectable object.
25. The method according to claim 1, wherein modifying the initial user-selectable object information comprises encrypting information regarding the user-selectable object.
26. The method according to claim 1, wherein the initial combined television program data set comprising initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program is received in a single serial data stream.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCE

This patent application is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. 12/881,004, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A TELEVISION PROGRAM IN AN INFORMATION STREAM INDEPENDENT OF THE TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/881,031, filed concurrently herewith, titled “SYSTEM AND METHOD FOR PROVIDING INFORMATION OF SELECTABLE OBJECTS IN A STILL IMAGE FILE AND/OR DATA STRAM”. This patent application is further related to U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USERSELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.

FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

[Not Applicable]

SEQUENCE LISTING

[Not Applicable]

MICROFICHE/COPYRIGHT REFERENCE

[Not Applicable]

BACKGROUND OF THE INVENTION

Present television systems are incapable of providing for and/or conveniently providing for user-selection of objects in a television program. Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.

BRIEF SUMMARY OF THE INVENTION

Various aspects of the present invention provide a system and method for providing information of selectable objects in a television program, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.

BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS

FIG. 1 is a diagram illustrating an exemplary television system, in accordance with various aspects of the present invention.

FIG. 2 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention.

FIG. 3 is a flow diagram illustrating an exemplary method for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention.

FIG. 4 is a diagram illustrating an exemplary television system, in accordance with various aspects of the present invention.

FIG. 5 is a diagram illustrating exemplary modules and/or sub-modules for a television system, in accordance with various aspects of the present invention.

DETAILED DESCRIPTION OF VARIOUS ASPECTS OF THE INVENTION

The following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.

Additionally, the following discussion will refer to various television system modules (e.g., television modules, television receiver modules, television controller modules, modules of a user's local television system, modules of a geographically distributed television system, etc.). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.

The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television control device, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), a general data communication network (e.g., the Internet), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.

The following discussion may at times refer to an on-screen pointing location. Such a pointing location refers to a location on the television screen (e.g., a primary television screen, a secondary television screen, etc.) to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing. Various aspects of the present invention, while referring to on-screen pointing location, are also readily extensible to such other forms of on-screen location identification.

Additionally, the following discussion will at times refer to television programming. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded television programming, broadcast/multicast/unicast television programming, etc.). Such television programming may, for example, comprise real-time television broadcast programming (or multicast or unicast television programming) and/or user-stored television programming that is stored in a user device (e.g., a VCR, PVR, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.). Various aspects of the present invention may, for example in a television program source system and/or television program distribution system, comprise embedding information in a television program, where such information describes various aspects of user-selectable objects in the television program. Various aspects of the present invention may also, for example in a television, comprise receiving television programming, presenting such received television programming to a user, determining an on-screen pointing location pointed to by the user and processing information of user-selectable objects embedded in the received television programming to identify a user-selected object in the television programming and/or associated actions.

Also, the following discussion will at times refer to user-selectable objects in television programming. Such user-selectable objects includes both animate (i.e., living) and inanimate (i.e., non-living) objects, both still and moving. Such objects may, for example, comprise characteristics of any of a variety of objects present in television programming. Such objects may, for example and without limitation, comprise inanimate objects, such as consumer good objects (e.g., clothing, automobiles, shoes, jewelry, furniture, food, beverages, appliances, electronics, toys, artwork, cosmetics, recreational vehicles, sports equipment, safety equipment, computer equipment, communication devices, books, etc.), premises objects (e.g., business locations, stores, hotels, signs, doors, buildings, landmarks, historical sites, entertainment venues, hospitals, government buildings, etc.), objects related to services (e.g., objects related to transportation, objects related to emergency services, objects related to general government services, objects related to entertainment services, objects related to food and/or drink services, etc.), objects related to location (e.g., parks, landmarks, streets, signs, road signs, etc.), etc. Such objects may, for example, comprise animate objects, such as people (e.g., actors/actresses, athletes, musicians, salespeople, commentators, reports, analysts, hosts/hostesses, entertainers, etc.), animals (e.g., pets, zoo animals, wild animals, etc.) and plants (e.g., flowers, trees, shrubs, fruits, vegetables, cacti, etc.).

Turning first to FIG. 1, such figure is a diagram illustrating a non-limiting exemplary television system 100 in accordance with various aspects of the present invention. The exemplary system 100 includes a television provider 110. The television provider 110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc. The television provider 110 may, for example, be an original source of television programming (or related information). Also for example, the television provider 110 may be a communication company that provides television programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.). The television provider 110 may, for example, provide television programming and non-programming information and/or video content. The television provider 110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.). As will be discussed below in more detail, the television provider 110 may operate to create a television program (or television program data set, television program data stream, etc.) that includes embedded information of user-selectable objects in the television program. For example and without limitation, such a television provider 110 may operate to receive a completed television program (e.g., a data file, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed television program. Also for example, such a television provider 110 may operate to form the original television program and embed information of user-selectable objects in the original television program during such formation (e.g., in the studio).

The exemplary television system 100 may also include a third party program information provider 120. Such a provider may, for example, provide information related to a television program. Such information may, for example, comprise information describing user-selectable objects in programming, program guide information, etc. As will be discussed below in more detail, such a third party program information provider (e.g., a party independent of a television program source, television program network operator, etc.) may operate to create a television program (or television program data set, television program data stream, etc.) that includes embedded information of user-selectable objects in the television program. For example and without limitation, such a third party program information provider 120 may operate to receive a completed television program (e.g., a data file, a data stream, etc.), for example via a communication network and/or on a physical media, and embed information of user-selectable objects in the completed television program.

The exemplary television system 100 may include one or more communication networks (e.g., the communication network(s) 130). The exemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which television programming and/or information related to television programming may be communicated. For example and without limitation, the communication network 130 may comprise characteristics of any one or more of: a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.

The exemplary television system 100 may include a first television 140. Such a first television 140 may, for example, comprise networking capability enabling such television 140 to communicate directly with the communication network 130. For example, the first television 140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.). Also for example, the first television 140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.). The first television 140 may, for example, operate to (which includes “operate when enabled to”) perform any or all of the functionality discussed herein. The first television 140 may, for example, operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 may include a first television controller 160. Such a first television controller 160 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the first television 140. The first television controller 160 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, the first television controller 160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.

The first television controller 160 (or television control device) may, for example, transmit signals directly to the first television 140 to control operation of the first television 140. The first television controller 160 may also, for example, operate to transmit signals (e.g., via the communication network 130) to the television provider 110 to control television programming (or related information) being provided to the first television 140, or to conduct other transactions (e.g., business transactions, etc.).

As will be discussed in more detail later, the first television controller 160 may operate to communicate screen pointing information with the first television 140 and/or other devices. Also, as will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. The first television controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location.

Additionally, for example in a scenario in which the first television controller 160 comprises an on-board display, the first television controller 160 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

As will be mentioned throughout the following discussion, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local television system. The first television 140 and first television controller 160 provide a non-limiting example of a user's local television system. Such a user's local television system, for example, generally refers to the television-related devices that are local to the television system currently being utilized by the user. For example, when a user is utilizing a television system located at the user's home, the user's local television system generally refers to the television-related devices that make up the user's home television system. Also for example, when a user is utilizing a television system at a premises away from the user's home (e.g., at another home, at a hotel, at an office, etc.), the user's local television system generally refers to the television-related devices that make up the premises television system Such a user's local television system does not, for example, comprise television network infrastructure devices that are generally outside of the user's current premises (e.g., cable and/or satellite head-end apparatus, cable and/or satellite communication intermediate communication network nodes) and/or programming source devices that are generally managed by television enterprises and generally exist outside of the user's home. Such entities, which may be communicatively coupled to the user's local television system, may be considered to be entities remote from the user's local television system (or “remote entities”).

The exemplary television system 100 may also include a television receiver 151. The television receiver 151 may, for example, operate to (e.g., which may include “operate when enabled to”) provide a communication link between a television and/or television controller and a communication network and/or information provider. For example, the television receiver 151 may operate to provide a communication link between the second television 141 and the communication network 130, or between the second television 141 and the television provider 110 (and/or third party program information provider 120) via the communication network 130.

The television receiver 151 may comprise characteristics of any of a variety of types of television receivers. For example and without limitation, the television receiver 151 may comprise characteristics of a cable television receiver, a satellite television receiver, etc. Also for example, the television receiver 151 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). The television receiver 151 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.).

Additionally, for example in a scenario in which the television receiver 151 comprises an on-board display and/or provides audio/video information to a television communicatively coupled thereto, the television receiver 151 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 may include a second television controller 161. Such a second television controller 161 may, for example, operate to (e.g., which may include “operate when enabled to”) control operation of the second television 141 and the television receiver 151. The second television controller 161 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, the second television controller 161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.

The second television controller 161 may, for example, operate to transmit signals directly to the second television 141 to control operation of the second television 141. The second television controller 161 may, for example, operate to transmit signals directly to the television receiver 151 to control operation of the television receiver 151. The second television controller 161 may additionally, for example, operate to transmit signals (e.g., via the television receiver 151 and the communication network 130) to the television provider 110 to control television programming (or related information) being provided to the television receiver 151, or to conduct other transactions (e.g., business transactions, etc.).

As will be discussed in more detail later, various aspects of the present invention include a user selecting a user-selectable object in programming. Such selection may, for example, comprise the user pointing to a location on a television screen (e.g., pointing to an animate or inanimate object presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. The second television controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location. Also, in a scenario in which the second television controller 161 comprises a touch screen, a user may touch a location of such touch screen to point to an on-screen location (e.g., to select a user-selectable object).

As will be mentioned throughout the following discussion, and as mentioned previously in the discussion of the first television 140 and television controller 160, various aspects of the invention will be performed by one or more devices, components and/or modules of a user's local television system. The second television 141, television receiver 151 and second television controller 161 provide another non-limiting example of a user's local television system.

Additionally, for example in a scenario in which the second television controller 161 comprises an on-board display, the second television controller 161 may operate to receive and process television program information (e.g., via a communication network, stored on a physical medium or computer readable medium, etc.), where such television program information comprises embedded information of user-selectable objects.

The exemplary television system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of the exemplary television system 100 unless explicitly claimed.

FIG. 2 is a flow diagram illustrating an exemplary method 200 for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention. Any or all aspects of the exemplary method 200 may, for example, be implemented in a television system component (e.g., the television provider 110, third party program information provider 120, a component of a communication network 130, first television 140, first television controller 160, second television 141, television receiver 151, second television controller 161, shown in FIG. 1 and discussed previously) and/or a plurality of such television system components operating in conjunction. For example, any or all aspects of the exemplary method 200 may be implemented in one or more television system components remote from the user's local television system. Also for example, any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local television system.

The exemplary method 200 may, for example, begin executing at step 205. The exemplary method 200 may begin executing in response to any of a variety of causes and/or conditions, non-limiting examples of which will now be provided. For example, the exemplary method 200 may begin executing in response to a user command to begin (e.g., a user at a television program source, a user at a television production studio, a user at a television distribution enterprise, etc.), in response to television program information and/or information of user-selectable objects in a television program arriving at a system entity implementing the method 200, in response to an electronic request communicated from the external entity to a system entity implementing the method 200, in response to a timer, in response to a request from an end user and/or a component of a user's local television system for a television program including information of user-selectable objects, in response to a request from a user for a television program where such user is associated in a database with television programming comprising user-selectable objects, upon reset and/or power-up of a system component implementing the exemplary method 200, in response to identification of a user and/or user equipment for which object selection capability is to be provided, in response to user payment of a fee, etc.

The exemplary method 200 may, for example at step 210, comprise receiving moving picture information for a television program. Many non-limiting examples of such television programs were provided above. Note that, depending on the particular implementation, such moving picture information may also, for example, be received with corresponding audio information.

Step 210 may comprise receiving the moving picture information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 210 may comprise receiving the moving picture information from a television broadcasting company, from a movie streaming company, from a television studio, from a television program database or server, from a video camera or other video recording device, an Internet television programming provider, etc.

Step 210 may comprise receiving the moving picture information via any of a variety of types of communication networks. Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).

Step 210 may comprise receiving the moving picture information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200. For example, in a scenario including the utilization of such hard media, step 210 may comprise receiving the moving picture information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).

In an exemplary scenario, step 210 may comprise receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program without information describing user-selectable objects in the television program. For example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.). For example, such a data set may be a data file (or set of logically linked data files) formatted in an MPEG or DVD format for normal presentation on a user's local television system. Such a data set of a television program, when received at step 210, might not have information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be added, as will be explained below.

In another exemplary scenario, step 210 may comprise receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. In an exemplary implementation, step 210 may comprise receiving moving picture information (e.g., frame-by-frame bitmaps, partially encoded moving picture information, etc.) that will be formatted in accordance with a moving picture standard, but which has not yet been so formatted. Such a data set of a television program, when received at step 210, might not have information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be added, as will be explained below.

In yet another exemplary scenario, step 210 may comprise receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program with information describing user-selectable objects in the television program. For example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.), or a variant thereof, that specifically accommodates information of user-selectable objects in the television program. Also for example, the received completed moving picture data set may be in conformance with a moving picture standard (e.g., MPEG, MPEG-2, MPEG-4, MPEG-4 AVC, DVD, way, etc.), or a variant thereof, that while not specifically accommodating information of user-selectable objects in the television program, allows for the incorporation of such information in unassigned data fields. For example, such a data set may be a data file (or set of logically linked data files) formatted in an MPEG or DVD format for normal presentation on a user's local television system. Such a data set of a television program, when received at step 210, might comprise information of user-selectable objects in the television program. Such information of user-selectable objects may then, for example, be deleted, modified and/or appended, as will be explained below.

Step 210 may, for example, comprise receiving the moving picture information in digital and/or analog signals. Though the examples provided above generally concerned the receipt of digital data, such examples are readily extendible to the receipt of analog moving picture information (e.g., the receipt of composite and/or component video signals, etc.).

In general, step 210 may comprise receiving moving picture information for a television program. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of moving picture information or by any particular manner of receiving moving picture information unless explicitly claimed.

The exemplary method 200 may, at step 220, comprise receiving object information corresponding to a user-selectable object in the television program. Many non-limiting examples of receiving such object information will now be provided.

Step 220 may comprise receiving the user-selectable object information from any of a variety of sources, non-limiting examples of which will now be provided. For example and without limitation, step 220 may comprise receiving the user-selectable object information from a television broadcasting company, from a movie streaming company, from a television studio, from a television program database or server, from an advertising company, from a commercial enterprise associated with a user-selectable object in a television program, from a person or organization associated with a user-selectable object in a television program, from an Internet television programming provider, from a third party television program information source, etc.

Step 210 may comprise receiving the user-selectable object information from a plurality of independent sources. For example, in an exemplary scenario in which a television program includes user-selectable objects corresponding to a plurality of respective interested parties (e.g., respective product sponsors, respective leagues or other associations, respective people, etc.), step 210 may comprise receiving the user-selectable object information from each of such respective interested parties. For example, step 210 may comprise receiving user-selectable object information corresponding to a user-selectable consumer good in a television program from a provider of such consumer good, receiving user-selectable object information corresponding to an entertainer in the television program from the entertainer's management company, receiving user-selectable object information corresponding to a user-selectable historical landmark in the television program from a society associated with the historical landmark, receiving user-selectable object information corresponding to a user-selectable object in the television program associated with a service from a provider of such service, etc. In such a multiple-source scenario, step 210 may comprise aggregating the user-selectable object information received from the plurality of sources (e.g., into a single user-selectable object data set) for ultimate combination of such user-selectable object information with received moving picture information.

Step 220 may, for example, comprise receiving the user-selectable object information from a same source as that from which the moving picture information was received at step 210 or may comprise receiving the user-selectable object information from a different source. For example and without limitation, step 220 may comprise receiving the user-selectable object information from an advertising company, while step 210 comprises receiving the moving picture information from a television studio. In another example, step 220 may comprise receiving the user-selectable object information from a commercial enterprise associated with a consumer good object presented in the television program, while step 210 comprises receiving the moving picture information from a head-end server of a sports network.

In yet another example, step 220 may comprise receiving the user-selectable object information directly from a computer process that generates such information. For example, an operator may play a moving picture (e.g., at a normal rate, a slower-than-normal rate, frame-by-frame, etc.) and utilize graphical tools (e.g., boxes or other polygons, edge detection routines, etc.) to define and track movement of a user-selectable object in the moving picture. Such a computer process may then output information describing the object and/or movement thereof in the moving picture. Step 220 may comprise receiving the information output from such process.

Step 220 may comprise receiving the user-selectable object information via any of a variety of types of communication networks. Such networks may, for example, comprise a wireless television network (e.g., terrestrial and/or satellite) and/or cable television network. Such networks may, for example, comprise any of variety of general data communication networks (e.g., the Internet, a local area network, a personal area network, a metropolitan area network, etc.).

Step 220 may, for example, comprise receiving the user-selectable object information via a same communication network as that via which the moving picture information was received at step 210 or may comprise receiving the user-selectable object information from a different communication network. For example and without limitation, step 220 may comprise receiving the user-selectable object information via a general data communication network (e.g., the Internet), while step 210 comprises receiving the moving picture information via a television network. In another example, step 220 may comprise receiving the user-selectable object information via a general data network, while step 210 comprises receiving the moving picture information from a computer readable medium.

Step 220 may comprise receiving the user-selectable object information from any of a variety of types of hard media (e.g., optical storage media, magnetic storage media, etc.). Such hard media may, for example, comprise characteristics of optical storage media (e.g., compact disc, digital versatile disc, Blueray®, laser disc, etc.), magnetic storage media (e.g., hard disc, diskette, magnetic tape, etc.), computer memory device (e.g., flash memory, one-time-programmable memory, read-only memory, random access memory, thumb drive, etc.). Such memory may, for example, be a temporary and/or permanent component of the system entity implementing the method 200. For example, in a scenario including the utilization of such hard media, step 220 may comprise receiving the user-selectable object information from such a device and/or from a reader of such a device (e.g., directly via an end-to-end conductor or via a communication network).

The object information corresponding to one or more user-selectable objects that is received at step 220 may comprise any of a variety of characteristics, non-limiting examples of which will now be provided.

For example, such user-selectable object information may comprise information describing and/or defining the user-selectable object that is shown in the television program. Such information may, for example, be processed by a recipient of such information to identify an object that is being selected by a user. Such information may, for example, comprise information describing boundaries associated with a user-selectable object in the television program (e.g., actual object boundaries (e.g., an object outline), areas generally coinciding with a user-selectable object (e.g., a description of one or more geometric shapes that generally correspond to a user-selectable object), selection areas that when selected indicate user-selection of a user-selectable object (e.g., a superset and/or subset of a user-selectable object in the television program), etc. Such information may, for example, describe and/or define the user-selectable in a television program frame coordinate system.

Such information describing and/or defining the user-selectable object that is shown in the television program may comprise information describing movement of a user-selectable object in the television program. For example, such information may comprise information describing the location of the object on a frame-by-frame basis, information describing movement of a user-selectable object in television screen coordinates as a function of time and/or frame, information describing location of a user-selectable object in a video frame relative to a previous object location in a previous video frame, etc.

Many examples of such object description information are provided in a variety of related U.S. patent applications. For example, as mentioned previously, U.S. patent application Ser. No. 12/774,380, filed May 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,832, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,866, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION RECEIVER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,911, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/850,945, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION CONTROLLER FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/851,036, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/851,075, filed Aug. 5, 2010, titled “SYSTEM AND METHOD IN A PARALLEL TELEVISION SYSTEM FOR PROVIDING USER-SELECTION OF OBJECTS IN A TELEVISION PROGRAM”; which are hereby incorporated herein by reference in their entirety, provide many examples of information describing (or otherwise related to) user-selectable objects in television programming.

Also for example, such user-selectable object information may comprise information describing the object, where such information may be presented to the user upon user-selection of a user selectable object. For example, such object information may comprise information describing physical characteristics of a user-selectable object, background information, historical information, general information of interest, location information, financial information, travel information, commerce information, personal information, etc.

Additionally for example, such user-selectable object information may comprise information describing and/or defining actions that may be taken upon user-selection of a user-selectable object, non-limiting examples of such actions and/or related information corresponding to a respective user-selectable object will now be presented.

For example, such user-selectable object information may comprise information describing a one or more manners of determining information to present to the user (e.g., retrieving such information from a known location, conducting a search for such information, etc.), establishing a communication session by which a user may interact with networked entities associated with a user-selected object, interacting with a user regarding display of a user-selected object and/or associated information, etc.

For example, such user-selectable object information may comprise information describing one or more manners of obtaining one or more sets of information, where such information may then, for example, be presented to the user. For example, such information may comprise a memory address (or data storage address) and/or a communication network address (e.g., an address of a networked data server, a URL, etc.), where such address may correspond to a location at which information corresponding to the identified object may be obtained. Such information may, for example, comprise a network address of a component with which a communication session may be initiated and/or conducted (e.g., to obtain information regarding the user-selected object, to interact with the user regarding the selected object, etc.).

In an exemplary scenario in which the user-selectable object information comprises information to present to a user upon user-selection of a selectable object in a television program, such information may comprise any of a variety of different types of information related to the user-selected object. For example and without limitation, such information may comprise information describing the user-selectable object (e.g., information describing aspects of the object, history of the object, design of the object, source of the object, price of the object, critiques of the object, information provided by commercial enterprises producing and/or providing such object, etc.), information indicating to the user how the user may obtain the selected object, information indicating how the user may utilize the selected object, etc. The information may, for example, comprise information of one or more non-commercial organizations associated with, and/or having information pertaining to, the identified user-selected object (e.g., non-profit and/or government organization contact information, web site address information, etc.).

In another exemplary scenario, the information corresponding to a user-selectable object in the television program may comprise information related to conducting a search for information corresponding to the user-selectable object. Such information may, for example, comprise network search terms that may be utilized in a search engine to search for information corresponding to the user-selected object. Such information may also comprise information describing the network boundaries of such a search, for example, identifying particular search networks, particular servers, particular addresses, particular databases, etc.

In an exemplary scenario the information corresponding to a user-selectable object may describe a manner in which a system is to intact with a user to more clearly identify information desired by the user. For example, such information may comprise information specifying user interaction that should take place when an amount of information available and corresponding to a user-selectable object exceeds a particular threshold. Such user interaction may, for example, help to reduce the amount of information that may ultimately be presented to the user. For example, such information may comprise information describing a user interface comprising providing a list (or menu) of types of information available to the user and soliciting information from the user regarding the selection of one or more of the listed types of information.

In yet another exemplary scenario, in which an action associated with a user-selectable object comprises the establishment and/or management of a communication session between the user and one or more networked entities, the user-selectable object information may comprise information describing the manner in which a communication session may be established and/or management.

In still another exemplary scenario, in which an action associated with a user-selectable object comprises providing a user interface by which a user may initiate and perform a commercial transaction regarding a user-selectable object, the user-selectable object information may comprise information describing the manner in which the commercial transaction is to be performed (e.g., order forms, financial information exchange, order tracking, etc.).

As shown above, various user-selectable objects (or types of objects) may, for example, be associated with any of a variety of respective actions that may be taken upon selection of a respective user-selectable object by a user. Such actions (e.g., information retrieval, information searching, communication session management, commercial transaction management, etc.) may, for example, be included in a table or other data structure indexed by the identity of a respective user-selectable object.

Other non-limiting examples of object information corresponding to user-selectable objects in a television program may comprise: athlete information (e.g., statistics, personal information, professional information, history, etc.), entertainer information (e.g., personal information, discography and/or filmography information, information of related organizations, fan club information, photograph and/or video information, etc.), landmark information (e.g., historical information, visitation information, location information, mapping information, photo album information, visitation diary, charitable donation information, etc.), political figure information (e.g., party affiliation, stances on particular issues, history, financial information, voting record, attendance record, etc.), information regarding general types of objects (e.g., information describing actions to take upon user-selection of a person object, of a consumer good object, of a landmark object, etc.) and/or specific objects (e.g., information describing actions to take when a particular person object is selected, when a particular consumer good object is selected, when a particular landmark object is selected, etc.).

For additional non-limiting examples of actions that may be performed related to user selectable objects in television programming, and related user-selectable object information that may be combined with television program moving picture information, the reader is directed to U.S. patent application Ser. No. 12/880,530, filed concurrently herewith, titled “SYSTEM AND METHOD IN A DISTRIBUTED SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,594, filed concurrently herewith, titled “SYSTEM AND METHOD IN A LOCAL TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,668, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM BASED ON USER LOCATION”, U.S. patent application Ser. No. 12/881,067, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/881,096, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR PRESENTING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,749, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION SYSTEM FOR RESPONDING TO USER-SELECTION OF AN OBJECT IN A TELEVISION PROGRAM UTILIZING AN ALTERNATIVE COMMUNICATION NETWORK”; U.S. patent application Ser. No. 12/880,851, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING ADVERTISING INFORMATION ASSOCIATED WITH A USER-SELECTED OBJECT IN A TELEVISION PROGRAM”; U.S. patent application Ser. No. 12/880,888, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED PERSON IN A TELEVISION PROGRAM”; and U.S. patent application Ser. No. 12/881,110, filed concurrently herewith, titled “SYSTEM AND METHOD IN A TELEVISION FOR PROVIDING INFORMATION ASSOCIATED WITH A USER-SELECTED INFORMATION ELEMENT IN A TELEVISION PROGRAM”. The entire contents of each of such applications are hereby incorporated herein by reference in their entirety.

In general, the above-mentioned types of information corresponding to user-selectable objects in television programming may be general to all eventual viewers of the television program, but may also be customized to a particular target user and/or end user. For example, such information may be customized to a particular user (e.g., based on income level, demographics, age, employment status and/or type, education level and/or type, family characteristics, religion, purchasing history, neighborhood characteristics, home characteristics, health characteristics, etc. For example, such information may also be customized to a particular geographical location or region.

In general, step 220 may comprise receiving object information corresponding to a user-selectable object in the television program. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of such user-selectable object information or by any particular manner of receiving such user-selectable object information unless explicitly claimed.

The exemplary method 200 may, at step 230, comprise combining the received moving picture information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Many non-limiting examples of such combining will now be provided.

As mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving a completed moving picture data set for the television program, where the completed moving picture data set is formatted for communicating the television program without information describing user-selectable objects in the television program. In such an exemplary scenario, step 230 may comprise combining the received moving picture information and the received user-selectable object information by, at least in part, inserting the received user-selectable object information in the completed moving picture data set to create a combined data set comprising the received moving picture data set and the received user-selectable object information.

For example, in an exemplary scenario in which the received completed moving picture data set, as received, is formatted in accordance with a moving picture standard (e.g., an MPEG standard), step 230 may comprise inserting the received user-selectable object information in data fields of the completed moving picture data set that are not assigned by the moving picture standard for any specific type of information (e.g., inserting such information into unassigned data fields provided by the moving picture standard, adding new data fields to the moving picture standard, etc.).

Such inserting may, for example, comprise inserting the received user-selectable object information in data fields of the completed moving picture data set that are interleaved with data fields carrying moving picture data. For example, such inserting may be performed in accordance with a format alternating moving picture data and user-selectable object information on a frame-by-frame basis (e.g., sequencing frame 1 moving picture data, frame 1 user-selectable object information, sequencing frame 2 moving picture data, frame 2 user-selectable object information, etc.), by groups of frames (e.g., frame 1-A moving picture data, frame 1-A user-selectable object information, frame A-N moving picture data, frame A-N user-selectable object information, etc.), by sub-frames, etc. Also for example, utilizing time information user-selectable object information need not be strictly placed with the moving picture data for the frame(s) in which the user-selectable object appears. For example, information of user-selectable objects in frame N+1 may be communicated with frame N moving picture information.

Also for example, in another exemplary scenario in which the received completed moving picture data set, as received, is formatted in accordance with a moving picture data standard that specifically assigns data fields to information of user-selectable objects, step 230 may comprise inserting the received user-selectable object information in the data fields of the completed moving picture data set that are specifically assigned by the moving picture standard to contain information of user-selectable objects.

Also as mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. For example, such a scenario may comprise receiving information describing the television program moving picture that has yet to be formatted into a data set that conforms to a particular moving picture standard (e.g., bitmap information, still frame information, movement vector information, etc., which has yet to be placed into a self-contained MPEG data set for communicating the television program). In such an exemplary scenario, step 230 may comprise combining the received moving picture information and the received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program (e.g., into a single cohesive data set, for example, a single data file or other data structure, into a plurality of logically linked data files or other data structures, etc.).

In an exemplary scenario, such a completed moving picture data set may be formatted in accordance with a moving picture standard that specifically assigns respective data fields (or elements) to moving picture information and user-selectable object information. In another exemplary scenario, such a completed moving picture data set may be formatted in accordance with a moving picture standard that specifically assigns data fields to moving picture information, but does not specifically assign data fields to user-selectable object information (e.g., utilizing general-purpose unassigned data fields, adding new data fields to the standard, etc.).

Also as mentioned previously, step 210 may comprise receiving moving picture information for a television program by, at least in part, receiving an initial combined television program data set that comprises initial moving picture information and initial user-selectable object information corresponding to user-selectable objects in the television program. For example, prior to being received, the received initial combined television program data set may have already been formed into a single cohesive data set that comprises the moving picture information for the television program and information of user-selectable objects in the television program.

In such an exemplary scenario, step 230 may comprise modifying the initial user-selectable object information of the initial combined television program data set in accordance with the received user-selectable object information (e.g., as received at step 220). Such modifying may, for example and without limitation, comprise adding the received object information to the initial object information in the initial combined television program data set (e.g., in unused unassigned data fields and/or in unused data fields that have been specifically assigned to contain user-selectable object information, etc.).

Also such modifying may comprise changing at least a portion of the initial object information of the initial combined television program data set in accordance with the received user-selectable object information (e.g., changing information defining a user-selectable object in a presented television program, changing information about a user-selectable object to be presented to a user, changing information regarding any action that may be performed upon user-selection of a user-selectable object, etc.). Additionally, such modifying may comprise deleting at least a portion of the initial object information in accordance with the received user-selectable object information (e.g., in a scenario in which the received user-selectable object information includes a command or directive to remove a portion or all information corresponding to a particular user-selectable object).

In the previously provided examples of combining the received moving picture information and the received user-selectable object information, step 230 may comprise performing such operations automatically (i.e., without real-time interaction with a user while such operations are being performed) and may also be performed with user interaction. For example, the received moving picture information and the received user-selectable object information may each be time-stamped to assist in merging such information. For example, step 230 may comprise analyzing such respective time-stamps to determine the location in a serial stream of moving picture information at which the user-selectable object information is to be inserted. For example, the user-selectable object information for a particular user-selectable object may comprise information of the time and/or frame numbers at which the user-selectable object appears in the television program. Such information may be utilized at step 230 to determine the appropriate location in the moving picture data set at which to place the user-selectable object information.

In another example, step 230 may comprise presenting an operator with a view of the moving picture of a television program and a view of a user-selectable object in such moving picture for which information is being added to a combined dataset. Step 230 may then comprise interacting with the operator to obtain permission and/or directions for combining the moving picture and user-selectable object information.

Note that step 230 may comprise encrypting the user-selectable object information or otherwise restricting access to such information. For example, in a scenario in which access to such information is provided on a subscription basis, in a scenario in which providers of such information desire to protect such information from undesirable access and/or manipulation, etc., such information protection may be beneficial.

In general, step 230 may comprise combining the received moving picture information (e.g., as received at step 210) and the received user-selectable object information (e.g., as received at step 220) in a combined data set. Accordingly, the scope of various aspects of the present invention should not be limited by any particular manner of performing such combining and/or any particular format in which such a combined data set may be placed unless specifically claimed.

The exemplary method 200 may, at step 240, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Such communication may comprise characteristics of any of a variety of types of communication, non-limiting examples of which will now be presented.

Step 240 may, for example, comprise communicating the combined data set(s) via a communication network (e.g., a television communication network, a telecommunication network, a general data communication network (e.g., the Internet, a LAN, etc.), etc.). Many non-limiting examples of such communication network were provided previously. Step 240 may, for example, comprise broadcasting, multi-casting and/or uni-casting the combined data set over one or more communication networks. Step 240 may also, for example, comprise communicating the combined data set(s) to another system and/or device via a direct conductive path (e.g., via a wire, circuit board trace, conductive trace on a die, etc.).

Additionally for example, step 240 may comprise storing the combined data set(s) on a computer readable medium (e.g., a DVD, a CD, a Blueray® disc, a laser disc, a magnetic tape, a hard drive, a diskette, etc.). Such a computer readable medium may then, for example, be shipped to a distributor and/or ultimate recipient of the computer readable medium. Further for example, step 240 may comprise storing the combined data set(s) in a volatile and/or non-volatile memory device (e.g., a flash memory device, a one-time-programmable memory device, an EEPROM, a RAM, etc.).

Further for example, step 240 may comprise storing (or causing or otherwise participating in the storage of) the combined data set(s) in a television system component (e.g., a component or device of the user's local television system and/or a component or device of a television program provider and/or a component or device of any television program source. For example and without limitation, step 240 may comprise storing the combined dataset(s), or otherwise participating in the storage of the combined dataset(s), in a component of the user's local television system (e.g., in a digital video recorder, a television receiver, a television, a television controller, personal communication device, a local networked database, a local networked personal computer, etc.).

Step 240 may, for example, comprise communicating the combined data set in serial fashion. For example, step 240 may comprise communicating the combined data set (comprising interleaved moving picture information and user-selectable object information) in a single data stream (e.g., via a television network, via a general data network, stored on a hard medium in such serial fashion, etc.). Also for example, step 240 may comprise communicating the combined data set in parallel data streams, each of which comprises interleaved moving picture information and user-selectable object information (e.g., as opposed to separate distinct respective data streams for each of moving picture information and user-selectable object information).

In general, step 240 may comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices (e.g., an end user or associated system, television programming provider or associated system, an advertiser or associated system, a television program producer or associated system, a television program database, a television program server, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such communicating or by any particular recipient of such communication unless explicitly claimed.

The exemplary method 200 may, for example at step 295, comprise performing continued operations. Step 295 may comprise performing any of a variety of continued operations, non-limiting examples of such continued operation(s) will be presented below. For example, step 295 may comprise returning execution flow to any of the previously discussed method steps. For example, step 295 may comprise returning execution flow of the exemplary method 200 to step 220 for receiving additional user-selectable object information to combine with television program information. Also for example, step 295 may comprise returning execution flow of the exemplary method 200 to step 210 for receiving additional television program moving picture information and user-selectable object information to combine with such received television program information. Additionally for example, step 295 may comprise returning execution flow of the exemplary method 200 to step 240 for additional communication of the combined information to additional recipients.

In general, step 295 may comprise performing continued operations (e.g., performing additional operations corresponding to combining television program information and information of user-selectable objects in such programming, etc.). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular type of continued processing unless explicitly claimed.

Turning next to FIG. 3, such figure is a flow diagram illustrating an exemplary method 300 for providing embedded information of selectable objects in a television program, in accordance with various aspects of the present invention. The exemplary method 300 may, for example, share any or all characteristics with the exemplary method 200 illustrated in FIG. 2 and discussed previously. Any or all aspects of the exemplary method 300 may, for example, be implemented in a television system component (e.g., the television provider 110, third party program information provider 120, a component of a communication network 130, first television 140, first television controller 160, second television 141, television receiver 151, second television controller 161, shown in FIG. 1 and discussed previously) and/or a plurality of such television system components operating in conjunction. For example, any or all aspects of the exemplary method 300 may be implemented in one or more television system components remote from the user's local television system. Also for example, any or all aspects of the exemplary method 200 may be implemented in one or more components of the user's local television system.

The exemplary method 300 may, for example, begin executing at step 305. The exemplary method 300 may begin executing in response to any of a variety of causes or conditions. Step 305 may, for example, share any or all characteristics with step 205 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

The exemplary method 300 may, for example at step 310, comprise receiving moving picture information for a television program. Step 310 may, for example, share any or all characteristics with step 210 of the exemplary method 200 illustrated in FIG. 2 and discussed previously. For example, step 310 may comprise receiving any of the various types of moving picture information from any of the various sources of moving picture information via any of the various communication media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.

For example, step 310 may comprise, for example at sub-step 312, receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program without information describing user-selectable objects in the television program. Alternatively for example, step 310 may comprise, for example at sub-step 314, receiving moving picture information for the television program prior to the moving picture information being formatted into a completed moving picture data set for communicating the television program. Alternatively for example, step 310 may comprise, for example at sub-step 316, receiving a completed moving picture data set for the television program, the completed moving picture data set formatted for communicating the television program with information describing user-selectable objects in the television program.

The exemplary method 300 may, for example at step 320, comprise receiving object information corresponding to a user-selectable object in the television program. Step 320 may, for example, share any or all characteristics with step 220 of the exemplary method 200 illustrated in FIG. 2 and discussed previously. For example, step 320 may comprise receiving any of the various types of user-selectable object information from any of the various sources of user-selectable object information via any of the various types of media discussed previously with regard to the method 200 of FIG. 2 and the system 100 of FIG. 1 and elsewhere herein.

For example, step 320 may comprise, for example at sub-step 322, receiving user-selectable object information comprising information describing and/or defining the user-selectable object that is shown in the television program (e.g., object dimension information, object movement information, etc.). Also for example, step 320 may comprise, for example at sub-step 324, receiving user-selectable object information comprising information regarding the user-selectable object that may be presented to the user upon user-selection of such object in a television program.

Additionally for example, step 320 may comprise, for example at sub-step 326, receiving user-selectable object information comprising information describing and/or defining actions that may be taken upon user-selection of a user-selectable object (e.g., retrieving and/or obtaining and/or searching for information about a user-selectable object, information specifying a manner in which a system is to interact with a user regarding a user-selected object, searching for information, establishing and/or maintaining communication sessions, information describing the manner in which the commercial transaction is to be performed, etc.).

The exemplary method 300 may, for example at step 330, comprise combining the received moving picture information (e.g., as received at step 310) and the received user-selectable object information (e.g., as received at step 320) in a combined data set. Step 330 may, for example, share any or all characteristics with step 230 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

For example, step 330 may comprise, for example at sub-step 332, inserting the received user-selectable object information in a completed moving picture data set that was received at step 320 (e.g., inserting such user-selectable object information in fields of the moving picture data set that are specified by a standard for carrying such user-selectable object information, inserting such user-selectable object information in fields of the moving picture data set that are not specifically allocated for a particular type of data, etc.).

Also for example, step 330 may comprise, for example at sub-step 334, combining received moving picture data and received user-selectable object information into a completed moving picture data set that is formatted for communicating the television program with information describing user-selectable objects in the television program. Additionally for example, step 330 may comprise, for example at sub-step 336, modifying initial user-selectable object information of an initial combined television program data set in accordance with received user-selectable object information.

The exemplary method 300 may, for example at step 340, comprise communicating the combined data set(s) (e.g., as formed at step 230) to one or more recipient systems or devices. Step 340 may, for example, share any or all characteristics with step 240 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

For example, step 340 may comprise, for example at sub-step 342, communicating the combined data set(s) via a communication network (e.g., any of a variety of communication networks discussed herein, etc.). Also for example, step 340 may comprise, for example, at sub-step 344, communicating the combined data set(s) by storing the combined data set(s) on a computer readable medium and/or by transmitting the combined data set(s) to another device or system to perform such storage. Additionally for example, step 340 may comprise, for example, at sub-step 346, communicating the combined data set in a single serial stream (e.g., comprising interleaved moving picture data and user-selectable object information). Further for example, step 340 may comprise, for example, at sub-step 348, communicating the combined data set in a plurality of parallel serial streams (e.g., each of such streams comprising interleaved moving picture data and user-selectable object information).

The exemplary method 300 may, for example at step 395, comprise performing continued operations. Step 395 may, for example, share any or all characteristics with step 295 of the exemplary method 200 illustrated in FIG. 2 and discussed previously.

Turning next to FIG. 4, such figure is a diagram illustrating an exemplary television system (e.g., single television system component and/or plurality of television system components) 400, in accordance with various aspects of the present invention. The exemplary television system 400 may, for example, share any or all characteristics with one or more of the television system components illustrated in FIG. 1 and discussed previously. For example, the exemplary television system 400 may correspond to any of the television system components illustrated in FIG. 1 (or the like) or any group of the television system components illustrated in FIG. 1 (or the like). Also, the exemplary television system 400 may comprise characteristics of a computing system (e.g., a personal computer, a mainframe computer, a digital signal processor, etc.). The exemplary television system 400 (e.g., various modules thereof) may operate to perform any or all of the functionality discussed previously with regard to the exemplary methods 200 and 300 illustrated in FIGS. 2-3 and discussed previously.

The exemplary television system 400 includes a first communication interface module 410. The first communication interface module 410 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the first communication interface module 410 is illustrated coupled to a wireless RF antenna via a wireless port 412, the wireless medium is merely illustrative and non-limiting. The first communication interface module 410 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television-related information (e.g., moving picture information, information of user-selectable objects, television programming with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, the first communication interface module 410 may operate to communicate with local sources of television-related content or other data (e.g., disc drives, computer-readable medium readers, video recorders, video cameras, computers, receivers, etc.). Additionally, for example, the first communication interface module 410 may operate to communicate with a remote controller (e.g., directly or via one or more intermediate communication networks).

The exemplary television system 400 includes a second communication interface module 420. The second communication interface module 420 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the second communication interface module 420 may communicate via a wireless RF communication port 422 and antenna, or may communicate via a non-tethered optical communication port 424 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the second communication interface module 420 may communicate via a tethered optical communication port 426 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 428 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The second communication interface module 420 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, general data communication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television-related information (e.g., moving picture information, information of user-selectable objects, television programming with and without embedded information of user-selectable objects) and/or other data is communicated. Also for example, the second communication module 420 may operate to communicate with local sources of television-related information (e.g., disc drives, computer-readable medium readers, video recorders, video cameras, computers, receivers, etc.). Additionally, for example, the second communication module 420 may operate to communicate with a remote controller (e.g., directly or via one or more intervening communication networks).

The exemplary television system 400 may also comprise additional communication interface modules, which are not illustrated (some of which may also be shown in FIG. 5). Such additional communication interface modules may, for example, share any or all aspects with the first 410 and second 420 communication interface modules discussed above.

The exemplary television system 400 may also comprise a communication module 430. The communication module 430 may, for example, operate to control and/or coordinate operation of the first communication interface module 410 and the second communication interface module 420 (and/or additional communication interface modules as needed). The communication module 430 may, for example, provide a convenient communication interface by which other components of the television system 400 may utilize the first 410 and second 420 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 430 may coordinate communications to reduce collisions and/or other interference between the communication interface modules.

The exemplary television system 400 may additionally comprise one or more user interface modules 440. The user interface module 440 may generally operate to provide user interface functionality to a user of the television system 400. For example, and without limitation, the user interface module 440 may operate to provide for user control of any or all standard television system commands (e.g., channel control, volume control, on/off, screen settings, input selection, etc.). The user interface module 440 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television system (e.g., buttons, etc.) and may also utilize the communication module 430 (and/or first 410 and second 420 communication interface modules) to communicate with other systems and/or components thereof, regarding television-related information, regarding user interaction that occurs during the formation of combined dataset(s), etc. (e.g., a television system controller (e.g., a dedicated television system remote control, a universal remote control, a cellular telephone, personal computing device, gaming controller, etc.)). In various exemplary scenario, the user interface module(s) 440 may operate to utilize the optional display 470 to communicate with a user regarding user-selectable object information and/or to present television programming to a user.

The user interface module 440 may also comprise one or more sensor modules that operate to interface with and/or control operation of any of a variety of sensors that may be utilized during the performance of the combined data set(s). For example, the one or more sensor modules may be utilized to ascertain an on-screen pointing location, which may for example be utilized to input and/or received user-selectable object information (e.g., to indicate and/or define user-selectable objects in a moving picture). For example and without limitation, the user interface module 440 (or sensor module(s) thereof) may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices, via the communication interface modules 410, 420, etc.). Also for example, in scenarios in which such sensors are active sensors (as opposed to purely passive sensors), the user interface module 440 (or sensor module(s) thereof) may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors. Additionally, the user interface module 440 may perform any of a variety of video output functions (e.g., presenting moving picture information to a user, presenting user-selectable object information to a user, presenting television programming to a user, providing visual feedback to a user regarding an identified user-selected object in a presented moving picture, etc.).

The exemplary television system 400 may comprise one or more processors 450. The processor 450 may, for example, comprise a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, the processor 450 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules in FIG. 4, such illustrative modules, or a portion thereof, may be implemented by the processor 450.

The exemplary television system 400 may comprise one or more memories 460. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one or more memories 460. Such memory 460 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation, such memory 460 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.

The exemplary television system 400 may comprise one or more modules 452 (e.g., moving picture information receiving module(s)) that operate to receive moving picture information for a television program. Such one or more modules 452 may, for example, operate to utilize the communication module 430 (e.g., and at least one of the communication interface modules 410, 420) to receive such television program moving picture information. For example, such one or more modules 452 may operate to perform step 210 of the exemplary method 200 discussed previously and/or step 310 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more module(s) 454 (e.g., user-selectable object information receiving module(s)) that operate to receive object information corresponding to one or more user-selectable objects in a television program. Such one or more modules 454 may, for example, operate to utilize the communication module 430 (e.g., and at least one of the communication interface modules 410, 420) to receive such television program user-selectable object information. For example, such one or more modules 454 may operate to perform step 220 of the exemplary method 200 discussed previously and/or step 320 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more modules 456 (e.g., moving picture and user-selectable object combining module(s)) that operate to combine received moving picture information (e.g., as received by the module(s) 452) and received user-selectable object information (e.g., as received by the module(s) 454) into a combined data set. Such one or more modules 456 may, for example, operate to receive moving picture information from the module(s) 452, receive user-selectable object information from the module(s) 454, combine such received moving picture information and user-selectable object information into a combined data set, and output such combined data set. Such one or more modules 456 may operate to perform step 230 of the exemplary method 200 discussed previously and/or step 330 of the exemplary method 300 discussed previously.

The exemplary television system 400 may comprise one or more modules 458 (e.g., combined data set communication module(s)) that operate to communicate the combined data set to at least one recipient system and/or device. For example, such module(s) 458 may operate to utilize the communication module(s) 430 (and, for example, one or both of the first communication interface module(s) 410 and second communication interface module(s) 420)) to communicate the combined data set. Also for example, such module(s) 458 may operate to communicate the combined data set to one or more system devices that store the combined data set on a physical medium (e.g., a computer-readable medium). Such one or more modules 458 may operate to perform step 240 of the exemplary method 200 discussed previously and/or step 340 of the exemplary method 300 discussed previously.

Though not illustrated, the exemplary television system 400 may, for example, comprise one or more modules that operate to perform any or all of the continued processing discussed previously with regard to step 295 of the exemplary method 200 and step 395 of the exemplary method 300, discussed previously. Such modules (e.g., as with the one or more modules 452, 454, 456 and 458) may be performed by the processor(s) 450 executing instructions stored in the memory 460.

Turning next to FIG. 5, such figure is a diagram illustrating exemplary modules and/or sub-modules for a television system 500, in accordance with various aspects of the present invention. The exemplary television system 500 may share any or all aspects with the television system 400 illustrated in FIG. 4 and discussed previously. For example, the exemplary television system 500 may, for example, share any or all characteristics with one or more of the television system components illustrated in FIG. 1 and discussed previously. For example, the exemplary television system 500 may correspond to any of the television system components illustrated in FIG. 1 (or the like) or any group of the television system components illustrated in FIG. 1 (or the like). For example, the exemplary television system 500 (or various modules thereof) may operate to perform any or all functionality discussed herein with regard to the exemplary method 200 illustrated in FIG. 2 and the exemplary method 300 illustrated in FIG. 3.

For example, the television system 500 comprises a processor 530. Such a processor 530 may, for example, share any or all characteristics with the processor 450 discussed with regard to FIG. 4. Also for example, the television system 500 comprises a memory 540. Such memory 540 may, for example, share any or all characteristics with the memory 460 discussed with regard to FIG. 4.

Also for example, the television system 500 may comprise any of a variety of user interface module(s) 550. Such user interface module(s) 550 may, for example, share any or all characteristics with the user interface module(s) 440 discussed previously with regard to FIG. 4. For example and without limitation, the user interface module(s) 550 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen), a vibrating mechanism, a keypad, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).

The exemplary television system 500 may also, for example, comprise any of a variety of communication modules (505, 506, and 510). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 410, 420 discussed previously with regard to FIG. 4. For example and without limitation, the communication interface module(s) 510 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. The exemplary television system 500 is also illustrated as comprising various wired 506 and/or wireless 505 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby.

The exemplary television system 500 may also comprise any of a variety of signal processing module(s) 590. Such signal processing module(s) 590 may share any or all characteristics with modules of the exemplary television system 400 that perform signal processing. Such signal processing module(s) 590 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s) 590 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.).

In summary, various aspects of the present invention provide a system and method for providing information of selectable objects in a television program. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Citations de brevets
Brevet cité Date de dépôt Date de publication Déposant Titre
US511151116 juin 19895 mai 1992Matsushita Electric Industrial Co., Ltd.Image motion vector detecting apparatus
US540825821 avr. 199318 avr. 1995The Arbitron CompanyMethod of automatically qualifying a signal reproduction device for installation of monitoring equipment
US554385113 mars 19956 août 1996Chang; Wen F.Method and apparatus for translating closed caption data
US560256822 déc. 199411 févr. 1997Goldstar Co., Ltd.Point type remote control apparatus and the method thereof
US570884529 sept. 199513 janv. 1998Wistendahl; Douglass A.System for mapping hot spots in media content for interactive digital media program
US571884518 janv. 199517 févr. 1998Enichem S.P.A.Tricyanovinyl substitution process for NLO polymers
US57215846 juin 199524 févr. 1998Sony CorporationTwo-way broadcast system and receiving system
US57271415 mai 199510 mars 1998Apple Computer, Inc.Method and apparatus for identifying user-selectable regions within multiple display frames
US57933619 déc. 199611 août 1998Corporation For National Research InitiativesUnconstrained pointing interface for natural human interaction with a display-based computer system
US59298492 mai 199627 juil. 1999Phoenix Technologies, Ltd.Integration of dynamic universal resource locators with television presentations
US609744131 déc. 19971 août 2000Eremote, Inc.System for dual-display interaction with integrated television and internet content
US612266022 févr. 199919 sept. 2000International Business Machines CorporationMethod for distributing digital TV signal and selection of content
US61339118 janv. 199817 oct. 2000Samsung Electronics Co., Ltd.Method for selecting menus displayed via television receiver
US62559618 mai 19983 juil. 2001Sony CorporationTwo-way communications between a remote control unit and one or more devices in an audio/visual environment
US625678523 déc. 19963 juil. 2001Corporate Media PatnersMethod and system for providing interactive look-and-feel in a digital broadcast via an X-Y protocol
US628271321 déc. 199828 août 2001Sony CorporationMethod and apparatus for providing on-demand electronic advertising
US631456925 nov. 19986 nov. 2001International Business Machines CorporationSystem for video, audio, and graphic presentation in tandem with video/audio play
US63177144 févr. 199713 nov. 2001Microsoft CorporationController and associated mechanical characters operable for continuously performing received control data while engaging in bidirectional communications over a single communications channel
US63494104 août 199919 févr. 2002Intel CorporationIntegrating broadcast television pause and web browsing
US640777929 mars 199918 juin 2002Zilog, Inc.Method and apparatus for an intuitive universal remote control system
US65325929 nov. 199811 mars 2003Sony CorporationBi-directional remote control unit and method of using the same
US65386727 févr. 200025 mars 2003Koninklijke Philips Electronics N.V.Method and apparatus for displaying an electronic program guide
US656798411 juil. 200020 mai 2003Research Investment Network, Inc.System for viewing multiple data streams simultaneously
US6931660 *28 janv. 200016 août 2005Opentv, Inc.Interactive television system and method for simultaneous transmission and rendering of multiple MPEG-encoded video streams
US705396510 juin 200330 mai 2006Fan Nong-QiangRemote control for controlling a computer using a screen of a television
US705767027 avr. 20016 juin 2006Dan KikinisCursor control system
US71026165 mars 19995 sept. 2006Microsoft CorporationRemote control device with pointing capacity
US715867628 janv. 20002 janv. 2007Emuse Media LimitedInteractive system
US720705328 juil. 200017 avr. 2007Sedna Patent Services, LlcMethod and apparatus for locally targeting virtual objects within a terminal
US730153011 sept. 200227 nov. 2007Samsung Electronics Co., Ltd.Pointer control method, pointing apparatus, and host apparatus therefor
US734408419 sept. 200518 mars 2008Sony CorporationPortable video programs
US736023228 déc. 200115 avr. 2008Diego, Inc.System and method to subscribe to channel URL addresses and to provide non-programming-related URL addresses in an interactive video casting system
US740943718 nov. 20025 août 2008Actv, Inc.Enhanced video programming system and method for incorporating and displaying retrieved integrated Internet information segments
US75354562 mai 200519 mai 2009Hillcrest Laboratories, Inc.Methods and devices for removing unintentional movement in 3D pointing devices
US7536706 *29 mars 199919 mai 2009Sharp Laboratories Of America, Inc.Information enhanced audio video encoding system
US761274816 déc. 20083 nov. 2009Sony CorporationInformation processing apparatus and method, recording medium, and program
US76313383 janv. 20018 déc. 2009Wink Communications, Inc.Interactive content delivery methods and apparatus
US780574725 mai 200128 sept. 2010Corporate Media PartnersMethod and system for providing interactive look-and-feel in a digital broadcast via an X-Y protocol
US78275772 mai 20062 nov. 2010Lg Electronics Inc.Apparatus and method for providing and obtaining product information through a broadcast signal
US786415921 juil. 20054 janv. 2011Thinkoptics, Inc.Handheld vision based absolute pointing system
US788917524 oct. 200715 févr. 2011Panasonic CorporationTouchpad-enabled remote controller and user interaction methods
US78903807 mai 200715 févr. 2011At&T Intellectual Property I, L.P.Method, system, and computer readable medium for implementing sales of products using a trace of an object
US798747828 août 200726 juil. 2011Sony Ericsson Mobile Communications AbMethods, devices, and computer program products for providing unobtrusive video advertising content
US80687814 févr. 200829 nov. 2011Optinetix (Israel) Ltd.Systems and methods for distributing information through broadcast media
US809542317 mars 200610 janv. 2012Grant Allen Lee NicholsInteractive international bulk trade television
US818121230 oct. 200815 mai 2012Frederic SigalMethod of providing a frame-based object redirection overlay for a video stream
US82231367 juin 200517 juil. 2012Intel CorporationError detection and prevention inacoustic data
US826974629 mars 200718 sept. 2012Microsoft CorporationCommunication with a touch screen
US829051325 févr. 200816 oct. 2012Apple Inc.Location-based services
US83596282 oct. 200822 janv. 2013Sony CorporationDisplay device and transmitting device
US842174631 oct. 200716 avr. 2013Porto Vinci Ltd. Limited Liability CompanyDevice control using multi-dimensional motion sensing and a wireless home entertainment hub
US843680929 août 20077 mai 2013Samsung Electronics Co., Ltd.Apparatus, method and medium converting motion signals
US845122319 mars 200828 mai 2013Samsung Electronics Co., Ltd.Pointing apparatus, pointer control apparatus, pointing method, and pointer control method
US860853518 juil. 200517 déc. 2013Mq Gaming, LlcSystems and methods for providing an interactive game
US876040121 avr. 200824 juin 2014Ron KimmelSystem and method for user object selection in geographic relation to a video display
US200100193681 mars 20016 sept. 2001Pace Micro TechnologyBroadcast data receiving
US2001002343622 janv. 199920 sept. 2001Anand SrinivasanMethod and apparatus for multiplexing seperately-authored metadata for insertion into a video data stream
US2001004729830 mars 200129 nov. 2001United Video Properties,Inc.System and method for metadata-linked advertisements
US2002001696511 avr. 20017 févr. 2002Mai-Ian TomsenMethod and system to save context for deferred transaction via interactive television
US200200404829 avr. 20014 avr. 2002Sextro Gary L.Features for interactive television
US2002004292523 juil. 200111 avr. 2002Koji EbisuTelevision receiver, receiver and program execution method
US2002005613621 mars 20019 mai 2002Wistendahl Douglass A.System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box
US2002006940520 sept. 20016 juin 2002Chapin Paul W.System and method for spokesperson interactive television advertisements
US20020078446 *30 août 200120 juin 2002Jon DakssMethod and apparatus for hyperlinking in a television broadcast
US2002009011413 déc. 200111 juil. 2002Rhoads Geoffrey B.Watermark enabled video objects
US2002012093428 févr. 200129 août 2002Marc AbrahamsInteractive television browsing and buying method
US200201364329 nov. 200126 sept. 2002Hiroyuki KoikeMethod and apparatus for processing information of an object
US2002016212025 avr. 200131 oct. 2002Slade MitchellApparatus and method to provide supplemental content from an interactive television system to a remote device
US2003000544516 mars 20012 janv. 2003Schein Steven M.Systems and methods for linking television viewers with advertisers and broadcasters
US2003002398125 juil. 200130 janv. 2003Thomas LemmonsMethod and apparatus for transmission of interactive and enhanced television data
US200300288732 août 20026 févr. 2003Thomas LemmonsPost production visual alterations
US2003003507520 août 200120 févr. 2003Butler Michelle A.Method and system for providing improved user input capability for interactive television
US2003005125316 août 200213 mars 2003Barone Samuel T.Interactive television tracking system
US2003005487820 sept. 200120 mars 2003International Game TechnologyPoint of play registration on a gaming machine
US2003007922422 oct. 200124 avr. 2003Anton KomarSystem and method to provide additional information associated with selectable display areas
US2003011560231 janv. 200319 juin 2003Knee Robert AlanElectronic television program guide schedule system and method with data feed access
US2003014532631 janv. 200231 juil. 2003Koninklijke Philips Electronics N.V.Subscription to TV channels/shows based on recommendation generated by a TV recommender
US2003021299613 avr. 200113 nov. 2003Wolzien Thomas R.System for interconnection of audio program data transmitted by radio to remote vehicle or individual with GPS location
US20030217360 *17 juin 200320 nov. 2003Gordon Donald F.System for generating, distributing and receiving an interactive user interface
US2003023675219 juin 200225 déc. 2003Eastman Kodak CompanyMethod and system for selling goods and/or services over a communication network between multiple users
US2004000341227 juin 20021 janv. 2004Digeo, Inc.Method and apparatus for secure transactions in an interactive television ticker
US2004007881429 mars 200222 avr. 2004Digeo, Inc.Module-based interactive television ticker
US2004010908721 mars 200310 juin 2004Maryse RobinsonMethod and apparatus for digital shopping
US2004011970119 déc. 200224 juin 2004Mulligan Roger C.Lattice touch-sensing system
US2004016785524 sept. 200326 août 2004Cambridge Vivien JohanAutomatic billing system for remote internet services
US2004022102529 avr. 20034 nov. 2004Johnson Ted C.Apparatus and method for monitoring computer networks
US2004023686522 janv. 200425 nov. 2004Actv, Inc.Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
US2004026840130 juin 200330 déc. 2004Gray James HaroldSystem and method for providing interactive media content over a network
US2005002820826 août 20043 févr. 2005United Video Properties, Inc.Interactive television program guide with remote access
US2005008669016 oct. 200321 avr. 2005International Business Machines CorporationInteractive, non-intrusive television advertising
US2005013242010 déc. 200416 juin 2005Quadrock Communications, IncSystem and method for interaction with television content
US2005013795823 déc. 200323 juin 2005Thomas HuberAdvertising methods for advertising time slots and embedded objects
US2005013866819 déc. 200323 juin 2005Bellsouth Intellectual Property CorporationSystem and method for enhanced hot key delivery
US2005015368717 mars 200414 juil. 2005Nokia CorporationProviding location information
US2005017786131 mars 200311 août 2005Matsushita Electric Industrial Co., LtdAsynchronous integration of portable handheld device
US200501934251 mars 20051 sept. 2005Sanghoon SullDelivery and presentation of content-relevant information associated with frames of audio-visual programs
US2005022922712 oct. 200413 oct. 2005Evenhere, Inc.Aggregation of retailers for televised media programming product placement
US2005023478214 avr. 200420 oct. 2005Schackne Raney JClothing and model image generation, combination, display, and selection
US200502518357 mai 200410 nov. 2005Microsoft CorporationStrategies for pausing and resuming the presentation of programs
US2005026254212 août 200424 nov. 2005United Video Properties, Inc.Television chat system
US2006003704414 oct. 200516 févr. 2006Microsoft CorporationPausing television programming in response to selection of hypertext link
US200600647342 déc. 200323 mars 2006Yue MaPortable device for viewing real-time synchronized information from broadcasting sources
US200600999645 nov. 200411 mai 2006Ebay Inc.System and method for location based content correlation
US2006015248921 juil. 200513 juil. 2006John SweetserHandheld vision based absolute pointing system
US2006017427328 oct. 20053 août 2006Samsung Electronics Co., Ltd.Method of displaying service in DMB, and method and apparatus for managing preferred service
US200601958782 mai 200631 août 2006Lg Electronics Inc.Apparatus and method for providing and obtaining product information through a broadcast signal
US2006024186431 janv. 200626 oct. 2006Outland Research, LlcMethod and apparatus for point-and-send data transfer within an ubiquitous computing environment
US200602599302 déc. 200516 nov. 2006Rothschild Leigh MSystem and method for obtaining information on digital media content
US2006026889517 mai 200530 nov. 2006Kotzin Michael DLinking a mobile wireless communication device to a proximal consumer broadcast device
US2006028284710 juin 200514 déc. 2006Aniruddha GupteEnhanced media method and apparatus for use in digital distribution system
US2007009727515 sept. 20063 mai 2007Universal Electronics Inc.Two way communication using light links
US20070130581 *3 janv. 20017 juin 2007Del Sesto Eric EInteractive content delivery methods and apparatus
US2007013761121 déc. 200521 juin 2007Yu Robert CActive radical initiator for internal combustion engines
US2007015652129 déc. 20055 juil. 2007United Video Properties, Inc.Systems and methods for commerce in media program related merchandise
US2007015726029 déc. 20055 juil. 2007United Video Properties, Inc.Interactive media guidance system having multiple devices
US2007019520521 févr. 200623 août 2007Lowe Jerry BRemote control system and method
US2007019901422 févr. 200623 août 2007E-Cast, Inc.Consumer portal
US2007025090130 mars 200725 oct. 2007Mcintire John PMethod and apparatus for annotating media streams
US2007026107929 juin 20078 nov. 2007Lg Electronics Inc.Method and video device for accessing information
US200702664062 mai 200715 nov. 2007Murali AravamudanMethod and system for performing actions using a non-intrusive television with reduced text input
US200702772019 août 200729 nov. 2007Microsoft CorporationSystem and method to facilitate programming of an associated recording device
US2007030026324 juil. 200627 déc. 2007Barton James MMethod and apparatus for advertisement placement in a user dialog on a set-top box
US2008001652624 sept. 200717 janv. 2008Asmussen Michael LAdvanced Set Top Terminal Having A Program Pause Feature With Voice-to-Text Conversion
US2008005275012 juil. 200728 févr. 2008Anders Grunnet-JepsenDirect-point on-demand information exchanges
US2008006609712 oct. 200513 mars 2008Woodhyun ParkMethod Of Realizing Interactive Advertisement Under Digital Braodcasting Environment By Extending Program Associated Data-Broadcasting To Internet Area
US200800661298 nov. 200713 mars 2008Goldpocket Interactive, Inc.Method and Apparatus for Interaction with Hyperlinks in a Television Broadcast
US2008007175014 sept. 200720 mars 2008Nokia CorporationMethod, Apparatus and Computer Program Product for Providing Standard Real World to Virtual World Links
US2008008955116 oct. 200617 avr. 2008Ashley HeatherInteractive TV data track synchronization system and method
US2008010985123 oct. 20068 mai 2008Ashley HeatherMethod and system for providing interactive video
US200801321634 févr. 20085 juin 2008Optinetix (Israel) Ltd.Systems and methods for distributing information through broadcast media
US20080134342 *29 oct. 20075 juin 2008Shamoon Talal GMethods and Apparatus for Persistent Control and Protection of Content
US2008013675414 nov. 200712 juin 2008Sony CorporationDisplay apparatus, display-apparatus control method and program
US20080172693 *16 janv. 200717 juil. 2008Microsoft CorporationRepresenting Television Programs Using Video Objects
US2008017757017 juil. 200724 juil. 2008Ari CraineMethods, Systems, and Computer-Readable Media for Disease Management
US2008018413231 janv. 200731 juil. 2008Zato Thomas JMedia content tagging
US2008020460329 janv. 200828 août 2008Hideharu HattoriVideo displaying apparatus and video displaying method
US2008020460528 févr. 200728 août 2008Leonard TsaiSystems and methods for using a remote control unit to sense television characteristics
US2008020948019 déc. 200728 août 2008Eide Kurt SMethod for enhanced video programming system for integrating internet data for on-demand interactive retrieval
US200900062111 juil. 20081 janv. 2009Decisionmark Corp.Network Content And Advertisement Distribution System and Method
US200900214738 déc. 200322 janv. 2009Grant Danny AHaptic Communication Devices
US200900347843 août 20075 févr. 2009Mcquaide Jr Arnold ChesterMethods, systems, and products for indexing scenes in digital media
US2009003794730 juil. 20075 févr. 2009Yahoo! Inc.Textual and visual interactive advertisements in videos
US2009007739411 sept. 200819 mars 2009Jr-Shian TsaiTechniques for communications based power management
US2009008381519 sept. 200826 mars 2009Mcmaster OrlandoGenerating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time
US2009011347520 août 200830 avr. 2009Yi LiSystems and methods for integrating search capability in interactive video
US2009016504121 déc. 200725 juin 2009Penberthy John SSystem and Method for Providing Interactive Content with Video Content
US2009016504820 déc. 200725 juin 2009United Video Properties, Inc.Methods and devices for presenting guide listings and guidance data in three dimensions in an interactive media guidance application
US2009018786222 janv. 200823 juil. 2009Sony CorporationMethod and apparatus for the intuitive browsing of content
US2009019925926 janv. 20096 août 2009Rachad AlaoService gateway for interactive television
US2009021731726 févr. 200827 août 2009At&T Knowledge Ventures, L.P.System and method for promoting marketable items
US2009023531211 mars 200817 sept. 2009Amir MoradTargeted content with broadcast material
US2009023757212 juil. 200724 sept. 2009Kazuyuki KishimotoImage display device and image display method
US2009025681115 avr. 200815 oct. 2009Sony Ericsson Mobile Communications AbOptical touch screen
US2009027181531 mai 200629 oct. 2009Laura ContinMethod and Tv Receiver for Storing Contents Associated to Tv Programs
US2009029668627 mai 20083 déc. 2009At & T Delaware Intellectual Property, Inc.Methods, communications devices, and computer program products for selecting an advertisement to initiate device-to-device communications
US2009032789414 avr. 200931 déc. 2009Novafora, Inc.Systems and methods for remote control of interactive video
US201000054888 janv. 20097 janv. 2010Novafora, Inc.Contextual Advertising
US2010006432013 mars 200611 mars 2010Verizon Services Corp.Integrating data on program popularity into an on-screen program guide
US2010009734825 févr. 200922 avr. 2010Inha Industry Partnership InstituteTouch screen tool
US2010009807422 oct. 200822 avr. 2010Backchannelmedia Inc.Systems and methods for providing a network link between broadcast content and content located on a computer network
US2010015715219 nov. 200924 juin 2010Thomson LicensingDisplay device with feedback elements and method for monitoring
US20100162303 *23 déc. 200824 juin 2010Cassanova Jeffrey PSystem and method for selecting an object in a video data stream
US2010021822820 févr. 200926 août 2010Walter Edward ASystem and method for processing image objects in video data
US201002574486 avr. 20097 oct. 2010Interactical LlcObject-Based Interactive Programming Device and Method
US201100321914 août 200910 févr. 2011Cooke Benjamin TVideo system and remote control with touch interface for supplemental content display
US201100635235 août 201017 mars 2011Jeyhan KaraoguzSystem and method in a television controller for providing user-selection of objects in a television program
US20110066929 *13 sept. 201017 mars 2011Jeyhan KaraoguzSystem and method for providing information of selectable objects in a still image file and/or data stream
US2011006706213 sept. 201017 mars 2011Jeyhan KaraoguzSystem and method for providing information of selectable objects in a television program
US2011006706313 sept. 201017 mars 2011Jeyhan KaraoguzSystem and method in a television system for presenting information associated with a user-selected object in a televison program
US2011006706413 sept. 201017 mars 2011Jeyhan KaraoguzSystem and method in a television system for presenting information associated with a user-selected object in a television program
US201100670695 août 201017 mars 2011Jeyhan KaraoguzSystem and method in a parallel television system for providing for user-selection of an object in a television program
US2011014101314 déc. 200916 juin 2011Alcatel-Lucent Usa, IncorporatedUser-interface apparatus and method for user control
US2011017943531 mars 201121 juil. 2011Charles CordraySystems and methods for managing content
US201200795252 déc. 201129 mars 2012United Video Properties, Inc.Interactive television program guide with remote access
US2012015426816 déc. 201121 juin 2012Apple Inc.Remote control systems that can distinguish stray light sources
US2012016377629 févr. 201228 juin 2012United Video Properties, Inc.Interactive program guide with continuous data stream and client-server data supplementation
US201401016901 oct. 201310 avr. 2014Nant Holdings Ip, LlcImage Capture and Identification System and Process
CN1193869A8 janv. 199823 sept. 1998三星电子株式会社Method for selecting menu in television receiver
CN1300501A28 janv. 200020 juin 2001皇家菲利浦电子有限公司Method and apparatus for presenting a electronic performance progam
CN1329796A29 oct. 19992 janv. 2002联合视频制品公司Interactive program guide with continuous data stream and client-server data supplementation
WO1999004559A121 juil. 199828 janv. 1999Samsung Information Systems AmericaTv graphical user interface having cursor position indicator
WO2007137611A131 mai 20066 déc. 2007Telecom Italia S.P.A.Method and tv receiver for storing contents associated to tv programs
WO2009033500A114 sept. 200719 mars 2009Nec Europe Ltd.Method and system for optimizing network performances
Citations hors brevets
Référence
1Final Office Action from related U.S. Appl. No. 12/774,154 dated Apr. 10, 2013.
2Final Office Action from related U.S. Appl. No. 12/774,154 dated Feb. 27, 2014.
3Final Office Action from related U.S. Appl. No. 12/774,221 dated Feb. 26, 2013.
4Final Office Action from related U.S. Appl. No. 12/774,221 dated Jan. 29, 2014.
5Final Office Action from related U.S. Appl. No. 12/774,221 dated Jul. 1, 2015.
6Final Office Action from related U.S. Appl. No. 12/774,321 dated Jun. 2, 2014.
7Final Office Action from related U.S. Appl. No. 12/774,321 dated Jun. 27, 2013.
8Final Office Action from related U.S. Appl. No. 12/774,380 dated Jun. 11, 2013.
9Final Office Action from related U.S. Appl. No. 12/850,832 dated Feb. 25, 2013.
10Final Office Action from related U.S. Appl. No. 12/850,832 dated Oct. 7, 2014.
11Final Office Action from related U.S. Appl. No. 12/850,832 dated Sep. 24, 2015.
12Final Office Action from related U.S. Appl. No. 12/850,866 dated Mar. 29, 2013.
13Final Office Action from related U.S. Appl. No. 12/850,911 dated Mar. 20, 2015.
14Final Office Action from related U.S. Appl. No. 12/850,911 dated Oct. 5, 2012.
15Final Office Action from related U.S. Appl. No. 12/850,945 dated Apr. 26, 2013.
16Final Office Action from related U.S. Appl. No. 12/850,945 dated Dec. 16, 2013.
17Final Office Action from related U.S. Appl. No. 12/851,036 dated Feb. 26, 2013.
18Final Office Action from related U.S. Appl. No. 12/851,075 dated Jun. 8, 2015.
19Final Office Action from related U.S. Appl. No. 12/851,075 dated Mar. 5, 2013.
20Final Office Action from related U.S. Appl. No. 12/851,075 dated Oct. 14, 2014.
21Final Office Action from related U.S. Appl. No. 12/880,530 dated Aug. 18, 2014.
22Final Office Action from related U.S. Appl. No. 12/880,530 dated Jan. 14, 2013.
23Final Office Action from related U.S. Appl. No. 12/880,530 dated Sep. 16, 2015.
24Final Office Action from related U.S. Appl. No. 12/880,594 dated Nov. 28, 2012.
25Final Office Action from related U.S. Appl. No. 12/880,668 dated Nov. 26, 2013.
26Final Office Action from related U.S. Appl. No. 12/880,749 dated Feb. 1, 2013.
27Final Office Action from related U.S. Appl. No. 12/880,749 dated Jan. 13, 2015.
28Final Office Action from related U.S. Appl. No. 12/880,749 dated Mar. 13, 2014.
29Final Office Action from related U.S. Appl. No. 12/880,851 dated Feb. 12, 2014.
30Final Office Action from related U.S. Appl. No. 12/880,851 dated Nov. 14, 2012.
31Final Office Action from related U.S. Appl. No. 12/880,851 dated Sep. 10, 2013.
32Final Office Action from related U.S. Appl. No. 12/880,888 dated Dec. 6, 2012.
33Final Office Action from related U.S. Appl. No. 12/881,004 dated Mar. 7, 2013.
34Final Office Action from related U.S. Appl. No. 12/881,031 dated Dec. 15, 2015.
35Final Office Action from related U.S. Appl. No. 12/881,031 dated Feb. 12, 2015.
36Final Office Action from related U.S. Appl. No. 12/881,031 dated Mar. 6, 2014.
37Final Office Action from related U.S. Appl. No. 12/881,067 dated Oct. 9, 2012.
38Final Office Action from related U.S. Appl. No. 12/881,096 dated Apr. 27, 2015.
39Final Office Action from related U.S. Appl. No. 12/881,096 dated Jan. 23, 2013.
40Final Office Action from related U.S. Appl. No. 12/881,110 dated Feb. 18, 2015.
41Final Office Action from related U.S. Appl. No. 12/881,110 dated Oct. 17, 2012.
42Final Office Action from related U.S. Appl. No. 14/457,451 dated Apr. 29, 2015.
43Final Office Action from related U.S. Appl. No. 14/467,408 dated May 7, 2015.
44Final Office Action from related U.S. Appl. No. 14/479,670 dated Jun. 9, 2015.
45Final Office Action from related U.S. Appl. No. 14/480,020 dated May 8, 2015.
46Final Office Action from related U.S. Appl. No. 14/488,778 dated May 19, 2015.
47Final Office Action from related U.S. Appl. No. 14/625,810 dated Nov. 16, 2015.
48Intel, "Intel Ethernet Switch Converged Enhanced Ethernet (CEE) and Datacenter Bridging (DCB)", White Paper, Feb. 2009, pp. 1-14.
49Non-Final Office Action from related U.S. Appl. No. 12/774,154 dated Aug. 14, 2013.
50Non-Final Office Action from related U.S. Appl. No. 12/774,154 dated Nov. 13, 2014.
51Non-Final Office Action from related U.S. Appl. No. 12/774,221 dated Jan. 28, 2015.
52Non-Final Office Action from related U.S. Appl. No. 12/774,221 dated Sep. 20, 2013.
53Non-Final Office Action from related U.S. Appl. No. 12/774,321 dated Feb. 7, 2014.
54Non-Final Office Action from related U.S. Appl. No. 12/774,380 dated Apr. 15, 2014.
55Non-Final Office Action from related U.S. Appl. No. 12/850,832 dated Jun. 3, 2015.
56Non-Final Office Action from related U.S. Appl. No. 12/850,832 dated Mar. 24, 2014.
57Non-Final Office Action from related U.S. Appl. No. 12/850,866 dated Aug. 14, 2014.
58Non-Final Office Action from related U.S. Appl. No. 12/850,945 dated Aug. 27, 2013.
59Non-Final Office Action from related U.S. Appl. No. 12/850,945 dated Jul. 25, 2014.
60Non-Final Office Action from related U.S. Appl. No. 12/851,075 dated Apr. 4, 2014.
61Non-Final Office Action from related U.S. Appl. No. 12/859,911 dated Aug. 14, 2014.
62Non-Final Office Action from related U.S. Appl. No. 12/880,530 dated Apr. 9, 2014.
63Non-Final Office Action from related U.S. Appl. No. 12/880,530 dated Mar. 31, 2015.
64Non-Final Office Action from related U.S. Appl. No. 12/880,594 dated Oct. 22, 2013.
65Non-Final Office Action from related U.S. Appl. No. 12/880,668 dated Jun. 10, 2013.
66Non-Final Office Action from related U.S. Appl. No. 12/880,749 dated Jul. 30, 2014.
67Non-Final Office Action from related U.S. Appl. No. 12/880,749 dated Oct. 4, 2013.
68Non-Final Office Action from related U.S. Appl. No. 12/880,888 dated Nov. 4, 2013.
69Non-Final Office Action from related U.S. Appl. No. 12/881,004 dated Oct. 30, 2013.
70Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Jul. 1, 2015.
71Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Jul. 25, 2014.
72Non-Final Office Action from related U.S. Appl. No. 12/881,031 dated Sep. 10, 2013.
73Non-Final Office Action from related U.S. Appl. No. 12/881,067 dated May 9, 2014.
74Non-Final Office Action from related U.S. Appl. No. 12/881,096 dated Sep. 22, 2014.
75Non-Final Office Action from related U.S. Appl. No. 12/881,110 dated Apr. 7, 2014.
76Non-Final Office Action from related U.S. Appl. No. 12/881,110 dated Sep. 17, 2014.
77Non-Final Office Action from related U.S. Appl. No. 14/457,451 dated Nov. 20, 2014.
78Non-Final Office Action from related U.S. Appl. No. 14/457,451 dated Sep. 22, 2015.
79Non-Final Office Action from related U.S. Appl. No. 14/467,408 dated Dec. 17, 2014.
80Non-Final Office Action from related U.S. Appl. No. 14/467,408 dated Oct. 26, 2015.
81Non-Final Office Action from related U.S. Appl. No. 14/479,670 dated Dec. 19, 2014.
82Non-Final Office Action from related U.S. Appl. No. 14/479,670 dated Oct. 15, 2015.
83Non-Final Office Action from related U.S. Appl. No. 14/480,020 dated Dec. 31, 2014.
84Non-Final Office Action from related U.S. Appl. No. 14/480,020 dated Sep. 30, 2015.
85Non-Final Office Action from related U.S. Appl. No. 14/488,778 dated Jan. 2, 2015.
86Non-Final Office Action from related U.S. Appl. No. 14/488,778 dated Oct. 7, 2015.
87Non-Final Office Action from related U.S. Appl. No. 14/625,810 dated Jun. 11, 2015.
88Non-Final Office Action from related U.S. Appl. No. 14/753,183 dated Nov. 6, 2015.
89Office Action from related U.S. Appl. No. 12/774,154 dated Dec. 5, 2012.
90Office Action from related U.S. Appl. No. 12/774,221 dated Aug. 29, 2012.
91Office Action from related U.S. Appl. No. 12/774,321 dated Nov. 14, 2012.
92Office Action from related U.S. Appl. No. 12/774,380 dated Jan. 8, 2013.
93Office Action from related U.S. Appl. No. 12/774,380 dated Jul. 9, 2012.
94Office Action from related U.S. Appl. No. 12/850,832 dated Aug. 15, 2012.
95Office Action from related U.S. Appl. No. 12/850,866 dated Jun. 20, 2012.
96Office Action from related U.S. Appl. No. 12/850,866 dated Oct. 4, 2012.
97Office Action from related U.S. Appl. No. 12/850,911 dated Jun. 20, 2012.
98Office Action from related U.S. Appl. No. 12/850,945 dated Aug. 2, 2012.
99Office Action from related U.S. Appl. No. 12/851,036 dated Aug. 22, 2012.
100Office Action from related U.S. Appl. No. 12/851,075 dated Sep. 5, 2012.
101Office Action from related U.S. Appl. No. 12/880,530 dated Aug. 2, 2012.
102Office Action from related U.S. Appl. No. 12/880,594 dated Jun. 19, 2012.
103Office Action from related U.S. Appl. No. 12/880,668 dated Jan. 2, 2013.
104Office Action from related U.S. Appl. No. 12/880,668 dated Jul. 2, 2012.
105Office Action from related U.S. Appl. No. 12/880,749 dated Aug. 30, 2012.
106Office Action from related U.S. Appl. No. 12/880,851 dated Jun. 20, 2012.
107Office Action from related U.S. Appl. No. 12/880,888 dated Jul. 2, 2012.
108Office Action from related U.S. Appl. No. 12/881,004 dated Nov. 1, 2012.
109Office Action from related U.S. Appl. No. 12/881,067 dated Jun. 27, 2012.
110Office Action from related U.S. Appl. No. 12/881,096 dated Jun. 19, 2012.
111Office Action from related U.S. Appl. No. 12/881,110 dated May 29, 2012.
Événements juridiques
DateCodeÉvénementDescription
2 nov. 2010ASAssignment
Owner name: BROADCOM CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KARAOGUZ, JEYHAN;SESHADRI, NAMBIRAJAN;SIGNING DATES FROM 20100910 TO 20100913;REEL/FRAME:025233/0186
11 févr. 2016ASAssignment
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH
Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001
Effective date: 20160201
3 févr. 2017ASAssignment
Owner name: BROADCOM CORPORATION, CALIFORNIA
Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001
Effective date: 20170119