US20140327677A1 - Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen - Google Patents
Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen Download PDFInfo
- Publication number
- US20140327677A1 US20140327677A1 US14/322,566 US201214322566A US2014327677A1 US 20140327677 A1 US20140327677 A1 US 20140327677A1 US 201214322566 A US201214322566 A US 201214322566A US 2014327677 A1 US2014327677 A1 US 2014327677A1
- Authority
- US
- United States
- Prior art keywords
- screen
- content
- displayed
- messages
- social
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000012545 processing Methods 0.000 claims abstract description 12
- 238000012544 monitoring process Methods 0.000 claims abstract description 8
- 238000004891 communication Methods 0.000 claims description 31
- 238000003860 storage Methods 0.000 claims description 16
- 230000008569 process Effects 0.000 claims description 10
- 230000007246 mechanism Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 21
- 230000033001 locomotion Effects 0.000 description 14
- 230000015654 memory Effects 0.000 description 14
- 230000000694 effects Effects 0.000 description 11
- 230000001960 triggered effect Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 238000013459 approach Methods 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 230000007704 transition Effects 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 238000013475 authorization Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000003292 diminished effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000008685 targeting Effects 0.000 description 2
- 241001112258 Moca Species 0.000 description 1
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 239000012190 activator Substances 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005201 scrubbing Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G06Q50/40—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/04—Processing captured monitoring data, e.g. for logfile generation
- H04L43/045—Processing captured monitoring data, e.g. for logfile generation for graphical visualisation of monitoring data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0255—Targeted advertisements based on user history
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/12—Synchronisation between the display unit and other units, e.g. other display units, video-disc players
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/10—Active monitoring, e.g. heartbeat, ping or trace-route
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43079—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on multiple devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/06—Consumer Electronics Control, i.e. control of another device by a display or vice versa
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
Definitions
- the present invention generally relates to providing additional content related to displayed content.
- the presented graphical representation referred to here as a Social Heatmap organizes social information to correspond with the timeline of a of a media item such as content being displayed on a first screen device.
- a method for providing a graphical representation of social messages on a second screen relating to content displayed on a first screen involves monitoring social media for one or more messages relating to content being displayed on a first screen, processing the one or more message relating to content being displayed on first screen device to associate message with a time segment of the content being displayed on the first screen, and providing a graphical representation of the one or more social messages associated with one or more time segments of the content being displayed on the first screen on the second screen.
- a second screen device capable of displaying a graphical representation of social messages associated with content being displayed on a first screen.
- the second screen device includes a screen, storage, and a processor.
- the screen is configured to display content.
- the storage is for storing data.
- the processor is configured to monitor social media for one or more messages relating to content being displayed on a first screen; process the one or more message relating to content being displayed on first screen device to associate message with a time segment of the content being displayed on the first screen, and provide a graphical representation of the one or more social messages associated with one or more time segments of the content being displayed on the first screen for display on the screen of the second screen device
- FIG. 1 is a system diagram outlining the delivery of video and audio content to the home in accordance with one embodiment.
- FIG. 2 is system diagram showing further detail of a representative set top box receiver.
- FIG. 3 is a diagram depicting a touch panel control device in accordance with one embodiment.
- FIG. 4 is a diagram depicting some exemplary user interactions for use with a touch panel control device in accordance with one embodiment.
- FIG. 5 is system diagram depicting one embodiment of a system for implementing techniques of the present invention in accordance with one embodiment.
- FIG. 6 is a flow diagram depicting an exemplary process in accordance with one embodiment.
- FIG. 7 is a diagram depicting an exemplary methodology of synching between devices in accordance with one embodiment.
- FIG. 8 is a diagram depicting an exemplary methodology of synching between devices in accordance with one embodiment.
- FIGS. 9A-9F are exemplary skeletal screen views depicting features in accordance with one embodiment when used in passive mode.
- FIGS. 10A-10D are exemplary skeletal screen views depicting features in accordance with one embodiment when used in active mode.
- FIGS. 11A-11C are exemplary skeletal views depicting a social media sharing feature in accordance with one embodiment.
- FIGS. 12A and 12B are exemplary skeletal views depicting a content selection features in accordance with one embodiment.
- FIGS. 13A-13E are exemplary skeletal views depicting additional features in accordance with one embodiment.
- FIGS. 14A-14L are exemplary skinned screen views depicting how certain features could appear to a user.
- FIG. 15 is a exemplary skeletal view depicting social media features in accordance with one embodiment.
- FIG. 16 is a flow diagram depicting the functionality of social media features in accordance with one embodiment.
- FIG. 17 is an exemplary skinned screen view depicting social media features in accordance with one embodiment.
- FIG. 18 is a flow diagram providing a general methodology of providing content on a second screen based on social messages regarding content being displayed on a first screen in accordance with one embodiment.
- FIG. 19 is an exemplary view of a social quote event in accordance with one embodiment.
- FIG. 20 is an exemplary view of dynamic advertizing on a second screen based on social messages in accordance with one embodiment.
- FIG. 21 is an exemplary view of how a social message can be associated with content in accordance with one embodiment.
- FIG. 22 is flow diagram providing a methodology for generating a social message associated with content in accordance with one embodiment.
- FIG. 23 is flow diagram providing a methodology for processing a social message associated with content in accordance with one embodiment.
- FIG. 24 is flow diagram providing a methodology for providing a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 25 is an exemplary representation of bins that make up a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 26 is an exemplary representation of the association of messages to the bins that make up a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 27 is a exemplary skeletal view depicting a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 28 is an exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 29 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment.
- FIG. 30 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment.
- FIG. 31 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment.
- FIG. 32 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment.
- FIG. 33 is a exemplary skinned screen view depicting a graphical representation of social messages associated with time periods of content in accordance with one embodiment.
- FIG. 1 a block diagram of an embodiment of a system 100 for delivering content to a home or end user is shown.
- the content originates from a content source 102 , such as a movie studio or production house.
- the content may be supplied in at least one of two forms.
- One form may be a broadcast form of content.
- the broadcast content is provided to the broadcast affiliate manager 104 , which is typically a national broadcast service, such as the American Broadcasting Company (ABC), National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), etc.
- the broadcast affiliate manager may collect and store the content, and may schedule delivery of the content over a deliver network, shown as delivery network 1 ( 106 ).
- Delivery network 1 ( 106 ) may include satellite link transmission from a national center to one or more regional or local centers. Delivery network 1 ( 106 ) may also include local content delivery using local delivery systems such as over the air broadcast, satellite broadcast, or cable broadcast. The locally delivered content is provided to a receiving device 108 in a user's home, where the content will subsequently be searched by the user. It is to be appreciated that the receiving device 108 can take many forms and may be embodied as a set top box/digital video recorder (DVR), a gateway, a modem, etc. Further, the receiving device 108 may act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network.
- DVR set top box/digital video recorder
- the receiving device 108 may act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network.
- Special or additional content may include content delivered as premium viewing, pay-per-view, or other content otherwise not provided to the broadcast affiliate manager, e.g., movies, video games or other video elements.
- the special content may be content requested by the user.
- the special content may be delivered to a content manager 110 .
- the content manager 110 may be a service provider, such as an Internet website, affiliated, for instance, with a content provider, broadcast service, or delivery network service.
- the content manager 110 may also incorporate Internet content into the delivery system.
- the content manager 110 may deliver the content to the user's receiving device 108 over a separate delivery network, delivery network 2 ( 112 ).
- Delivery network 2 ( 112 ) may include high-speed broadband Internet type communications systems.
- the content from the broadcast affiliate manager 104 may also be delivered using all or parts of delivery network 2 ( 112 ) and content from the content manager 110 may be delivered using all or parts of delivery network 1 ( 106 ).
- the user may also obtain content directly from the Internet via delivery network 2 ( 112 ) without necessarily having the content managed by the content manager 110 .
- the additional content is provided as an augmentation to the broadcast content, providing alternative displays, purchase and merchandising options, enhancement material, etc.
- the additional content may completely replace some programming content provided as broadcast content.
- the additional content may be completely separate from the broadcast content, and may simply be a media alternative that the user may choose to utilize.
- the additional content may be a library of movies that are not yet available as broadcast content.
- the receiving device 108 may receive different types of content from one or both of delivery network 1 and delivery network 2.
- the receiving device 108 processes the content, and provides a separation of the content based on user preferences and commands.
- the receiving device 108 may also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the receiving device 108 and features associated with playing back stored content will be described below in relation to FIG. 2 .
- the processed content is provided to a display device 114 .
- the display device 114 may be a conventional 2-D type display or may alternatively be an advanced 3-D display.
- the receiving device 108 may also be interfaced to a second screen such as a touch screen control device 116 .
- the touch screen control device 116 may be adapted to provide user control for the receiving device 108 and/or the display device 114 .
- the touch screen device 116 may also be capable of displaying video content.
- the video content may be graphics entries, such as user interface entries, or may be a portion of the video content that is delivered to the display device 114 .
- the touch screen control device 116 may interface to receiving device 108 using any well known signal transmission system, such as infra-red (IR) or radio frequency (RF) communications and may include standard protocols such as infra-red data association (IRDA) standard, Wi-Fi, Bluetooth and the like, or any other proprietary protocols.
- IR infra-red
- RF radio frequency
- the touch screen control device 116 can be interfaced directly with delivery networks 1 and 2. Operations of touch screen control device 116 will be described in further detail below.
- the system 100 also includes a back end server 118 and a usage database 120 .
- the back end server 118 includes a personalization engine that analyzes the usage habits of a user and makes recommendations based on those usage habits.
- the usage database 120 is where the usage habits for a user are stored. In some cases, the usage database 120 may be part of the back end server 118 a .
- the back end server 118 (as well as the usage database 120 ) is connected to the system the system 100 and accessed through the delivery network 2 ( 112 ).
- Receiving device 200 may operate similar to the receiving device described in FIG. 1 and may be included as part of a gateway device, modem, set-top box, or other similar communications device.
- the device 200 shown may also be incorporated into other systems including an audio device or a display device. In either case, several components necessary for complete operation of the system are not shown in the interest of conciseness, as they are well known to those skilled in the art.
- the input signal receiver 202 may be one of several known receiver circuits used for receiving, demodulation, and decoding signals provided over one of the several possible networks including over the air, cable, satellite, Ethernet, fiber and phone line networks.
- the desired input signal may be selected and retrieved by the input signal receiver 202 based on user input provided through a control interface or touch panel interface 222 .
- Touch panel interface 222 may include an interface for a touch screen device. Touch panel interface 222 may also be adapted to interface to a cellular phone, a tablet, a mouse, a high end remote or the like.
- the decoded output signal is provided to an input stream processor 204 .
- the input stream processor 204 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream.
- the audio content is provided to an audio processor 206 for conversion from the received format, such as compressed digital signal, to an analog waveform signal.
- the analog waveform signal is provided to an audio interface 208 and further to the display device or audio amplifier.
- the audio interface 208 may provide a digital signal to an audio output device or display device using a High-Definition Multimedia Interface (HDMI) cable or alternate audio interface such as via a Sony/Philips Digital Interconnect Format (SPDIF).
- HDMI High-Definition Multimedia Interface
- SPDIF Sony/Philips Digital Interconnect Format
- the audio interface may also include amplifiers for driving one more sets of speakers.
- the audio processor 206 also performs any necessary conversion for the storage of the audio signals.
- the video output from the input stream processor 204 is provided to a video processor 210 .
- the video signal may be one of several formats.
- the video processor 210 provides, as necessary a conversion of the video content, based on the input signal format.
- the video processor 210 also performs any necessary conversion for the storage of the video signals.
- a storage device 212 stores audio and video content received at the input.
- the storage device 212 allows later retrieval and playback of the content under the control of a controller 214 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from a user interface 216 and/or touch panel interface 222 .
- the storage device 212 may be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or may be an interchangeable optical disk storage system such as a compact disk (CD) drive or digital video disk (DVD) drive.
- the converted video signal from the video processor 210 , either originating from the input or from the storage device 212 , is provided to the display interface 218 .
- the display interface 218 further provides the display signal to a display device of the type described above.
- the display interface 218 may be an analog signal interface such as red-green-blue (RGB) or may be a digital interface such as HDMI. It is to be appreciated that the display interface 218 will generate the various screens for presenting the search results in a three dimensional grid as will be described in more detail below.
- the controller 214 is interconnected via a bus to several of the components of the device 200 , including the input stream processor 202 , audio processor 206 , video processor 210 , storage device 212 , and a user interface 216 .
- the controller 214 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display.
- the controller 214 also manages the retrieval and playback of stored content.
- the controller 214 performs searching of content and the creation and adjusting of the grid display representing the content, either stored or to be delivered via the delivery networks, described above.
- the controller 214 is further coupled to control memory 220 (e.g., volatile or nonvolatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code for controller 214 .
- Control memory 220 may store instructions for controller 214 .
- Control memory may also store a database of elements, such as graphic elements containing content. The database may be stored as a pattern of graphic elements. Alternatively, the memory may store the graphic elements in identified or grouped memory locations and use an access or location table to identify the memory locations for the various portions of information related to the graphic elements. Additional details related to the storage of the graphic elements will be described below.
- control memory 220 may include several possible embodiments, such as a single memory device or, alternatively, more than one memory circuit communicatively connected or coupled together to form a shared or common memory. Still further, the memory may be included with other circuitry, such as portions of bus communications circuitry, in a larger circuit.
- the user interface process of the present disclosure employs an input device that can be used to express functions, such as fast forward, rewind, etc.
- a touch panel device 300 may be interfaced via the user interface 216 and/or touch panel interface 222 of the receiving device 200 , as shown in FIG. 3 .
- the touch panel device 300 allows operation of the receiving device or set top box based on hand movements, or gestures, and actions translated through the panel into commands for the set top box or other control device.
- the touch panel 300 may simply serve as a navigational tool to navigate the grid display.
- the touch panel 300 will additionally serve as the display device allowing the user to more directly interact with the navigation through the grid display of content.
- the touch panel device may be included as part of a remote control device containing more conventional control functions such as actuator or activator buttons.
- the touch panel 300 can also includes at least one camera element. In some embodiments, the touch panel 300 may also include a microphone.
- FIG. 4 the use of a gesture sensing controller or touch screen, such as shown, provides for a number of types of user interaction.
- the inputs from the controller are used to define gestures and the gestures, in turn, define specific contextual commands.
- the configuration of the sensors may permit defining movement of a user's fingers on a touch screen or may even permit defining the movement of the controller itself in either one dimension or two dimensions.
- Two-dimensional motion, such as a diagonal, and a combination of yaw, pitch and roll can be used to define any three-dimensional motion, such as a swing.
- a number of gestures are illustrated in FIG. 4 . Gestures are interpreted in context and are identified by defined movements made by the user.
- Bumping 420 is defined by a two-stroke drawing indicating pointing in one direction, either up, down, left or right.
- the bumping gesture is associated with specific commands in context. For example, in a TimeShifting mode, a left-bump gesture 420 indicates rewinding, and a right-bump gesture indicates fast-forwarding. In other contexts, a bump gesture 420 is interpreted to increment a particular value in the direction designated by the bump.
- Checking 440 is defined as in drawing a checkmark. It is similar to a downward bump gesture 420 . Checking is identified in context to designate a reminder, user tag or to select an item or element. Circling 440 is defined as drawing a circle in either direction. It is possible that both directions could be distinguished.
- Dragging 450 is defined as an angular movement of the controller (a change in pitch and/or yaw) while pressing a button (virtual or physical) on the tablet 300 (i.e., a “trigger drag”).
- the dragging gesture 450 may be used for navigation, speed, distance, time-shifting, rewinding, and forwarding.
- Dragging 450 can be used to move a cursor, a virtual cursor, or a change of state, such as highlighting outlining or selecting on the display. Dragging 450 can be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is preferred to modify the response to the dragging command.
- Nodding 460 is defined by two fast trigger-drag up-and-down vertical movements. Nodding 460 is used to indicate “Yes” or “Accept.”
- X-ing 470 is defined as in drawing the letter “X.” X-ing 470 is used for “Delete” or “Block” commands.
- Wagging 480 is defined by two trigger-drag fast back-and-forth horizontal movements. The wagging gesture 480 is used to indicate “No” or “Cancel.”
- a simple right or left movement on the sensor as shown here may produce a fast forward or rewind function.
- multiple sensors could be included and placed at different locations on the touch screen. For instance, a horizontal sensor for left and right movement may be placed in one spot and used for volume up/down, while a vertical sensor for up and down movement may be place in a different spot and used for channel up/down. In this way specific gesture mappings may be used.
- the system and methodology can be implemented in any number of ways depending on the hardware and the content involved. Examples of such deployment include DVD, Blu-Ray disc (BD); streaming video or video on demand (VOD), and broadcast (satellite, cable, over the air).
- BD Blu-Ray disc
- VOD streaming video or video on demand
- broadcast satellite, cable, over the air.
- Each of these deployments would have different architectures but one could standardize the triggers for each of these events (the additional content) that represents what would be queued by the application running on the second screen. For example, event A and event B would be triggered by a synching mechanism associated with any of these sources of a video.
- event A When the tablet encounters “event A”, the program running on the second screen device (e.g. tablet) will enact “event A”. Similarly, if “event B” is encountered, the program running on the second screen device would do “event B”.
- FIG. 5 depicts a generic system 500 on which such methodology could be implemented.
- the system 500 includes a first screen device 510 , a second screen device 520 , a playback device 530 , a network 540 and server 550 . Each of these elements will be discussed in more detail below.
- the first screen device 510 is a display device, such as display device 114 described above in relation to FIG. 1 , for displaying content such as television programs, movies, and websites. Examples of such first screen display devices include, but are not limited to, a television, monitor, projector, or the like.
- the first screen device 510 is connected to the playback device 530 which can provide the primary content to the first screen device 510 for display. Examples of such communication include, but are not limited to HDMI, VGA, Display port, USB, component, composite, radio frequency (RF), and infrared (IR), and the like.
- the first screen display device 510 may be connected to the network 540 , in either a wired or wireless (WiFi) manner, providing additional connection to the second screen device 520 and server 550 .
- the first display device 510 may include the functionality of the playback device 530 .
- the first screen display device 510 may be in non-networked communication 560 with the second screen device 520 . Examples of such non-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth, and other audio communication techniques and protocols.
- the second screen device 520 is device capable of displaying additional content related to the primary content being displayed on the first screen device 510 .
- the second screen device may be a touch screen control device 116 or touch screen device 300 as described above. Examples of second screen devices include, but are not limited to, a smart phone, tablet, laptop, personal media player (e.g. ipod), or the like.
- the second screen device 520 is in communication with playback device 530 using either network 540 , non-networked communication 560 , or both.
- the second screen device 550 is also in communication with the server 550 via the network 540 for requesting and receiving additional content related to the primary content being displayed on the first screen device 510 .
- the second screen device 520 may be in networked or non-networked communication 560 with the first screen device 510 .
- non-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth (BT), audio communication techniques and protocols, or the like.
- the playback device 530 is device capable of providing primary content for display on the first screen device 510 .
- Examples of such playback display devices include, but are not limited to, a DVD player, Blue-Ray Disc (BD) player, game console, receiver device (cable or satellite), Digital Video Recorder (DVR), streaming device, personal computer, or the like.
- the playback device 530 is connected to the first screen device 510 for providing the primary content to the first screen device 510 for display. Examples of such connections include, but are not limited to HDMI, VGA, Display port, USB, component, composite, radio frequency (RF), and infrared (IR), and the like.
- the playback device 530 is also connected to the network 540 , in either a wired or wireless (WiFi) manner, providing connection to the second screen device 520 and server 550 .
- the functionality of the playback device 530 may be included in the first screen display device 510 .
- the playback device 530 may be in non-networked communication 560 with the second screen device 520 . Examples of such non-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth (BT), and other audio communication techniques and protocols.
- the network 540 can be a wired or wireless communication network implemented using Ethernet, MoCA, and wireless protocols or a combination thereof. Examples of such a network include, but are not limited to, delivery network 1 ( 106 ) and delivery network 2 ( 112 ) discussed above.
- the server 550 is a content server configured to provide additional content to the second screen device 520 .
- the server may also provide the primary content for display on the first screen device 510 .
- the service is connected to the network 540 and can communicate with any of the devices that are also connected. Examples of such a server include, but are not limited to, content source 102 , broadcast affiliate manager 104 , content manager 110 , and the back end server described above.
- FIG. 6 depicts a flow diagram 600 for a methodology for displaying additional content related to primary content being viewed is disclosed.
- the method includes the following steps: Displaying primary content on a first screen device 510 (step 610 ). Providing, in association with the display of the primary content on the first screen, a synching mechanism to synch additional content (step 620 ). Displaying, on a second screen device 520 , additional content related to the primary content on the first screen 510 that is synched to the content displayed on the first screen device according to the synching mechanism (step 630 ).
- the method also includes the steps of receiving commands from the second screen device 520 to control the display of primary content on the first screen device 510 (step 640 ) and controlling the display of the primary content on the first screen device 510 based on the commands received from the second screen device 520 (step 650 ).
- steps 640 receive commands from the second screen device 520 to control the display of primary content on the first screen device 510
- step 650 controlling the display of the primary content on the first screen device 510 based on the commands received from the second screen device 520.
- the step of displaying primary content is performed on the first screen device 510 .
- the primary content can be provided by the playback device 530 or be received directly from a content provider at the first screen display device 510 .
- the primary content is then shown or otherwise displayed on the first screen device 510 .
- the display of the primary content also includes the control of the content being displayed. This can include the traditional playback commands of play, stop, pause, rewind, and fast forward as well as the navigation of on screen menus to select the content and other playback options.
- the display on the first screen device 510 (step 620 ) further includes displaying an indicator of the type of additional content being displayed on the second screen device 520 .
- the provided synching mechanism (step 620 ) can be implemented in a number of ways.
- the synching mechanism is performed by an application running on the second screen device 520 , the playback mechanism 530 , the first screen device 510 or any combination thereof.
- the second screen device 520 is configured (via an application) to detect synching signals, cues, or other type of indicators that directs the second screen device 520 to update the display of additional content to coincide with the primary content being displayed on the first screen 510 .
- the synching signals, cues or other type of indicators can be provided as part of the primary content or can be generated by the playback device 530 of first screen device 510 (via an application) in accordance with the chapter, scene, time-code, subject matter, or content being displayed.
- the synching signals, cues or other type of indicators can be transmitted to the second screen device 520 using the network, in either a wired or wireless (WiFi) manner, or using non-networked communication 560 such as audio signals. Examples of some of the implementations are given below. Other possible implementations will be apparent given the benefit of this disclosure.
- the step of displaying the additional content is performed on the second screen device 520 .
- the additional content can be stored locally on the second screen device 520 or be provided by the server 550 , playback device 530 , or first screen device 510 .
- the display of the additional content is synched to the primary content being displayed on the first screen device 510 according to the synching mechanism. For example, when the second screen device 520 detects a synching signal, cue or other type of indicator, the second screen device 520 updates the display of the additional content accordingly.
- this further involves contacting and requesting the additional content from the server 550 , playback device 530 , or first screen device 510 and subsequently downloading and displaying the additional content.
- the additional content to be displayed can be selected, modified, or omitted based on the user using the system.
- the display on the second screen device 520 (step 630 ) further includes displaying the status of the display of the primary content on the first screen device 510 such as whether the display of the primary content on the first screen device 510 has been paused. In certain other embodiments, the display on the second screen device 520 (step 630 ) further includes displaying the status of the synch between the additional content on the second screen device 520 and the primary content on the first screen device 510 .
- the second screen device 520 is capable of transmitting as well as receiving.
- the optional steps 640 and 650 address this capability.
- commands are received from the second screen device 520 .
- these commands are received at the device controlling the playback of the primary content on the first screen device 510 .
- the playback device 530 is the device receiving the commands.
- the commands can be sent via the network 540 or non-networked communication 560 .
- the commands can control the display of the primary content (step 650 ). Examples of such control include, but are not limited to, play, stop, pause, rewind, fast-forward, as well as chapter, scene, and selection. These commands can also be used to synch the primary content displayed on the first screen device 510 with the additional content being displayed on the second screen device 520 .
- FIG. 7 provides a high level overview of one example of system 700 with a synching mechanism implemented using a non-networked communication 560 .
- the non-networked communication synching mechanism is audio watermarking 710 .
- audio watermarking 710 involves inserting a high-frequency signal, cue, or other indicator into the audio signal of the primary content being displayed on the first screen device 510 .
- the audio watermark is inaudible to humans but can be detected by a microphone in the second screen device 520 .
- the second screen device 520 detects an audio watermark
- the displayed additional content is updated to synch with the primary content being displayed on the first screen device 510 based on the detected watermark.
- the audio watermarks can be incorporated into the primary content at the source of the content or inserted locally by the playback device 520 or first screen device 510 .
- FIG. 8 provides a high-level overview of one example of a system 800 with a synching mechanism implemented using the network 540 .
- the synching mechanism is wireless communication (WiFi) 810 between a playback device 530 (a Blu-Ray Disc player) and the second screen device 520 (an iOS device running an application).
- WiFi wireless communication
- the features and protocols of a BD-Live enabled device are used. There are two main components of this protocol: connection and communication. Both are described below.
- connection and communication Both are described below.
- the second screen iOS application will be referred to as the “iPad” and the BD-Live enabled device will be referred to as the “disc”.
- Connection occurs when an iOS enabled device 520 first launches the second screen application and attempts to connect to a BD-Live enabled device 530 on the same Wi-Fi network 540 .
- Such a wireless communication as seen in this example is that it is bi-directional allowing the second screen device to transmit as well as receive commands. This allows for two way synching as well as control of playback from the second screen device 520 .
- the application of the second screen device 520 could be specific to a specific program or movie on a specific system (e.g. BD).
- the second screen application could be generic to a studio with available plug-ins to configure the application to a particular program or movie.
- the second screen application could be universal to system (BD, VOD, broadcast), content, or both.
- the system can be operated in with a passive approach or an interactive approach.
- icons displayed on first screen device 510 prompt the user to look at the second screen device 520 for an additional content event being displayed that is related to the primary content displayed on the first screen device 510 .
- the icon preferably indicates what type of additional content event is available on the second screen device 520 (e.g., a shopping cart icon indicates a purchase event, an “I” icon indicates an information event, a stickman icon indicates a character information event, etc.)
- FIG. 9A-F depicts some of the aspects that may be displayed to the user in passive mode.
- FIGS. 9A-F depict skeletal examples of what may be displayed on the screen 900 of the second screen device to a user when using an application in passive mode that provides additional content on the second screen device 520 that is synched with the primary content on the first screen device 510 .
- FIG. 9A depicts a splash screen that may be displayed to the user when the application is launched. It includes the product logo and indication of the primary content 902 . Here new content screens transition in from right in a conveyer-belt like manner as indicate by arrow 904 .
- FIG. 9B depicts a pop-up message 906 that is displayed to a user when no playback device 530 is detected by second screen device 520 .
- the screen 900 of FIG. 9C shows a synch button/icon 908 , chapter timeline 910 , active chapter indicator 912 , chapter-event indicator 914 , chapter number indicator 916 , event timeline 918 , chapter background 920 , event card 922 , and timeline view icons 924 .
- the synch button 908 provides a mechanism to synch the content between the first and second screen devices 510 , 520 .
- the synch button 908 may also indicate the status of the synch between the content on the first and second screen devices 510 , 520 .
- the chapter timeline 910 indicates the chapters of the primary content.
- the movie title leader is in the background of the chapter timeline 910 and indicates the primary content.
- the chapter-event indicator 914 indicates that events displayed in the event timeline 918 are part of the active chapter shown in the chapter timeline 910 .
- the event timeline 918 displays event cards 922 indicating events that correspond to what is transpiring in the current chapter of the primary content. For each chapter, the first displayed event card 922 indicates the chapter that the following events occur in.
- Each chapter may be provided with a unique background 920 for the events of that particular chapter.
- the timeline view icon/button 924 indicates that the viewer is in timeline view showing the chapter timeline 910 and event timeline 918 as well as provides a mechanism to access the timeline view.
- FIGS. 9D and 9E depict how event cards 922 progress across the event timeline 918 .
- the synch button/icon 908 indicates that the timeline view of the additional content is in synch with the primary content on the first screen device 510 .
- the current triggered event card 926 is shown in the center position of the event timeline 918 and represents the first triggered event.
- To the left of the current triggered event card 926 in the event timeline 918 is the previous event card 928 , in this case the card indicating the chapter.
- To the right of the current triggered event card 926 in the event timeline 918 is the next event card 930 , in this case the card indicating the next scheduled event. Since, in FIG.
- the current triggered event card 926 includes the additional content 932 related to the primary content.
- the current triggered event card 926 also provides an indicator 934 as to what type of additional content is displayed. In certain embodiments this indicator matches an indicator shown on the first screen display 510 .
- the current event card 926 also includes buttons/icons for synching 936 and sharing 938 .
- the synch button/icon 936 provides a mechanism that causes the primary content displayed on the first screen device 520 to be synched with the current event.
- the share button/icon 938 provides a mechanism to share the additional content of the event with a social network.
- the elements of the screen 900 of FIG. 9E are similar to the elements of FIG. 9D except that the current triggered event card 926 is for an event that happens later in the timeline as indicated by the chapter indicator 916 which indicates the current chapter is chapter 3.
- FIG. 9F depicts examples of other possible functionality that may be provided as part of display on the second screen device 920 .
- the chapter timeline 910 is provided with a collapse icon/button 940 which provides a mechanism to toggle the chapter timeline between visible 940 a and hidden 940 b .
- the synch button/icon 908 can toggle between status indicating whether synch is currently active 908 a and status indicating synch has been lost and re-synching is available 908 b .
- a volume button icon 942 is provided. The volume button/icon 942 provides a mechanism to turn the sound of the first screen display “OFF” or “ON”.
- the volume button 942 may also indicate the status of whether the volume is “ON” indicating muting is available 942 a , or “OFF” indicating sound is available 942 b .
- a play/pause button/icon 944 is provided.
- the play/pause button 944 provides a mechanism to pause or resume playback of content on the first screen display 510 .
- the pause/play button may also indicate the status of whether the playback can be paused 944 a , or “or resumed 944 b.
- the user selects an additional content event on the second screen device 520 and what is displayed on the primary screen device 510 is synched to the selected event.
- the events of additional content are synched to the primary content.
- the user swipes the movie timeline or the events the events become out of synch with the movie being shown on the main screen.
- the timeline or events are the synched back to what is being displayed on the main screen.
- a user can select a trivia event or map event, touch the synch button, and the scene in the movie related to the selected trivia or map event will be played on the main screen. Examples of this can be seen in FIG. 10A-D .
- FIG. 10A depicts how a user may interact with the chapter timeline 910 and event timeline 918 on the screen 900 .
- icons 1000 and 1002 represent how the user can touch the screen to scroll left or right in the chapter or event timelines 910 , 918 .
- FIG. 10B depicts one embodiment of the screen 900 when a user interacts with the chapter timeline 910 .
- the synch button/icon 908 indicates that the additional content on the second screen display 520 is out of synch with the primary content on the first screen display 510 .
- Icon 1000 represents the user scrolling through the chapter timeline 910 .
- the current chapter remains highlighted 912 until the transition to the new chapter is completed.
- a chapter position indicator 1004 is provided that indicates what chapter of the available chapters is selected.
- the chapter indicator 916 also indicates the selected chapter and updates when the transition to the new chapter is complete.
- the event timeline 918 is dimmed.
- the user may jump directly to a particular chapter by selecting the chapter from the timeline 910 .
- FIG. 10C depicts one embodiment of the screen 900 when a user interacts with the event timeline 918 .
- Icon 1002 represents the user scrolling through the event timeline 918 .
- the timeline 918 is being transitioned from current triggered event card 926 to the next event card 930 .
- an event position indicator 1004 is provided that indicates what event of the available events is selected.
- FIG. 10D depicts one embodiment of the screen 900 when a user interacting with the event timeline 918 causes a transition from one chapter to another.
- Icon 1002 represents the user scrolling through the event timeline 910 causing a chapter change.
- the timeline 918 is being transitioned a new event card 922 indicating a new set of events related to a new chapter.
- the event position indicator 1004 is centered until the new series of events begins.
- FIGS. 11A-C and 12 A-B indicate some of the other interactive activities that can be accessed via the event cards 922 .
- FIGS. 11A-C depict the social media sharing feature.
- FIGS. 12A-B depict the chapter selection as well as selection and playback of additional media files.
- FIG. 11A-C shows various pop-up fields on the display 900 when the sharing feature is active via the share button/icon 937 .
- FIG. 11A shows the field 1100 displayed when the user has logged into their social network (in this case Facebook). Area 1102 indicates the event being shared and area 1104 indicates the comments the user is going to share about the event. Button 1106 provides the mechanism to submit the event and comments to be shared.
- FIG. 11B shows the field 1100 displayed when the user has not yet signed in to the social network. In this example button 1108 is provided to sign into Facebook and button 1110 is provided to sign into twitter. Options to sign into other social networks may also be provided.
- FIG. 11C shows a onscreen Qwerty keyboard 1112 that may be used to enter comments into area 1104 for user's comments. In certain embodiments, this may be a default keyboard provided by the second screen device 520 .
- FIG. 12A-B shows the selection of chapters as well media content for playback by the user.
- the playback on the first screen device 510 is paused.
- the user double taps 1202 the currently playing chapter shown in the chapter timeline playback of on the first screen device will jump to the beginning of the chapter and the events timeline 918 will be set to the first event of that chapter.
- the event cards 922 may include media files 1204 such as video or audio clips. If the media file is an audio clip, then selection of the audio clip results in playback on the current screen 900 .
- the media file is a video clip
- selection of the video clip results in the launching of a full-screen media player 1206 as seen in FIG. 12B .
- the media player includes on-screen controls 1208 .
- the user To return to the previous screen, the user only needs to tap the non-video surface 1210 of the media player.
- FIG. 13A-E depicts some other possible features regarding the additional content. These include a map view 1300 , family tree 1310 , and settings 1320 .
- FIG. 13A depicts the menu bars for these options. In this example each of these menu bars are provided with first screen device controls 1330 including pause/resume and mute/un-mute.
- FIG. 13B depicts the map view display 1300 .
- the map view display 1300 includes a map 1302 including marked locations 1304 and information about the locations 1306 . Icons are also provided to select other maps 1308 .
- FIG. 13C depicts the family tree view 1310 .
- the family tree view shows the family tree with fields 1312 indicating the relationship between the family members.
- buttons/icon 1314 at the bottom indicates what view is currently being shown (i.e. the family tree view). If a field 1312 is selected, a pop-up field 1316 is displayed, as shown in FIG. 13D , providing information about the person in the field 1312 .
- FIG. 13 e depicts the settings view 1320 . In view 1320 the user is provided with controls for adjusting the preferences for the audio and video 1322 , events 1324 , and social network sharing 1326 .
- FIGS. 14A-L depict skinned examples of what may be displayed on the screen 900 of the second screen device to a user when using an application that provides additional content on the second screen device 520 that is synched with the primary content on the first screen device 510 .
- FIG. 14A is a skinned version of the splash screen as shown and described in relation to FIG. 9A .
- FIGS. 14B-F depict skinned versions of the timeline view as seen and described in relation to FIGS. 9C-F and 10 A-D.
- FIG. 14G depicts a skinned version of a screen display wherein all the available video clips that are part of the additional content are displayed for the user.
- FIG. 14A is a skinned version of the splash screen as shown and described in relation to FIG. 9A .
- FIGS. 14B-F depict skinned versions of the timeline view as seen and described in relation to FIGS. 9C-F and 10 A-D.
- FIG. 14G depicts a
- FIG. 14H depicts a skinned version of a screen display wherein all the available audio clips that are part of the additional content are displayed for the user.
- FIG. 14I depicts a skinned version of the maps view as shown and described in relation to FIGS. 13B .
- FIGS. 14J and 14K depict skinned version of the family tree view as shown and described in relation to FIGS. 13C and 13D respectively.
- FIG. 14L depicts a skinned version of the settings view as shown and described in relation to FIG. 13E .
- a user may be able to configure or otherwise select what events they wish to be shown (e.g., don't show me purchase events).
- the user may be able to select or bookmark events for viewing at a later time.
- certain events may unavailable or locked out depending on the version of the program being viewed (i.e. purchased vs. rented or BD vs. VOD vs. Broadcast).
- the events available can be personalized for a user based on previous viewing habits (i.e. in system such as TIVO where a user's viewing habits are tracked or using the personalization engine 118 of FIG. 1 ).
- a store front could be provided and accessible from the second screen to for purchasing movie merchandise.
- points or awards could be provided to a user for watching, reviewing, or recommending a program or film. For example, the more movies watched or shared with friends, the more points awarded. The points can then be used for prizes or discounts on related goods.
- achievements can also be awarded. These achievements could be pushed to a social networking site.
- Example achievements could include:
- a Wild feature could be implemented.
- a running Wiki could let a user and other users of a disc comment on certain scene.
- tracking metadata could be created which is pushed to a web based wild.
- metadata could include:
- This pushed information can be used to create a running Wild which lets others comment on the movie. These comments could then be reintegrated into the second screen application as events which can be accessed.
- activity on one or more social networks that is related to content displayed on the first screen can be monitored and used to provide additional content on the second screen.
- the application on the second screen device 520 can support social media such as Facebook and Twitter. Additional examples of this can be seen in FIGS. 15-17 .
- FIG. 15 depict a wireframe screenshot 1500 showing the panel 1510 displayed over the background, in this case a “greyed-out” or “dimmed” timeline view 1520 , when the user has not yet signed in to the social network.
- panel 1510 is for signing into twitter.
- the panel 1510 provides an area 1512 to provide a username or email address as well as an area 1512 to provide a password.
- Button 1516 authorized the application to access twitter.
- Button 1518 declines the sign-in.
- a similar panel or field can be provided to sign into Facebook, or other social media networks.
- FIG. 16 depict a flow diagram 1600 of the screens displayed to a user based on whether they are signed into a social network.
- junction 160 it is determined is the user it signed in or otherwise authorized on the social network. If the user has not provided authorization ( 1612 ) screen 1500 with panel 1510 of FIG. 15 is displayed to the user prompting them to sign in. If the user has previously provided authorization ( 1614 ) or signs in using screen 1500 ( 1618 ) screen 1620 with message panel 1622 is displayed. From the message panel 1622 the user may cancel or send a message using buttons 1624 and 1626 respectively. The text of the message can be entered using an onscreen keyboard 1628 . Once a message is sent or canceled out of ( 1630 ) or if authorization is never provided ( 1632 ) junction 1634 is arrived at wherein the panels are dismissed and the user is returned to the previous screen view ( 1640 ).
- FIG. 17 depicts a skinned version screen 1700 of the screen 1620 of FIG. 16 .
- a message panel 1710 is provided overtop the skinned background 1720 .
- the message panel 1710 includes a text area 1712 as well as cancel 1714 and send 1716 buttons. Text can be entered into the text area 1712 using an onscreen keyboard 1730 .
- the user's experience can further be enhance being able to track comments on social media relating to content being viewed on the first screen device 510 and providing additional content on the second screen device 520 that is synched to the primary content on the first screen device 510 based on the tracked comments relating to the primary content.
- FIG. 18 shows a flow diagram 1800 depicting one possible methodology for providing such functionality on a second screen device 520 .
- the method includes three steps. First, social media activity regarding content being displayed on the first screen device 510 is monitored (step 1810 ). The monitored social media activity is then processed (step 1820 ). Finally, additional content based on the social media activity is provided on the second screen device 520 which is synched to the primary content on the first screen device 510 (step 1830 ). Each of these steps will be discussed in more detail below in reference to specific exemplary implementation that rely on social media activity.
- Second screen applications designed to support particular broadcasts have been developed to extend the branded experience. These second screen applications may include a relevant social feed by filtering messages by a hashtag or other keyword. Even with this filtering in place the message count can quickly become unwieldy for popular events (e.g. Superbowl 46 where the Tweets per second peaked at 10,245). What is disclosed herein is a mechanism to identify and surface relevant social messages for the user to see without being inundated with large numbers of messages to scan.
- This disclosure offers an approach that curates social messages that are then offered to the user with timing appropriate to a second screen application that may not be entirely focused on the social messaging aspect. This is valuable when it is desirable for second-screen applications that want to integrate social messaging with other aspects of the media experience like bonus content, trivia and advertising.
- curated social information can be interleaved with other “timeline events”.
- timeline event For example an actor trivia “timeline event” may be presented in conjunction with a character's appearance on screen.
- a “Social Quote Event” may be timed for display just after an intense action scene. This latter event displays only the high-level/relevant social messages based on targeting or frequency. This provides the user a sense of what is being communicated in the social network while not requiring the user to scan through hundreds or even thousands of messages. An example of this can be seen in FIG. 19 .
- FIG. 19 depicts a wireframe screenshot 1900 showing a “Social Quote” event panel 1910 as part of the timeline of other events 1920 .
- the panel 1910 provides a text area 1912 displaying the relevant curated social quotes.
- one or more additional buttons are provided.
- Button 1914 provides the ability to post or in this case re-tweet a given quote.
- First social media activity is monitored (step 1810 ). This involves looking for keywords, hashtags, or the like that include the name of the content being displayed on the first screen device 510 , the name of the actors, the name of the director, or other related information using techniques and methodologies that are well known in the art. Monitoring of social media can be performed on the second screen device 520 , a server 530 , provided by a third party service, or a combination of thereof.
- step 1820 The step of processing (step 1820 ) is where the data is curated to only provide the most relevant comments or messages.
- the curated “Social Quote” the following heuristics may be applied:
- social messages can be “sponsored” so that they automatically gain relative importance above the crowd of other messages. For example social media comments that were posted from a second screen application may be given priority. A further refinement to this would be along the lines of targeted advertising—the importance of sponsored messages can be weighted against user-specific criteria to achieve its relative ranking amongst other messages.
- This processing or curating can be performed on the second screen device 520 , a server 530 , provided by a third party service, or a combination of thereof.
- One or a few social messages thus ranked can be displayed to the user to provide targeted and other “significant” messaging without having to resort to scanning ever-flowing lists of social messages (step 1830 ).
- “Social Quote” events can be distributed throughout the timeline such that regular exposure to social networks is purposefully interleaved with other event types for a thoughtfully designed experience.
- a key to effective advertising is to deliver relevant offers at the right time.
- Current techniques involves extensive data collection on user behavior (e.g. Google tracking your searches, application use, viewing habits, etc.) which is then used to select advertisements that best suit the user's profile.
- user behavior e.g. Google tracking your searches, application use, viewing habits, etc.
- a great deal of specificity can be derived using public and private data to develop these personalization profiles.
- Social messaging may indeed be used as a source for personalization but may be handled independently from other activities that the user may be engaged in simultaneously. For example a user may tweet about a particular movie and a back-end system in the cloud can record the interest in that movie. This data could then be used the next time a user visits a website that takes advantage of this data to deliver advertisements. While this may potentially result in a higher advertisement relevance to the user during the subsequent visit the context of the original social message is lost.
- Second-screen applications offer a known environment from which social messages can be monitored to enhance advertising personalization.
- An application on a second screen device 520 can designed to display scheduled “events” in sync with primary video playback. Events can represent trivia, social content, voting, bonus material, advertisements, etc that are timed for display at relevant points in the primary video playback.
- the context that the second-screen application (and device) provides is rich, to include:
- the disclosed embodiments interleave “events” to support an overall experience. Some of these event types afford a social interaction element such as voting or can offer social messaging anytime throughout the presentation. Subsequent “events” can be reserved for advertising and can react to previous activity to include social messaging. This can be accomplished by identifying keywords within the social message itself and providing that to an advertising service along with other relevant information such as location, metadata associated with the current media time, etc.
- the application consolidates all these variables and offers advertisements within the context of the overall experience.
- a James Bond film could be playing on a first screen device and a location such as Hong Kong is displayed on screen.
- the second-screen app displays a supporting media item such as trivia about the location.
- the user could send a social message (tweets) to describe her desire to go on a vacation.
- the word “vacation” is parsed by the application in preparation for later advertising display. Later in the second-screen timeline an advertisement “event” focusing on a vacation offer is displayed.
- the monitoring takes place when it is detected that the user has sent a social message including the keyword “vacation.”
- the monitoring is performed on the second screen device 520 , but is could also be performed at the server 530 , by a third party service, or a combination thereof.
- the specific advertisement is selected from an ad service by submitting the keyword “vacation”. Additional information such as the specifically known location in the movie can be derived from metadata associate to the movie at the particular time of the tweet. Additionally the advertisement can recommend a local travel agent using the user's location information.
- the processing can be performed on the second screen device 520 , but is could also be performed at the server 530 , by a third party service, or a combination thereof.
- the advertisement can be displayed on the second screen device (Step 1830 ).
- An example of this can be seen in FIG. 20 .
- FIG. 20 depicts a wireframe screenshot 2000 showing a dynamic advertising event panel 2010 as part of the timeline of other events 2020 .
- the panel 2010 provides a text area 2012 displaying the relevant advertising offers.
- Other possible features and implementation will be apparent to one skilled in the art given the benefit of this disclosure.
- the advertisement thus becomes a natural part of the “conversation” all within the context of the media consumption experience.
- Social messages are typically ephemeral and relevant only to the present moment in time.
- a common convention for associating a message to a particular event or topic is to use hashtags. These hashtags provide a means to filter for social messages of interest. Filtering for these hashtags provides a “real-time” view of messages on the topic.
- This disclosure takes this notion a step further and describes a mechanism for associating message to a specific point in time relative to the start of a media content (such as video).
- Social messages employing these techniques can be associated to specific points in time within the particular content. For example, social messages can be associated with an opening scene that sparks social commentary and later for other arbitrary points in the media timeline. This example becomes even more relevant when a piece of media is replayed at a later time.
- hashtags are used to add a timestamp.
- the current practice of using hashtags is well known.
- the present disclosure provides a mechanism that can be implemented without custom extensions to the social messaging protocol.
- the encoded time offset is sent as part of the message itself. It is possible to obviate this approach by providing specific metadata that is not typically displayed as part of the social message itself.
- FIG. 21 depicts exemplary social messages 2100 that including a hashtag identifying the media 2110 is provided (this is common use today) with the addition of another hashtagged string of characters encoding the time offset 2120 .
- the hashtag #ijkcs ( 2110 ) is used to identify Indiana Jones and the Kingdom of the Crystal Skull.
- the hashtag #1054675( 2120 ) is a checksummed encoding of the frame offset.
- FIG. 22 depicts an exemplary flowchart of a method for creating time-stamped social messages.
- the method begins at block 2210 . It is then determined that a user desires to create a social message (block 2220 ). In this embodiment, this occurs when the user selects a comment button provided as part of the application running on the second screen device 520 . A timestamp of the position of the playback is then created (block 2230 ). In this example, this involves generating a checksum. To create the checksum the application receives positioning information directly if the media is being played back within the application or from other methods that provide position information from external sources. This numeric data is then checksummed. The user can then enter the text for the social message and request it be sent (block 2240 ).
- the timestamp is then added to the message and the message is sent or otherwise committed to the social network (block 2250 ).
- this involves appending the checksum to the social message text.
- the checksum is used to ensure integrity of the data and allow a consuming application to ignore bad position data that might be maliciously or inadvertently created. This ends the method (block 2260 )
- FIG. 23 depicts an exemplary flowchart of a method for consuming or otherwise reading and decoding a time-stamped social message.
- the method includes receiving a social message pertaining to the particular content in question, in this case, the content being displayed on the first screen device 510 (block 2320 ).
- this involves the selection of a message based on the hashtag representing the video of interest (basic search on hashtag).
- the position information (timestamp) can then extracted (block 2330 ).
- this involves the application looking for an additional hashtag that immediately follows and processed as an encoded position. If the position information is found (block 2340 ), The position information is separated into a predetermined position and checksum. The position is confirmed against the checksum (block 2350 ).
- the position can then be used to associate the social message to a particular time within the video and the message information can be displayed synched with the display on the first screen device 510 (block 2360 ). If the position information cannot be confirmed, the message can be discarded or displayed without being synched with the display on the first screen device 510 (block 2370 ). This ends the method (block 2380 ).
- a separate service can monitor and record real-time feeds of social messages of interest (say for a particular television broadcast).
- This service could use known broadcast schedules and correlate the expected time of the television event with the time the messages are seen in real-time.
- the service then records the social message with the appended position information.
- the resulting repository of time-stamped message is then accessed in lieu of a direct connection to the social network.
- origin information for the social messages can be used to correlate to specific regions with specific broadcast times.
- the presented graphical representation referred to here as a Social Heatmap organizes social information to correspond with the timeline of a of a media item such as content being displayed on a first screen device 510 .
- the social message timestamp can be used to make the correlation.
- messages need to be stamped with the time relative to the start of the media. This can be done within the application since the relative time of the media playback can be acquired via syncing mechanisms.
- the sync time is added to the social message which is then made available via a social network to others using an application that is aware of the time information. In this way each social message has a relative media timestamp which aids in visual placement on the screen when displaying the information on a second screen device 520 .
- FIG. 24 depicts an exemplary flowchart of a method for consuming or otherwise reading and decoding a time-stamped social message.
- the method includes receiving a social message pertaining to the particular content in question, in this case, the content being displayed on the first screen device 510 (block 2440 ). This can involve the selection of a message based on the hashtag representing the video of interest (basic search on hashtag) as discussed above or it could be based on keywords in the messages themselves. Timestamp information associated with the message is then looked for and messages without associated timestamp information are discarded (block 2430 ). In certain embodiments this involves the application looking for an additional hashtag that immediately follows and processed as an encoded position.
- the messages can be allocated to bins (block 2450 ). This process is discussed in more detail below.
- a graphical representation of the social messages associated with specific time periods of the primary content on the first screen device 510 can then be displayed on the second screen device 520 while the primary content is being displayed on the first screen device 510 .
- the graphical representation of the social messages associated with specific time periods also graphically represent the intensity or frequency (i.e. the number) of social messages associated the specific periods of time and as such is referred to as a heatmap.
- the user can then use the heatmap to navigate the through the content on the first screen and the associated social messages (block 2460 ). Such navigation is discussed in more detail below. This ends the method (block 2370 ).
- Bins are mechanism used for grouping messages with discrete sections of time.
- the number of messages in each bin can corresponds to the activity level for that section of time. It has been found that the screen width of each bin should be wide enough to afford navigation (e.g. using a touch device) but small enough to provide navigation resolution. An example of this can be seen in FIGS. 25 and 26 .
- a tablet application allocates 1000 pixels of width for the heatmap 2500 . These 1000 pixels represent the entirety of the media playback time, say 50 minutes. This allows approximately 20 pixels of screen width per minute of content. If we use a bin width of 20 pixels then our navigation resolution will be to the nearest minute and the sensitivity for navigation will require a move of 20 pixels from bin to bin.
- FIG. 26 depicts an exemplary plot of sample count for values allocated to bins for rendering. The number of messages in a bin may be graphically represented using color, plots, or other indicators. The actual sensitivity that should be used will depend on the input device, user demographics and other factors.
- FIG. 27 depicts an exemplary wireframe screenshot 2700 including a social heatmap or timeline.
- event panels 2710 displayed as part of a timeline view on the second screen device 520 which a user can scroll through as described previously.
- a Comment button 2720 is provided to allow the user to send social messages regarding content being displayed on the first 510 or second 520 screen devices.
- Social messages generated using such a function can include content and timestamp information for processing and inclusion in the heatmap 2750 .
- Sync indicator button 2730 allows the user to synchronize the events of the timeline view with the content being displayed on the first screen device 510 .
- Button 2730 can also indicate the status of the synchronization.
- At the bottom of the screenshot 2700 there is also a playback position indicator 2740 and the social heatmap 2750 .
- the playback position indicator graphically displayed the current position in the playback of the primary content on the first screen device 510 .
- a user can adjust the indicator to change the current playback position in the primary content being displayed on the first screen device 510 .
- the social heatmap 2750 has bins representing social messages associated with time periods in the playback of the primary content. Selecting bins, for example by sliding or “scrubbing” along the social heatmap 2750 causes the associated social messages to be displayed. Examples of this can be seen in FIGS. 28-32 .
- FIG. 28 depicts the screenshot 2700 of FIG. 27 with a pop-over panel 2800 that is displays the social messages associated with a bin 2810 in the social heatmap 2750 when a specific bin is selected by a user as represented by icon 2820 .
- FIG. 29 depicts one embodiment of how messages can graphically be transition between (as indicated by arrow 2900 ) in the popover panel as a user scrolls or scrubs along the heatmap 2750 (as indicated by icon 2820 ).
- FIG. 30 indicates one possible embodiment of a panel 3000 that can be displayed if there are no social messages associated with a bin 2810 .
- FIG. 32 depict how multiple messages can be scrolled through within a panel 3100 .
- messages can be scrolled through vertically as indicated by arrows 3110 .
- a scroll bar indicator 3120 is provided to indicate that there are multiple messages to be scrolled through.
- FIG. 32 depict another embodiment wherein selecting individual messages within the panel 3200 provides the user with additional functionality.
- selecting a message provides additional buttons that allow the user to resend (“re-tweet”) a message 3210 or got the specific instance in the playback of the content on the first screen device 510 for with the message is associated.
- FIG. 33 depicts a skinned version screen 3300 of the screen 2700 of FIG. 27 .
- color and peaks are used to graphically indicate the intensity of social messages along the heatmap 3310 .
- processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read only memory (“ROM”) for storing software, random access memory (“RAM”), and nonvolatile storage.
- DSP digital signal processor
- ROM read only memory
- RAM random access memory
- any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- Computer program or application in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following a) conversion to another language, code or notation; b) reproduction in a different material form.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application Ser. No. 61/584,134 filed Jan. 6, 2012 which is incorporated by reference herein in its entirety.
- This application is also related to the applications entitled: “METHODS AND SYSTEMS FOR SYNCHONIZING CONTENT ON A SECOND SCREEN”, “METHOD AND SYSTEM FOR SYNCHING SOCIAL MESSAGES WITH CONTENT TIMELINE”, “METHOD AND SYSTEM FOR PROVIDING A DISPLAY OF SOCIAL MESSAGES ON A SECOND SCREEN WHICH IS SYNCHED WITH CONTENT ON A FIRST SCREEN”, “ALTERNATE VIEW PLAYBACK ON A SECOND SCREEN”, and “METHOD ANN SYSTEM FOR PROVIDING DYNAMIC ADVERTISING ON SECOND SCREEN BASED ON SOCIAL MESSAGES” which are have been filed concurrently and are incorporated by reference herein in their entirety.
- 1. Technical Field
- The present invention generally relates to providing additional content related to displayed content.
- 2. Description of Related Art
- Social messaging is becoming a ubiquitous feature across various software applications. One problem this presents is that the sheer quantity of messages can become overwhelming especially for popular topics. At some point the individual messages get lost in the crowd and the quantity of messages becomes the interesting social aspect. This disclosure describes a way to visualize the volume of social activity over time which can then be used to identify interesting points in time for media content and also be used to navigate to those points in time.
- Many second-screen applications provide a social message “feed” that simply streams social messages as they happen. There is little organization of this information beyond displaying the most recent at the top of the list or perhaps the notion of a “promoted” message that advertisers use to keep their message at the top of the stack. Messages are quickly replaced by new messages sometimes faster than a user can scan them. Once the messages have been buried in the stack their effective relevance to time is diminished.
- The presented graphical representation referred to here as a Social Heatmap organizes social information to correspond with the timeline of a of a media item such as content being displayed on a first screen device.
- In accordance with one embodiment, a method for providing a graphical representation of social messages on a second screen relating to content displayed on a first screen is provided. The method involves monitoring social media for one or more messages relating to content being displayed on a first screen, processing the one or more message relating to content being displayed on first screen device to associate message with a time segment of the content being displayed on the first screen, and providing a graphical representation of the one or more social messages associated with one or more time segments of the content being displayed on the first screen on the second screen.
- In accordance with another embodiment, a second screen device capable of displaying a graphical representation of social messages associated with content being displayed on a first screen. The second screen device includes a screen, storage, and a processor. The screen is configured to display content. The storage is for storing data. The processor is configured to monitor social media for one or more messages relating to content being displayed on a first screen; process the one or more message relating to content being displayed on first screen device to associate message with a time segment of the content being displayed on the first screen, and provide a graphical representation of the one or more social messages associated with one or more time segments of the content being displayed on the first screen for display on the screen of the second screen device
-
FIG. 1 is a system diagram outlining the delivery of video and audio content to the home in accordance with one embodiment. -
FIG. 2 is system diagram showing further detail of a representative set top box receiver. -
FIG. 3 is a diagram depicting a touch panel control device in accordance with one embodiment. -
FIG. 4 is a diagram depicting some exemplary user interactions for use with a touch panel control device in accordance with one embodiment. -
FIG. 5 is system diagram depicting one embodiment of a system for implementing techniques of the present invention in accordance with one embodiment. -
FIG. 6 is a flow diagram depicting an exemplary process in accordance with one embodiment. -
FIG. 7 is a diagram depicting an exemplary methodology of synching between devices in accordance with one embodiment. -
FIG. 8 is a diagram depicting an exemplary methodology of synching between devices in accordance with one embodiment. -
FIGS. 9A-9F are exemplary skeletal screen views depicting features in accordance with one embodiment when used in passive mode. -
FIGS. 10A-10D are exemplary skeletal screen views depicting features in accordance with one embodiment when used in active mode. -
FIGS. 11A-11C are exemplary skeletal views depicting a social media sharing feature in accordance with one embodiment. -
FIGS. 12A and 12B are exemplary skeletal views depicting a content selection features in accordance with one embodiment. -
FIGS. 13A-13E are exemplary skeletal views depicting additional features in accordance with one embodiment. -
FIGS. 14A-14L are exemplary skinned screen views depicting how certain features could appear to a user. -
FIG. 15 is a exemplary skeletal view depicting social media features in accordance with one embodiment. -
FIG. 16 is a flow diagram depicting the functionality of social media features in accordance with one embodiment. -
FIG. 17 is an exemplary skinned screen view depicting social media features in accordance with one embodiment. -
FIG. 18 is a flow diagram providing a general methodology of providing content on a second screen based on social messages regarding content being displayed on a first screen in accordance with one embodiment. -
FIG. 19 is an exemplary view of a social quote event in accordance with one embodiment. -
FIG. 20 is an exemplary view of dynamic advertizing on a second screen based on social messages in accordance with one embodiment. -
FIG. 21 is an exemplary view of how a social message can be associated with content in accordance with one embodiment. -
FIG. 22 is flow diagram providing a methodology for generating a social message associated with content in accordance with one embodiment. -
FIG. 23 is flow diagram providing a methodology for processing a social message associated with content in accordance with one embodiment. -
FIG. 24 is flow diagram providing a methodology for providing a graphical representation of social messages associated with time periods of content in accordance with one embodiment. -
FIG. 25 is an exemplary representation of bins that make up a graphical representation of social messages associated with time periods of content in accordance with one embodiment. -
FIG. 26 is an exemplary representation of the association of messages to the bins that make up a graphical representation of social messages associated with time periods of content in accordance with one embodiment. -
FIG. 27 is a exemplary skeletal view depicting a graphical representation of social messages associated with time periods of content in accordance with one embodiment. -
FIG. 28 is an exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with one embodiment. -
FIG. 29 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment. -
FIG. 30 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment. -
FIG. 31 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment. -
FIG. 32 is another exemplary view of the operation of a graphical representation of social messages associated with time periods of content in accordance with another embodiment. -
FIG. 33 is a exemplary skinned screen view depicting a graphical representation of social messages associated with time periods of content in accordance with one embodiment. - Turning now to
FIG. 1 , a block diagram of an embodiment of asystem 100 for delivering content to a home or end user is shown. The content originates from acontent source 102, such as a movie studio or production house. The content may be supplied in at least one of two forms. One form may be a broadcast form of content. The broadcast content is provided to thebroadcast affiliate manager 104, which is typically a national broadcast service, such as the American Broadcasting Company (ABC), National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), etc. The broadcast affiliate manager may collect and store the content, and may schedule delivery of the content over a deliver network, shown as delivery network 1 (106). Delivery network 1 (106) may include satellite link transmission from a national center to one or more regional or local centers. Delivery network 1 (106) may also include local content delivery using local delivery systems such as over the air broadcast, satellite broadcast, or cable broadcast. The locally delivered content is provided to areceiving device 108 in a user's home, where the content will subsequently be searched by the user. It is to be appreciated that the receivingdevice 108 can take many forms and may be embodied as a set top box/digital video recorder (DVR), a gateway, a modem, etc. Further, the receivingdevice 108 may act as entry point, or gateway, for a home network system that includes additional devices configured as either client or peer devices in the home network. - A second form of content is referred to as special or additional content. Special or additional content may include content delivered as premium viewing, pay-per-view, or other content otherwise not provided to the broadcast affiliate manager, e.g., movies, video games or other video elements. In many cases, the special content may be content requested by the user. The special content may be delivered to a
content manager 110. Thecontent manager 110 may be a service provider, such as an Internet website, affiliated, for instance, with a content provider, broadcast service, or delivery network service. Thecontent manager 110 may also incorporate Internet content into the delivery system. Thecontent manager 110 may deliver the content to the user'sreceiving device 108 over a separate delivery network, delivery network 2 (112). Delivery network 2 (112) may include high-speed broadband Internet type communications systems. It is important to note that the content from thebroadcast affiliate manager 104 may also be delivered using all or parts of delivery network 2 (112) and content from thecontent manager 110 may be delivered using all or parts of delivery network 1 (106). In addition, the user may also obtain content directly from the Internet via delivery network 2 (112) without necessarily having the content managed by thecontent manager 110. - Several adaptations for utilizing the separately delivered additional content may be possible. In one possible approach, the additional content is provided as an augmentation to the broadcast content, providing alternative displays, purchase and merchandising options, enhancement material, etc. In another embodiment, the additional content may completely replace some programming content provided as broadcast content. Finally, the additional content may be completely separate from the broadcast content, and may simply be a media alternative that the user may choose to utilize. For instance, the additional content may be a library of movies that are not yet available as broadcast content.
- The receiving
device 108 may receive different types of content from one or both ofdelivery network 1 anddelivery network 2. The receivingdevice 108 processes the content, and provides a separation of the content based on user preferences and commands. The receivingdevice 108 may also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the receivingdevice 108 and features associated with playing back stored content will be described below in relation toFIG. 2 . The processed content is provided to adisplay device 114. Thedisplay device 114 may be a conventional 2-D type display or may alternatively be an advanced 3-D display. - The receiving
device 108 may also be interfaced to a second screen such as a touchscreen control device 116. The touchscreen control device 116 may be adapted to provide user control for the receivingdevice 108 and/or thedisplay device 114. Thetouch screen device 116 may also be capable of displaying video content. The video content may be graphics entries, such as user interface entries, or may be a portion of the video content that is delivered to thedisplay device 114. The touchscreen control device 116 may interface to receivingdevice 108 using any well known signal transmission system, such as infra-red (IR) or radio frequency (RF) communications and may include standard protocols such as infra-red data association (IRDA) standard, Wi-Fi, Bluetooth and the like, or any other proprietary protocols. In some embodiments, the touchscreen control device 116 can be interfaced directly withdelivery networks screen control device 116 will be described in further detail below. - In the example of
FIG. 1 , thesystem 100 also includes aback end server 118 and ausage database 120. Theback end server 118 includes a personalization engine that analyzes the usage habits of a user and makes recommendations based on those usage habits. Theusage database 120 is where the usage habits for a user are stored. In some cases, theusage database 120 may be part of the back end server 118 a. In the present example, the back end server 118 (as well as the usage database 120) is connected to the system thesystem 100 and accessed through the delivery network 2 (112). - Turning now to
FIG. 2 , a block diagram of an embodiment of a receivingdevice 200 is shown. Receivingdevice 200 may operate similar to the receiving device described inFIG. 1 and may be included as part of a gateway device, modem, set-top box, or other similar communications device. Thedevice 200 shown may also be incorporated into other systems including an audio device or a display device. In either case, several components necessary for complete operation of the system are not shown in the interest of conciseness, as they are well known to those skilled in the art. - In the
device 200 shown inFIG. 2 , the content is received by aninput signal receiver 202. Theinput signal receiver 202 may be one of several known receiver circuits used for receiving, demodulation, and decoding signals provided over one of the several possible networks including over the air, cable, satellite, Ethernet, fiber and phone line networks. The desired input signal may be selected and retrieved by theinput signal receiver 202 based on user input provided through a control interface ortouch panel interface 222.Touch panel interface 222 may include an interface for a touch screen device.Touch panel interface 222 may also be adapted to interface to a cellular phone, a tablet, a mouse, a high end remote or the like. - The decoded output signal is provided to an
input stream processor 204. Theinput stream processor 204 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream. The audio content is provided to anaudio processor 206 for conversion from the received format, such as compressed digital signal, to an analog waveform signal. The analog waveform signal is provided to anaudio interface 208 and further to the display device or audio amplifier. Alternatively, theaudio interface 208 may provide a digital signal to an audio output device or display device using a High-Definition Multimedia Interface (HDMI) cable or alternate audio interface such as via a Sony/Philips Digital Interconnect Format (SPDIF). The audio interface may also include amplifiers for driving one more sets of speakers. Theaudio processor 206 also performs any necessary conversion for the storage of the audio signals. - The video output from the
input stream processor 204 is provided to avideo processor 210. The video signal may be one of several formats. Thevideo processor 210 provides, as necessary a conversion of the video content, based on the input signal format. Thevideo processor 210 also performs any necessary conversion for the storage of the video signals. - A
storage device 212 stores audio and video content received at the input. Thestorage device 212 allows later retrieval and playback of the content under the control of acontroller 214 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from auser interface 216 and/ortouch panel interface 222. Thestorage device 212 may be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or may be an interchangeable optical disk storage system such as a compact disk (CD) drive or digital video disk (DVD) drive. - The converted video signal, from the
video processor 210, either originating from the input or from thestorage device 212, is provided to thedisplay interface 218. Thedisplay interface 218 further provides the display signal to a display device of the type described above. Thedisplay interface 218 may be an analog signal interface such as red-green-blue (RGB) or may be a digital interface such as HDMI. It is to be appreciated that thedisplay interface 218 will generate the various screens for presenting the search results in a three dimensional grid as will be described in more detail below. - The
controller 214 is interconnected via a bus to several of the components of thedevice 200, including theinput stream processor 202,audio processor 206,video processor 210,storage device 212, and auser interface 216. Thecontroller 214 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display. Thecontroller 214 also manages the retrieval and playback of stored content. Furthermore, as will be described below, thecontroller 214 performs searching of content and the creation and adjusting of the grid display representing the content, either stored or to be delivered via the delivery networks, described above. - The
controller 214 is further coupled to control memory 220 (e.g., volatile or nonvolatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code forcontroller 214.Control memory 220 may store instructions forcontroller 214. Control memory may also store a database of elements, such as graphic elements containing content. The database may be stored as a pattern of graphic elements. Alternatively, the memory may store the graphic elements in identified or grouped memory locations and use an access or location table to identify the memory locations for the various portions of information related to the graphic elements. Additional details related to the storage of the graphic elements will be described below. Further, the implementation of thecontrol memory 220 may include several possible embodiments, such as a single memory device or, alternatively, more than one memory circuit communicatively connected or coupled together to form a shared or common memory. Still further, the memory may be included with other circuitry, such as portions of bus communications circuitry, in a larger circuit. - The user interface process of the present disclosure employs an input device that can be used to express functions, such as fast forward, rewind, etc. To allow for this, a
touch panel device 300 may be interfaced via theuser interface 216 and/ortouch panel interface 222 of the receivingdevice 200, as shown inFIG. 3 . Thetouch panel device 300 allows operation of the receiving device or set top box based on hand movements, or gestures, and actions translated through the panel into commands for the set top box or other control device. In one embodiment, thetouch panel 300 may simply serve as a navigational tool to navigate the grid display. In other embodiments, thetouch panel 300 will additionally serve as the display device allowing the user to more directly interact with the navigation through the grid display of content. The touch panel device may be included as part of a remote control device containing more conventional control functions such as actuator or activator buttons. Thetouch panel 300 can also includes at least one camera element. In some embodiments, thetouch panel 300 may also include a microphone. - Turning now to
FIG. 4 , the use of a gesture sensing controller or touch screen, such as shown, provides for a number of types of user interaction. The inputs from the controller are used to define gestures and the gestures, in turn, define specific contextual commands. The configuration of the sensors may permit defining movement of a user's fingers on a touch screen or may even permit defining the movement of the controller itself in either one dimension or two dimensions. Two-dimensional motion, such as a diagonal, and a combination of yaw, pitch and roll can be used to define any three-dimensional motion, such as a swing. A number of gestures are illustrated inFIG. 4 . Gestures are interpreted in context and are identified by defined movements made by the user. - Bumping 420 is defined by a two-stroke drawing indicating pointing in one direction, either up, down, left or right. The bumping gesture is associated with specific commands in context. For example, in a TimeShifting mode, a left-
bump gesture 420 indicates rewinding, and a right-bump gesture indicates fast-forwarding. In other contexts, abump gesture 420 is interpreted to increment a particular value in the direction designated by the bump. Checking 440 is defined as in drawing a checkmark. It is similar to adownward bump gesture 420. Checking is identified in context to designate a reminder, user tag or to select an item or element. Circling 440 is defined as drawing a circle in either direction. It is possible that both directions could be distinguished. However, to avoid confusion, a circle is identified as a single command regardless of direction. Dragging 450 is defined as an angular movement of the controller (a change in pitch and/or yaw) while pressing a button (virtual or physical) on the tablet 300 (i.e., a “trigger drag”). The dragginggesture 450 may be used for navigation, speed, distance, time-shifting, rewinding, and forwarding. Dragging 450 can be used to move a cursor, a virtual cursor, or a change of state, such as highlighting outlining or selecting on the display. Dragging 450 can be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is preferred to modify the response to the dragging command. For example, in some interfaces, operation in one dimension or direction is favored with respect to other dimensions or directions depending upon the position of the virtual cursor or the direction of movement. Nodding 460 is defined by two fast trigger-drag up-and-down vertical movements. Nodding 460 is used to indicate “Yes” or “Accept.” X-ing 470 is defined as in drawing the letter “X.” X-ing 470 is used for “Delete” or “Block” commands. Wagging 480 is defined by two trigger-drag fast back-and-forth horizontal movements. The wagginggesture 480 is used to indicate “No” or “Cancel.” - Depending on the complexity of the sensor system, only simple one dimensional motions or gestures may be allowed. For instance, a simple right or left movement on the sensor as shown here may produce a fast forward or rewind function. In addition, multiple sensors could be included and placed at different locations on the touch screen. For instance, a horizontal sensor for left and right movement may be placed in one spot and used for volume up/down, while a vertical sensor for up and down movement may be place in a different spot and used for channel up/down. In this way specific gesture mappings may be used.
- The system and methodology can be implemented in any number of ways depending on the hardware and the content involved. Examples of such deployment include DVD, Blu-Ray disc (BD); streaming video or video on demand (VOD), and broadcast (satellite, cable, over the air). Each of these deployments would have different architectures but one could standardize the triggers for each of these events (the additional content) that represents what would be queued by the application running on the second screen. For example, event A and event B would be triggered by a synching mechanism associated with any of these sources of a video. When the tablet encounters “event A”, the program running on the second screen device (e.g. tablet) will enact “event A”. Similarly, if “event B” is encountered, the program running on the second screen device would do “event B”.
-
FIG. 5 depicts ageneric system 500 on which such methodology could be implemented. Here thesystem 500 includes afirst screen device 510, asecond screen device 520, aplayback device 530, anetwork 540 andserver 550. Each of these elements will be discussed in more detail below. - The
first screen device 510 is a display device, such asdisplay device 114 described above in relation toFIG. 1 , for displaying content such as television programs, movies, and websites. Examples of such first screen display devices include, but are not limited to, a television, monitor, projector, or the like. Thefirst screen device 510 is connected to theplayback device 530 which can provide the primary content to thefirst screen device 510 for display. Examples of such communication include, but are not limited to HDMI, VGA, Display port, USB, component, composite, radio frequency (RF), and infrared (IR), and the like. In certain embodiments, the firstscreen display device 510 may be connected to thenetwork 540, in either a wired or wireless (WiFi) manner, providing additional connection to thesecond screen device 520 andserver 550. In some embodiments, thefirst display device 510 may include the functionality of theplayback device 530. In still other embodiments, the firstscreen display device 510 may be innon-networked communication 560 with thesecond screen device 520. Examples of suchnon-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth, and other audio communication techniques and protocols. - The
second screen device 520 is device capable of displaying additional content related to the primary content being displayed on thefirst screen device 510. The second screen device may be a touchscreen control device 116 ortouch screen device 300 as described above. Examples of second screen devices include, but are not limited to, a smart phone, tablet, laptop, personal media player (e.g. ipod), or the like. Thesecond screen device 520 is in communication withplayback device 530 using eithernetwork 540,non-networked communication 560, or both. Thesecond screen device 550 is also in communication with theserver 550 via thenetwork 540 for requesting and receiving additional content related to the primary content being displayed on thefirst screen device 510. In some embodiments, thesecond screen device 520 may be in networked ornon-networked communication 560 with thefirst screen device 510. Examples of suchnon-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth (BT), audio communication techniques and protocols, or the like. - The
playback device 530 is device capable of providing primary content for display on thefirst screen device 510. Examples of such playback display devices include, but are not limited to, a DVD player, Blue-Ray Disc (BD) player, game console, receiver device (cable or satellite), Digital Video Recorder (DVR), streaming device, personal computer, or the like. Theplayback device 530 is connected to thefirst screen device 510 for providing the primary content to thefirst screen device 510 for display. Examples of such connections include, but are not limited to HDMI, VGA, Display port, USB, component, composite, radio frequency (RF), and infrared (IR), and the like. Theplayback device 530 is also connected to thenetwork 540, in either a wired or wireless (WiFi) manner, providing connection to thesecond screen device 520 andserver 550. In some embodiments, the functionality of theplayback device 530 may be included in the firstscreen display device 510. In still other embodiments, theplayback device 530 may be innon-networked communication 560 with thesecond screen device 520. Examples of suchnon-networked communication 560 include, but are not limited to, RF, IR, Blue-Tooth (BT), and other audio communication techniques and protocols. - The
network 540 can be a wired or wireless communication network implemented using Ethernet, MoCA, and wireless protocols or a combination thereof. Examples of such a network include, but are not limited to, delivery network 1 (106) and delivery network 2 (112) discussed above. - The
server 550 is a content server configured to provide additional content to thesecond screen device 520. In certain embodiments, the server may also provide the primary content for display on thefirst screen device 510. The service is connected to thenetwork 540 and can communicate with any of the devices that are also connected. Examples of such a server include, but are not limited to,content source 102,broadcast affiliate manager 104,content manager 110, and the back end server described above. -
FIG. 6 depicts a flow diagram 600 for a methodology for displaying additional content related to primary content being viewed is disclosed. The method includes the following steps: Displaying primary content on a first screen device 510 (step 610). Providing, in association with the display of the primary content on the first screen, a synching mechanism to synch additional content (step 620). Displaying, on asecond screen device 520, additional content related to the primary content on thefirst screen 510 that is synched to the content displayed on the first screen device according to the synching mechanism (step 630). In certain embodiments, the method also includes the steps of receiving commands from thesecond screen device 520 to control the display of primary content on the first screen device 510 (step 640) and controlling the display of the primary content on thefirst screen device 510 based on the commands received from the second screen device 520 (step 650). Each of these steps will be described in more detail below. - The step of displaying primary content (step 610), such as a movie or television show, is performed on the
first screen device 510. This involves the primary content being provided to thefirst screen display 510. The primary content can be provided by theplayback device 530 or be received directly from a content provider at the firstscreen display device 510. The primary content is then shown or otherwise displayed on thefirst screen device 510. The display of the primary content also includes the control of the content being displayed. This can include the traditional playback commands of play, stop, pause, rewind, and fast forward as well as the navigation of on screen menus to select the content and other playback options. In certain embodiments, the display on the first screen device 510 (step 620) further includes displaying an indicator of the type of additional content being displayed on thesecond screen device 520. - The provided synching mechanism (step 620) can be implemented in a number of ways. In certain embodiments the synching mechanism is performed by an application running on the
second screen device 520, theplayback mechanism 530, thefirst screen device 510 or any combination thereof. At its most basic, thesecond screen device 520 is configured (via an application) to detect synching signals, cues, or other type of indicators that directs thesecond screen device 520 to update the display of additional content to coincide with the primary content being displayed on thefirst screen 510. The synching signals, cues or other type of indicators, can be provided as part of the primary content or can be generated by theplayback device 530 of first screen device 510 (via an application) in accordance with the chapter, scene, time-code, subject matter, or content being displayed. The synching signals, cues or other type of indicators can be transmitted to thesecond screen device 520 using the network, in either a wired or wireless (WiFi) manner, or usingnon-networked communication 560 such as audio signals. Examples of some of the implementations are given below. Other possible implementations will be apparent given the benefit of this disclosure. - The step of displaying the additional content, such as supplemental materials, video clips, websites, and the like (step 630) is performed on the
second screen device 520. The additional content can be stored locally on thesecond screen device 520 or be provided by theserver 550,playback device 530, orfirst screen device 510. The display of the additional content is synched to the primary content being displayed on thefirst screen device 510 according to the synching mechanism. For example, when thesecond screen device 520 detects a synching signal, cue or other type of indicator, thesecond screen device 520 updates the display of the additional content accordingly. In some embodiments, this further involves contacting and requesting the additional content from theserver 550,playback device 530, orfirst screen device 510 and subsequently downloading and displaying the additional content. In some embodiments, the additional content to be displayed can be selected, modified, or omitted based on the user using the system. - In certain embodiments, the display on the second screen device 520 (step 630) further includes displaying the status of the display of the primary content on the
first screen device 510 such as whether the display of the primary content on thefirst screen device 510 has been paused. In certain other embodiments, the display on the second screen device 520 (step 630) further includes displaying the status of the synch between the additional content on thesecond screen device 520 and the primary content on thefirst screen device 510. - In certain embodiments, the
second screen device 520 is capable of transmitting as well as receiving. Theoptional steps step 640 commands are received from thesecond screen device 520. Ideally, these commands are received at the device controlling the playback of the primary content on thefirst screen device 510. In certain embodiment, theplayback device 530 is the device receiving the commands. The commands can be sent via thenetwork 540 ornon-networked communication 560. Once received, the commands can control the display of the primary content (step 650). Examples of such control include, but are not limited to, play, stop, pause, rewind, fast-forward, as well as chapter, scene, and selection. These commands can also be used to synch the primary content displayed on thefirst screen device 510 with the additional content being displayed on thesecond screen device 520. -
FIG. 7 provides a high level overview of one example ofsystem 700 with a synching mechanism implemented using anon-networked communication 560. In thissystem 700, the non-networked communication synching mechanism isaudio watermarking 710. In this example,audio watermarking 710 involves inserting a high-frequency signal, cue, or other indicator into the audio signal of the primary content being displayed on thefirst screen device 510. The audio watermark is inaudible to humans but can be detected by a microphone in thesecond screen device 520. When thesecond screen device 520 detects an audio watermark, the displayed additional content is updated to synch with the primary content being displayed on thefirst screen device 510 based on the detected watermark. The audio watermarks can be incorporated into the primary content at the source of the content or inserted locally by theplayback device 520 orfirst screen device 510. -
FIG. 8 provides a high-level overview of one example of asystem 800 with a synching mechanism implemented using thenetwork 540. In thissystem 800 the synching mechanism is wireless communication (WiFi) 810 between a playback device 530 (a Blu-Ray Disc player) and the second screen device 520 (an iOS device running an application). In the example ofFIG. 7 , the features and protocols of a BD-Live enabled device are used. There are two main components of this protocol: connection and communication. Both are described below. For simplicity the second screen iOS application will be referred to as the “iPad” and the BD-Live enabled device will be referred to as the “disc”. - Connection occurs when an iOS enabled
device 520 first launches the second screen application and attempts to connect to a BD-Live enableddevice 530 on the same Wi-Fi network 540. -
- 1. Disc is inserted into BD player
- 2. Disc enters UDP ‘listening’ loop
- 3. iPad launches second screen application
- 4. iPad performs UDP broadcast of authentication token
- 5. Disc receives authentication token and authenticates
- 6. Disc retrieves IP from tokens sender (iPad's IP)
- 7. Disc responds to authentication with its IP and PORT
- 8. iPad confirms IP and PORT
- 9. iPad closes UDP socket communication
- 10. iPad establishes direct TCP socket communication with disc based on IP and PORT provided.
- Communication occurs after a connection has been established between the second screen iOS application and a BD-Live enabled device.
-
- 1. iPad and Disc are aware of each other's IP's as well as what PORT communication should occur using
- 2. TCP socket communication is maintained for the duration of the applications lifecycle.
- One advantage of such a wireless communication as seen in this example is that it is bi-directional allowing the second screen device to transmit as well as receive commands. This allows for two way synching as well as control of playback from the
second screen device 520. - In certain embodiments, the application of the
second screen device 520 could be specific to a specific program or movie on a specific system (e.g. BD). In other embodiments, the second screen application could be generic to a studio with available plug-ins to configure the application to a particular program or movie. In still other embodiments the second screen application could be universal to system (BD, VOD, broadcast), content, or both. Other possible implementations and configurations will be apparent to one skilled in the art given the benefit of this disclosure. - The system can be operated in with a passive approach or an interactive approach. In the passive approach icons displayed on
first screen device 510 prompt the user to look at thesecond screen device 520 for an additional content event being displayed that is related to the primary content displayed on thefirst screen device 510. The icon preferably indicates what type of additional content event is available on the second screen device 520 (e.g., a shopping cart icon indicates a purchase event, an “I” icon indicates an information event, a stickman icon indicates a character information event, etc.)FIG. 9A-F depicts some of the aspects that may be displayed to the user in passive mode. -
FIGS. 9A-F depict skeletal examples of what may be displayed on thescreen 900 of the second screen device to a user when using an application in passive mode that provides additional content on thesecond screen device 520 that is synched with the primary content on thefirst screen device 510. -
FIG. 9A depicts a splash screen that may be displayed to the user when the application is launched. It includes the product logo and indication of theprimary content 902. Here new content screens transition in from right in a conveyer-belt like manner as indicate byarrow 904. -
FIG. 9B depicts a pop-upmessage 906 that is displayed to a user when noplayback device 530 is detected bysecond screen device 520. - The
screen 900 ofFIG. 9C shows a synch button/icon 908,chapter timeline 910,active chapter indicator 912, chapter-event indicator 914,chapter number indicator 916,event timeline 918,chapter background 920,event card 922, andtimeline view icons 924. Thesynch button 908 provides a mechanism to synch the content between the first andsecond screen devices synch button 908 may also indicate the status of the synch between the content on the first andsecond screen devices chapter timeline 910 indicates the chapters of the primary content. The movie title leader is in the background of thechapter timeline 910 and indicates the primary content. As the primary content progresses the chapters move along chapter timeline in a conveyer-belt life fashion with theactive chapter indicator 912 indicating the current chapter in the primary content via highlight and center position of thechapter timeline 912. The chapter-event indicator 914 indicates that events displayed in theevent timeline 918 are part of the active chapter shown in thechapter timeline 910. Theevent timeline 918displays event cards 922 indicating events that correspond to what is transpiring in the current chapter of the primary content. For each chapter, the first displayedevent card 922 indicates the chapter that the following events occur in. As the primary content progresses theevent cards 922 move alongevent timeline 918 in a conveyer-belt like fashion with the current event in the center position of theevent timeline 918. Each chapter may be provided with aunique background 920 for the events of that particular chapter. The timeline view icon/button 924 indicates that the viewer is in timeline view showing thechapter timeline 910 andevent timeline 918 as well as provides a mechanism to access the timeline view. - The
screens 900 ofFIGS. 9D and 9E depict howevent cards 922 progress across theevent timeline 918. Here the synch button/icon 908 indicates that the timeline view of the additional content is in synch with the primary content on thefirst screen device 510. InFIG. 9D , the current triggeredevent card 926 is shown in the center position of theevent timeline 918 and represents the first triggered event. To the left of the current triggeredevent card 926 in theevent timeline 918 is theprevious event card 928, in this case the card indicating the chapter. To the right of the current triggeredevent card 926 in theevent timeline 918 is thenext event card 930, in this case the card indicating the next scheduled event. Since, inFIG. 9D , this is the current triggeredevent card 926 is for the first triggered event, thechapter indicator 916 indicates that it ischapter 1. The current triggeredevent card 926 includes theadditional content 932 related to the primary content. The current triggeredevent card 926 also provides anindicator 934 as to what type of additional content is displayed. In certain embodiments this indicator matches an indicator shown on thefirst screen display 510. Thecurrent event card 926 also includes buttons/icons for synching 936 and sharing 938. The synch button/icon 936 provides a mechanism that causes the primary content displayed on thefirst screen device 520 to be synched with the current event. The share button/icon 938 provides a mechanism to share the additional content of the event with a social network. The elements of thescreen 900 ofFIG. 9E are similar to the elements ofFIG. 9D except that the current triggeredevent card 926 is for an event that happens later in the timeline as indicated by thechapter indicator 916 which indicates the current chapter ischapter 3. -
FIG. 9F depicts examples of other possible functionality that may be provided as part of display on thesecond screen device 920. Here thechapter timeline 910 is provided with a collapse icon/button 940 which provides a mechanism to toggle the chapter timeline between visible 940 a and hidden 940 b. Likewise the synch button/icon 908 can toggle between status indicating whether synch is currently active 908 a and status indicating synch has been lost and re-synching is available 908 b. In some embodiments avolume button icon 942 is provided. The volume button/icon 942 provides a mechanism to turn the sound of the first screen display “OFF” or “ON”. Thevolume button 942 may also indicate the status of whether the volume is “ON” indicating muting is available 942 a, or “OFF” indicating sound is available 942 b. In some other embodiments a play/pause button/icon 944 is provided. The play/pause button 944 provides a mechanism to pause or resume playback of content on thefirst screen display 510. The pause/play button may also indicate the status of whether the playback can be paused 944 a, or “or resumed 944 b. - In the interactive approach, the user selects an additional content event on the
second screen device 520 and what is displayed on theprimary screen device 510 is synched to the selected event. As indicated previously, the events of additional content are synched to the primary content. If the user swipes the movie timeline or the events, the events become out of synch with the movie being shown on the main screen. To re-synch touches the synch button on the tablet. The timeline or events are the synched back to what is being displayed on the main screen. Likewise, a user can select a trivia event or map event, touch the synch button, and the scene in the movie related to the selected trivia or map event will be played on the main screen. Examples of this can be seen inFIG. 10A-D . -
FIG. 10A depicts how a user may interact with thechapter timeline 910 andevent timeline 918 on thescreen 900. Hereicons event timelines -
FIG. 10B depicts one embodiment of thescreen 900 when a user interacts with thechapter timeline 910. In this example the synch button/icon 908 indicates that the additional content on thesecond screen display 520 is out of synch with the primary content on thefirst screen display 510.Icon 1000 represents the user scrolling through thechapter timeline 910. The current chapter remains highlighted 912 until the transition to the new chapter is completed. When navigating through the chapter timeline 910 achapter position indicator 1004 is provided that indicates what chapter of the available chapters is selected. Thechapter indicator 916 also indicates the selected chapter and updates when the transition to the new chapter is complete. In this example, while the user is navigating through thechapter timeline 910, theevent timeline 918 is dimmed. In certain embodiments, the user may jump directly to a particular chapter by selecting the chapter from thetimeline 910. -
FIG. 10C depicts one embodiment of thescreen 900 when a user interacts with theevent timeline 918.Icon 1002 represents the user scrolling through theevent timeline 918. Here, thetimeline 918 is being transitioned from current triggeredevent card 926 to thenext event card 930. When navigating through theevent timeline 918 anevent position indicator 1004 is provided that indicates what event of the available events is selected. -
FIG. 10D depicts one embodiment of thescreen 900 when a user interacting with theevent timeline 918 causes a transition from one chapter to another.Icon 1002 represents the user scrolling through theevent timeline 910 causing a chapter change. Here, thetimeline 918 is being transitioned anew event card 922 indicating a new set of events related to a new chapter. When navigating through theevent timeline 918 causes a transition to a new chapter theevent position indicator 1004 is centered until the new series of events begins. -
FIGS. 11A-C and 12A-B indicate some of the other interactive activities that can be accessed via theevent cards 922.FIGS. 11A-C depict the social media sharing feature.FIGS. 12A-B depict the chapter selection as well as selection and playback of additional media files. -
FIG. 11A-C shows various pop-up fields on thedisplay 900 when the sharing feature is active via the share button/icon 937.FIG. 11A shows thefield 1100 displayed when the user has logged into their social network (in this case Facebook).Area 1102 indicates the event being shared andarea 1104 indicates the comments the user is going to share about the event.Button 1106 provides the mechanism to submit the event and comments to be shared.FIG. 11B shows thefield 1100 displayed when the user has not yet signed in to the social network. In thisexample button 1108 is provided to sign into Facebook andbutton 1110 is provided to sign into twitter. Options to sign into other social networks may also be provided.FIG. 11C shows aonscreen Qwerty keyboard 1112 that may be used to enter comments intoarea 1104 for user's comments. In certain embodiments, this may be a default keyboard provided by thesecond screen device 520. -
FIG. 12A-B shows the selection of chapters as well media content for playback by the user. In the example of 12A, if the user single taps 1200 the currently playing chapter shown in thechapter timeline 912 the playback on thefirst screen device 510 is paused. If the user double taps 1202 the currently playing chapter shown in the chapter timeline, playback of on the first screen device will jump to the beginning of the chapter and theevents timeline 918 will be set to the first event of that chapter. In some embodiments, theevent cards 922 may includemedia files 1204 such as video or audio clips. If the media file is an audio clip, then selection of the audio clip results in playback on thecurrent screen 900. If the media file is a video clip, then selection of the video clip results in the launching of a full-screen media player 1206 as seen inFIG. 12B . In this example the media player includes on-screen controls 1208. To return to the previous screen, the user only needs to tap thenon-video surface 1210 of the media player. -
FIG. 13A-E depicts some other possible features regarding the additional content. These include amap view 1300,family tree 1310, andsettings 1320.FIG. 13A depicts the menu bars for these options. In this example each of these menu bars are provided with first screen device controls 1330 including pause/resume and mute/un-mute.FIG. 13B depicts themap view display 1300. Themap view display 1300 includes amap 1302 includingmarked locations 1304 and information about thelocations 1306. Icons are also provided to selectother maps 1308.FIG. 13C depicts thefamily tree view 1310. The family tree view shows the family tree with fields 1312 indicating the relationship between the family members. In this example the button/icon 1314 at the bottom indicates what view is currently being shown (i.e. the family tree view). If a field 1312 is selected, a pop-upfield 1316 is displayed, as shown inFIG. 13D , providing information about the person in the field 1312.FIG. 13 e depicts the settings view 1320. Inview 1320 the user is provided with controls for adjusting the preferences for the audio andvideo 1322,events 1324, andsocial network sharing 1326. -
FIGS. 14A-L depict skinned examples of what may be displayed on thescreen 900 of the second screen device to a user when using an application that provides additional content on thesecond screen device 520 that is synched with the primary content on thefirst screen device 510.FIG. 14A is a skinned version of the splash screen as shown and described in relation toFIG. 9A .FIGS. 14B-F depict skinned versions of the timeline view as seen and described in relation toFIGS. 9C-F and 10A-D.FIG. 14G depicts a skinned version of a screen display wherein all the available video clips that are part of the additional content are displayed for the user.FIG. 14H depicts a skinned version of a screen display wherein all the available audio clips that are part of the additional content are displayed for the user.FIG. 14I depicts a skinned version of the maps view as shown and described in relation toFIGS. 13B .FIGS. 14J and 14K depict skinned version of the family tree view as shown and described in relation toFIGS. 13C and 13D respectively.FIG. 14L depicts a skinned version of the settings view as shown and described in relation toFIG. 13E . - The events and features shown in the figures are just some examples of possible events. In certain embodiments, a user may be able to configure or otherwise select what events they wish to be shown (e.g., don't show me purchase events). In other embodiments the user may be able to select or bookmark events for viewing at a later time. In still other embodiments certain events may unavailable or locked out depending on the version of the program being viewed (i.e. purchased vs. rented or BD vs. VOD vs. Broadcast). In other embodiments, the events available can be personalized for a user based on previous viewing habits (i.e. in system such as TIVO where a user's viewing habits are tracked or using the
personalization engine 118 ofFIG. 1 ). - Other possible configurations include shopping features. For example, a store front could be provided and accessible from the second screen to for purchasing movie merchandise. In another embodiment points or awards could be provided to a user for watching, reviewing, or recommending a program or film. For example, the more movies watched or shared with friends, the more points awarded. The points can then be used for prizes or discounts on related goods.
- Similarly, achievements can also be awarded. These achievements could be pushed to a social networking site. Example achievements could include:
- Watching certain scenes—Achievement
- Watching certain discs in a series—Achievement
- Watching certain discs by a particular studio or actor—Achievement
- In still other implementations a Wild feature could be implemented. A running Wiki could let a user and other users of a disc comment on certain scene. For example, tracking metadata could be created which is pushed to a web based wild. Such metadata could include:
- Chapter Information
- Time Codes
- Thumbnails of Scenes
- Actor/Director Information
- This pushed information can be used to create a running Wild which lets others comment on the movie. These comments could then be reintegrated into the second screen application as events which can be accessed.
- Additional features and screens are also possible. For example, in some embodiments activity on one or more social networks that is related to content displayed on the first screen can be monitored and used to provide additional content on the second screen.
- As discussed in regard to
FIG. 11A-C . The application on thesecond screen device 520 can support social media such as Facebook and Twitter. Additional examples of this can be seen inFIGS. 15-17 . -
FIG. 15 depict awireframe screenshot 1500 showing thepanel 1510 displayed over the background, in this case a “greyed-out” or “dimmed”timeline view 1520, when the user has not yet signed in to the social network. In thisexample panel 1510 is for signing into twitter. Thepanel 1510 provides anarea 1512 to provide a username or email address as well as anarea 1512 to provide a password.Button 1516 authorized the application to access twitter.Button 1518 declines the sign-in. A similar panel or field can be provided to sign into Facebook, or other social media networks. -
FIG. 16 depict a flow diagram 1600 of the screens displayed to a user based on whether they are signed into a social network. At junction 160 it is determined is the user it signed in or otherwise authorized on the social network. If the user has not provided authorization (1612)screen 1500 withpanel 1510 ofFIG. 15 is displayed to the user prompting them to sign in. If the user has previously provided authorization (1614) or signs in using screen 1500 (1618)screen 1620 withmessage panel 1622 is displayed. From themessage panel 1622 the user may cancel or send amessage using buttons onscreen keyboard 1628. Once a message is sent or canceled out of (1630) or if authorization is never provided (1632)junction 1634 is arrived at wherein the panels are dismissed and the user is returned to the previous screen view (1640). -
FIG. 17 depicts askinned version screen 1700 of thescreen 1620 ofFIG. 16 . Amessage panel 1710 is provided overtop theskinned background 1720. Themessage panel 1710 includes atext area 1712 as well as cancel 1714 and send 1716 buttons. Text can be entered into thetext area 1712 using anonscreen keyboard 1730. - In addition to providing the ability to post social messages from the application on the
second screen device 520, the user's experience can further be enhance being able to track comments on social media relating to content being viewed on thefirst screen device 510 and providing additional content on thesecond screen device 520 that is synched to the primary content on thefirst screen device 510 based on the tracked comments relating to the primary content. -
FIG. 18 shows a flow diagram 1800 depicting one possible methodology for providing such functionality on asecond screen device 520. At a basic level, the method includes three steps. First, social media activity regarding content being displayed on thefirst screen device 510 is monitored (step 1810). The monitored social media activity is then processed (step 1820). Finally, additional content based on the social media activity is provided on thesecond screen device 520 which is synched to the primary content on the first screen device 510 (step 1830). Each of these steps will be discussed in more detail below in reference to specific exemplary implementation that rely on social media activity. - Popular broadcast media can potentially generate an overwhelming amount of related social messages. Second screen applications designed to support particular broadcasts have been developed to extend the branded experience. These second screen applications may include a relevant social feed by filtering messages by a hashtag or other keyword. Even with this filtering in place the message count can quickly become unwieldy for popular events (e.g. Superbowl 46 where the Tweets per second peaked at 10,245). What is disclosed herein is a mechanism to identify and surface relevant social messages for the user to see without being inundated with large numbers of messages to scan.
- Current practice in many social messaging applications is to simply present the messages as they are received resulting in an almost continuous scrolling of messages as they are rendered in the user interface. Other implementations may throttle requests for messages to reduce load on the back-end servers but the number of results for each request may be large and would be difficult for a user to scan the list before the next request is fulfilled.
- This disclosure offers an approach that curates social messages that are then offered to the user with timing appropriate to a second screen application that may not be entirely focused on the social messaging aspect. This is valuable when it is desirable for second-screen applications that want to integrate social messaging with other aspects of the media experience like bonus content, trivia and advertising.
- In the context of an application on a second-
screen device 520 where multiple items are displayed to the user in a timed fashion in sync with the media being displayed on afirst screen device 510, curated social information can be interleaved with other “timeline events”. For example an actor trivia “timeline event” may be presented in conjunction with a character's appearance on screen. A “Social Quote Event” may be timed for display just after an intense action scene. This latter event displays only the high-level/relevant social messages based on targeting or frequency. This provides the user a sense of what is being communicated in the social network while not requiring the user to scan through hundreds or even thousands of messages. An example of this can be seen inFIG. 19 . -
FIG. 19 depicts awireframe screenshot 1900 showing a “Social Quote”event panel 1910 as part of the timeline ofother events 1920. Thepanel 1910 provides atext area 1912 displaying the relevant curated social quotes. In certain embodiments one or more additional buttons are provided. In thisexample Button 1914 provides the ability to post or in this case re-tweet a given quote. Other possible features and implementation will be apparent to one skilled in the art given the benefit of this disclosure. - The process of providing, on a
second screen device 520, social media comments relevant to content being displayed on afirst screen device 510 follows the general methodology set forth inFIG. 18 . - First social media activity is monitored (step 1810). This involves looking for keywords, hashtags, or the like that include the name of the content being displayed on the
first screen device 510, the name of the actors, the name of the director, or other related information using techniques and methodologies that are well known in the art. Monitoring of social media can be performed on thesecond screen device 520, aserver 530, provided by a third party service, or a combination of thereof. - The step of processing (step 1820) is where the data is curated to only provide the most relevant comments or messages. To achieve the curated “Social Quote” the following heuristics may be applied:
-
- If the social message is targeted to the current user it should be deemed important for display. As an example this is like an “at” message (@user) in Twitter. Other targeting or addressing schemes can be employed as well (e.g. post on a wall in Facebook)
- Many social messages are simply re-sending original messages. Twitter provides a mechanism of “re-tweets” that adds metadata to a message to identify how many times it has been passed along. In this case the re-tweet count provides a metric of “importance” which is then used to select these messages for display.
- if the message contains multiple keywords it can be given greater “importance”.
- if a user may have specified user preferences for which the messages can be compared to.
- In certain embodiments is also conceivable that social messages can be “sponsored” so that they automatically gain relative importance above the crowd of other messages. For example social media comments that were posted from a second screen application may be given priority. A further refinement to this would be along the lines of targeted advertising—the importance of sponsored messages can be weighted against user-specific criteria to achieve its relative ranking amongst other messages. This processing or curating can be performed on the
second screen device 520, aserver 530, provided by a third party service, or a combination of thereof. - One or a few social messages thus ranked can be displayed to the user to provide targeted and other “significant” messaging without having to resort to scanning ever-flowing lists of social messages (step 1830).
- In the case of the timeline-based application on a
second screen device 520, “Social Quote” events can be distributed throughout the timeline such that regular exposure to social networks is purposefully interleaved with other event types for a thoughtfully designed experience. - A key to effective advertising is to deliver relevant offers at the right time. Current techniques involves extensive data collection on user behavior (e.g. Google tracking your searches, application use, viewing habits, etc.) which is then used to select advertisements that best suit the user's profile. A great deal of specificity can be derived using public and private data to develop these personalization profiles.
- Providing a focused context for collecting data has the potential for improving advertisement delivery even further. Social messaging may indeed be used as a source for personalization but may be handled independently from other activities that the user may be engaged in simultaneously. For example a user may tweet about a particular movie and a back-end system in the cloud can record the interest in that movie. This data could then be used the next time a user visits a website that takes advantage of this data to deliver advertisements. While this may potentially result in a higher advertisement relevance to the user during the subsequent visit the context of the original social message is lost. Second-screen applications offer a known environment from which social messages can be monitored to enhance advertising personalization.
- An application on a
second screen device 520 can designed to display scheduled “events” in sync with primary video playback. Events can represent trivia, social content, voting, bonus material, advertisements, etc that are timed for display at relevant points in the primary video playback. - The context that the second-screen application (and device) provides is rich, to include:
-
- Media information. This establishes an interest in a particular media item (broadcast TV, Video, etc)
- Specific Media time information. The application knows where you are within a piece of media and any specific metadata associate with that time.
- Location information. Second-
screen devices 520 are typically equipment with mechanisms for identifying your current location
- The disclosed embodiments interleave “events” to support an overall experience. Some of these event types afford a social interaction element such as voting or can offer social messaging anytime throughout the presentation. Subsequent “events” can be reserved for advertising and can react to previous activity to include social messaging. This can be accomplished by identifying keywords within the social message itself and providing that to an advertising service along with other relevant information such as location, metadata associated with the current media time, etc.
- The application consolidates all these variables and offers advertisements within the context of the overall experience.
- For Example a James Bond film could be playing on a first screen device and a location such as Hong Kong is displayed on screen. The second-screen app displays a supporting media item such as trivia about the location. The user could send a social message (tweets) to describe her desire to go on a vacation. The word “vacation” is parsed by the application in preparation for later advertising display. Later in the second-screen timeline an advertisement “event” focusing on a vacation offer is displayed.
- This process also follows the
general methodology 1800 ofFIG. 18 . The monitoring (step 1810) takes place when it is detected that the user has sent a social message including the keyword “vacation.” In this example, the monitoring is performed on thesecond screen device 520, but is could also be performed at theserver 530, by a third party service, or a combination thereof. - In the processing step (step 1820), the specific advertisement is selected from an ad service by submitting the keyword “vacation”. Additional information such as the specifically known location in the movie can be derived from metadata associate to the movie at the particular time of the tweet. Additionally the advertisement can recommend a local travel agent using the user's location information. The processing can be performed on the
second screen device 520, but is could also be performed at theserver 530, by a third party service, or a combination thereof. - Finally, the advertisement can be displayed on the second screen device (Step 1830). An example of this can be seen in
FIG. 20 . -
FIG. 20 depicts awireframe screenshot 2000 showing a dynamicadvertising event panel 2010 as part of the timeline ofother events 2020. Thepanel 2010 provides atext area 2012 displaying the relevant advertising offers. Other possible features and implementation will be apparent to one skilled in the art given the benefit of this disclosure. - The advertisement thus becomes a natural part of the “conversation” all within the context of the media consumption experience.
- While the addition of social messaging to a second screen application as discussed in the examples above enhances the experience for the user, further functionality and enhancement of the experience can be achieved if the social messaging can be time-stamped or otherwise synched with the playback of content on the
first screen device 510. - Social messages are typically ephemeral and relevant only to the present moment in time. A common convention for associating a message to a particular event or topic is to use hashtags. These hashtags provide a means to filter for social messages of interest. Filtering for these hashtags provides a “real-time” view of messages on the topic. This disclosure takes this notion a step further and describes a mechanism for associating message to a specific point in time relative to the start of a media content (such as video). Social messages employing these techniques can be associated to specific points in time within the particular content. For example, social messages can be associated with an opening scene that sparks social commentary and later for other arbitrary points in the media timeline. This example becomes even more relevant when a piece of media is replayed at a later time.
- In one embodiment, hashtags are used to add a timestamp. The current practice of using hashtags is well known. There is currently no widely established mechanism for describing time offset information within a social message or as additional metadata. The present disclosure provides a mechanism that can be implemented without custom extensions to the social messaging protocol. The encoded time offset is sent as part of the message itself. It is possible to obviate this approach by providing specific metadata that is not typically displayed as part of the social message itself.
-
FIG. 21 depicts exemplarysocial messages 2100 that including a hashtag identifying themedia 2110 is provided (this is common use today) with the addition of another hashtagged string of characters encoding the time offset 2120. In this example, the hashtag #ijkcs (2110) is used to identify Indiana Jones and the Kingdom of the Crystal Skull. The hashtag #1054675(2120) is a checksummed encoding of the frame offset. -
FIG. 22 depicts an exemplary flowchart of a method for creating time-stamped social messages. The method begins atblock 2210. It is then determined that a user desires to create a social message (block 2220). In this embodiment, this occurs when the user selects a comment button provided as part of the application running on thesecond screen device 520. A timestamp of the position of the playback is then created (block 2230). In this example, this involves generating a checksum. To create the checksum the application receives positioning information directly if the media is being played back within the application or from other methods that provide position information from external sources. This numeric data is then checksummed. The user can then enter the text for the social message and request it be sent (block 2240). The timestamp is then added to the message and the message is sent or otherwise committed to the social network (block 2250). In this example, this involves appending the checksum to the social message text. The checksum is used to ensure integrity of the data and allow a consuming application to ignore bad position data that might be maliciously or inadvertently created. This ends the method (block 2260) -
FIG. 23 depicts an exemplary flowchart of a method for consuming or otherwise reading and decoding a time-stamped social message. Once begun (block 2310) the method includes receiving a social message pertaining to the particular content in question, in this case, the content being displayed on the first screen device 510 (block 2320). In this example this involves the selection of a message based on the hashtag representing the video of interest (basic search on hashtag). The position information (timestamp) can then extracted (block 2330). In this example, this involves the application looking for an additional hashtag that immediately follows and processed as an encoded position. If the position information is found (block 2340), The position information is separated into a predetermined position and checksum. The position is confirmed against the checksum (block 2350). If the position information is valid, the position can then be used to associate the social message to a particular time within the video and the message information can be displayed synched with the display on the first screen device 510 (block 2360). If the position information cannot be confirmed, the message can be discarded or displayed without being synched with the display on the first screen device 510 (block 2370). This ends the method (block 2380). - Assuming not all social commentary will be generated using an application employing the techniques of the present disclosure, the following approach may be used to modify message so they can be consumed by applications sensitive to the position information.
- A separate service (e.g. in the cloud) can monitor and record real-time feeds of social messages of interest (say for a particular television broadcast). This service could use known broadcast schedules and correlate the expected time of the television event with the time the messages are seen in real-time. The service then records the social message with the appended position information. The resulting repository of time-stamped message is then accessed in lieu of a direct connection to the social network. In certain embodiment origin information for the social messages can be used to correlate to specific regions with specific broadcast times.
- As discuss previously, social messaging is becoming a ubiquitous feature across various software applications. One problem this presents is that the sheer quantity of messages can become overwhelming especially for popular topics. At some point the individual messages get lost in the crowd and the quantity of messages becomes the interesting social aspect. This disclosure describes a way to visualize the volume of social activity over time which can then be used to identify interesting points in time for media content and also be used to navigate to those points in time.
- Many second-screen applications provide a social message “feed” that simply streams social messages as they happen. There is little organization of this information beyond displaying the most recent at the top of the list or perhaps the notion of a “promoted” message that advertisers use to keep their message at the top of the stack. Messages are quickly replaced by new messages sometimes faster than a user can scan them. Once the messages have been buried in the stack their effective relevance to time is diminished.
- The presented graphical representation referred to here as a Social Heatmap organizes social information to correspond with the timeline of a of a media item such as content being displayed on a
first screen device 510. In the case of live broadcast the social message timestamp can be used to make the correlation. In the case of recorded content, messages need to be stamped with the time relative to the start of the media. This can be done within the application since the relative time of the media playback can be acquired via syncing mechanisms. The sync time is added to the social message which is then made available via a social network to others using an application that is aware of the time information. In this way each social message has a relative media timestamp which aids in visual placement on the screen when displaying the information on asecond screen device 520. -
FIG. 24 depicts an exemplary flowchart of a method for consuming or otherwise reading and decoding a time-stamped social message. Once begun (block 2410) the method includes receiving a social message pertaining to the particular content in question, in this case, the content being displayed on the first screen device 510 (block 2440). This can involve the selection of a message based on the hashtag representing the video of interest (basic search on hashtag) as discussed above or it could be based on keywords in the messages themselves. Timestamp information associated with the message is then looked for and messages without associated timestamp information are discarded (block 2430). In certain embodiments this involves the application looking for an additional hashtag that immediately follows and processed as an encoded position. Based on the associated timestamp information, the messages can be allocated to bins (block 2450). This process is discussed in more detail below. A graphical representation of the social messages associated with specific time periods of the primary content on thefirst screen device 510 can then be displayed on thesecond screen device 520 while the primary content is being displayed on thefirst screen device 510. In certain embodiments the graphical representation of the social messages associated with specific time periods also graphically represent the intensity or frequency (i.e. the number) of social messages associated the specific periods of time and as such is referred to as a heatmap. The user can then use the heatmap to navigate the through the content on the first screen and the associated social messages (block 2460). Such navigation is discussed in more detail below. This ends the method (block 2370). - Bins are mechanism used for grouping messages with discrete sections of time. The number of messages in each bin can corresponds to the activity level for that section of time. It has been found that the screen width of each bin should be wide enough to afford navigation (e.g. using a touch device) but small enough to provide navigation resolution. An example of this can be seen in
FIGS. 25 and 26 . - In the example of
FIG. 25 , a tablet application allocates 1000 pixels of width for theheatmap 2500. These 1000 pixels represent the entirety of the media playback time, say 50 minutes. This allows approximately 20 pixels of screen width per minute of content. If we use a bin width of 20 pixels then our navigation resolution will be to the nearest minute and the sensitivity for navigation will require a move of 20 pixels from bin to bin.FIG. 26 depicts an exemplary plot of sample count for values allocated to bins for rendering. The number of messages in a bin may be graphically represented using color, plots, or other indicators. The actual sensitivity that should be used will depend on the input device, user demographics and other factors. -
FIG. 27 depicts anexemplary wireframe screenshot 2700 including a social heatmap or timeline. In this embodiment event panels 2710 displayed as part of a timeline view on thesecond screen device 520 which a user can scroll through as described previously. AComment button 2720 is provided to allow the user to send social messages regarding content being displayed on the first 510 or second 520 screen devices. Social messages generated using such a function can include content and timestamp information for processing and inclusion in theheatmap 2750.Sync indicator button 2730 allows the user to synchronize the events of the timeline view with the content being displayed on thefirst screen device 510.Button 2730 can also indicate the status of the synchronization. At the bottom of thescreenshot 2700 there is also a playback position indicator 2740 and thesocial heatmap 2750. The playback position indicator graphically displayed the current position in the playback of the primary content on thefirst screen device 510. In certain embodiments, a user can adjust the indicator to change the current playback position in the primary content being displayed on thefirst screen device 510. Thesocial heatmap 2750 has bins representing social messages associated with time periods in the playback of the primary content. Selecting bins, for example by sliding or “scrubbing” along thesocial heatmap 2750 causes the associated social messages to be displayed. Examples of this can be seen inFIGS. 28-32 . -
FIG. 28 depicts thescreenshot 2700 ofFIG. 27 with a pop-over panel 2800 that is displays the social messages associated with abin 2810 in thesocial heatmap 2750 when a specific bin is selected by a user as represented byicon 2820.FIG. 29 depicts one embodiment of how messages can graphically be transition between (as indicated by arrow 2900) in the popover panel as a user scrolls or scrubs along the heatmap 2750 (as indicated by icon 2820).FIG. 30 indicates one possible embodiment of apanel 3000 that can be displayed if there are no social messages associated with abin 2810.FIG. 32 depict how multiple messages can be scrolled through within apanel 3100. In this example, messages can be scrolled through vertically as indicated byarrows 3110. In certain embodiments ascroll bar indicator 3120 is provided to indicate that there are multiple messages to be scrolled through.FIG. 32 depict another embodiment wherein selecting individual messages within thepanel 3200 provides the user with additional functionality. In this example, selecting a message provides additional buttons that allow the user to resend (“re-tweet”) amessage 3210 or got the specific instance in the playback of the content on thefirst screen device 510 for with the message is associated. -
FIG. 33 depicts askinned version screen 3300 of thescreen 2700 ofFIG. 27 . In this example color and peaks are used to graphically indicate the intensity of social messages along theheatmap 3310. - The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope.
- All examples and conditional language recited herein are intended for informational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
- Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
- Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herewith represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
- The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read only memory (“ROM”) for storing software, random access memory (“RAM”), and nonvolatile storage.
- Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- Although embodiments which incorporate the teachings of the present disclosure have been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings. Having described preferred embodiments for a method and system for providing media recommendations (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings.
- While the example set forth above has focused on an electronic device, it should be understood that the present invention can also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which, when loaded in a computer system, is able to carry out these methods. Computer program or application in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following a) conversion to another language, code or notation; b) reproduction in a different material form.
- Additionally, the description above is intended by way of example only and is not intended to limit the present invention in any way, except as set forth in the following claims.
Claims (15)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/322,566 US20140327677A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261584134P | 2012-01-06 | 2012-01-06 | |
US14/322,566 US20140327677A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen |
PCT/US2012/071811 WO2013103578A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140327677A1 true US20140327677A1 (en) | 2014-11-06 |
Family
ID=47472156
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/322,566 Abandoned US20140327677A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen |
US14/370,460 Expired - Fee Related US9578072B2 (en) | 2012-01-06 | 2012-12-27 | Method and system for synchronising content on a second screen |
US14/370,456 Abandoned US20150003798A1 (en) | 2012-01-06 | 2012-12-27 | Alternate view video playback on a second screen |
US14/370,458 Abandoned US20150020096A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for synchronising social messages with a content timeline |
US14/370,453 Abandoned US20140365302A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing dynamic advertising on a second screen based on social messages |
US14/370,448 Abandoned US20150019644A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a display of socialmessages on a second screen which is synched to content on a first screen |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/370,460 Expired - Fee Related US9578072B2 (en) | 2012-01-06 | 2012-12-27 | Method and system for synchronising content on a second screen |
US14/370,456 Abandoned US20150003798A1 (en) | 2012-01-06 | 2012-12-27 | Alternate view video playback on a second screen |
US14/370,458 Abandoned US20150020096A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for synchronising social messages with a content timeline |
US14/370,453 Abandoned US20140365302A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing dynamic advertising on a second screen based on social messages |
US14/370,448 Abandoned US20150019644A1 (en) | 2012-01-06 | 2012-12-27 | Method and system for providing a display of socialmessages on a second screen which is synched to content on a first screen |
Country Status (7)
Country | Link |
---|---|
US (6) | US20140327677A1 (en) |
EP (6) | EP2801021A1 (en) |
JP (6) | JP2015509240A (en) |
KR (6) | KR20140121395A (en) |
CN (6) | CN104081782A (en) |
BR (2) | BR112014016761A8 (en) |
WO (6) | WO2013103578A1 (en) |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130191745A1 (en) * | 2012-01-10 | 2013-07-25 | Zane Vella | Interface for displaying supplemental dynamic timeline content |
US20140006020A1 (en) * | 2012-06-29 | 2014-01-02 | Mckesson Financial Holdings | Transcription method, apparatus and computer program product |
US20140067947A1 (en) * | 2012-08-31 | 2014-03-06 | Ime Archibong | Sharing Television and Video Programming Through Social Networking |
US20150135071A1 (en) * | 2013-11-12 | 2015-05-14 | Fox Digital Entertainment, Inc. | Method and apparatus for distribution and presentation of audio visual data enhancements |
US20150178290A1 (en) * | 2013-12-25 | 2015-06-25 | Canon Kabushiki Kaisha | Display control apparatus, display control method, and computer-readable storage medium |
US20160018982A1 (en) * | 2014-07-17 | 2016-01-21 | Facebook, Inc. | Touch-Based Gesture Recognition and Application Navigation |
US20160063015A1 (en) * | 2014-08-28 | 2016-03-03 | Kent Andrew Edmonds | Systems and methods for providing complimentary content on linked machines |
US9301016B2 (en) | 2012-04-05 | 2016-03-29 | Facebook, Inc. | Sharing television and video programming through social networking |
US20170041644A1 (en) * | 2011-06-14 | 2017-02-09 | Watchwith, Inc. | Metadata delivery system for rendering supplementary content |
US20170180293A1 (en) * | 2015-12-17 | 2017-06-22 | International Business Machines Corporation | Contextual temporal synchronization markers |
US20170264585A1 (en) * | 2016-02-26 | 2017-09-14 | Shanghai Hode Information Technology Co.,Ltd. | Method and apparatus for displaying comment information |
US20170330292A1 (en) * | 2016-05-16 | 2017-11-16 | Adobe Systems Incorporated | Correlator |
US20170339462A1 (en) | 2011-06-14 | 2017-11-23 | Comcast Cable Communications, Llc | System And Method For Presenting Content With Time Based Metadata |
US10149014B2 (en) | 2001-09-19 | 2018-12-04 | Comcast Cable Communications Management, Llc | Guide menu based on a repeatedly-rotating sequence |
US10171878B2 (en) | 2003-03-14 | 2019-01-01 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US10237617B2 (en) | 2003-03-14 | 2019-03-19 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content or managed content |
US10324619B2 (en) | 2014-07-17 | 2019-06-18 | Facebook, Inc. | Touch-based gesture recognition and application navigation |
US10491942B2 (en) | 2002-09-19 | 2019-11-26 | Comcast Cable Communications Management, Llc | Prioritized placement of content elements for iTV application |
US10575070B2 (en) | 2005-05-03 | 2020-02-25 | Comcast Cable Communications Management, Llc | Validation of content |
US10602225B2 (en) | 2001-09-19 | 2020-03-24 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US10664138B2 (en) * | 2003-03-14 | 2020-05-26 | Comcast Cable Communications, Llc | Providing supplemental content for a second screen experience |
CN111352627A (en) * | 2020-02-27 | 2020-06-30 | 政采云有限公司 | Page skeleton screen generation method, device, equipment and readable storage medium |
US10841531B2 (en) | 2018-10-18 | 2020-11-17 | Fujitsu Limited | Display control apparatus and display control method |
US10848830B2 (en) | 2003-09-16 | 2020-11-24 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US10868682B2 (en) | 2013-06-13 | 2020-12-15 | Pushfor, Ltd. | System and method for monitoring usage of an electronic document |
US10880609B2 (en) | 2013-03-14 | 2020-12-29 | Comcast Cable Communications, Llc | Content event messaging |
US10938883B2 (en) * | 2013-06-21 | 2021-03-02 | Tencent Technology (Shenzhen) Company Limited | Method and system for controlling media information display on multiple terminals |
US10996845B2 (en) * | 2016-03-25 | 2021-05-04 | Alibaba Group Holding Limited | Method, application, browser, and electronic device for providing webpage content |
US11070892B2 (en) * | 2013-03-14 | 2021-07-20 | The Nielsen Company (Us), Llc | Methods and apparatus to present supplemental media on a second screen |
US11070890B2 (en) | 2002-08-06 | 2021-07-20 | Comcast Cable Communications Management, Llc | User customization of user interfaces for interactive television |
US11094131B2 (en) | 2014-06-10 | 2021-08-17 | 2Mee Ltd | Augmented reality apparatus and method |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
US11126394B2 (en) * | 2012-05-01 | 2021-09-21 | Lisnr, Inc. | Systems and methods for content delivery and management |
USD944281S1 (en) * | 2019-03-26 | 2022-02-22 | Facebook, Inc. | Display device with graphical user interface |
US11330319B2 (en) | 2014-10-15 | 2022-05-10 | Lisnr, Inc. | Inaudible signaling tone |
US11361542B2 (en) | 2012-09-12 | 2022-06-14 | 2Mee Ltd | Augmented reality apparatus and method |
US11363325B2 (en) * | 2014-03-20 | 2022-06-14 | 2Mee Ltd | Augmented reality apparatus and method |
US20220210506A1 (en) * | 2020-11-23 | 2022-06-30 | The Boston Consulting Group, Inc. | Methods And Systems For Context-Sensitive Manipulation of an Object via a Presentation Software |
US11381875B2 (en) | 2003-03-14 | 2022-07-05 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US11388451B2 (en) | 2001-11-27 | 2022-07-12 | Comcast Cable Communications Management, Llc | Method and system for enabling data-rich interactive television using broadcast database |
US11412306B2 (en) | 2002-03-15 | 2022-08-09 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US11461806B2 (en) * | 2016-09-12 | 2022-10-04 | Sonobeacon Gmbh | Unique audio identifier synchronization system |
US11783382B2 (en) | 2014-10-22 | 2023-10-10 | Comcast Cable Communications, Llc | Systems and methods for curating content metadata |
US11783842B2 (en) * | 2014-02-28 | 2023-10-10 | Comcast Cable Communications, Llc | Voice-enabled screen reader |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
Families Citing this family (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352983B1 (en) | 2002-07-11 | 2013-01-08 | Tvworks, Llc | Programming contextual interactive user interface for television |
US20170041649A1 (en) * | 2011-06-14 | 2017-02-09 | Watchwith, Inc. | Supplemental content playback system |
WO2013019259A1 (en) | 2011-08-01 | 2013-02-07 | Thomson Licensing | Telepresence communications system and method |
US10924582B2 (en) | 2012-03-09 | 2021-02-16 | Interdigital Madison Patent Holdings | Distributed control of synchronized content |
US9866899B2 (en) | 2012-09-19 | 2018-01-09 | Google Llc | Two way control of a set top box |
US9788055B2 (en) * | 2012-09-19 | 2017-10-10 | Google Inc. | Identification and presentation of internet-accessible content associated with currently playing television programs |
US10348821B2 (en) * | 2012-12-21 | 2019-07-09 | Dropbox, Inc. | Prioritizing structural operations and distributing changes in a synced online content management system |
US11375347B2 (en) * | 2013-02-20 | 2022-06-28 | Disney Enterprises, Inc. | System and method for delivering secondary content to movie theater patrons |
US9311651B2 (en) * | 2013-03-07 | 2016-04-12 | Cable Television Laboratories, Inc. | Identity-Media Measurement Model (IMMM) |
US9553927B2 (en) | 2013-03-13 | 2017-01-24 | Comcast Cable Communications, Llc | Synchronizing multiple transmissions of content |
US20140282683A1 (en) * | 2013-03-14 | 2014-09-18 | Samsung Electronics Co., Ltd. | Computing system with device interaction mechanism and method of operation thereof |
US9756288B2 (en) | 2013-04-10 | 2017-09-05 | Thomson Licensing | Tiering and manipulation of peer's heads in a telepresence system |
WO2014204459A1 (en) | 2013-06-20 | 2014-12-24 | Thomson Licensing | System and method to assist synchronization of distributed play out of control |
CN104252468B (en) * | 2013-06-26 | 2019-04-12 | Sap欧洲公司 | For maximizing the method and its system of the information gain of continuous events |
JP6271194B2 (en) * | 2013-09-17 | 2018-01-31 | 特定非営利活動法人メディア・アクセス・サポートセンター | Providing second screen information to mobile devices |
CN103581598A (en) * | 2013-11-13 | 2014-02-12 | 惠州Tcl移动通信有限公司 | Interconnection and interworking and multi-screen interaction equipment, system and implementation method |
WO2015084244A1 (en) * | 2013-12-05 | 2015-06-11 | Gold & Dragon Ab | Method and system for enabling live event |
US20150281119A1 (en) * | 2014-03-26 | 2015-10-01 | United Video Properties, Inc. | Methods and systems for transferring authorization to access media content between multiple user devices |
TWM484091U (en) * | 2014-03-31 | 2014-08-11 | Unibase Information Corp | Aircraft guiding system having auditing and image synchronization function, and auditing and image synchronization device |
US10331095B2 (en) | 2014-04-29 | 2019-06-25 | Cox Communications | Systems and methods for development of an automation control service |
US9875242B2 (en) | 2014-06-03 | 2018-01-23 | Google Llc | Dynamic current results for second device |
US10152543B1 (en) * | 2014-06-03 | 2018-12-11 | Google Llc | Generating content labels according to user search queries |
WO2016033545A1 (en) * | 2014-08-29 | 2016-03-03 | Sling Media Inc. | Systems and processes for delivering digital video content based upon excitement data |
US10536758B2 (en) | 2014-10-09 | 2020-01-14 | Thuuz, Inc. | Customized generation of highlight show with narrative component |
US11863848B1 (en) | 2014-10-09 | 2024-01-02 | Stats Llc | User interface for interaction with customized highlight shows |
US10433030B2 (en) | 2014-10-09 | 2019-10-01 | Thuuz, Inc. | Generating a customized highlight sequence depicting multiple events |
US10506295B2 (en) | 2014-10-09 | 2019-12-10 | Disney Enterprises, Inc. | Systems and methods for delivering secondary content to viewers |
EP3214774A4 (en) * | 2014-10-29 | 2018-05-02 | LG Electronics Inc. | Broadcast signal transmission apparatus, broadcast signal reception apparatus, broadcast signal transmission method, and broadcast signal reception method |
US9551161B2 (en) | 2014-11-30 | 2017-01-24 | Dolby Laboratories Licensing Corporation | Theater entrance |
CN106999788A (en) | 2014-11-30 | 2017-08-01 | 杜比实验室特许公司 | The large format theater design of social media link |
CN105992046B (en) * | 2015-02-26 | 2020-07-07 | 阿里巴巴集团控股有限公司 | Business data pushing method, device and system |
US10459991B2 (en) | 2015-04-23 | 2019-10-29 | International Business Machines Corporation | Content contribution validation |
US10277693B2 (en) * | 2015-06-04 | 2019-04-30 | Twitter, Inc. | Trend detection in a messaging platform |
JP6504453B2 (en) * | 2015-07-01 | 2019-04-24 | カシオ計算機株式会社 | Image transmitting apparatus, image transmitting method and program |
US10489812B2 (en) | 2015-07-15 | 2019-11-26 | International Business Machines Corporation | Acquiring and publishing supplemental information on a network |
CN105307001A (en) * | 2015-09-30 | 2016-02-03 | 天脉聚源(北京)科技有限公司 | Method and device for real-time displaying release information on video program |
US10057651B1 (en) * | 2015-10-05 | 2018-08-21 | Twitter, Inc. | Video clip creation using social media |
KR102578982B1 (en) * | 2015-11-30 | 2023-09-18 | 삼성전자주식회사 | A method for providing a translation service and an electronic device therefor |
US10771508B2 (en) | 2016-01-19 | 2020-09-08 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
ES2629484B1 (en) * | 2016-02-09 | 2018-03-08 | Eloi MOLINAS LOMBART | VIDEO AND DATA SYNCHRONIZATION METHOD AND SYSTEM |
US10382823B2 (en) * | 2016-03-28 | 2019-08-13 | Oath Inc. | Video content deep diving |
KR102570379B1 (en) * | 2016-04-22 | 2023-08-25 | 엘지전자 주식회사 | Display device for providing a screen mirroring function and operating method thereof |
CN106911953A (en) | 2016-06-02 | 2017-06-30 | 阿里巴巴集团控股有限公司 | A kind of video playing control method, device and audio/video player system |
US10540158B2 (en) | 2016-07-18 | 2020-01-21 | Google Llc | Post-install application interaction |
US10331750B2 (en) | 2016-08-01 | 2019-06-25 | Facebook, Inc. | Systems and methods to manage media content items |
WO2018078768A1 (en) * | 2016-10-27 | 2018-05-03 | エヴィクサー株式会社 | Content reproduction program and content reproduction apparatus |
CN110800018A (en) * | 2017-04-27 | 2020-02-14 | 斯纳普公司 | Friend location sharing mechanism for social media platform |
CN109729436B (en) * | 2017-10-31 | 2021-03-16 | 腾讯科技(深圳)有限公司 | Advertisement bullet screen processing method and device |
CN108108442A (en) * | 2017-12-21 | 2018-06-01 | 电子科技大学 | A kind of method for optimizing web page browsing |
CN108259574A (en) * | 2017-12-26 | 2018-07-06 | 北京海杭通讯科技有限公司 | A kind of personal method for building up and its intelligent terminal from media system |
CN108200287B (en) * | 2017-12-29 | 2020-10-30 | 浙江佩鼎大数据科技有限公司 | Information processing method, terminal and computer readable storage medium |
CN108418950B (en) * | 2018-01-31 | 2019-10-18 | 维沃移动通信有限公司 | Message prompt method and mobile terminal |
US10628115B2 (en) * | 2018-08-21 | 2020-04-21 | Facebook Technologies, Llc | Synchronization of digital content consumption |
JP7338935B2 (en) * | 2018-12-19 | 2023-09-05 | Line株式会社 | terminal display method, terminal, terminal program |
US11006191B2 (en) * | 2019-08-02 | 2021-05-11 | The Nielsen Company (Us), Llc | Use of watermarking to control abandonment of dynamic content modification |
CN112788378B (en) * | 2019-11-04 | 2023-04-25 | 海信视像科技股份有限公司 | Display device and content display method |
CN111190518B (en) * | 2019-12-30 | 2022-05-17 | 中央电视台 | Interaction method and device between first screen and second screen, terminal and storage medium |
AU2020439978B2 (en) * | 2020-04-01 | 2023-07-06 | Google Llc | Enabling media features provided on a first screen device to be presented on a second screen device |
US11502978B2 (en) * | 2020-06-30 | 2022-11-15 | Snap Inc. | Messaging system for resurfacing content items |
US11741502B2 (en) * | 2021-02-03 | 2023-08-29 | Ohana Corp | System and methods for symbiotic display of ads on mobile devices |
CN113221078B (en) * | 2021-03-25 | 2024-03-12 | 贵州大学 | Watermark tracking method for instant messaging system information screen capture leakage |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100241961A1 (en) * | 2009-03-23 | 2010-09-23 | Peterson Troy A | Content presentation control and progression indicator |
US20130086159A1 (en) * | 2011-09-29 | 2013-04-04 | Nader Gharachorloo | Media content recommendations based on social network relationship |
US20140282013A1 (en) * | 2013-03-15 | 2014-09-18 | Afzal Amijee | Systems and methods for creating and sharing nonlinear slide-based mutlimedia presentations and visual discussions comprising complex story paths and dynamic slide objects |
US20140317535A1 (en) * | 2011-05-10 | 2014-10-23 | Echostar Technologies L.L.C. | Apparatus, systems and methods for facilitating social networking via a media device |
Family Cites Families (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5842216A (en) * | 1996-05-03 | 1998-11-24 | Mitsubishi Electric Information Technology Center America, Inc. | System for sending small positive data notification messages over a network to indicate that a recipient node should obtain a particular version of a particular data item |
US20050210101A1 (en) * | 1999-03-04 | 2005-09-22 | Universal Electronics Inc. | System and method for providing content, management, and interactivity for client devices |
US6737957B1 (en) | 2000-02-16 | 2004-05-18 | Verance Corporation | Remote control signaling using audio watermarks |
JP2002132602A (en) * | 2000-07-31 | 2002-05-10 | Hewlett Packard Co <Hp> | Method for introducing and linking picture equipment |
CA2421775C (en) * | 2000-09-08 | 2013-03-12 | Kargo Inc. | Video interaction |
US20020188959A1 (en) * | 2001-06-12 | 2002-12-12 | Koninklijke Philips Electronics N.V. | Parallel and synchronized display of augmented multimedia information |
US6741684B2 (en) * | 2001-06-26 | 2004-05-25 | Koninklijke Philips Electronics N.V. | Interactive TV using remote control with built-in phone |
EP1464172B1 (en) * | 2001-12-24 | 2013-04-24 | Intrasonics S.A.R.L. | Captioning system |
WO2004051909A2 (en) * | 2002-12-02 | 2004-06-17 | Matsushita Electric Industrial Co., Ltd. | Portable device for viewing real-time synchronized information from broadcasting sources |
EP2456104A1 (en) * | 2003-02-10 | 2012-05-23 | Nielsen Media Research, Inc. | Methods and apparatus to adaptively gather audience measurement data |
US8406341B2 (en) | 2004-01-23 | 2013-03-26 | The Nielsen Company (Us), Llc | Variable encoding and detection apparatus and methods |
JP2005223534A (en) * | 2004-02-04 | 2005-08-18 | Victor Co Of Japan Ltd | Receiver and method for generating summary graph |
JP2007096971A (en) * | 2005-09-29 | 2007-04-12 | Toshiba Corp | Wireless transmitter and wireless receiver |
WO2007076150A2 (en) * | 2005-12-23 | 2007-07-05 | Facebook, Inc. | Systems and methods for generating a social timeline |
KR100782858B1 (en) * | 2006-04-11 | 2007-12-06 | 삼성전자주식회사 | Method and apparatus for synchronizing contents of home network devices |
WO2009049323A1 (en) * | 2007-10-11 | 2009-04-16 | Visible Technologies, Llc. | Systems and methods for consumer-generated media reputation management |
US20080133327A1 (en) * | 2006-09-14 | 2008-06-05 | Shah Ullah | Methods and systems for securing content played on mobile devices |
US20080092164A1 (en) * | 2006-09-27 | 2008-04-17 | Anjana Agarwal | Providing a supplemental content service for communication networks |
CN100588260C (en) * | 2007-04-13 | 2010-02-03 | 深圳市融创天下科技发展有限公司 | method for inter-broadcasting of network advertisement in the video program |
EP1988731B1 (en) * | 2007-05-02 | 2011-10-12 | Alcatel Lucent | Method for establishing a parameterized wireless communication channel |
JP2009100470A (en) * | 2007-09-28 | 2009-05-07 | Fujifilm Corp | Device and method for reproducing data |
US8875212B2 (en) * | 2008-04-15 | 2014-10-28 | Shlomo Selim Rakib | Systems and methods for remote control of interactive video |
US8775647B2 (en) * | 2007-12-10 | 2014-07-08 | Deluxe Media Inc. | Method and system for use in coordinating multimedia devices |
WO2009100093A1 (en) * | 2008-02-05 | 2009-08-13 | Dolby Laboratories Licensing Corporation | Associating information with media content |
JP4524703B2 (en) * | 2008-02-29 | 2010-08-18 | ソニー株式会社 | Information processing apparatus and method, and program |
US20090235298A1 (en) * | 2008-03-13 | 2009-09-17 | United Video Properties, Inc. | Systems and methods for synchronizing time-shifted media content and related communications |
US20090249222A1 (en) | 2008-03-25 | 2009-10-01 | Square Products Corporation | System and method for simultaneous media presentation |
CN101256598A (en) * | 2008-04-07 | 2008-09-03 | 华为技术有限公司 | Method and device for improving satisfaction degree of network user |
CA2721917A1 (en) * | 2008-04-24 | 2009-10-29 | Churchill Downs Technology Initiatives Company | Personalized transaction management and media delivery system |
EP2124449A1 (en) * | 2008-05-19 | 2009-11-25 | THOMSON Licensing | Device and method for synchronizing an interactive mark to streaming content |
JP2009289191A (en) * | 2008-05-30 | 2009-12-10 | Nippon Telegr & Teleph Corp <Ntt> | Contribution information provision device, contribution information browsing method, program, and storage medium |
KR20110076988A (en) * | 2008-10-08 | 2011-07-06 | 에드키퍼 인크. | Managing internet advertising and promotional content |
US20100205628A1 (en) * | 2009-02-12 | 2010-08-12 | Davis Bruce L | Media processing methods and arrangements |
JP5244674B2 (en) * | 2009-03-31 | 2013-07-24 | 株式会社ゼンリンデータコム | Advertisement providing apparatus and program |
AU2010242814B2 (en) * | 2009-05-01 | 2014-07-31 | The Nielsen Company (Us), Llc | Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content |
KR101944469B1 (en) * | 2009-07-16 | 2019-01-31 | 블루핀 랩스, 인코포레이티드 | Estimating and displaying social interest in time-based media |
KR20110020619A (en) * | 2009-08-24 | 2011-03-03 | 삼성전자주식회사 | Method for play synchronization and device using the same |
US20110055017A1 (en) * | 2009-09-01 | 2011-03-03 | Amiad Solomon | System and method for semantic based advertising on social networking platforms |
CN102484686B (en) * | 2009-09-15 | 2016-01-20 | 汤姆森特许公司 | For providing the method and apparatus of side information |
EP2484040B1 (en) * | 2009-10-02 | 2018-03-07 | Telefonaktiebolaget LM Ericsson (publ) | Method for retransmission using checksums for identifying lost data packets |
CN102598109A (en) * | 2009-10-29 | 2012-07-18 | 汤姆森特许公司 | Multiple-screen interactive screen architecture |
KR101702659B1 (en) * | 2009-10-30 | 2017-02-06 | 삼성전자주식회사 | Appratus and method for syncronizing moving picture contents and e-book contents and system thereof |
US8463100B2 (en) * | 2009-11-05 | 2013-06-11 | Cosmo Research Company Limited | System and method for identifying, providing, and presenting content on a mobile device |
US20110153330A1 (en) * | 2009-11-27 | 2011-06-23 | i-SCROLL | System and method for rendering text synchronized audio |
US9094726B2 (en) * | 2009-12-04 | 2015-07-28 | At&T Intellectual Property I, Lp | Apparatus and method for tagging media content and managing marketing |
JP2011129009A (en) * | 2009-12-21 | 2011-06-30 | Cybird Co Ltd | Short sentence communication method |
US8660545B1 (en) * | 2010-01-06 | 2014-02-25 | ILook Corporation | Responding to a video request by displaying information on a TV remote and video on the TV |
JP5277184B2 (en) * | 2010-01-25 | 2013-08-28 | 日本放送協会 | Choice generation and presentation apparatus and choice generation and presentation program |
JP5573202B2 (en) * | 2010-01-29 | 2014-08-20 | 船井電機株式会社 | Portable terminal and information display interlocking system |
US8396874B2 (en) * | 2010-02-17 | 2013-03-12 | Yahoo! Inc. | System and method for using topic messages to understand media relating to an event |
US9084096B2 (en) * | 2010-02-22 | 2015-07-14 | Yahoo! Inc. | Media event structure and context identification using short messages |
US20110221962A1 (en) * | 2010-03-10 | 2011-09-15 | Microsoft Corporation | Augmented reality via a secondary channel |
US9185458B2 (en) * | 2010-04-02 | 2015-11-10 | Yahoo! Inc. | Signal-driven interactive television |
US20110307917A1 (en) * | 2010-06-11 | 2011-12-15 | Brian Shuster | Method and apparatus for interactive mobile coupon/offer delivery, storage and redemption system |
JP2011259383A (en) * | 2010-06-11 | 2011-12-22 | Nippon Telegr & Teleph Corp <Ntt> | Tv-program-related information display system, interterminal event synchronizing apparatus, interterminal event synchronizing method, and program |
JP5618404B2 (en) * | 2010-06-29 | 2014-11-05 | 楽天株式会社 | Information providing apparatus, information providing method, information providing program, and recording medium on which information providing program is recorded |
US8424037B2 (en) * | 2010-06-29 | 2013-04-16 | Echostar Technologies L.L.C. | Apparatus, systems and methods for accessing and synchronizing presentation of media content and supplemental media rich content in response to selection of a presented object |
US8423409B2 (en) * | 2010-09-02 | 2013-04-16 | Yahoo! Inc. | System and method for monetizing user-generated web content |
US9071871B2 (en) | 2010-12-08 | 2015-06-30 | Microsoft Technology Licensing, Llc | Granular tagging of content |
US8918465B2 (en) * | 2010-12-14 | 2014-12-23 | Liveperson, Inc. | Authentication of service requests initiated from a social networking site |
US9729694B2 (en) * | 2010-12-29 | 2017-08-08 | Avaya Inc. | Method and apparatus for providing priority indicia associated with a plurality of messages |
CN103535028A (en) | 2010-12-30 | 2014-01-22 | 汤姆逊许可公司 | Method and system for providing additional content related to a displayed content |
US8898698B2 (en) * | 2011-01-21 | 2014-11-25 | Bluefin Labs, Inc. | Cross media targeted message synchronization |
US9100669B2 (en) * | 2011-05-12 | 2015-08-04 | At&T Intellectual Property I, Lp | Method and apparatus for associating micro-blogs with media programs |
US8949333B2 (en) * | 2011-05-20 | 2015-02-03 | Alejandro Backer | Systems and methods for virtual interactions |
US20110289532A1 (en) * | 2011-08-08 | 2011-11-24 | Lei Yu | System and method for interactive second screen |
EP2595405B1 (en) * | 2011-11-15 | 2020-02-26 | LG Electronics Inc. | Electronic device and method for providing contents recommendation service |
US20130173742A1 (en) * | 2011-12-28 | 2013-07-04 | United Video Properties, Inc. | Systems and methods for latency-based synchronized playback at multiple locations |
EP2611051B1 (en) * | 2011-12-29 | 2014-06-04 | Thomson Licensing | Method for synchronizing media services |
-
2012
- 2012-12-27 EP EP12809561.9A patent/EP2801021A1/en not_active Ceased
- 2012-12-27 JP JP2014551284A patent/JP2015509240A/en active Pending
- 2012-12-27 US US14/322,566 patent/US20140327677A1/en not_active Abandoned
- 2012-12-27 US US14/370,460 patent/US9578072B2/en not_active Expired - Fee Related
- 2012-12-27 US US14/370,456 patent/US20150003798A1/en not_active Abandoned
- 2012-12-27 CN CN201280066117.9A patent/CN104081782A/en active Pending
- 2012-12-27 EP EP12810026.0A patent/EP2801207A1/en not_active Withdrawn
- 2012-12-27 WO PCT/US2012/071811 patent/WO2013103578A1/en active Application Filing
- 2012-12-27 KR KR1020147018532A patent/KR20140121395A/en not_active Application Discontinuation
- 2012-12-27 CN CN201280066345.6A patent/CN104040479A/en active Pending
- 2012-12-27 WO PCT/US2012/071822 patent/WO2013103583A1/en active Application Filing
- 2012-12-27 KR KR1020147018652A patent/KR20140113934A/en not_active Application Discontinuation
- 2012-12-27 KR KR1020147018700A patent/KR20140121400A/en not_active Application Discontinuation
- 2012-12-27 JP JP2014551285A patent/JP6416626B2/en active Active
- 2012-12-27 JP JP2014551282A patent/JP2015512069A/en active Pending
- 2012-12-27 CN CN201280066353.0A patent/CN104081783A/en active Pending
- 2012-12-27 CN CN201280066095.6A patent/CN104041057A/en active Pending
- 2012-12-27 US US14/370,458 patent/US20150020096A1/en not_active Abandoned
- 2012-12-27 US US14/370,453 patent/US20140365302A1/en not_active Abandoned
- 2012-12-27 EP EP12810027.8A patent/EP2801208B1/en not_active Not-in-force
- 2012-12-27 KR KR1020147018653A patent/KR20140121399A/en not_active Application Discontinuation
- 2012-12-27 CN CN201280066136.1A patent/CN104205854A/en active Pending
- 2012-12-27 WO PCT/US2012/071813 patent/WO2013103580A1/en active Application Filing
- 2012-12-27 EP EP12814107.4A patent/EP2801209A1/en not_active Withdrawn
- 2012-12-27 KR KR1020147018701A patent/KR20140117387A/en not_active Application Discontinuation
- 2012-12-27 BR BR112014016761A patent/BR112014016761A8/en not_active IP Right Cessation
- 2012-12-27 BR BR112014016229A patent/BR112014016229A8/en not_active IP Right Cessation
- 2012-12-27 WO PCT/US2012/071824 patent/WO2013103584A1/en active Application Filing
- 2012-12-27 US US14/370,448 patent/US20150019644A1/en not_active Abandoned
- 2012-12-27 JP JP2014551283A patent/JP2015511418A/en active Pending
- 2012-12-27 KR KR1020147017549A patent/KR20140121387A/en not_active Application Discontinuation
- 2012-12-27 WO PCT/US2012/071817 patent/WO2013103581A1/en active Application Filing
- 2012-12-27 CN CN201280066352.6A patent/CN104041058A/en active Pending
- 2012-12-27 EP EP12810024.5A patent/EP2801205A1/en not_active Withdrawn
- 2012-12-27 EP EP12810025.2A patent/EP2801206A1/en not_active Ceased
- 2012-12-27 WO PCT/US2012/071820 patent/WO2013103582A1/en active Application Filing
- 2012-12-27 JP JP2014551286A patent/JP2015510305A/en active Pending
- 2012-12-27 JP JP2014551281A patent/JP2015513129A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100241961A1 (en) * | 2009-03-23 | 2010-09-23 | Peterson Troy A | Content presentation control and progression indicator |
US20140317535A1 (en) * | 2011-05-10 | 2014-10-23 | Echostar Technologies L.L.C. | Apparatus, systems and methods for facilitating social networking via a media device |
US20130086159A1 (en) * | 2011-09-29 | 2013-04-04 | Nader Gharachorloo | Media content recommendations based on social network relationship |
US20140282013A1 (en) * | 2013-03-15 | 2014-09-18 | Afzal Amijee | Systems and methods for creating and sharing nonlinear slide-based mutlimedia presentations and visual discussions comprising complex story paths and dynamic slide objects |
Cited By (85)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10602225B2 (en) | 2001-09-19 | 2020-03-24 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US10149014B2 (en) | 2001-09-19 | 2018-12-04 | Comcast Cable Communications Management, Llc | Guide menu based on a repeatedly-rotating sequence |
US10587930B2 (en) | 2001-09-19 | 2020-03-10 | Comcast Cable Communications Management, Llc | Interactive user interface for television applications |
US11388451B2 (en) | 2001-11-27 | 2022-07-12 | Comcast Cable Communications Management, Llc | Method and system for enabling data-rich interactive television using broadcast database |
US11412306B2 (en) | 2002-03-15 | 2022-08-09 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US11070890B2 (en) | 2002-08-06 | 2021-07-20 | Comcast Cable Communications Management, Llc | User customization of user interfaces for interactive television |
US10491942B2 (en) | 2002-09-19 | 2019-11-26 | Comcast Cable Communications Management, Llc | Prioritized placement of content elements for iTV application |
US10687114B2 (en) | 2003-03-14 | 2020-06-16 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US11381875B2 (en) | 2003-03-14 | 2022-07-05 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US10171878B2 (en) | 2003-03-14 | 2019-01-01 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US10237617B2 (en) | 2003-03-14 | 2019-03-19 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content or managed content |
US11089364B2 (en) | 2003-03-14 | 2021-08-10 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US10616644B2 (en) | 2003-03-14 | 2020-04-07 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content, or managed content |
US10664138B2 (en) * | 2003-03-14 | 2020-05-26 | Comcast Cable Communications, Llc | Providing supplemental content for a second screen experience |
US11785308B2 (en) | 2003-09-16 | 2023-10-10 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US10848830B2 (en) | 2003-09-16 | 2020-11-24 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US11765445B2 (en) | 2005-05-03 | 2023-09-19 | Comcast Cable Communications Management, Llc | Validation of content |
US10575070B2 (en) | 2005-05-03 | 2020-02-25 | Comcast Cable Communications Management, Llc | Validation of content |
US11272265B2 (en) | 2005-05-03 | 2022-03-08 | Comcast Cable Communications Management, Llc | Validation of content |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
USRE48546E1 (en) | 2011-06-14 | 2021-05-04 | Comcast Cable Communications, Llc | System and method for presenting content with time based metadata |
US20170041644A1 (en) * | 2011-06-14 | 2017-02-09 | Watchwith, Inc. | Metadata delivery system for rendering supplementary content |
US10306324B2 (en) | 2011-06-14 | 2019-05-28 | Comcast Cable Communication, Llc | System and method for presenting content with time based metadata |
US20170339462A1 (en) | 2011-06-14 | 2017-11-23 | Comcast Cable Communications, Llc | System And Method For Presenting Content With Time Based Metadata |
US20130191745A1 (en) * | 2012-01-10 | 2013-07-25 | Zane Vella | Interface for displaying supplemental dynamic timeline content |
US9301016B2 (en) | 2012-04-05 | 2016-03-29 | Facebook, Inc. | Sharing television and video programming through social networking |
US11126394B2 (en) * | 2012-05-01 | 2021-09-21 | Lisnr, Inc. | Systems and methods for content delivery and management |
US9805118B2 (en) * | 2012-06-29 | 2017-10-31 | Change Healthcare Llc | Transcription method, apparatus and computer program product |
US20140006020A1 (en) * | 2012-06-29 | 2014-01-02 | Mckesson Financial Holdings | Transcription method, apparatus and computer program product |
US10425671B2 (en) | 2012-08-31 | 2019-09-24 | Facebook, Inc. | Sharing television and video programming through social networking |
US9743157B2 (en) | 2012-08-31 | 2017-08-22 | Facebook, Inc. | Sharing television and video programming through social networking |
US20140067947A1 (en) * | 2012-08-31 | 2014-03-06 | Ime Archibong | Sharing Television and Video Programming Through Social Networking |
US10028005B2 (en) | 2012-08-31 | 2018-07-17 | Facebook, Inc. | Sharing television and video programming through social networking |
US10142681B2 (en) | 2012-08-31 | 2018-11-27 | Facebook, Inc. | Sharing television and video programming through social networking |
US9912987B2 (en) | 2012-08-31 | 2018-03-06 | Facebook, Inc. | Sharing television and video programming through social networking |
US10154297B2 (en) | 2012-08-31 | 2018-12-11 | Facebook, Inc. | Sharing television and video programming through social networking |
US10158899B2 (en) | 2012-08-31 | 2018-12-18 | Facebook, Inc. | Sharing television and video programming through social networking |
US9854303B2 (en) | 2012-08-31 | 2017-12-26 | Facebook, Inc. | Sharing television and video programming through social networking |
US9549227B2 (en) | 2012-08-31 | 2017-01-17 | Facebook, Inc. | Sharing television and video programming through social networking |
US10257554B2 (en) | 2012-08-31 | 2019-04-09 | Facebook, Inc. | Sharing television and video programming through social networking |
US9807454B2 (en) | 2012-08-31 | 2017-10-31 | Facebook, Inc. | Sharing television and video programming through social networking |
US9386354B2 (en) | 2012-08-31 | 2016-07-05 | Facebook, Inc. | Sharing television and video programming through social networking |
US10405020B2 (en) | 2012-08-31 | 2019-09-03 | Facebook, Inc. | Sharing television and video programming through social networking |
US9992534B2 (en) | 2012-08-31 | 2018-06-05 | Facebook, Inc. | Sharing television and video programming through social networking |
US9578390B2 (en) | 2012-08-31 | 2017-02-21 | Facebook, Inc. | Sharing television and video programming through social networking |
US9723373B2 (en) | 2012-08-31 | 2017-08-01 | Facebook, Inc. | Sharing television and video programming through social networking |
US9699485B2 (en) | 2012-08-31 | 2017-07-04 | Facebook, Inc. | Sharing television and video programming through social networking |
US9461954B2 (en) | 2012-08-31 | 2016-10-04 | Facebook, Inc. | Sharing television and video programming through social networking |
US9686337B2 (en) | 2012-08-31 | 2017-06-20 | Facebook, Inc. | Sharing television and video programming through social networking |
US9674135B2 (en) | 2012-08-31 | 2017-06-06 | Facebook, Inc. | Sharing television and video programming through social networking |
US9667584B2 (en) | 2012-08-31 | 2017-05-30 | Facebook, Inc. | Sharing television and video programming through social networking |
US9491133B2 (en) | 2012-08-31 | 2016-11-08 | Facebook, Inc. | Sharing television and video programming through social networking |
US9660950B2 (en) | 2012-08-31 | 2017-05-23 | Facebook, Inc. | Sharing television and video programming through social networking |
US9497155B2 (en) * | 2012-08-31 | 2016-11-15 | Facebook, Inc. | Sharing television and video programming through social networking |
US11361542B2 (en) | 2012-09-12 | 2022-06-14 | 2Mee Ltd | Augmented reality apparatus and method |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
US10880609B2 (en) | 2013-03-14 | 2020-12-29 | Comcast Cable Communications, Llc | Content event messaging |
US11601720B2 (en) | 2013-03-14 | 2023-03-07 | Comcast Cable Communications, Llc | Content event messaging |
US11070892B2 (en) * | 2013-03-14 | 2021-07-20 | The Nielsen Company (Us), Llc | Methods and apparatus to present supplemental media on a second screen |
US10868682B2 (en) | 2013-06-13 | 2020-12-15 | Pushfor, Ltd. | System and method for monitoring usage of an electronic document |
US10938883B2 (en) * | 2013-06-21 | 2021-03-02 | Tencent Technology (Shenzhen) Company Limited | Method and system for controlling media information display on multiple terminals |
US20150135071A1 (en) * | 2013-11-12 | 2015-05-14 | Fox Digital Entertainment, Inc. | Method and apparatus for distribution and presentation of audio visual data enhancements |
US20150178290A1 (en) * | 2013-12-25 | 2015-06-25 | Canon Kabushiki Kaisha | Display control apparatus, display control method, and computer-readable storage medium |
US11783842B2 (en) * | 2014-02-28 | 2023-10-10 | Comcast Cable Communications, Llc | Voice-enabled screen reader |
US11363325B2 (en) * | 2014-03-20 | 2022-06-14 | 2Mee Ltd | Augmented reality apparatus and method |
US11094131B2 (en) | 2014-06-10 | 2021-08-17 | 2Mee Ltd | Augmented reality apparatus and method |
US10324619B2 (en) | 2014-07-17 | 2019-06-18 | Facebook, Inc. | Touch-based gesture recognition and application navigation |
US10007419B2 (en) * | 2014-07-17 | 2018-06-26 | Facebook, Inc. | Touch-based gesture recognition and application navigation |
US20160018982A1 (en) * | 2014-07-17 | 2016-01-21 | Facebook, Inc. | Touch-Based Gesture Recognition and Application Navigation |
US11770446B2 (en) * | 2014-08-28 | 2023-09-26 | Ebay Inc. | Systems and methods for providing complementary content on linked machines |
US20230388377A1 (en) * | 2014-08-28 | 2023-11-30 | Ebay Inc. | Systems and methods for providing complementary content on linked machines |
US20160063015A1 (en) * | 2014-08-28 | 2016-03-03 | Kent Andrew Edmonds | Systems and methods for providing complimentary content on linked machines |
US11330319B2 (en) | 2014-10-15 | 2022-05-10 | Lisnr, Inc. | Inaudible signaling tone |
US11783382B2 (en) | 2014-10-22 | 2023-10-10 | Comcast Cable Communications, Llc | Systems and methods for curating content metadata |
US20170180293A1 (en) * | 2015-12-17 | 2017-06-22 | International Business Machines Corporation | Contextual temporal synchronization markers |
US20170264585A1 (en) * | 2016-02-26 | 2017-09-14 | Shanghai Hode Information Technology Co.,Ltd. | Method and apparatus for displaying comment information |
US10708215B2 (en) * | 2016-02-26 | 2020-07-07 | Shanghai Hode Information Technology Co., Ltd. | Method and apparatus for displaying comment information |
US10996845B2 (en) * | 2016-03-25 | 2021-05-04 | Alibaba Group Holding Limited | Method, application, browser, and electronic device for providing webpage content |
US20170330292A1 (en) * | 2016-05-16 | 2017-11-16 | Adobe Systems Incorporated | Correlator |
US11461806B2 (en) * | 2016-09-12 | 2022-10-04 | Sonobeacon Gmbh | Unique audio identifier synchronization system |
US10841531B2 (en) | 2018-10-18 | 2020-11-17 | Fujitsu Limited | Display control apparatus and display control method |
USD944281S1 (en) * | 2019-03-26 | 2022-02-22 | Facebook, Inc. | Display device with graphical user interface |
CN111352627A (en) * | 2020-02-27 | 2020-06-30 | 政采云有限公司 | Page skeleton screen generation method, device, equipment and readable storage medium |
US11470389B2 (en) * | 2020-11-23 | 2022-10-11 | The Boston Consulting Group, Inc. | Methods and systems for context-sensitive manipulation of an object via a presentation software |
US20220210506A1 (en) * | 2020-11-23 | 2022-06-30 | The Boston Consulting Group, Inc. | Methods And Systems For Context-Sensitive Manipulation of an Object via a Presentation Software |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140327677A1 (en) | Method and system for providing a graphical representation on a second screen of social messages related to content on a first screen | |
WO2012092247A1 (en) | Method and system for providing additional content related to a displayed content | |
US20140150023A1 (en) | Contextual user interface | |
US20130007793A1 (en) | Primary screen view control through kinetic ui framework | |
US20130054319A1 (en) | Methods and systems for presenting a three-dimensional media guidance application | |
US9782681B2 (en) | Methods and systems for controlling media guidance application operations during video gaming applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THOMSON LICENSING, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WALKER, MARK LEROY;REEL/FRAME:033259/0731 Effective date: 20130801 |
|
AS | Assignment |
Owner name: THOMSON LICENSING DTV, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THOMSON LICENSING;REEL/FRAME:041370/0433 Effective date: 20170113 |
|
AS | Assignment |
Owner name: THOMSON LICENSING DTV, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THOMSON LICENSING;REEL/FRAME:041378/0630 Effective date: 20170113 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |