US20130033642A1 - Data transmission across independent streams - Google Patents

Data transmission across independent streams Download PDF

Info

Publication number
US20130033642A1
US20130033642A1 US13/566,254 US201213566254A US2013033642A1 US 20130033642 A1 US20130033642 A1 US 20130033642A1 US 201213566254 A US201213566254 A US 201213566254A US 2013033642 A1 US2013033642 A1 US 2013033642A1
Authority
US
United States
Prior art keywords
related data
data components
components
synchronization
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/566,254
Other versions
US9001728B2 (en
Inventor
Wade Wan
Rajesh Mamidwar
Xuemin Chen
Marcus Kellerman
Brett Tischler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US13/566,254 priority Critical patent/US9001728B2/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TISHLER, BRETT, KELLERMAN, MARCUS, CHEN, XUEMIN, MAMIDWAR, RAJESH, WAN, WADE
Publication of US20130033642A1 publication Critical patent/US20130033642A1/en
Priority to US14/677,757 priority patent/US9538199B2/en
Application granted granted Critical
Publication of US9001728B2 publication Critical patent/US9001728B2/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED reassignment AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED MERGER (SEE DOCUMENT FOR DETAILS). Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED reassignment AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED CORRECTIVE ASSIGNMENT TO CORRECT THE EFFECTIVE DATE PREVIOUSLY RECORDED ON REEL 047229 FRAME 0408. ASSIGNOR(S) HEREBY CONFIRMS THE THE EFFECTIVE DATE IS 09/05/2018. Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED reassignment AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITED CORRECTIVE ASSIGNMENT TO CORRECT THE PATENT NUMBER 9,385,856 TO 9,385,756 PREVIOUSLY RECORDED AT REEL: 47349 FRAME: 001. ASSIGNOR(S) HEREBY CONFIRMS THE MERGER. Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/65Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using error resilience
    • H04N19/68Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using error resilience involving the insertion of resynchronisation markers into the bitstream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L45/00Routing or path finding of packets in data switching networks
    • H04L45/74Address processing for routing
    • H04L45/745Address table lookup; Address filtering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • H04N19/39Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability involving multiple description coding [MDC], i.e. with separate layers being structured as independently decodable descriptions of input picture data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/65Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using error resilience
    • H04N19/66Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using error resilience involving data partitioning, i.e. separation of data into packets or partitions according to importance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23602Multiplexing isochronously with the video sync, e.g. according to bit-parallel or bit-serial interface formats, as SDI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2383Channel coding or modulation of digital bit-stream, e.g. QPSK modulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
    • H04N21/4382Demodulation or channel decoding, e.g. QPSK demodulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet

Definitions

  • Transmission of program data is typically carried out over a single common layer. For example, audio and video content for the program is carried over the same transport stream.
  • the program data is synchronized within the same transport stream.
  • data for multiple programs is multiplexed for transport in a single stream.
  • the modulation of the transport stream is often optimized for the programs being transmitted.
  • FIGS. 1 and 2 are graphical representations of examples of systems for transmission of related data components across independent streams in accordance with various embodiments of the present disclosure.
  • FIG. 3 is a graphical representation of an example of a transmitting device of FIGS. 1 and 2 in accordance with various embodiments of the present disclosure.
  • FIG. 4 is a graphical representation of an example of a receiving device of FIG. 1 in accordance with various embodiments of the present disclosure.
  • FIG. 5 is a graphical representation of an example of a receiving device of FIG. 2 in accordance with various embodiments of the present disclosure.
  • FIGS. 6 and 7 are flowcharts illustrating examples of transmission of related data components across independent streams in accordance with various embodiments of the present disclosure.
  • Transmission is typically performed over a single bitstream that carries all of the data associated with a channel or a source.
  • high definition (HD) channels may be carried over the same layer using a quadrature amplitude modulation (QAM) that has been optimized for the existing data.
  • QAM quadrature amplitude modulation
  • the data may be separated into components that may be transmitted over different bitstreams and recombined after receipt.
  • related data such as audio, video, and/or other content (e.g., channel guides, closed captioning, encryption information, etc.) may be separated and transmitted on independent transport streams.
  • additional information may be included to enhance some or all of the existing channels.
  • Enhanced information or services may be transmitted over an independent stream without modifying the modulation (e.g., QAM) of the existing transport stream.
  • This may be applied to layered coding techniques such as, e.g., scalable video coding (SVC) where different temporal, spatial, and/or quality resolutions may be transported in separate layers. The separate data components may then be recombined for processing at the receiving end.
  • SVC scalable video coding
  • a transmitting device 103 sends one or more streams 106 of data to a receiving device 109 (e.g., a receiver or transceiver).
  • the transmitting device 103 may comprise suitable circuitry that may include, e.g., processor(s), application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive, process, and distribute data to the receiving device 109 via a bitstream such as the transport stream 106 .
  • the transmitting device 103 which may be included in a headend system, may be configured to provide various services such as, e.g., distribution, multicast, and/or quality of service for reliable and timely transmission of the data to the receiving device 109 .
  • the transmitting device 103 may utilize, for example, a cable TV network, a satellite broadcasting network, the Internet protocol (IP) data network such as the Internet, and/or a wireless communication network for delivery of the data to the receiving device 109 .
  • IP Internet protocol
  • Data from multiple channels 112 may be combined and modulated 115 for transmission through a common transport stream (e.g., QAM 1) 106 .
  • a common transport stream e.g., QAM 1
  • data components related to a channel and data components that are unrelated to the chancel e.g., data components related to another channel
  • information from a channel 112 may be separated for transmission over N different transport streams 106 .
  • the data may be separated into a base layer component 118 and an enhancement layer component 121 .
  • the base layer component 118 may be used to produce, e.g., standard or HD video using legacy decoders.
  • the enhanced layer component 121 can include additional information that, when combined with information in the base layer component 118 , may be used by more advanced (or enhanced) decoders to produce enhanced video for display.
  • the base layer component 118 may include channel information to support decoding of a 1080p24 HD video.
  • the enhancement layer component 121 may include additional channel information to support decoding the HD video at a higher resolution such as, e.g., 2K ⁇ 4K.
  • the enhancement layer component 121 may include additional channel information to support decoding the HD video at a different rate or format such as, e.g., 1080p60 or 1080p120. In either case, there may be multiple enhancement layers to support different display rates or resolutions.
  • the base layer component 118 may support decoding at 1080p24, a first enhancement layer component may allow decoding at 1080p60 or 1080i60, and a second enhancement layer component (not shown) may allow decoding at 1080p120 or 1080i120. These first and second enhancement layer components may be transmitted in the same or different transport streams 106 .
  • the enhancement layer components 121 may also include content other than audio and/or video information.
  • an enhancement layer component 121 may include guide information that may be associated with that specific channel or may be associated with multiple channels. If the guide information is associated with multiple channels, then a single enhancement layer component 121 including the guide information may be associated with each of the base layer components 118 for the channels.
  • the enhancement layer components 118 may include mosaic information associated with the base layer 118 .
  • the mosaic information may include a miniature video (e.g., a network icon, program advertisement, or information streamer) that may be decoded at the same time as the video content in the base layer component 118 and rendered over the base layer video.
  • Mosaic information may also include multiple small video components that may be decoded at the same time and recombined for display.
  • enhancement layer components may include information for multiple small videos that may be transported across multiple streams. The base layer and enhancement layer components may be decoded at the same time to provide the video output.
  • Other implementations may include one or more encryption keys or other encryption information in the enhancement layer component(s) 121 .
  • the base layer components 118 of four channels (A-D) are combined and modulated 115 in the transmitting device 103 before being sent in a first transport stream (e.g., QAM 1) 106 .
  • a first transport stream e.g., QAM 1
  • data components from more or fewer channels may be combined for a transport stream 106 .
  • Corresponding enhancement layer components 121 of the four channels (A-D) are also combined and modulated 115 in the transmitting device 103 before being sent in a second transport stream (e.g., QAM 2) 106 . While QAM is illustrated for the transport streams 106 of FIG. 1 , other modulation schemes may also be utilized.
  • the transport streams 106 are received and demodulated 124 by the receiving device 109 .
  • the receiving device 109 may comprise suitable circuitry that may include, e.g., processor(s), application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive and process the data components from the transmitting device 103 via a bitstream such as the transport stream 106 .
  • suitable circuitry may include, e.g., processor(s), application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive and process the data components from the transmitting device 103 via a bitstream such as the transport stream 106 .
  • the receiving device 109 e.g., a set-top box (STB)
  • STB set-top box
  • the receiving device 109 is configured demodulate 124 the received transport stream(s) and provide the appropriate data components to decoders 127 and/or 130 for decoding.
  • a legacy decoder 127 may only be able to decode content in the base layer component 118
  • a newer enhanced decoder 130 may be able to decode higher level or enhanced content in the enhancement layer components 121 . If only the basic HD video of a channel is desired, the base layer component 118 can be decoded by a legacy decoder 127 and the enhanced layer component 121 may be ignored.
  • the base layer component 118 and the enhanced layer component 121 are provided to a more advanced (or enhanced) decoder 130 for decoding.
  • an enhanced layer component 121 may be provided to a plurality of decoders for use in decoding.
  • the demodulation 124 may also allow concurrent demodulation of multiple channels in a transport stream 103 and provision of the demodulated data components to different decoders for decoding.
  • the separated components are recombined at the receiving end. Synchronization of the component data is needed to ensure proper decoding. For example, separating SVC components across different QAM transport streams can introduce issues such as, e.g., synchronization deviations across the different transport streams, properly defined encryption, guide, and mosaic information or data to allow for use of different streams when they typically are in the same stream, etc.
  • the transport streams are demodulated and the demodulated components synchronized. Synchronization of the data components may be accomplished using one or more synchronization tags including, e.g., timestamps and/or frame numbers. Formatting of the component data may also provide information that may be used for synchronization of the data components obtained from the separate streams.
  • FIG. 2 shown is another example of a system for transmission of related data components across independent streams.
  • the data components are sent to the receiving device 109 through a plurality of transport streams 106 .
  • a wideband tuner 224 demodulates components from multiple transport streams and distributes the information from one or more streams to the appropriate decoder(s) 127 and/or 130 , e.g., based upon its capabilities.
  • the wideband tuner 224 may be configured to combine the base and enhanced components before distributing the content to the appropriate decoder. This distribution of channel content allows simultaneous decoding by different decoders.
  • the transmitting device 103 comprises suitable circuitry that may include, e.g., encoder(s) 303 , processor(s) 306 , memory 309 , application specific hardware (ASIC), interfaces, and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to process the data before modulation and transmission of the data components via one or more transport streams 106 .
  • suitable circuitry may include, e.g., encoder(s) 303 , processor(s) 306 , memory 309 , application specific hardware (ASIC), interfaces, and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to process the data before modulation and transmission of the data components via one or more transport streams 106 .
  • the transmitting device 103 receives data 312 such as audio, video, and/or other content (e.g., channel guides, closed captioning, encryption information, etc.) for transmission and separates the received data 312 into related data components for transmission through the one or more transport streams 106 .
  • the transmitting device 103 receives the transmission data 312 and separates the received data 312 into data components.
  • the encoder 303 may be configured to encode the received data 312 into a SVC base layer component and one or more SVC enhancement layer components.
  • Other data components such as, e.g., guide information may be received by the transmitting device 103 as separate data 312 .
  • mosaic information may be separated into different data components and sent through different transport streams 106 .
  • the transmitting device 103 may also be configured add a synchronization tag to the separated data components that may be used for synchronization of related components that are sent via different streams 106 .
  • the synchronization tag may be placed in a predefined location in the header information of the data components.
  • the synchronization tag may include a time stamp that may be used for synchronization of the separated data components.
  • the program clock reference (PCR) for each transport stream 106 may be used for synchronization.
  • the PCR for the different streams 106 may be synchronized by the transmitting device 103 and a time stamp corresponding to the synchronized PCRs may be added to each of the separated components. In other implementations, the clocks may not be synchronized.
  • the PCR associated with the base layer component may be used as the master clock and offset corrections may be determined by the transmitting device 103 for the PCRs associated with the other streams 106 .
  • the offset correction value may be included in the synchronization tag of the data component corresponding to the transport stream 106 .
  • the differences in the PCRs may then be compensated using the offset correction values when the transmitted data components are received in the receiving device 109 .
  • the synchronization tag includes a frame identifier in the separated data components to indicate the relationship between the different components.
  • the frame identifiers comprise a corresponding frame number and/or a channel or program identifier.
  • the channel or program identifier may be stored in a program association table or a program mapping table in the transmitting device 103 and the receiving device 109 .
  • the frame identifier of the common data component may include an identification code that may be used to determine its association with each of the other data components.
  • the identification code may be included in the other data components or may include information that may be used to determine the other data components.
  • the identification code may indicate which of the other transport streams 106 include the related data components.
  • the separated data components may then be sent to a plurality of multiplexers 315 , which are configured to merge the data components for modulation 115 and transmission to the receiving device 109 via different transport streams 106 .
  • the transmitting device 103 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to merge and modulate the data components for transmission.
  • the modulated data 318 is then transmitted to the receiving device 109 in the corresponding transport stream 106 . As illustrated in the example of FIG.
  • a plurality of base layer components 118 are merged by a first multiplexer 315 for modulation 115 and transmission via a first transport stream (QAM 1) 106 and a plurality of enhanced layer components 121 are merged by a second multiplexer 315 for modulation 115 and transmission via a second transport stream (QAM 2) 106 .
  • FIG. 1 shows the base and enhanced layer components 118 and 121 being sent through separate streams 106
  • a combination of base and enhanced layer components 118 and 121 may be merged by a multiplexer 314 for modulation 115 and transmission via a transport stream 106 .
  • the modulation 115 of the transmitting device 103 may also include one or more buffer(s) to correct for slight differences between clock speeds of the modulation 115 .
  • the buffer(s) may be at the modulation output. In the ideal case, all of the modulation 115 is performed at the same clock speed so there is no long term drift. In reality, slight differences exist between the modulation speeds (e.g., modulation 115 associated with one transport stream 106 is running at 27 MHz and modulation 115 associated with another transport stream 106 is running at 27+ ⁇ MHz) so that over time a long term drift across the transport streams 106 can occur. This may compromise the ability of the receiving device 109 to resynchronize the related data components for decoding using the synchronization tags.
  • the transmitting device 103 may utilize the modulation output buffer(s) and monitor the data flow across the transport streams 106 . The transmitting device 103 may then adjust the flow over the streams 106 to keep the transmission of the related data components close to each other.
  • the transmitting device 103 may be configured to correct for the drift by adjusting the modulation rate by reducing (or increasing) the clock speed if more (or less) data is being modulated for transmission on the corresponding transport stream 106 compared to the other streams 106 .
  • the transmitting device 103 can monitor the condition of the modulation buffer(s) by, e.g., monitoring the buffer levels or the rate at which data is removed from the buffer(s) and adjusting the modulation rate accordingly.
  • the receiving device 109 comprises suitable circuitry that may include, e.g., decoder(s) 127 and 130 , processor(s), memory, application specific hardware (ASIC), interfaces, and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to process the data components for decoding.
  • the receiving device 109 receives and demodulates 124 each of the plurality of transport streams 106 including the data components.
  • each transport stream 106 is separately demodulated 124 before sending to a demultiplexer 403 to separate the different data components in the transport stream 106 .
  • the demultiplexer 403 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to separate the data components for decoding by the appropriated decoder 127 and/or 130 .
  • the receiving device 109 is configured to identify related data components based at least in part upon the synchronization tag included in the data component.
  • the synchronization tag may be obtained from the data components and used for the identification of related data components.
  • the program clock reference (PCR) for the different streams 106 may be synchronized by the transmitting device 103 and a time stamp corresponding to the synchronized PCRs may be included in each of the related data components.
  • the receiving device 109 may identify related data components with the same time stamp.
  • the clocks for each transport steam 106 may not be synchronized.
  • the transmitting device 103 may use the clock for one transport stream 106 as the master and provide offset corrections for the other clocks.
  • the PCR associated with the base layer component may be used as the master clock and offset corrections for the PCRs associated with the other streams 106 may be included in the corresponding data components.
  • the receiving device 109 may use the offset correction value included in the synchronization tag of the data component and the PCR of the corresponding transport stream 106 to identify data components that are related.
  • the synchronization tag includes a frame identifier in the data components to indicate the relationship between the components.
  • the frame identifiers may comprise a corresponding frame number, a channel or program identifier, and/or other relationship information.
  • the receiving device 109 may use the channel or program identifier to identify related data components.
  • a data component is related to a plurality of other data components in different channels or to multiple data components in the same channel (e.g., guide information that may be related to multiple channels or encryption keys that may be used to decrypt the base layer component and the corresponding enhancement layer components)
  • the frame identifier of the common data component e.g., guide or encryption information
  • the identification code may be included in the other data components or may include information that may be used to determine the other data components.
  • the identification code may indicate which of the other transport streams 106 include the related data components.
  • the receiving device 109 may include a program association table or a program mapping table that may be accessed by the receiving device 109 for identification of related data components. When the synchronizing tag does not include a time stamp or the time stamps may not be relied upon, then receiving device 109 may use information from the frame identifier to synchronize the related data components.
  • the frame identifier may include information such as, e.g., the number of related data components and/or an indication of the relationship between the related data components (e.g., base layer, enhancement layer, encryption, guide, or mosaic component).
  • the relationship information may also be used to determine which decoder(s) 127 and/or 130 may receive the data component for decoding.
  • the demultiplexers 403 may route the related data components to the appropriate decoder. For example, as illustrated in FIG. 4 , a base layer component may be sent to a legacy decoder 127 to produce, e.g., standard or HD video.
  • An enhanced layer component can include additional information that, when combined with information in the base layer component, may be used by more advanced (or enhanced) decoders to produce enhanced video with a higher resolution, a higher rate, etc.
  • the base layer component and one or more enhanced layer components can be routed to an enhanced decoder 130 for decoding.
  • the decoder 130 may be configured to decode a SVC base layer component and one or more SVC enhancement layer components to generate video with higher temporal, spatial, and/or quality resolutions than can be produced from the base layer alone. While the example of FIG. 4 depicts two demultiplexers 403 and two decoders 127 and 130 , additional demultiplexers 403 and decoders 127 and/or 130 may be included.
  • Legacy decoders 127 may also be configured to receive and process related data components. For instance, a related data component including encryption or guide information may be routed to a legacy decoder 127 for use with the base layer component.
  • the legacy and enhanced (or advanced) decoders 127 and 130 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to decode the data components and provide the decoded data 406 for rendering or further processing.
  • the demultiplexers 403 and/or decoders 127 and 130 may also include a buffer to store related data components to allow for variations in routing times between the data components.
  • the receiving device 109 may configured to control the routing of related data components from the demultiplexers 403 to coordinate the arrival of the related data components at the decoder 127 or 130 . If a delay occurs between related data components reaching the appropriate decoder, the buffer may be utilized to adjust for the delay.
  • the receiving device 109 includes a wideband tuner 224 and legacy and enhanced decoders 127 and 130 .
  • the wideband tuner 224 may comprise suitable circuitry that may include, e.g., processor(s) 403 , memory 406 , application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive, process, and distribute the data components received via the multiple transport streams 106 .
  • suitable circuitry may include, e.g., processor(s) 403 , memory 406 , application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive, process, and distribute the data components received via the multiple transport streams 106 .
  • ASIC application specific hardware
  • the wideband tuner 224 receives and demodulates 409 the plurality of transport streams 106 including the data components and separates the data components by, e.g., demultiplexing 412 .
  • the receiving device 109 is configured to identify related data components based at least in part upon the synchronization tag included in the data component as previously described. In some embodiments, identification of the related data components is carried out by the wideband tuner 224 .
  • the wideband tuner 224 distributes the related data components to the appropriate decoder(s) 127 and/or 130 , e.g., based upon its capabilities. Related components may be combined by the wideband tuner 224 before distributing the content to the appropriate decoder. In some implementations, related data components from different channels may be simultaneously provided to different decoders for decoding.
  • FIG. 6 shown is a flowchart illustrating an example of transmission of related data components across independent streams in a transmitting device 103 .
  • the transmission data may include audio, video, and/or other content such as, e.g., channel guides, closed captioning, encryption information, etc.
  • the transmission data is separated into related data components for transmission to a receiving device 109 via different streams 106 ( FIGS. 1 and 2 ).
  • HD video content received by the transmission device 103 may be separated into a base layer component and one or more enhanced layer components during encoding for transmission over multiple transport streams 106 .
  • related data components may include a video content, audio content, and guide information. Encryption information corresponding to the content of the related data components may also be included as another related data component.
  • mosaic information may be sent over a plurality of transport streams 106 .
  • the video content may be divided into smaller video portions that may be transmitted via different transport streams 106 .
  • the smaller video portions may be encoded in parallel and sent to the receiving device 109 as related data components.
  • a synchronization tag is included in each of the related data components by the transmitting device 103 .
  • the synchronization tag may include, e.g., a time stamp, an offset correction value, a frame identifier, and/or other information that may be used for synchronization of the related data components by the receiving device 109 .
  • the related data components are then transmitted in different streams in block 612 .
  • the related data components may be merged with other unrelated and/or related data components (e.g., by multiplexing) and modulated for transmission to the receiving device 109 .
  • a quadrature amplitude modulation (QAM) or other appropriate modulation of the merged data components may be used.
  • FIG. 7 shown is a flowchart illustrating an example of transmission of related data components across independent streams in a receiving device 109 .
  • multiple transport streams 106 including data components are received by the receiving device 109 in block 703 .
  • the received transport streams 106 may be demodulated and the separated into the data components by demultiplexing.
  • the demodulation and demultiplexing may be performed by a multiband tuner 224 of the receiving device 109 .
  • the data components are identified by the receiving device 109 .
  • Related data components may be identified based at least in part upon the synchronization tag included in each of the related data components.
  • the related components from different transport streams 106 are then routed (block 709 ) to the appropriate decoder for decoding in block 712 .
  • the routing 709 and decoding 712 may be based at least in part upon the synchronization tag.
  • Information such as, e.g., a time stamp, an offset correction value, a frame identifier, and/or other information that may be used for synchronization of the related data components by the receiving device 109 .
  • mosaic information sent over a plurality of transport streams 106 may be decoded and recombined to reform a video based at least in part upon the synchronization tag.
  • the smaller video portions may be decoded in parallel and reformed to provide the video data for rendering on a display device.
  • encryption information included in a related data component may be used to process the related data component.
  • the routing may be based upon the capabilities of the decoder.
  • SVC base layer and enhanced layer components may be routed to an enhanced (or advanced) decoder 130 ( FIGS. 1 and 2 ) that may be capable of decoding the video content at higher resolutions.
  • Legacy decoders 127 FIGS. 1 and 2 ) that are not capable of utilizing the enhanced layer information may only receive the base layer component.
  • Other related data components including, e.g., channel guide information, closed captioning, encryption information, etc. may also be routed to a legacy decoder 127 and/or an enhanced decoder 130 if they are capable of decoding and providing the content for rendering or processing.
  • each block may represent a module, segment, or portion of code that comprises program instructions to implement the specified logical function(s).
  • the program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a transmitting device 103 or receiving device 109 .
  • the machine code may be converted from the source code, etc.
  • each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
  • FIGS. 6 and 7 show a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession in FIGS. 6 and 7 may be executed concurrently or with partial concurrence. Further, in some embodiments, one or more of the blocks shown in FIGS. 6 and 7 may be skipped or omitted. In addition, any number of counters, state variables, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure.
  • any code or application described herein that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor a transmitting device 103 or receiving device 109 .
  • the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system.
  • a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system.
  • the computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media.
  • a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs.
  • the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM).
  • the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.

Abstract

Various systems and methods are provided for transmission of related data components across independent streams. In one embodiment, among others, a transmitting device may separate transmission data into related data components and transmit each related data component in an associated transport stream. Each related data component includes a synchronization tag associated with synchronization of the related data component within the transmission data. In another embodiment, a receiving device may receive related data components transmitted in separate transport streams and decode the related data components based at least in part upon a synchronization tag included in each related data component. In another embodiment, among others, a method for includes receiving data components transmitted on a plurality of transport streams, separating related data components from unrelated data components in the transport streams based at least in part upon a synchronization tag of each related data component; and decoding the related data components.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to copending U.S. provisional application entitled “DATA TRANSMISSION ACROSS INDEPENDENT STREAMS” having Ser. No. 61/515,543, filed Aug. 5, 2011, the entirety of which is hereby incorporated by reference.
  • BACKGROUND
  • Transmission of program data is typically carried out over a single common layer. For example, audio and video content for the program is carried over the same transport stream. The program data is synchronized within the same transport stream. In many cases, data for multiple programs is multiplexed for transport in a single stream. The modulation of the transport stream is often optimized for the programs being transmitted.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
  • FIGS. 1 and 2 are graphical representations of examples of systems for transmission of related data components across independent streams in accordance with various embodiments of the present disclosure.
  • FIG. 3 is a graphical representation of an example of a transmitting device of FIGS. 1 and 2 in accordance with various embodiments of the present disclosure.
  • FIG. 4 is a graphical representation of an example of a receiving device of FIG. 1 in accordance with various embodiments of the present disclosure.
  • FIG. 5 is a graphical representation of an example of a receiving device of FIG. 2 in accordance with various embodiments of the present disclosure.
  • FIGS. 6 and 7 are flowcharts illustrating examples of transmission of related data components across independent streams in accordance with various embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • Disclosed herein are various embodiments of methods related to transmission of related data components across independent streams. Reference will now be made in detail to the description of the embodiments as illustrated in the drawings, wherein like reference numbers indicate like parts throughout the several views.
  • Transmission is typically performed over a single bitstream that carries all of the data associated with a channel or a source. For example, six high definition (HD) channels may be carried over the same layer using a quadrature amplitude modulation (QAM) that has been optimized for the existing data. However, the data may be separated into components that may be transmitted over different bitstreams and recombined after receipt. For example, related data such as audio, video, and/or other content (e.g., channel guides, closed captioning, encryption information, etc.) may be separated and transmitted on independent transport streams. In some cases, additional information may be included to enhance some or all of the existing channels. Enhanced information or services may be transmitted over an independent stream without modifying the modulation (e.g., QAM) of the existing transport stream. This may be applied to layered coding techniques such as, e.g., scalable video coding (SVC) where different temporal, spatial, and/or quality resolutions may be transported in separate layers. The separate data components may then be recombined for processing at the receiving end.
  • Referring to FIG. 1, shown is an example of a system for transmission of related data components across independent streams. In the example of FIG. 1, a transmitting device 103 (e.g., a transmitter or transceiver) sends one or more streams 106 of data to a receiving device 109 (e.g., a receiver or transceiver). The transmitting device 103 may comprise suitable circuitry that may include, e.g., processor(s), application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive, process, and distribute data to the receiving device 109 via a bitstream such as the transport stream 106. The transmitting device 103, which may be included in a headend system, may be configured to provide various services such as, e.g., distribution, multicast, and/or quality of service for reliable and timely transmission of the data to the receiving device 109. The transmitting device 103 may utilize, for example, a cable TV network, a satellite broadcasting network, the Internet protocol (IP) data network such as the Internet, and/or a wireless communication network for delivery of the data to the receiving device 109.
  • Data from multiple channels 112 may be combined and modulated 115 for transmission through a common transport stream (e.g., QAM 1) 106. In this way, data components related to a channel and data components that are unrelated to the chancel (e.g., data components related to another channel) may be sent via the same transport stream 106. In some cases, information from a channel 112 may be separated for transmission over N different transport streams 106. For instance, in multi-tiered video coding such as SVC, the data may be separated into a base layer component 118 and an enhancement layer component 121. The base layer component 118 may be used to produce, e.g., standard or HD video using legacy decoders. The enhanced layer component 121 can include additional information that, when combined with information in the base layer component 118, may be used by more advanced (or enhanced) decoders to produce enhanced video for display. For example, the base layer component 118 may include channel information to support decoding of a 1080p24 HD video. In some embodiments, the enhancement layer component 121 may include additional channel information to support decoding the HD video at a higher resolution such as, e.g., 2K×4K. In other embodiments, the enhancement layer component 121 may include additional channel information to support decoding the HD video at a different rate or format such as, e.g., 1080p60 or 1080p120. In either case, there may be multiple enhancement layers to support different display rates or resolutions. For instance, the base layer component 118 may support decoding at 1080p24, a first enhancement layer component may allow decoding at 1080p60 or 1080i60, and a second enhancement layer component (not shown) may allow decoding at 1080p120 or 1080i120. These first and second enhancement layer components may be transmitted in the same or different transport streams 106.
  • The enhancement layer components 121 may also include content other than audio and/or video information. For instance, an enhancement layer component 121 may include guide information that may be associated with that specific channel or may be associated with multiple channels. If the guide information is associated with multiple channels, then a single enhancement layer component 121 including the guide information may be associated with each of the base layer components 118 for the channels. In other implementations, the enhancement layer components 118 may include mosaic information associated with the base layer 118. For instance, the mosaic information may include a miniature video (e.g., a network icon, program advertisement, or information streamer) that may be decoded at the same time as the video content in the base layer component 118 and rendered over the base layer video. Mosaic information may also include multiple small video components that may be decoded at the same time and recombined for display. In other implementations, enhancement layer components may include information for multiple small videos that may be transported across multiple streams. The base layer and enhancement layer components may be decoded at the same time to provide the video output. Other implementations may include one or more encryption keys or other encryption information in the enhancement layer component(s) 121.
  • In FIG. 1, the base layer components 118 of four channels (A-D) are combined and modulated 115 in the transmitting device 103 before being sent in a first transport stream (e.g., QAM 1) 106. In other embodiments, data components from more or fewer channels may be combined for a transport stream 106. Corresponding enhancement layer components 121 of the four channels (A-D) are also combined and modulated 115 in the transmitting device 103 before being sent in a second transport stream (e.g., QAM 2) 106. While QAM is illustrated for the transport streams 106 of FIG. 1, other modulation schemes may also be utilized. The transport streams 106 are received and demodulated 124 by the receiving device 109. The receiving device 109 may comprise suitable circuitry that may include, e.g., processor(s), application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive and process the data components from the transmitting device 103 via a bitstream such as the transport stream 106.
  • In the example of FIG. 1, the receiving device 109 (e.g., a set-top box (STB)) is configured demodulate 124 the received transport stream(s) and provide the appropriate data components to decoders 127 and/or 130 for decoding. For example, a legacy decoder 127 may only be able to decode content in the base layer component 118, while a newer enhanced decoder 130 may be able to decode higher level or enhanced content in the enhancement layer components 121. If only the basic HD video of a channel is desired, the base layer component 118 can be decoded by a legacy decoder 127 and the enhanced layer component 121 may be ignored. However, if enhanced video is desired, then the base layer component 118 and the enhanced layer component 121 are provided to a more advanced (or enhanced) decoder 130 for decoding. In some implementations, an enhanced layer component 121 may be provided to a plurality of decoders for use in decoding. The demodulation 124 may also allow concurrent demodulation of multiple channels in a transport stream 103 and provision of the demodulated data components to different decoders for decoding.
  • The separated components are recombined at the receiving end. Synchronization of the component data is needed to ensure proper decoding. For example, separating SVC components across different QAM transport streams can introduce issues such as, e.g., synchronization deviations across the different transport streams, properly defined encryption, guide, and mosaic information or data to allow for use of different streams when they typically are in the same stream, etc. The transport streams are demodulated and the demodulated components synchronized. Synchronization of the data components may be accomplished using one or more synchronization tags including, e.g., timestamps and/or frame numbers. Formatting of the component data may also provide information that may be used for synchronization of the data components obtained from the separate streams.
  • Referring to FIG. 2, shown is another example of a system for transmission of related data components across independent streams. As in FIG. 1, the data components are sent to the receiving device 109 through a plurality of transport streams 106. In the example of FIG. 2, a wideband tuner 224 demodulates components from multiple transport streams and distributes the information from one or more streams to the appropriate decoder(s) 127 and/or 130, e.g., based upon its capabilities. The wideband tuner 224 may be configured to combine the base and enhanced components before distributing the content to the appropriate decoder. This distribution of channel content allows simultaneous decoding by different decoders.
  • Referring next to FIG. 3, shown is an example of a transmitting device 103 for transmission of related data components across independent streams 106. The transmitting device 103 comprises suitable circuitry that may include, e.g., encoder(s) 303, processor(s) 306, memory 309, application specific hardware (ASIC), interfaces, and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to process the data before modulation and transmission of the data components via one or more transport streams 106. The transmitting device 103 receives data 312 such as audio, video, and/or other content (e.g., channel guides, closed captioning, encryption information, etc.) for transmission and separates the received data 312 into related data components for transmission through the one or more transport streams 106. The transmitting device 103 receives the transmission data 312 and separates the received data 312 into data components. For example, the encoder 303 may be configured to encode the received data 312 into a SVC base layer component and one or more SVC enhancement layer components. Other data components such as, e.g., guide information may be received by the transmitting device 103 as separate data 312. In other embodiments, mosaic information may be separated into different data components and sent through different transport streams 106.
  • The transmitting device 103 may also be configured add a synchronization tag to the separated data components that may be used for synchronization of related components that are sent via different streams 106. For example, the synchronization tag may be placed in a predefined location in the header information of the data components. The synchronization tag may include a time stamp that may be used for synchronization of the separated data components. For example, the program clock reference (PCR) for each transport stream 106 may be used for synchronization. The PCR for the different streams 106 may be synchronized by the transmitting device 103 and a time stamp corresponding to the synchronized PCRs may be added to each of the separated components. In other implementations, the clocks may not be synchronized. For example, the PCR associated with the base layer component may be used as the master clock and offset corrections may be determined by the transmitting device 103 for the PCRs associated with the other streams 106. The offset correction value may be included in the synchronization tag of the data component corresponding to the transport stream 106. The differences in the PCRs may then be compensated using the offset correction values when the transmitted data components are received in the receiving device 109.
  • In some embodiments, the synchronization tag includes a frame identifier in the separated data components to indicate the relationship between the different components. In some implementations, the frame identifiers comprise a corresponding frame number and/or a channel or program identifier. In some cases, the channel or program identifier may be stored in a program association table or a program mapping table in the transmitting device 103 and the receiving device 109. Where a data component is related to a plurality of other components (e.g., guide information that may be related to multiple channels or encryption keys that may be used to decrypt the base layer component and the corresponding enhancement layer components), the frame identifier of the common data component (e.g., guide or encryption information) may include an identification code that may be used to determine its association with each of the other data components. The identification code may be included in the other data components or may include information that may be used to determine the other data components. For example, the identification code may indicate which of the other transport streams 106 include the related data components. When the synchronizing tag does not include a time stamp or the time stamps may not be relied upon, then the information of the frame identifier may be used to synchronize the related data components.
  • The separated data components may then be sent to a plurality of multiplexers 315, which are configured to merge the data components for modulation 115 and transmission to the receiving device 109 via different transport streams 106. The transmitting device 103 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to merge and modulate the data components for transmission. The modulated data 318 is then transmitted to the receiving device 109 in the corresponding transport stream 106. As illustrated in the example of FIG. 1, a plurality of base layer components 118 are merged by a first multiplexer 315 for modulation 115 and transmission via a first transport stream (QAM 1) 106 and a plurality of enhanced layer components 121 are merged by a second multiplexer 315 for modulation 115 and transmission via a second transport stream (QAM 2) 106. While the example of FIG. 1 shows the base and enhanced layer components 118 and 121 being sent through separate streams 106, in other implementations a combination of base and enhanced layer components 118 and 121 may be merged by a multiplexer 314 for modulation 115 and transmission via a transport stream 106.
  • The modulation 115 of the transmitting device 103 may also include one or more buffer(s) to correct for slight differences between clock speeds of the modulation 115. In some implementations, the buffer(s) may be at the modulation output. In the ideal case, all of the modulation 115 is performed at the same clock speed so there is no long term drift. In reality, slight differences exist between the modulation speeds (e.g., modulation 115 associated with one transport stream 106 is running at 27 MHz and modulation 115 associated with another transport stream 106 is running at 27+Δ MHz) so that over time a long term drift across the transport streams 106 can occur. This may compromise the ability of the receiving device 109 to resynchronize the related data components for decoding using the synchronization tags. Over time, the long term drift across the transport streams 106 results in a delay between the related data components that, no matter how large the buffers are at the receiving device 109, can eventually prevent matching the related data components for decoding. To compensate, the transmitting device 103 may utilize the modulation output buffer(s) and monitor the data flow across the transport streams 106. The transmitting device 103 may then adjust the flow over the streams 106 to keep the transmission of the related data components close to each other. For example, the transmitting device 103 may be configured to correct for the drift by adjusting the modulation rate by reducing (or increasing) the clock speed if more (or less) data is being modulated for transmission on the corresponding transport stream 106 compared to the other streams 106. For instance, the transmitting device 103 can monitor the condition of the modulation buffer(s) by, e.g., monitoring the buffer levels or the rate at which data is removed from the buffer(s) and adjusting the modulation rate accordingly.
  • Referring to FIG. 4, shown is an example of a receiving device 109 of FIG. 1 for transmission of related data components across independent streams 106. The receiving device 109 comprises suitable circuitry that may include, e.g., decoder(s) 127 and 130, processor(s), memory, application specific hardware (ASIC), interfaces, and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to process the data components for decoding. The receiving device 109 receives and demodulates 124 each of the plurality of transport streams 106 including the data components. In the example of FIG. 4, each transport stream 106 is separately demodulated 124 before sending to a demultiplexer 403 to separate the different data components in the transport stream 106. The demultiplexer 403 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to separate the data components for decoding by the appropriated decoder 127 and/or 130.
  • The receiving device 109 is configured to identify related data components based at least in part upon the synchronization tag included in the data component. The synchronization tag may be obtained from the data components and used for the identification of related data components. For example, the program clock reference (PCR) for the different streams 106 may be synchronized by the transmitting device 103 and a time stamp corresponding to the synchronized PCRs may be included in each of the related data components. By comparing the time stamps of data components from different demultiplexers 403, the receiving device 109 may identify related data components with the same time stamp. In other implementations, the clocks for each transport steam 106 may not be synchronized. In that case, the transmitting device 103 may use the clock for one transport stream 106 as the master and provide offset corrections for the other clocks. For example, the PCR associated with the base layer component may be used as the master clock and offset corrections for the PCRs associated with the other streams 106 may be included in the corresponding data components. The receiving device 109 may use the offset correction value included in the synchronization tag of the data component and the PCR of the corresponding transport stream 106 to identify data components that are related.
  • In some embodiments, the synchronization tag includes a frame identifier in the data components to indicate the relationship between the components. The frame identifiers may comprise a corresponding frame number, a channel or program identifier, and/or other relationship information. The receiving device 109 may use the channel or program identifier to identify related data components. Where a data component is related to a plurality of other data components in different channels or to multiple data components in the same channel (e.g., guide information that may be related to multiple channels or encryption keys that may be used to decrypt the base layer component and the corresponding enhancement layer components), the frame identifier of the common data component (e.g., guide or encryption information) may include an identification code that may be used to determine its association with each of the other data components. The identification code may be included in the other data components or may include information that may be used to determine the other data components. For example, the identification code may indicate which of the other transport streams 106 include the related data components. In some cases, the receiving device 109 may include a program association table or a program mapping table that may be accessed by the receiving device 109 for identification of related data components. When the synchronizing tag does not include a time stamp or the time stamps may not be relied upon, then receiving device 109 may use information from the frame identifier to synchronize the related data components. For example, the frame identifier may include information such as, e.g., the number of related data components and/or an indication of the relationship between the related data components (e.g., base layer, enhancement layer, encryption, guide, or mosaic component). The relationship information may also be used to determine which decoder(s) 127 and/or 130 may receive the data component for decoding.
  • When the receiving device 109 identifies a data component, the demultiplexers 403 may route the related data components to the appropriate decoder. For example, as illustrated in FIG. 4, a base layer component may be sent to a legacy decoder 127 to produce, e.g., standard or HD video. An enhanced layer component can include additional information that, when combined with information in the base layer component, may be used by more advanced (or enhanced) decoders to produce enhanced video with a higher resolution, a higher rate, etc. As shown in FIG. 4, the base layer component and one or more enhanced layer components can be routed to an enhanced decoder 130 for decoding. For example, the decoder 130 may be configured to decode a SVC base layer component and one or more SVC enhancement layer components to generate video with higher temporal, spatial, and/or quality resolutions than can be produced from the base layer alone. While the example of FIG. 4 depicts two demultiplexers 403 and two decoders 127 and 130, additional demultiplexers 403 and decoders 127 and/or 130 may be included.
  • Legacy decoders 127 may also be configured to receive and process related data components. For instance, a related data component including encryption or guide information may be routed to a legacy decoder 127 for use with the base layer component. The legacy and enhanced (or advanced) decoders 127 and 130 may comprise suitable circuitry and/or code executed by hardware included in the circuitry (e.g., a processor) configured to decode the data components and provide the decoded data 406 for rendering or further processing. The demultiplexers 403 and/or decoders 127 and 130 may also include a buffer to store related data components to allow for variations in routing times between the data components. For example, the receiving device 109 may configured to control the routing of related data components from the demultiplexers 403 to coordinate the arrival of the related data components at the decoder 127 or 130. If a delay occurs between related data components reaching the appropriate decoder, the buffer may be utilized to adjust for the delay.
  • Referring next to FIG. 5, shown is an example of a receiving device 109 of FIG. 2 for transmission of related data components across independent streams 106. In the example of FIG. 5, the receiving device 109 includes a wideband tuner 224 and legacy and enhanced decoders 127 and 130. The wideband tuner 224 may comprise suitable circuitry that may include, e.g., processor(s) 403, memory 406, application specific hardware (ASIC), interfaces and/or other components as well as code executed by hardware included in the circuitry (e.g., a processor) that may be configured to receive, process, and distribute the data components received via the multiple transport streams 106. The wideband tuner 224 receives and demodulates 409 the plurality of transport streams 106 including the data components and separates the data components by, e.g., demultiplexing 412. The receiving device 109 is configured to identify related data components based at least in part upon the synchronization tag included in the data component as previously described. In some embodiments, identification of the related data components is carried out by the wideband tuner 224. The wideband tuner 224 distributes the related data components to the appropriate decoder(s) 127 and/or 130, e.g., based upon its capabilities. Related components may be combined by the wideband tuner 224 before distributing the content to the appropriate decoder. In some implementations, related data components from different channels may be simultaneously provided to different decoders for decoding.
  • Referring now to FIG. 6, shown is a flowchart illustrating an example of transmission of related data components across independent streams in a transmitting device 103. Initially, data is received by the transmitting device 103 for transmission in block 603. The transmission data may include audio, video, and/or other content such as, e.g., channel guides, closed captioning, encryption information, etc. In block 606, the transmission data is separated into related data components for transmission to a receiving device 109 via different streams 106 (FIGS. 1 and 2). For example, HD video content received by the transmission device 103 may be separated into a base layer component and one or more enhanced layer components during encoding for transmission over multiple transport streams 106. In other implementations, related data components may include a video content, audio content, and guide information. Encryption information corresponding to the content of the related data components may also be included as another related data component. In some implementations, mosaic information may be sent over a plurality of transport streams 106. For example, the video content may be divided into smaller video portions that may be transmitted via different transport streams 106. The smaller video portions may be encoded in parallel and sent to the receiving device 109 as related data components.
  • In block 609, a synchronization tag is included in each of the related data components by the transmitting device 103. The synchronization tag may include, e.g., a time stamp, an offset correction value, a frame identifier, and/or other information that may be used for synchronization of the related data components by the receiving device 109. The related data components are then transmitted in different streams in block 612. The related data components may be merged with other unrelated and/or related data components (e.g., by multiplexing) and modulated for transmission to the receiving device 109. A quadrature amplitude modulation (QAM) or other appropriate modulation of the merged data components may be used.
  • Referring next to FIG. 7, shown is a flowchart illustrating an example of transmission of related data components across independent streams in a receiving device 109. Initially, multiple transport streams 106 including data components are received by the receiving device 109 in block 703. The received transport streams 106 may be demodulated and the separated into the data components by demultiplexing. In some embodiments, the demodulation and demultiplexing may be performed by a multiband tuner 224 of the receiving device 109. In block 706, the data components are identified by the receiving device 109. Related data components may be identified based at least in part upon the synchronization tag included in each of the related data components. The related components from different transport streams 106 are then routed (block 709) to the appropriate decoder for decoding in block 712.
  • The routing 709 and decoding 712 may be based at least in part upon the synchronization tag. Information such as, e.g., a time stamp, an offset correction value, a frame identifier, and/or other information that may be used for synchronization of the related data components by the receiving device 109. For example, mosaic information sent over a plurality of transport streams 106 may be decoded and recombined to reform a video based at least in part upon the synchronization tag. The smaller video portions may be decoded in parallel and reformed to provide the video data for rendering on a display device. In some cases, encryption information included in a related data component may be used to process the related data component. In addition, the routing may be based upon the capabilities of the decoder. For example, SVC base layer and enhanced layer components may be routed to an enhanced (or advanced) decoder 130 (FIGS. 1 and 2) that may be capable of decoding the video content at higher resolutions. Legacy decoders 127 (FIGS. 1 and 2) that are not capable of utilizing the enhanced layer information may only receive the base layer component. Other related data components including, e.g., channel guide information, closed captioning, encryption information, etc. may also be routed to a legacy decoder 127 and/or an enhanced decoder 130 if they are capable of decoding and providing the content for rendering or processing.
  • The flow charts of FIGS. 6 and 7 show the functionality and operation of a transmitting device 103 and receiving device 109. If embodied in software, each block may represent a module, segment, or portion of code that comprises program instructions to implement the specified logical function(s). The program instructions may be embodied in the form of source code that comprises human-readable statements written in a programming language or machine code that comprises numerical instructions recognizable by a suitable execution system such as a processor in a transmitting device 103 or receiving device 109. The machine code may be converted from the source code, etc. If embodied in hardware, each block may represent a circuit or a number of interconnected circuits to implement the specified logical function(s).
  • Although the flow charts of FIGS. 6 and 7 show a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession in FIGS. 6 and 7 may be executed concurrently or with partial concurrence. Further, in some embodiments, one or more of the blocks shown in FIGS. 6 and 7 may be skipped or omitted. In addition, any number of counters, state variables, warning semaphores, or messages might be added to the logical flow described herein, for purposes of enhanced utility, accounting, performance measurement, or providing troubleshooting aids, etc. It is understood that all such variations are within the scope of the present disclosure.
  • Also, any code or application described herein that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor a transmitting device 103 or receiving device 109. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system. The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
  • It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims (20)

1. A transmitting device, comprising:
circuitry configured to separate transmission data into related data components, each related data component associated with one of a plurality of transport streams; and
circuitry configured to transmit each related data component in the associated transport stream, each related data component including a synchronization tag associated with synchronization of the related data component within the transmission data.
2. The transmitting device of claim 1, further comprising circuitry configured to include the synchronizing tag in the related data component.
3. The transmitting device of claim 1, wherein the transmission data comprises video content.
4. The transmitting device of claim 3, wherein the video content is separated into a base layer component and at least one enhanced layer component, wherein the base layer component and the at least one enhanced layer component are transmitted in different transport streams.
5. The transmitting device of claim 4, wherein a plurality of enhanced layer components of the video content are transmitted in the same transport stream.
6. The transmitting device of claim 1, further comprising circuitry configured to merge related data components with unrelated data components before transmitting in the associated transport stream.
7. The transmitting device of claim 1, further comprising circuitry configured to modulate the related data components at a modulation rate for transmission in the associated transport stream, the modulation circuitry comprising a modulation buffer, wherein the modulation rate is adjusted based at least in part upon a condition of the modulation buffer.
8. The transmitting device of claim 1, wherein the synchronization tag comprises a time stamp.
9. The transmitting device of claim 1, wherein the synchronization tag comprises a frame identifier.
10. A receiving device, comprising:
circuitry configured to receive a plurality of related data components transmitted in separate transport streams, each related data component including a synchronization tag associated with synchronization of the related data components received in the separate transport streams; and
circuitry configured to decode the plurality of related data components based at least in part upon the synchronization tag.
11. The receiving device of claim 10, further comprising circuitry configured to separate related data components from unrelated data components in the same transport stream based at least in part upon the synchronization tags of the related data components.
12. The receiving device of claim 10, further comprising a wideband tuner configured to separate related data components from unrelated data components in each of a plurality of separate transport streams based at least in part upon the synchronization tag of the related data components.
13. The receiving device of claim 12, wherein the wideband tuner is configured to identify related data components based at least in part upon information in the synchronization tag.
14. The receiving device of claim 10, wherein the synchronization tag comprises a frame identifier.
15. The receiving device of claim 10, wherein the synchronization tag comprises a time stamp.
16. The receiving device of claim 10, wherein the plurality of related data components comprise a base layer component transmitted in a first transport stream and an enhanced layer component transmitted in a second transport stream.
17. The receiving device of claim 16, wherein the base layer component and the enhanced layer component are scalable video coding (SVC) layer components.
18. A method for data transmission across independent streams, comprising:
receiving a plurality of data components transmitted on a plurality of transport streams, each transport stream including related data components and unrelated data components, each related data component including a synchronization tag associated with synchronization of the related data components;
separating the related data components from the unrelated data components in each of the plurality of transport streams based at least in part upon the synchronization tag of each related data component; and
decoding the related data components based at least in part upon the synchronization tag.
19. The method of claim 18, wherein the synchronization tag comprises a clock offset compensation value.
20. The method of claim 19, wherein the related data components are identified based at least in part upon the clock offset compensation value and a program clock reference (PCR) of the corresponding transport stream.
US13/566,254 2011-08-05 2012-08-03 Data transmission across independent streams Active 2033-06-13 US9001728B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/566,254 US9001728B2 (en) 2011-08-05 2012-08-03 Data transmission across independent streams
US14/677,757 US9538199B2 (en) 2011-08-05 2015-04-02 Data transmission across independent streams

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161515543P 2011-08-05 2011-08-05
US13/566,254 US9001728B2 (en) 2011-08-05 2012-08-03 Data transmission across independent streams

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/677,757 Continuation US9538199B2 (en) 2011-08-05 2015-04-02 Data transmission across independent streams

Publications (2)

Publication Number Publication Date
US20130033642A1 true US20130033642A1 (en) 2013-02-07
US9001728B2 US9001728B2 (en) 2015-04-07

Family

ID=47626743

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/566,254 Active 2033-06-13 US9001728B2 (en) 2011-08-05 2012-08-03 Data transmission across independent streams
US14/677,757 Active US9538199B2 (en) 2011-08-05 2015-04-02 Data transmission across independent streams

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/677,757 Active US9538199B2 (en) 2011-08-05 2015-04-02 Data transmission across independent streams

Country Status (1)

Country Link
US (2) US9001728B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140303984A1 (en) * 2013-04-05 2014-10-09 Dts, Inc. Layered audio coding and transmission
US20150016547A1 (en) * 2013-07-15 2015-01-15 Sony Corporation Layer based hrd buffer management for scalable hevc
US20150215497A1 (en) * 2014-01-24 2015-07-30 Hiperwall, Inc. Methods and systems for synchronizing media stream presentations
US20150319473A1 (en) * 2012-11-07 2015-11-05 Eyal Farkash A system and method for providing a private mosaic
US20160212486A1 (en) * 2015-01-15 2016-07-21 Mediatek Inc. Video displaying method, video decoding method and electronic system applying the method
US9721575B2 (en) 2011-03-09 2017-08-01 Dts Llc System for dynamically creating and rendering audio objects
CN110062257A (en) * 2013-06-18 2019-07-26 太阳专利托管公司 Sending method and method of reseptance
US20190364310A1 (en) * 2017-12-01 2019-11-28 Harmonic, Inc. Hybrid Statistical Multiplexer
US20220217377A1 (en) * 2016-02-17 2022-07-07 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US11563593B2 (en) 2020-08-19 2023-01-24 Charter Communications Operating, Llc Methods and apparatus for coordination between wireline backhaul and wireless systems
US11582055B2 (en) * 2020-08-18 2023-02-14 Charter Communications Operating, Llc Methods and apparatus for wireless device attachment in a managed network architecture
US11844057B2 (en) 2020-09-09 2023-12-12 Charter Communications Operating, Llc Methods and apparatus for wireless data traffic management in wireline backhaul systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9705746B2 (en) * 2012-03-11 2017-07-11 Avago Technologies General Ip (Singapore) Pte. Ltd. Channel bonding for layered content

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020136406A1 (en) * 2001-03-20 2002-09-26 Jeremy Fitzhardinge System and method for efficiently storing and processing multimedia content
US20060130113A1 (en) * 2004-12-15 2006-06-15 Carlucci John B Method and apparatus for wideband distribution of content
US20090052323A1 (en) * 2005-03-21 2009-02-26 Dirk Breynaert Managing Traffic in a Satellite Transmission System
US20120023535A1 (en) * 2010-07-22 2012-01-26 Brooks Paul D Apparatus and methods for packetized content delivery over a bandwidth-efficient network
US20120092453A1 (en) * 2009-06-16 2012-04-19 Jong Yeul Suh Broadcast transmitter, broadcast receiver and 3d video processing method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020136406A1 (en) * 2001-03-20 2002-09-26 Jeremy Fitzhardinge System and method for efficiently storing and processing multimedia content
US20060130113A1 (en) * 2004-12-15 2006-06-15 Carlucci John B Method and apparatus for wideband distribution of content
US20090052323A1 (en) * 2005-03-21 2009-02-26 Dirk Breynaert Managing Traffic in a Satellite Transmission System
US20120092453A1 (en) * 2009-06-16 2012-04-19 Jong Yeul Suh Broadcast transmitter, broadcast receiver and 3d video processing method thereof
US20120023535A1 (en) * 2010-07-22 2012-01-26 Brooks Paul D Apparatus and methods for packetized content delivery over a bandwidth-efficient network

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9721575B2 (en) 2011-03-09 2017-08-01 Dts Llc System for dynamically creating and rendering audio objects
US20150319473A1 (en) * 2012-11-07 2015-11-05 Eyal Farkash A system and method for providing a private mosaic
US10148991B2 (en) * 2012-11-07 2018-12-04 Cisco Technology, Inc. System and method for providing a private mosaic
US9837123B2 (en) 2013-04-05 2017-12-05 Dts, Inc. Layered audio reconstruction system
US20140303984A1 (en) * 2013-04-05 2014-10-09 Dts, Inc. Layered audio coding and transmission
CN105264600A (en) * 2013-04-05 2016-01-20 Dts有限责任公司 Layered audio coding and transmission
US9558785B2 (en) * 2013-04-05 2017-01-31 Dts, Inc. Layered audio coding and transmission
US9613660B2 (en) 2013-04-05 2017-04-04 Dts, Inc. Layered audio reconstruction system
CN110062257A (en) * 2013-06-18 2019-07-26 太阳专利托管公司 Sending method and method of reseptance
US10708608B2 (en) 2013-07-15 2020-07-07 Sony Corporation Layer based HRD buffer management for scalable HEVC
US20150016547A1 (en) * 2013-07-15 2015-01-15 Sony Corporation Layer based hrd buffer management for scalable hevc
US9942622B2 (en) * 2014-01-24 2018-04-10 Hiperwall, Inc. Methods and systems for synchronizing media stream presentations
US20150215497A1 (en) * 2014-01-24 2015-07-30 Hiperwall, Inc. Methods and systems for synchronizing media stream presentations
US9762966B2 (en) * 2015-01-15 2017-09-12 Mediatek Inc. Video displaying method and video decoding method which can operate in multiple display mode and electronic system applying the method
US20160212486A1 (en) * 2015-01-15 2016-07-21 Mediatek Inc. Video displaying method, video decoding method and electronic system applying the method
US20220217377A1 (en) * 2016-02-17 2022-07-07 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US11924450B2 (en) * 2016-02-17 2024-03-05 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US20190364310A1 (en) * 2017-12-01 2019-11-28 Harmonic, Inc. Hybrid Statistical Multiplexer
US10893308B2 (en) * 2017-12-01 2021-01-12 Harmonic, Inc. Hybrid statistical multiplexer
US11582055B2 (en) * 2020-08-18 2023-02-14 Charter Communications Operating, Llc Methods and apparatus for wireless device attachment in a managed network architecture
US11563593B2 (en) 2020-08-19 2023-01-24 Charter Communications Operating, Llc Methods and apparatus for coordination between wireline backhaul and wireless systems
US11844057B2 (en) 2020-09-09 2023-12-12 Charter Communications Operating, Llc Methods and apparatus for wireless data traffic management in wireline backhaul systems

Also Published As

Publication number Publication date
US9538199B2 (en) 2017-01-03
US9001728B2 (en) 2015-04-07
US20150215650A1 (en) 2015-07-30

Similar Documents

Publication Publication Date Title
US9538199B2 (en) Data transmission across independent streams
US9210354B2 (en) Method and apparatus for reception and transmission
KR101580516B1 (en) method of receiving a broadcasting signal and apparatus for receiving a broadcasting signal
US9628771B2 (en) Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor
EP2103148B1 (en) Transmitting/receiving digital realistic broadcasting involving beforehand transmisson of auxiliary information
CN107852516B (en) Method and apparatus for transmitting broadcast signal
US20120320168A1 (en) Method and apparatus for transmission and reception in the provision of a plurality of transport interactive 3dtv broadcasting services
US20120293618A1 (en) Image data transmission apparatus, image data transmission method and image data reception apparatus
US9973764B2 (en) Method and device for transmitting and receiving advanced UHD broadcasting content in digital broadcasting system
US20120081516A1 (en) Transmitting apparatus, transmitting method, receiving apparatus, and receiving method
KR20130014428A (en) Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time
US20100266052A1 (en) Method and apparatus for transmitting/receiving enhanced media data in digital multimedia broadcasting system
US20130271568A1 (en) Transmitting system and receiving apparatus for providing hybrid service, and service providing method thereof
US8953019B2 (en) Method and apparatus for generating stream and method and apparatus for processing stream
US20140125762A1 (en) Transmission device, transmission method, reception apparatus, and reception method
EP3439309B1 (en) Method and apparatus for transmitting and receiving broadcast signals
CN104081767A (en) Transmission device, transmission method, reception device and reception method
US20100315486A1 (en) Stereoscopic video service providing/receiving method and apparatus in digital broadcasting system
JP2008017207A (en) Image signal receiver
US10904592B2 (en) Transmission apparatus, transmission method, image processing apparatus, image processing method, reception apparatus, and reception method
KR20150057149A (en) System and method for providing 3d broadcast service provision based on re-transmission broadcast networks
KR20110068821A (en) Method and apparatus for receiving and transmitting
KR20130056829A (en) Transmitter/receiver for 3dtv broadcasting, and method for controlling the same
EP3448043B1 (en) Broadcast signal transmission/reception method and apparatus for providing high-quality media in dash-based system
KR101941781B1 (en) Method and Apparatus for Receiving 8K Broadcasting based on MMT

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WAN, WADE;MAMIDWAR, RAJESH;CHEN, XUEMIN;AND OTHERS;SIGNING DATES FROM 20120801 TO 20120809;REEL/FRAME:028860/0142

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119

AS Assignment

Owner name: AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:047229/0408

Effective date: 20180509

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE EFFECTIVE DATE PREVIOUSLY RECORDED ON REEL 047229 FRAME 0408. ASSIGNOR(S) HEREBY CONFIRMS THE THE EFFECTIVE DATE IS 09/05/2018;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:047349/0001

Effective date: 20180905

AS Assignment

Owner name: AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE PATENT NUMBER 9,385,856 TO 9,385,756 PREVIOUSLY RECORDED AT REEL: 47349 FRAME: 001. ASSIGNOR(S) HEREBY CONFIRMS THE MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:051144/0648

Effective date: 20180905

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8