DE102008020735A1 - Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor - Google Patents
Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor Download PDFInfo
- Publication number
- DE102008020735A1 DE102008020735A1 DE102008020735A DE102008020735A DE102008020735A1 DE 102008020735 A1 DE102008020735 A1 DE 102008020735A1 DE 102008020735 A DE102008020735 A DE 102008020735A DE 102008020735 A DE102008020735 A DE 102008020735A DE 102008020735 A1 DE102008020735 A1 DE 102008020735A1
- Authority
- DE
- Germany
- Prior art keywords
- synchronization
- datasets
- time
- display units
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43074—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234318—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4348—Demultiplexing of additional data and video streams
- H04N21/4349—Demultiplexing of additional data and video streams by extracting from data carousels, e.g. extraction of software modules from a DVB carousel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
Abstract
Description
Die Erfindung betrifft eine technische Lösung, die eine objektgenaue Synchronisation beliebig kodierter Videoströme mit zusätzlicher Information ermöglicht. Als Videoausgangsmaterial wird hierbei das Video nach dem Dekodiervorgang verarbeitet, d. h. ein in unkodierter Videostrom benutzt. Als Zusatzinformation kann jede Art der Information benutzt werden, die auf einem Endgerät (PC, Set-Tap-Box, etc.) in Zusammenhang mit dem Videosignal darstellbar ist. Die Erfindung ermöglicht eine Zeitsynchronisation zwischen Darstellungseinheiten (z. B. Frames) und der Zusatzinformation. Durch die Verankerung wird dem linearen Videostrom eine Nichtlinearität, die sich auf Objekte, definiert durch X/Y-Koordinaten, innerhalb eines Videostroms beziehen, hinzugefügt. Damit lassen sich Objekte innerhalb einer Sequenz von Darstellungseinheiten eines unkodierten Videostromes für den Benutzer des Endgerätes mit Interaktivität (Darstellung der Zusatzinformation oder weiterer Funktionalität wie bspw. Verlinkung) versehen (”clickable Objects”). Die Lösung ist sowohl für die Digital Video Broadcast (Push) Betriebsart als auch für die VideoOnDemand (Pull) Betriebsart anwendbar. Betrachtet man den letzteren Fall, so spielt es keine Rolle ob das Videomaterial auf dem Endgerät oder auf einem dezentralen Server abgelegt ist. Die Erfindung erfordert eine Bearbeitung des Videoausgangsmaterials und entsprechende Funktionen am Endgerät, stellt aber keine Anforderungen an das dazwischenliegende Netzwerk in Bezug auf synchrongenaue Übertragung der Pakete des kodierten Video Ausgangsmaterial und der Zusatzdaten.The The invention relates to a technical solution that is object-accurate Synchronization arbitrarily encoded video streams with additional information allows. As a video source material here is the video after the decoding process processed, d. H. a used in uncoded video stream. As additional information Any kind of information that can be used on a terminal (PC, set-tap box, etc.) in connection with the video signal can be displayed. The invention allows a time synchronization between presentation units (eg frames) and additional information. By anchoring becomes the linear Video stream a nonlinearity, referring to objects defined by X / Y coordinates, within of a video stream, added. This can be objects within a sequence of presentation units of an uncoded Video streams for the user of the terminal with interactivity (Presentation of additional information or other functionality such as. Linking) ("clickable Objects "). The solution is both for the Digital Video Broadcast (Push) mode as well as the VideoOnDemand (Pull) mode applicable. Looking at the latter case, so it does not matter if the video footage on the device or on a decentralized server is stored. The invention requires a Editing the video source material and corresponding functions on the terminal, but makes no demands on the intervening network in terms of synchronous transmission Packages of encoded video source material and additional data.
Eine Möglichkeit dieser Art wurde bereits durch Arbeiten van Bove [1] am Massachusetts Institute of Technology; Media Laboratory, USA gezeigt, jedoch wurde hier unkomprimiertes Video als Ausgangsmaterial benutzt. In der unter [1] referenzierten Lösung erfolgt die Verankerung von Videoobjekten und der zugehörigen Zeitbasis, was die Synchronisation am Endgerät ermöglicht, auf unkomprimiertes Videomaterial. Darüber hinaus wird für die Extraktion der Objektdaten ein Algorithmus auf Baisis von Gitterkonturen verwendet.A possibility This style was already by works van Bove at Massachusetts Institute of Technology; Media Laboratory, USA, however, has been here uncompressed video used as a source material. In the under [1] referenced solution the anchoring of video objects and the associated time base takes place, what allows the synchronization on the terminal, uncompressed Video material. About that Beyond for the extraction of the object data an algorithm on Baisis of grid contours used.
Eine Möglichkeit der Synchronisation zwischen MPEG-2 kodiertem Videomaterial und zusätzlicher Information wurde in einer Veröffentlichung van Brunnheroto [2] publiziert. Hier wird versucht die Verankerung durch den in MPEG-2 vorhandenen Program Clock Referenz (PCR) Parameter zu ermöglichen. Damit ist keine getrennte Übertragung van Video Ausgangsmaterial und Zusatzinformation möglich. Zudem stellt dieser Ansatz die Anforderungen an das dazwischenliegende Netzwerk, die Datenpakete van Video Material und Zusatzdaten jitterfrei zu übertragen.A possibility the synchronization between MPEG-2 encoded video material and additional Information was in a publication van Brunnheroto [2] publishes. Here the anchorage is tried through the Program Clock Reference (PCR) parameter present in MPEG-2 to enable. This is not a separate transfer van Video Source material and additional information possible. moreover This approach makes the demands on the intervening Network, the data packets of video material and additional data jitter-free transferred to.
In [3] wird eine Lösung explizit für MPEG-1/-2 kodiertes Videomaterial vorgestellt. Dabei werden die für MPEG-1/-2 definierten Zeitwerte PTS und time code benutzt.In [3] becomes a solution explicitly for MPEG-1 / -2 encoded video footage presented. Here are the for MPEG-1 / -2 defined time values PTS and time code used.
Aufgabenstellung:Task:
Für die Erfindung besteht die Aufgabe, die Benutzung geeigneter Zeitwerte, Koordinatendaten und Maßnahmen zur Synchronisation am Endgerät anzugeben.For the invention The task is to use appropriate time values, coordinate data and activities to specify for synchronization on the terminal.
a) Benutzung geeigneter Zeitwerte und Koordinatendaten.a) Use of appropriate time values and Coordinate data.
Es besteht die Aufgabe für ein markiertes Frame oder für ein Objekt innerhalb eines Frames bzw. für Objekte einer Reihe von Frames eines Videostromes hinsichtlich ihrer zeitlichen und räumlichen (zwei dimensional) Ausprägung eine Verankerung bereitzustellen, so dass Zusatzinformationen (z. B. Overlays oder abgeleitete Funktionen) am Endgerät frame- und objektgenau dargestellt werden können.It the task exists for a marked frame or for an object within a frame or for objects of a series of frames of a video stream in terms of their temporal and spatial (two dimensional) expression provide anchorage so that additional information (e.g. Overlays or derived functions) on the terminal frame- and can be displayed accurately.
b) Synchronisation am Endgerätb) synchronization on the terminal
Es besteht die Aufgabe einen Triggermechanismus bereitzustellen, damit mit den unter a) gewonnenen Daten eine frame- bzw. objektgenaue Überlagerung eines beliebig kodierten Videostromes mit Zusatzinformationen erzielt werden kann.It The task is to provide a triggering mechanism for this with the data obtained under a) a frame- or object-exact overlay achieved an arbitrarily encoded video stream with additional information can be.
Die beschriebe Lösung wurde Beispielhaft in einer Browser basierte Anwendung umgesetzt und verifiziert. Das Bild 2 zeigt einen Screenshot bei dem das Mikrofon im Videostrom durch beschreibende Metadaten gekennzeichnet (Overlay) und mit Interaktivität („clickable”) hinterlegt wurde.The describe solution was exemplarily implemented in a browser based application and verified. Picture 2 shows a screenshot of the microphone characterized in the video stream by descriptive metadata (overlay) and with interactivity ("Clickable") deposited has been.
Ausführungsbeispiel:Embodiment:
Die Erfindung löst die ihre gestellten zwei Teilaufgaben folgendermaßen:The Invention solves their asked two subtasks as follows:
a) Gewinnung geeigneter Zeitwerte und Koordinatendatena) obtain appropriate time values and coordinate data
Transform basierte Komprimierungsalgorithmen, wie z. B. die der MPEG-1/-2 Standards, zerlegen die unterschiedlichen Bildtypen, auch Frames genannt, in Blöcke (Frame > Macroblöcke > Blöcke). Es werden nicht die einzelnen Objekte der Frames berücksichtigt, im Gegensatz zu MPEG-4, welches Videoobjekte unterstützt. Davon macht diese Erfindung jedoch kein Gebrauch. In der vorliegenden Lösung wird die Tatsache ausgenutzt, dass jeder Videostrom eine Zeitachse (Mediapostion-time) besitzt. Für unsere Lösung werden dazu in wiederkehrenden Intervallen von Darstellungseinheiten die Objektkoordinaten (X/Y) und die Zeitwerte [t(i)] extrahiert und in einem Datensatz abgespeichert. Es wird somit eine Zeitachse mit Referenzpunkten verwendet. Neben den Zeitwerten, werden auch die zugehörigen Koordinatenwerte (X/Y) des Objektes ausgelesen. Beide Parameter (Zeitwert und X/Y-Koordinaten) werden als Synchronisationsanker für die objektgenaue Synchronisation von Zusatzinformationen (z. B. Ovleray) benutzt. Damit ist dieses Verfahren sowohl für Videoströme mit konstanter als auch variabler Framerate anwendbar. Bei Videoströmen mit konstanter Framerate kann die Zeitachse auch aus den Frame-Präsentationszeiten abgeleitet werden.Transform based compression algorithms, such as As the MPEG-1 / -2 standards, decomposing the different types of images, also called frames, into blocks (frame> macro blocks> blocks). It does not take into account the individual objects of the frames, unlike MPEG-4, which supports video objects. However, this invention makes no use of it. In the present solution, the fact is exploited that each video stream has a time axis (Mediapostion-time). For our solution, the object coordinates (X / Y) and the time values [t (i)] are extracted in recurring intervals of presentation units and written in stored a record. Thus, a time axis with reference points is used. In addition to the time values, the associated coordinate values (X / Y) of the object are also read out. Both parameters (time value and X / Y coordinates) are used as synchronization anchors for object-precise synchronization of additional information (eg Ovleray). Thus, this method is applicable to both constant and variable frame rate video streams. For video streams with a constant frame rate, the timeline can also be derived from the frame presentation times.
Das Bild 1 zeigt ein Beispiel bei dem die Objektdaten (Zeitwerte, Koordinaten) für verschiedene Zeitpunkte aufgezeichnet worden sind (variable Framerate). Dabei werden die Koordinaten in relative Positionen umgerechnet, um die Objektdaten in verschieden aufgelösten oder kodierten Videoströmen wiederverwenden zu können.The Figure 1 shows an example where the object data (time values, coordinates) for different times have been recorded (variable frame rate). Here are the Coordinates converted into relative positions to the object data in different resolutions or reuse encoded video streams to be able to.
Diese gewonnenen Zeitwerte, die jede Abspiel-Hard- und Software nutzt, um die lineare Zeitachse innerhalb des Videostroms darzustellen gilt für alle Kodierungsarten.These obtained time values that each playback hardware and software uses, to represent the linear time axis within the video stream applies to everyone Coding styles.
Damit wird keine synchrone Übertragung des kodierten Videomaterials und der Zusatzinformation von dem zwischen Sender und Endgerät liegenden Netzwerk gefordert, wäre aber auch denkbar.In order to is no synchronous transmission of the coded video material and the additional information of the between Transmitter and terminal lying network would be required but also conceivable.
Die Koordinatenwerte, der einzelnen Frames werden mit Hilfe von geeigneten Objekt-Analyse bzw. -Tracking Algorithmen gewonnen.The Coordinate values of each frame are determined using appropriate Object analysis or tracking Algorithms won.
Der oben beschriebene Vorgang kann sowohl als ein offline Prozess, der auf vor- oder unkomprimiertes Videomaterial angewendet wird, als auch in Echtzeitvideosystemen eingesetzt werden. Dabei ist der begrenzende Faktor für die Echtzeitfähigkeit das Objekterkennungsverfahren, das diese Erfindung aber nicht festlegt.Of the The process described above can be considered both as an offline process, the is applied to pre- or uncompressed video footage than also be used in real-time video systems. Here is the limiting Factor for the real-time capability that Object recognition method, which does not specify this invention.
b) Synchronisation am Endgerätb) synchronization on the terminal
Um eine Synchronisation am Endgerät zu ermöglichen, müssen die extrahierten Zeitwerte mit den entsprechenden Koordinatendaten vor dem Erreichen des zu synchronisierenden Videostromes am Endgerät vorhanden sein. Innerhalb des Endgerätes wird während der Dekodierung des Videostromes fortlaufend auf die Zeitparameter zugegriffen (Hardware-/Softwareimplementierung) und mit den extrahierten Zeitwerten (Datensätzen) verglichen. Die Synchronisation erfolgt durch Vergleich der extrahierten und aktuell gelesenen Zeitwerte. D. h., Overlay oder weitere Funktionen werden ausgeführt. In der Praxis sind gewisse Verarbeitungsverzögerungen, die für die Darstellung bzw. Bearbeitung der Zusatzinformation am Endgerät notwendig sind, einzurechnen.Around a synchronization on the terminal to enable have to the extracted time values with the corresponding coordinate data before reaching the video stream to be synchronized on the terminal available be. Within the terminal is while the decoding of the video stream continuously to the time parameters accessed (hardware / software implementation) and with the extracted Time values (records) compared. Synchronization is done by comparing the extracted ones and currently read time values. That is, overlay or other functions will be executed. In practice, there are some processing delays necessary for the presentation or editing the additional information on the terminal are necessary to include.
Durch die Erfindung wird erreicht, dass Videoströme, am Endgerät frame- bzw. objektgenau mit Zusatzfunktionalitäten (z. B. Overlay oder Verlinkung etc.) angereichert werden können. Damit wird dem originalen linear ablaufenden Videostrom objektbezogene nichtlineare Interaktivität hinzugefügt.By the invention achieves that video streams, at the terminal frame- or object-precise with additional functionalities (eg overlay or linking, etc.) can be enriched. This will be object-related to the original linear video stream nonlinear interactivity added.
Literaturliterature
-
[1]
Bove M. et al; Hyperlinked television research at the MIT Media Laborstory; IBM System Journal, Vol. 39; NOS 3&4, 2000 Bove M. et al; Hyperlinked television research at the MIT Media Laboratory; IBM System Journal, Vol. 39; NOS 3 & 4, 2000 -
[2]
Brunheroto, J. et al; Issues in Data Embedding and Synchronisation for Digital Television; in Proc. IEEE International Conference an Multimedia and Expo 30. Jury – 2 August 2000; New York; (ISBN 0-7803-6536-4) Brunheroto, J. et al; Issues in Data Embedding and Synchronization for Digital Television; in proc. IEEE International Conference on Multimedia and Expo 30th Jury - 2 August 2000; New York; (ISBN 0-7803-6536-4) -
[3] Jäger,
R. Objektgenaue Synchronisation von MPEG-1/2 kodierten Videoströmen mit
Zusatzinformation, Patent;
DE 102 40 363 B3 DE 102 40 363 B3
Claims (3)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102008020735A DE102008020735A1 (en) | 2008-04-25 | 2008-04-25 | Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102008020735A DE102008020735A1 (en) | 2008-04-25 | 2008-04-25 | Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor |
Publications (1)
Publication Number | Publication Date |
---|---|
DE102008020735A1 true DE102008020735A1 (en) | 2009-10-29 |
Family
ID=41111762
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DE102008020735A Ceased DE102008020735A1 (en) | 2008-04-25 | 2008-04-25 | Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor |
Country Status (1)
Country | Link |
---|---|
DE (1) | DE102008020735A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106375870A (en) * | 2016-08-31 | 2017-02-01 | 北京旷视科技有限公司 | Video marking method and device |
US11600029B2 (en) | 2012-06-06 | 2023-03-07 | Sodyo Ltd. | Display synchronization using colored anchors |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5774666A (en) * | 1996-10-18 | 1998-06-30 | Silicon Graphics, Inc. | System and method for displaying uniform network resource locators embedded in time-based medium |
WO1998028907A2 (en) * | 1996-12-23 | 1998-07-02 | Corporate Media Partners Doing Business As Americast | Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol |
WO2000020976A2 (en) * | 1998-10-07 | 2000-04-13 | Hotv Inc. | Method and apparatus for synchronous presentation of interactive video and audio transmissions for tv and internet environments |
DE10033134A1 (en) * | 1999-10-21 | 2001-05-03 | Iplacet Softwareentwicklungs G | Method and device for displaying information on selected picture elements of pictures of a video sequence |
WO2002019719A1 (en) * | 2000-08-30 | 2002-03-07 | Watchpoint Media, Inc. | A method and apparatus for hyperlinking in a television broadcast |
DE10240363B3 (en) | 2002-09-02 | 2004-07-29 | Jäger, Rudolf, Dr.rer.nat. | Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream |
-
2008
- 2008-04-25 DE DE102008020735A patent/DE102008020735A1/en not_active Ceased
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5774666A (en) * | 1996-10-18 | 1998-06-30 | Silicon Graphics, Inc. | System and method for displaying uniform network resource locators embedded in time-based medium |
WO1998028907A2 (en) * | 1996-12-23 | 1998-07-02 | Corporate Media Partners Doing Business As Americast | Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol |
WO2000020976A2 (en) * | 1998-10-07 | 2000-04-13 | Hotv Inc. | Method and apparatus for synchronous presentation of interactive video and audio transmissions for tv and internet environments |
DE10033134A1 (en) * | 1999-10-21 | 2001-05-03 | Iplacet Softwareentwicklungs G | Method and device for displaying information on selected picture elements of pictures of a video sequence |
WO2002019719A1 (en) * | 2000-08-30 | 2002-03-07 | Watchpoint Media, Inc. | A method and apparatus for hyperlinking in a television broadcast |
DE10240363B3 (en) | 2002-09-02 | 2004-07-29 | Jäger, Rudolf, Dr.rer.nat. | Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream |
Non-Patent Citations (5)
Title |
---|
Bove M. et al; Hyperlinked television research at the MIT Media Laborstory; IBM System Journal, Vol. 39; NOS 3&4, 2000 |
BOVE,V.M.[u.a.]: Hyperlinked television research at the MIT Media Laboratory. In: IBM Systems Journal, 2000, Vol.39, NOS 3&4, S.470 478 * |
BOVE,V.M.[u.a.]: Hyperlinked television research at the MIT Media Laboratory. In: IBM Systems Journal, 2000, Vol.39, NOS 3&4, S.470 478 BRUNHEROTO,J. [u.a.]: Issues in data embedding and synchronization for digital television. In: IEEE International Conference on Multimedia and Expo 2000. New York, 30.Juli - 2.August 2000, Vol.3, S.1233-1236 |
Brunheroto, J. et al; Issues in Data Embedding and Synchronisation for Digital Television; in Proc. IEEE International Conference an Multimedia and Expo 30. Jury - 2 August 2000; New York; (ISBN 0-7803-6536-4) |
BRUNHEROTO,J. [u.a.]: Issues in data embedding and synchronization for digital television. In: IEEE International Conference on Multimedia and Expo 2000. New York, 30.Juli - 2.August 2000, Vol.3, S.1233-1236 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11600029B2 (en) | 2012-06-06 | 2023-03-07 | Sodyo Ltd. | Display synchronization using colored anchors |
CN106375870A (en) * | 2016-08-31 | 2017-02-01 | 北京旷视科技有限公司 | Video marking method and device |
CN106375870B (en) * | 2016-08-31 | 2019-09-17 | 北京旷视科技有限公司 | Video labeling method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE69727372T2 (en) | A SYSTEM AND METHOD FOR GENERATING TRICTION PLAY VIDEO DATA STREAMS FROM A COMPRESSED NORMAL PLAYBACK VIDEO DATA STREAM | |
DE3814627C2 (en) | ||
DE60310514T2 (en) | Method and apparatus for synchronizing the reproduction of audio and / or video frames | |
US11605403B2 (en) | Time compressing video content | |
DE69722556T2 (en) | Synchronization of a stereoscopic video sequence | |
DE69333789T2 (en) | Encoding of continuous image data | |
DE69531223T2 (en) | METHOD AND ARRANGEMENT FOR DATA PACKET TRANSFER | |
DE60103510T2 (en) | METHOD AND DEVICE FOR SIMULTANEOUS RECORDING AND PLAYBACK OF TWO DIFFERENT VIDEO PROGRAMS | |
DE2936263A1 (en) | SYSTEM FOR STILL IMAGE DRIVING | |
DE69836470T2 (en) | TRANSMITTER, RECEIVER AND MEDIUM FOR PROGRESSIVE PICTURE SIGNAL | |
DE102020108357A1 (en) | RE-ENCODING PREDICTED IMAGES IN LIVE VIDEOSTREAM APPLICATIONS | |
DE19620186A1 (en) | Method and device for synchronizing temporally related data streams | |
DE69915843T2 (en) | PART BAND CODING / decoding | |
WO2009082990A1 (en) | Method and device for real-time multi-view production | |
DE60312711T2 (en) | METHOD AND DEVICE FOR CODING PICTURE AND / OR AUDIO DATA | |
WO2002078352A1 (en) | Method for compressing and decompressing video data | |
DE102008020735A1 (en) | Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor | |
US20190141366A1 (en) | System and method for insertion of an asset into a source dynamic media | |
US20200260075A1 (en) | Systems and methods for group of pictures encoding | |
DE10240363B3 (en) | Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream | |
EP1516495B1 (en) | Method for creating a system clock in a receiver device and corresponding receiver device | |
EP0981909B1 (en) | Method and device for coding and decoding a digitized image | |
AT503668B1 (en) | METHOD AND DEVICE FOR PRESENTING SIGNALS ON A DISPLAY DEVICE | |
EP1267306B1 (en) | Method for marking digital video data | |
DE10308138B4 (en) | Method for synchronizing picture and video phase of two or more MPEG-2 encoded video sequences for digital multi-projection systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
OP8 | Request for examination as to paragraph 44 patent law | ||
8122 | Nonbinding interest in granting licences declared | ||
8131 | Rejection |