DE102008020735A1 - Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor - Google Patents

Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor Download PDF

Info

Publication number
DE102008020735A1
DE102008020735A1 DE102008020735A DE102008020735A DE102008020735A1 DE 102008020735 A1 DE102008020735 A1 DE 102008020735A1 DE 102008020735 A DE102008020735 A DE 102008020735A DE 102008020735 A DE102008020735 A DE 102008020735A DE 102008020735 A1 DE102008020735 A1 DE 102008020735A1
Authority
DE
Germany
Prior art keywords
synchronization
datasets
time
display units
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
DE102008020735A
Other languages
German (de)
Inventor
Rudolf Jaeger
Christopher Koehnen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jager Rudolf Drrernat
Kohnen Christopher
Original Assignee
Jager Rudolf Drrernat
Kohnen Christopher
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jager Rudolf Drrernat, Kohnen Christopher filed Critical Jager Rudolf Drrernat
Priority to DE102008020735A priority Critical patent/DE102008020735A1/en
Publication of DE102008020735A1 publication Critical patent/DE102008020735A1/en
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • H04N21/4349Demultiplexing of additional data and video streams by extracting from data carousels, e.g. extraction of software modules from a DVB carousel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program

Abstract

The method involves synchronizing frames or objects of video streams with additional information e.g. overlay. Display units e.g. frames, are associated with datasets (i) using a time axis and coordinates. The datasets form a synchronization anchor for an object of videos and allow a corresponding synchronization. A linear interpolation based on a preset number of display units and synchronization time values of intermediate display units is determined for to-be-synchronized display units that do not have time values (t), where the time values are provided in a display sequence.

Description

Die Erfindung betrifft eine technische Lösung, die eine objektgenaue Synchronisation beliebig kodierter Videoströme mit zusätzlicher Information ermöglicht. Als Videoausgangsmaterial wird hierbei das Video nach dem Dekodiervorgang verarbeitet, d. h. ein in unkodierter Videostrom benutzt. Als Zusatzinformation kann jede Art der Information benutzt werden, die auf einem Endgerät (PC, Set-Tap-Box, etc.) in Zusammenhang mit dem Videosignal darstellbar ist. Die Erfindung ermöglicht eine Zeitsynchronisation zwischen Darstellungseinheiten (z. B. Frames) und der Zusatzinformation. Durch die Verankerung wird dem linearen Videostrom eine Nichtlinearität, die sich auf Objekte, definiert durch X/Y-Koordinaten, innerhalb eines Videostroms beziehen, hinzugefügt. Damit lassen sich Objekte innerhalb einer Sequenz von Darstellungseinheiten eines unkodierten Videostromes für den Benutzer des Endgerätes mit Interaktivität (Darstellung der Zusatzinformation oder weiterer Funktionalität wie bspw. Verlinkung) versehen (”clickable Objects”). Die Lösung ist sowohl für die Digital Video Broadcast (Push) Betriebsart als auch für die VideoOnDemand (Pull) Betriebsart anwendbar. Betrachtet man den letzteren Fall, so spielt es keine Rolle ob das Videomaterial auf dem Endgerät oder auf einem dezentralen Server abgelegt ist. Die Erfindung erfordert eine Bearbeitung des Videoausgangsmaterials und entsprechende Funktionen am Endgerät, stellt aber keine Anforderungen an das dazwischenliegende Netzwerk in Bezug auf synchrongenaue Übertragung der Pakete des kodierten Video Ausgangsmaterial und der Zusatzdaten.The The invention relates to a technical solution that is object-accurate Synchronization arbitrarily encoded video streams with additional information allows. As a video source material here is the video after the decoding process processed, d. H. a used in uncoded video stream. As additional information Any kind of information that can be used on a terminal (PC, set-tap box, etc.) in connection with the video signal can be displayed. The invention allows a time synchronization between presentation units (eg frames) and additional information. By anchoring becomes the linear Video stream a nonlinearity, referring to objects defined by X / Y coordinates, within of a video stream, added. This can be objects within a sequence of presentation units of an uncoded Video streams for the user of the terminal with interactivity (Presentation of additional information or other functionality such as. Linking) ("clickable Objects "). The solution is both for the Digital Video Broadcast (Push) mode as well as the VideoOnDemand (Pull) mode applicable. Looking at the latter case, so it does not matter if the video footage on the device or on a decentralized server is stored. The invention requires a Editing the video source material and corresponding functions on the terminal, but makes no demands on the intervening network in terms of synchronous transmission Packages of encoded video source material and additional data.

Eine Möglichkeit dieser Art wurde bereits durch Arbeiten van Bove [1] am Massachusetts Institute of Technology; Media Laboratory, USA gezeigt, jedoch wurde hier unkomprimiertes Video als Ausgangsmaterial benutzt. In der unter [1] referenzierten Lösung erfolgt die Verankerung von Videoobjekten und der zugehörigen Zeitbasis, was die Synchronisation am Endgerät ermöglicht, auf unkomprimiertes Videomaterial. Darüber hinaus wird für die Extraktion der Objektdaten ein Algorithmus auf Baisis von Gitterkonturen verwendet.A possibility This style was already by works van Bove at Massachusetts Institute of Technology; Media Laboratory, USA, however, has been here uncompressed video used as a source material. In the under [1] referenced solution the anchoring of video objects and the associated time base takes place, what allows the synchronization on the terminal, uncompressed Video material. About that Beyond for the extraction of the object data an algorithm on Baisis of grid contours used.

Eine Möglichkeit der Synchronisation zwischen MPEG-2 kodiertem Videomaterial und zusätzlicher Information wurde in einer Veröffentlichung van Brunnheroto [2] publiziert. Hier wird versucht die Verankerung durch den in MPEG-2 vorhandenen Program Clock Referenz (PCR) Parameter zu ermöglichen. Damit ist keine getrennte Übertragung van Video Ausgangsmaterial und Zusatzinformation möglich. Zudem stellt dieser Ansatz die Anforderungen an das dazwischenliegende Netzwerk, die Datenpakete van Video Material und Zusatzdaten jitterfrei zu übertragen.A possibility the synchronization between MPEG-2 encoded video material and additional Information was in a publication van Brunnheroto [2] publishes. Here the anchorage is tried through the Program Clock Reference (PCR) parameter present in MPEG-2 to enable. This is not a separate transfer van Video Source material and additional information possible. moreover This approach makes the demands on the intervening Network, the data packets of video material and additional data jitter-free transferred to.

In [3] wird eine Lösung explizit für MPEG-1/-2 kodiertes Videomaterial vorgestellt. Dabei werden die für MPEG-1/-2 definierten Zeitwerte PTS und time code benutzt.In [3] becomes a solution explicitly for MPEG-1 / -2 encoded video footage presented. Here are the for MPEG-1 / -2 defined time values PTS and time code used.

Aufgabenstellung:Task:

Für die Erfindung besteht die Aufgabe, die Benutzung geeigneter Zeitwerte, Koordinatendaten und Maßnahmen zur Synchronisation am Endgerät anzugeben.For the invention The task is to use appropriate time values, coordinate data and activities to specify for synchronization on the terminal.

a) Benutzung geeigneter Zeitwerte und Koordinatendaten.a) Use of appropriate time values and Coordinate data.

Es besteht die Aufgabe für ein markiertes Frame oder für ein Objekt innerhalb eines Frames bzw. für Objekte einer Reihe von Frames eines Videostromes hinsichtlich ihrer zeitlichen und räumlichen (zwei dimensional) Ausprägung eine Verankerung bereitzustellen, so dass Zusatzinformationen (z. B. Overlays oder abgeleitete Funktionen) am Endgerät frame- und objektgenau dargestellt werden können.It the task exists for a marked frame or for an object within a frame or for objects of a series of frames of a video stream in terms of their temporal and spatial (two dimensional) expression provide anchorage so that additional information (e.g. Overlays or derived functions) on the terminal frame- and can be displayed accurately.

b) Synchronisation am Endgerätb) synchronization on the terminal

Es besteht die Aufgabe einen Triggermechanismus bereitzustellen, damit mit den unter a) gewonnenen Daten eine frame- bzw. objektgenaue Überlagerung eines beliebig kodierten Videostromes mit Zusatzinformationen erzielt werden kann.It The task is to provide a triggering mechanism for this with the data obtained under a) a frame- or object-exact overlay achieved an arbitrarily encoded video stream with additional information can be.

Die beschriebe Lösung wurde Beispielhaft in einer Browser basierte Anwendung umgesetzt und verifiziert. Das Bild 2 zeigt einen Screenshot bei dem das Mikrofon im Videostrom durch beschreibende Metadaten gekennzeichnet (Overlay) und mit Interaktivität („clickable”) hinterlegt wurde.The describe solution was exemplarily implemented in a browser based application and verified. Picture 2 shows a screenshot of the microphone characterized in the video stream by descriptive metadata (overlay) and with interactivity ("Clickable") deposited has been.

Ausführungsbeispiel:Embodiment:

Die Erfindung löst die ihre gestellten zwei Teilaufgaben folgendermaßen:The Invention solves their asked two subtasks as follows:

a) Gewinnung geeigneter Zeitwerte und Koordinatendatena) obtain appropriate time values and coordinate data

Transform basierte Komprimierungsalgorithmen, wie z. B. die der MPEG-1/-2 Standards, zerlegen die unterschiedlichen Bildtypen, auch Frames genannt, in Blöcke (Frame > Macroblöcke > Blöcke). Es werden nicht die einzelnen Objekte der Frames berücksichtigt, im Gegensatz zu MPEG-4, welches Videoobjekte unterstützt. Davon macht diese Erfindung jedoch kein Gebrauch. In der vorliegenden Lösung wird die Tatsache ausgenutzt, dass jeder Videostrom eine Zeitachse (Mediapostion-time) besitzt. Für unsere Lösung werden dazu in wiederkehrenden Intervallen von Darstellungseinheiten die Objektkoordinaten (X/Y) und die Zeitwerte [t(i)] extrahiert und in einem Datensatz abgespeichert. Es wird somit eine Zeitachse mit Referenzpunkten verwendet. Neben den Zeitwerten, werden auch die zugehörigen Koordinatenwerte (X/Y) des Objektes ausgelesen. Beide Parameter (Zeitwert und X/Y-Koordinaten) werden als Synchronisationsanker für die objektgenaue Synchronisation von Zusatzinformationen (z. B. Ovleray) benutzt. Damit ist dieses Verfahren sowohl für Videoströme mit konstanter als auch variabler Framerate anwendbar. Bei Videoströmen mit konstanter Framerate kann die Zeitachse auch aus den Frame-Präsentationszeiten abgeleitet werden.Transform based compression algorithms, such as As the MPEG-1 / -2 standards, decomposing the different types of images, also called frames, into blocks (frame> macro blocks> blocks). It does not take into account the individual objects of the frames, unlike MPEG-4, which supports video objects. However, this invention makes no use of it. In the present solution, the fact is exploited that each video stream has a time axis (Mediapostion-time). For our solution, the object coordinates (X / Y) and the time values [t (i)] are extracted in recurring intervals of presentation units and written in stored a record. Thus, a time axis with reference points is used. In addition to the time values, the associated coordinate values (X / Y) of the object are also read out. Both parameters (time value and X / Y coordinates) are used as synchronization anchors for object-precise synchronization of additional information (eg Ovleray). Thus, this method is applicable to both constant and variable frame rate video streams. For video streams with a constant frame rate, the timeline can also be derived from the frame presentation times.

Das Bild 1 zeigt ein Beispiel bei dem die Objektdaten (Zeitwerte, Koordinaten) für verschiedene Zeitpunkte aufgezeichnet worden sind (variable Framerate). Dabei werden die Koordinaten in relative Positionen umgerechnet, um die Objektdaten in verschieden aufgelösten oder kodierten Videoströmen wiederverwenden zu können.The Figure 1 shows an example where the object data (time values, coordinates) for different times have been recorded (variable frame rate). Here are the Coordinates converted into relative positions to the object data in different resolutions or reuse encoded video streams to be able to.

Diese gewonnenen Zeitwerte, die jede Abspiel-Hard- und Software nutzt, um die lineare Zeitachse innerhalb des Videostroms darzustellen gilt für alle Kodierungsarten.These obtained time values that each playback hardware and software uses, to represent the linear time axis within the video stream applies to everyone Coding styles.

Damit wird keine synchrone Übertragung des kodierten Videomaterials und der Zusatzinformation von dem zwischen Sender und Endgerät liegenden Netzwerk gefordert, wäre aber auch denkbar.In order to is no synchronous transmission of the coded video material and the additional information of the between Transmitter and terminal lying network would be required but also conceivable.

Die Koordinatenwerte, der einzelnen Frames werden mit Hilfe von geeigneten Objekt-Analyse bzw. -Tracking Algorithmen gewonnen.The Coordinate values of each frame are determined using appropriate Object analysis or tracking Algorithms won.

Der oben beschriebene Vorgang kann sowohl als ein offline Prozess, der auf vor- oder unkomprimiertes Videomaterial angewendet wird, als auch in Echtzeitvideosystemen eingesetzt werden. Dabei ist der begrenzende Faktor für die Echtzeitfähigkeit das Objekterkennungsverfahren, das diese Erfindung aber nicht festlegt.Of the The process described above can be considered both as an offline process, the is applied to pre- or uncompressed video footage than also be used in real-time video systems. Here is the limiting Factor for the real-time capability that Object recognition method, which does not specify this invention.

b) Synchronisation am Endgerätb) synchronization on the terminal

Um eine Synchronisation am Endgerät zu ermöglichen, müssen die extrahierten Zeitwerte mit den entsprechenden Koordinatendaten vor dem Erreichen des zu synchronisierenden Videostromes am Endgerät vorhanden sein. Innerhalb des Endgerätes wird während der Dekodierung des Videostromes fortlaufend auf die Zeitparameter zugegriffen (Hardware-/Softwareimplementierung) und mit den extrahierten Zeitwerten (Datensätzen) verglichen. Die Synchronisation erfolgt durch Vergleich der extrahierten und aktuell gelesenen Zeitwerte. D. h., Overlay oder weitere Funktionen werden ausgeführt. In der Praxis sind gewisse Verarbeitungsverzögerungen, die für die Darstellung bzw. Bearbeitung der Zusatzinformation am Endgerät notwendig sind, einzurechnen.Around a synchronization on the terminal to enable have to the extracted time values with the corresponding coordinate data before reaching the video stream to be synchronized on the terminal available be. Within the terminal is while the decoding of the video stream continuously to the time parameters accessed (hardware / software implementation) and with the extracted Time values (records) compared. Synchronization is done by comparing the extracted ones and currently read time values. That is, overlay or other functions will be executed. In practice, there are some processing delays necessary for the presentation or editing the additional information on the terminal are necessary to include.

Durch die Erfindung wird erreicht, dass Videoströme, am Endgerät frame- bzw. objektgenau mit Zusatzfunktionalitäten (z. B. Overlay oder Verlinkung etc.) angereichert werden können. Damit wird dem originalen linear ablaufenden Videostrom objektbezogene nichtlineare Interaktivität hinzugefügt.By the invention achieves that video streams, at the terminal frame- or object-precise with additional functionalities (eg overlay or linking, etc.) can be enriched. This will be object-related to the original linear video stream nonlinear interactivity added.

Literaturliterature

  • [1] Bove M. et al; Hyperlinked television research at the MIT Media Laborstory; IBM System Journal, Vol. 39; NOS 3&4, 2000 [1] Bove M. et al; Hyperlinked television research at the MIT Media Laboratory; IBM System Journal, Vol. 39; NOS 3 & 4, 2000
  • [2] Brunheroto, J. et al; Issues in Data Embedding and Synchronisation for Digital Television; in Proc. IEEE International Conference an Multimedia and Expo 30. Jury – 2 August 2000; New York; (ISBN 0-7803-6536-4) [2] Brunheroto, J. et al; Issues in Data Embedding and Synchronization for Digital Television; in proc. IEEE International Conference on Multimedia and Expo 30th Jury - 2 August 2000; New York; (ISBN 0-7803-6536-4)
  • [3] Jäger, R. Objektgenaue Synchronisation von MPEG-1/2 kodierten Videoströmen mit Zusatzinformation, Patent; DE 102 40 363 B3 , 29.07.2004[3] Jäger, R. Object-precise synchronization of MPEG-1/2 encoded video streams with additional information, patent; DE 102 40 363 B3 , 29.07.2004

Claims (3)

Zeitliches Synchronisationsverfahren, das es ermöglicht, Frames oder Objekte von Videoströmen, mit Zusatzinformation zu synchronisieren, dadurch gekennzeichnet, dass die Darstellungseinheiten mit Hilfe einer Zeitachse und Koordinaten in Datensätzen verknüpft werden. Diese Datensätze bilden den Synchronisationsanker zum Objekt eines Videos und ermöglichen eine entsprechende Synchronisation.A temporal synchronization method which makes it possible to synchronize frames or objects of video streams with additional information, characterized in that the representation units are linked by means of a time axis and coordinates in data records. These records form the synchronization anchor to the object of a video and enable a corresponding synchronization. Zeitliches Synchronisationsverfahren nach Anspruch 1, dadurch gekennzeichnet, dass für zu synchronisierende Darstellungseinheiten die keine Zeitwerte aufweisen, eine lineare Interpolation anhand der vorhandenen Anzahl der Darstellungseinheiten und in der Darstellungsfolge am nächsten liegenden Synchronisationszeitwerten die Zeitwerte der dazwischenliegenden Darstellungseinheiten bestimmt wird.Time synchronization method according to claim 1, characterized in that for display units to be synchronized which have no time values, a linear interpolation based on the existing number of presentation units and in the presentation sequence the next lying synchronization values, the time values of the intervening Display units is determined. Verwendung der Verfahren nach Anspruch 1 oder 2 sowohl für Videoströme, die in Broadcast Verfahren ausgestrahlt werden (Push Betrieb) als auch für Videoströme, die auf Anforderung an einen lokalen Speicher (Speicher innerhalb des Endgerätes) oder einen entfernten Speicher (Speicher extern zum Endgerät) zur Darstellung abgerufen werden (On-Demand/Pull Betrieb).Use of the method according to claim 1 or 2 both for video streams that be broadcast in broadcast method (push operation) as well as for video streams that on request to a local memory (memory within the Terminal) or a remote memory (memory external to the terminal) for display be accessed (on-demand / pull operation).
DE102008020735A 2008-04-25 2008-04-25 Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor Ceased DE102008020735A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102008020735A DE102008020735A1 (en) 2008-04-25 2008-04-25 Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102008020735A DE102008020735A1 (en) 2008-04-25 2008-04-25 Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor

Publications (1)

Publication Number Publication Date
DE102008020735A1 true DE102008020735A1 (en) 2009-10-29

Family

ID=41111762

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102008020735A Ceased DE102008020735A1 (en) 2008-04-25 2008-04-25 Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor

Country Status (1)

Country Link
DE (1) DE102008020735A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106375870A (en) * 2016-08-31 2017-02-01 北京旷视科技有限公司 Video marking method and device
US11600029B2 (en) 2012-06-06 2023-03-07 Sodyo Ltd. Display synchronization using colored anchors

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5774666A (en) * 1996-10-18 1998-06-30 Silicon Graphics, Inc. System and method for displaying uniform network resource locators embedded in time-based medium
WO1998028907A2 (en) * 1996-12-23 1998-07-02 Corporate Media Partners Doing Business As Americast Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol
WO2000020976A2 (en) * 1998-10-07 2000-04-13 Hotv Inc. Method and apparatus for synchronous presentation of interactive video and audio transmissions for tv and internet environments
DE10033134A1 (en) * 1999-10-21 2001-05-03 Iplacet Softwareentwicklungs G Method and device for displaying information on selected picture elements of pictures of a video sequence
WO2002019719A1 (en) * 2000-08-30 2002-03-07 Watchpoint Media, Inc. A method and apparatus for hyperlinking in a television broadcast
DE10240363B3 (en) 2002-09-02 2004-07-29 Jäger, Rudolf, Dr.rer.nat. Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5774666A (en) * 1996-10-18 1998-06-30 Silicon Graphics, Inc. System and method for displaying uniform network resource locators embedded in time-based medium
WO1998028907A2 (en) * 1996-12-23 1998-07-02 Corporate Media Partners Doing Business As Americast Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol
WO2000020976A2 (en) * 1998-10-07 2000-04-13 Hotv Inc. Method and apparatus for synchronous presentation of interactive video and audio transmissions for tv and internet environments
DE10033134A1 (en) * 1999-10-21 2001-05-03 Iplacet Softwareentwicklungs G Method and device for displaying information on selected picture elements of pictures of a video sequence
WO2002019719A1 (en) * 2000-08-30 2002-03-07 Watchpoint Media, Inc. A method and apparatus for hyperlinking in a television broadcast
DE10240363B3 (en) 2002-09-02 2004-07-29 Jäger, Rudolf, Dr.rer.nat. Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Bove M. et al; Hyperlinked television research at the MIT Media Laborstory; IBM System Journal, Vol. 39; NOS 3&4, 2000
BOVE,V.M.[u.a.]: Hyperlinked television research at the MIT Media Laboratory. In: IBM Systems Journal, 2000, Vol.39, NOS 3&4, S.470 478 *
BOVE,V.M.[u.a.]: Hyperlinked television research at the MIT Media Laboratory. In: IBM Systems Journal, 2000, Vol.39, NOS 3&4, S.470 478 BRUNHEROTO,J. [u.a.]: Issues in data embedding and synchronization for digital television. In: IEEE International Conference on Multimedia and Expo 2000. New York, 30.Juli - 2.August 2000, Vol.3, S.1233-1236
Brunheroto, J. et al; Issues in Data Embedding and Synchronisation for Digital Television; in Proc. IEEE International Conference an Multimedia and Expo 30. Jury - 2 August 2000; New York; (ISBN 0-7803-6536-4)
BRUNHEROTO,J. [u.a.]: Issues in data embedding and synchronization for digital television. In: IEEE International Conference on Multimedia and Expo 2000. New York, 30.Juli - 2.August 2000, Vol.3, S.1233-1236 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11600029B2 (en) 2012-06-06 2023-03-07 Sodyo Ltd. Display synchronization using colored anchors
CN106375870A (en) * 2016-08-31 2017-02-01 北京旷视科技有限公司 Video marking method and device
CN106375870B (en) * 2016-08-31 2019-09-17 北京旷视科技有限公司 Video labeling method and device

Similar Documents

Publication Publication Date Title
DE69727372T2 (en) A SYSTEM AND METHOD FOR GENERATING TRICTION PLAY VIDEO DATA STREAMS FROM A COMPRESSED NORMAL PLAYBACK VIDEO DATA STREAM
DE3814627C2 (en)
DE60310514T2 (en) Method and apparatus for synchronizing the reproduction of audio and / or video frames
US11605403B2 (en) Time compressing video content
DE69722556T2 (en) Synchronization of a stereoscopic video sequence
DE69333789T2 (en) Encoding of continuous image data
DE69531223T2 (en) METHOD AND ARRANGEMENT FOR DATA PACKET TRANSFER
DE60103510T2 (en) METHOD AND DEVICE FOR SIMULTANEOUS RECORDING AND PLAYBACK OF TWO DIFFERENT VIDEO PROGRAMS
DE2936263A1 (en) SYSTEM FOR STILL IMAGE DRIVING
DE69836470T2 (en) TRANSMITTER, RECEIVER AND MEDIUM FOR PROGRESSIVE PICTURE SIGNAL
DE102020108357A1 (en) RE-ENCODING PREDICTED IMAGES IN LIVE VIDEOSTREAM APPLICATIONS
DE19620186A1 (en) Method and device for synchronizing temporally related data streams
DE69915843T2 (en) PART BAND CODING / decoding
WO2009082990A1 (en) Method and device for real-time multi-view production
DE60312711T2 (en) METHOD AND DEVICE FOR CODING PICTURE AND / OR AUDIO DATA
WO2002078352A1 (en) Method for compressing and decompressing video data
DE102008020735A1 (en) Time synchronization method for e.g. video stream that is broadcast during push operation, involves associating display units with datasets using time axis and coordinates, where datasets form synchronization anchor
US20190141366A1 (en) System and method for insertion of an asset into a source dynamic media
US20200260075A1 (en) Systems and methods for group of pictures encoding
DE10240363B3 (en) Time synchronization method for synchronizing coded video stream with additional information e.g. for television broadcast or on-demand video stream
EP1516495B1 (en) Method for creating a system clock in a receiver device and corresponding receiver device
EP0981909B1 (en) Method and device for coding and decoding a digitized image
AT503668B1 (en) METHOD AND DEVICE FOR PRESENTING SIGNALS ON A DISPLAY DEVICE
EP1267306B1 (en) Method for marking digital video data
DE10308138B4 (en) Method for synchronizing picture and video phase of two or more MPEG-2 encoded video sequences for digital multi-projection systems

Legal Events

Date Code Title Description
OP8 Request for examination as to paragraph 44 patent law
8122 Nonbinding interest in granting licences declared
8131 Rejection