WO2012028851A1 - Method and system for additional service synchronisation - Google Patents

Method and system for additional service synchronisation Download PDF

Info

Publication number
WO2012028851A1
WO2012028851A1 PCT/GB2011/001288 GB2011001288W WO2012028851A1 WO 2012028851 A1 WO2012028851 A1 WO 2012028851A1 GB 2011001288 W GB2011001288 W GB 2011001288W WO 2012028851 A1 WO2012028851 A1 WO 2012028851A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
broadcast
user device
data
server
Prior art date
Application number
PCT/GB2011/001288
Other languages
French (fr)
Inventor
Michael Sparks
Original Assignee
British Broadcasting Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Broadcasting Corporation filed Critical British Broadcasting Corporation
Publication of WO2012028851A1 publication Critical patent/WO2012028851A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/18Arrangements for synchronising broadcast or distribution via plural systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/09Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
    • H04H60/13Arrangements for device control affected by the broadcast information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/35Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users
    • H04H60/38Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space
    • H04H60/40Arrangements for identifying or recognising characteristics with a direct linkage to broadcast information or to broadcast space-time, e.g. for identifying broadcast stations or for identifying users for identifying broadcast time or space for identifying broadcast time

Definitions

  • This invention relates to audience experience of broad audio-video content.
  • it relates to the degree with which a viewer can adapt & personalise the traditional broadcast experience using content derived from a secondary data source.
  • Television broadcasting has been around in commercial form since the 1930s, and whilst the technologies to deliver it have changed over time, the user experience remains largely the same - a single box with a single screen, and one or more speakers playing video and audio chosen by a broadcaster. Since television, and broadcasting in general is fundamentally about storytelling - having a specific take on a story by a given storyteller, this is unsurprising. However, how we enjoy a story can depend largely on the senses and capabilities we have available. If I am blind, I need more audio cues. If I am deaf, I need more visual cues. If I speak a different language, I need the story in my language.
  • Example adaptations include subtitles, teletext, interactive programming (eg MHEG), signing and audio description (also known as narrative subtitles). Whilst these adaptations have often been termed "accessibility", they are generally useful adaptations and the term accessibility should be understood broadly to include any functionality to improve access to content. Someone can watch subtitles whilst another person is on the phone. Another may listen to the audio and audio description whilst using a second device that uses their eyes.
  • broadcast spectrum is limited. Better use of broadcast spectrum has been achieved by switching from analogue broadcast to digital broadcast. Balances between signal robustness and bit rate have been made, in an attempt to increase available bandwidth. Video and audio quality has been degraded, in order to fit in ever more channels, and ever more adaptation services. Despite all of these technological changes, there are both theoretical and practical limits as to how many services you can fit in the broadcast chain. There is in essence a broadcast bandwidth limit. Over the past 20-25 years however, newer mechanisms for delivery of content have been growing in popularity. Examples here include the Internet and mobile telephone networks (both voice and data).
  • Such networks and clients of such networks have become very small, very powerful devices, more than capable of playing back audio & video, and a plethora of user choice driven services for audio and video have come into existence. Furthermore, the capabilities of the audience have grown beyond simply being able to have one video and audio display. They may have a laptop, several mobile phones, and even a network connected audio system, including a 3D surround sound system all within a single room. Adapting the broadcast service to such systems is possible, but again hits the broadcast bandwidth limit, limiting the number of services which can expand to such a system.
  • An example here is of a television or set top box with integrated network connection.
  • These typically timestamp the Internet delivered secondary content (eg secondary audio track) with a presentation timestamp that correlates with a presentation timestamp in the broadcast content, allowing the two signals to be integrated in the receiver.
  • secondary content eg secondary audio track
  • presentation timestamp that correlates with a presentation timestamp in the broadcast content
  • U.S Pat. No 7634798 by Watson discloses a system for enhanced interactive television. This does indeed take advantage of the multi-device nature of the modern home, allowing a user to participate in interactive experiences in time with a broadcast event.
  • this system does not attempt to enhance the linear nature of storytelling, and in particular offers a process oriented approach based upon the execution of commands, for interactivity requiring the use of a common clock. As a result this approach is primarily oriented towards "events" such as quizzes, rather than general programmes.
  • vents such as quizzes, rather than general programmes.
  • there is prior art aimed at enabling more detail regarding advertising to be displayed synchronously with a programme, which may be viewed as a form of content adaptation.
  • a local device it is preferable for a local device to be able to synchronise retrieved additional content without modification to the broadcast chain and, in particular, without modification to the broadcast receiver device.
  • the invention provides a system for synchronising retrieved additional service data with a broadcast service as received at a broadcast receiver by additionally receiving the broadcast service at a local time server and supplying a time or synchronisation signal from the local time server to an additional service device.
  • the use of such an extra local time server allows the additional service device, such as a laptop mobile or the like, to determine the time of program content, as received at the local time server and, hence, within the tolerance needed as received at the broadcast receiver, and thereby to synchronise assertion of additional services with the programme at the broadcast receiver.
  • the additional device is thereby provided with a synchronisation signal relative to the programme as received at the broadcast receiver, but without requiring a connection between the additional device and the broadcast receiver.
  • Figure 1 shows an overview of a system embodying the invention
  • Figure 2 is a schematic view of the core functional components of a system embodying the invention
  • Figure 3 is a time line showing receipt of timed signals from a broadcast time server to an additional service device
  • Figure 4 is a time line showing transmission of a test time signal from an additional service device to a broadcast time server and back again for calculation of transmission time delay;
  • Figure 5 is a schematic view of the core functional components of a user device. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • the embodiment of the invention seeks to provide the viewer the possibility to enhance their traditional television broadcast experience, by using the devices they own to play back additional adaptation content synchronously with the broadcast.
  • the audience's devices utilise a network connection to identify the programme being watched, the time into the programme and the broadcaster's concept of time as received at the user's receiver. This enables these additional devices to synchronise a local playout scheduler for playback of additional content.
  • additional adaptive content can be played back timed relative to a programme, or to a schedule.
  • This additional content can be obtained by the audiences' devices via a number of means including pulled from a network (Internet, mobile, etc), pushed from the network by the broadcaster (eg over X PP, SMS, MMS), or on pre-recorded media (CDROM, DVD, memory stick etc), or even from a previous broadcast by the broadcaster.
  • the system does not require continuous connection to a central server, and enables each device the user or users own to be synchronised to the broadcast, allowing unlimited adaptations, removing the broadcast bandwidth limitation, without change to a pre-existing broadcast infrastructure.
  • Combining the adaptability of modern networks, such as the Internet, with broadcast enables broadcasters to send additional adaptation services over such modern networks, without the need for additional broadcast spectrum, essentially enabling them to break the broadcast bandwidth limit.
  • the broadcaster broadcasts the common core of the story being told using audio and video, but can present adaptations - for example subtitles, audio description, director's narrative - via another medium, such as a network based around user choice - such as the Internet or mobile.
  • a personal device eg mobile
  • a broadcast transmitter 10 transmits television programmes (over air or by cable) which are received at user receivers 12 which may be set- top-boxes, freeview receivers, television receivers, or other receivers of broadcast audio-video services. It is noted now for ease of understanding that there is a broadcast delay between the broadcast transmitter 10 and the receiver 12 of a user.
  • the user also has an additional device, such as a mobile telephone, laptop or indeed any user device 14 capable of receiving data related to the programme being received by the receiver 12.
  • Such data may be considered as "additional data" 16 in the sense that it supplements or is additional to the corresponding programme being broadcast and received at the receivers.
  • the additional data may also be referred to as supplementary data, additional service data or the like, being the important point being that the data in some way relates to the programme broadcast from the transmitter and received at the receivers.
  • the user device may be referred to as an additional service device for ease of description. It is desired that this additional service device should assert retrieved additional data relating to the programme being viewed on the receiver 12 in synchronisation with that programme.
  • the system does not have any connection between the additional service device 14 and the broadcast receiver 12 because it is preferred not to require any modification to a standard broadcast receiver 12.
  • a broadcast time server 18 which receives the broadcast signals and extracts from the signals broadcast timing information and programme time information which the additional service device can then receive.
  • the additional service device is thereby able to calculate based on its own internal clock when additional content should be asserted in relation to the programme as received at the local time server and therefore as received at the user's own receiver 12.
  • the additional service may comprise text audio-video 3D information telemetry and a wide variety of other possible data and can be retrieved by any suitable routes such as via the Internet or mobile telephone network.
  • the system shown in Figure 2 comprises a broadcast transmitter 10 and receiver 12 as already described and an additional service device 14, such as a mobile telephone, laptop or the like which can retrieve additional service data from an additional service data store 16.
  • a broadcast time server 18 also receives the broadcast programme from the broadcast transmitter 10 over a broadcast channel 24.
  • the additional service device may communicate with the broadcast time server over a communication link 22.
  • the broadcaster makes available a television broadcast service (TBS) which includes a programme service, a time clock, along with now and next information, which identifies the currently broadcast programme.
  • TBS television broadcast service
  • This may be as simple as an analogue service with a clock on a teletext data service along with a now and next page.
  • the time is taken from a digital video broadcasting (DVB) time and date table of the programme status information, and the now and next information taken from the event information table of the programme status information.
  • the time clock broadcast by the transmitter 10 and received at the broadcast time server 18 is indicative of the broadcast time clock as received at the receiver 12.
  • the difference in time of receipt depends upon the difference between the time A of the transmission via the broadcast channel 20 in comparison to the time B of transmission of the broadcast channel 24.
  • the broadcast time server 18 may therefore be local to a locality such as a particular city or geographical area of a country such that the time difference between time A and time B is of the order 10s of milliseconds and therefore imperceptible to the user.
  • the locality may therefore be considered as a "transmitter region" in the sense that all receivers receiving the broadcast signal from a given transmitter are considered to be within the same locality.
  • the locality may therefore be a certain distance from the terrestrial broadcast antenna.
  • the locality may be the footprint of a satellite which could cover an entire country.
  • the concept of "locality" therefore relates to the fact that all receivers in the given locality receive the broadcast transmission at substantially the same time and any differences in time of arrival of the broadcast transmission are imperceptible to a user.
  • the now and next information mentioned above is a specific example of programme clock information and is the preferred choice in a DVB implementation of the system.
  • the now and next information indicates the start of a given programme with reference to the broadcast time clock.
  • the now and next information is typically broadcast once per second giving an indication of the current programme and next programme to be broadcast.
  • an extra now and next information signal is transmitted when a programme changes. This junction between different now and next signals may be used as the indicator of a new programme relative to the broadcast time clock.
  • the broadcast time server 18 thus receives a broadcast time clock within an allowable tolerance and also the timing of a programme being broadcast relative to the broadcast time clock.
  • the broadcaster makes the broadcast time server 18 (BTS) available over a communications network 22.
  • This time server provides the broadcaster concept of time, as received by receivers of the broadcast, rather than as generated by a playout system prior to transmission, and thus represents actual broadcast time, irrespective of delays integral to a playout system which may change over time.
  • the broadcast time server also provides a current programme server (CPS) available over the communications network 22. This receives the now and next information, for example from DVB's programme status information's event information table, to identify the current broadcast programme, and makes this available as a network queryable service.
  • the broadcast time server also provides a time into current programme server (TICPS) available over the communications network. This receives the now and next information, and monitors it for changes. When the now and next information changes, the system takes the junction point as the start time for the new programme.
  • the broadcast time server (BTS) thereby provides current programme and time into current programme with which the additional service can synchronise.
  • the broadcast time server may derive the time signal from the audio-video transmission in a variety of ways.
  • One approach may be based on timing of subtitles for a programme. Subtitles are timed accurately relative to a programme and so extracting the subtitles and program clock reference from the subtitles gives a direct indication of time into the current programme, and this may be used as the derived broadcast time signal.
  • the broadcaster also makes available an additional services server system 16 (ASSS) available over a communications network 26.
  • ASSS additional services server system 16
  • These services are made available as declarative timed schedules.
  • the simplest of these comprises of a list of timestamps and textual information.
  • More complex schedules comprise of a list of triples, where each triple is a timestamp, datatype tag, and an arbitrary octet sequence. These timestamps may be relative to the current programme time, or relative to the broadcaster's concept of time.
  • the system requires the broadcaster to ensure that the now and next information as broadcast is synchronised with the programme. This is practical due to automated playout systems, and essentially a configuration for the broadcaster's existing equipment, rather than a change to the broadcaster's existing equipment.
  • the audience has a standard receiver 12 which receives the television broadcast service and displays this.
  • An example receiver is a traditional television, another would be a digital video broadcasting receiver.
  • the audience also has secondary device 14 - having a network client.
  • the audience may have a plurality of such devices.
  • Each network client is synchronised with the broadcast chain, and hence with the audience's receiver as will now be described.
  • the user configures the device for a particular broadcast service.
  • the configuring of a device for a particular broadcast service may be by any convenient user interface.
  • a typical approach on a smart mobile telephone would be to access a mobile compatible web page from which a given broadcast service may be selected by a user.
  • the additional service device may then immediately commence retrieval of additional service data and cache this data in readiness for presentation to the user at appropriate time relative to the broadcast being viewed on the television receiver 12.
  • the additional service data may be retrieved from a remote server as shown over a network, but equally may be retrieved from a local store or from any other storage arrangement such as CDRO , memory and so on.
  • the device also contacts the time server 18 and synchronises a client local application clock (CLAC).
  • CLAC client local application clock
  • the synchronisation of the local clock of the additional service device 14 with the time server 18 can be achieved in a number of ways.
  • a possible approach is simply to receive a number of time samples from the time server to reproduce a clock locally based on calculating the skew and drift relative to the system clock of the client additional service device 14.
  • such an approach would omit any calculation of the network latency of the network 22 via which the time signals are provided.
  • a time delta D may exist between the broadcast time server and the additional service device.
  • the preferred approach to calculate this latency is for the additional service device to transmit a signal to the broadcast time server and for the broadcast time server to return a signal, so that the additional service device can calculate the time delta D from the transmitted and received signals.
  • a time server may choose to allow the client to synchronise using a known network time synchronisation algorithm such as the Marzullo's algorithm (as used by the Network Time Protocol (RFC 1035), Simple Network Time Protocol (RFC 4330), etc).
  • a known network time synchronisation algorithm such as the Marzullo's algorithm (as used by the Network Time Protocol (RFC 1035), Simple Network Time Protocol (RFC 4330), etc).
  • the broadcast time server can repeatedly transmit a clock signal in the time server domain (TS1 , TS2...TSN). This is received after a latency time D at the additional service device.
  • the time signals are received at corresponding times in a local clock time domain (LC1 , LC2...LCN).
  • a broad view of time at the local device in the broadcast time domain may then be calculated so that the connection does not need to be permanently maintained. This is done by calculating the relative drift of the two clocks by dividing the difference between TSN and TS by the difference between LCN and LC to derive the relative rates of the clocks.
  • Multiplying this relative rate by the difference between the local clock at any given point in time and the local clock at a start point provides a broad view of time in the broadcast time server domain effectively removing any different rates of the two clocks.
  • this does not take account of the latency delta D and for this purpose the client device can determine this delta D by transmission and reception of a test signal, as shown in Figure 4.
  • the preferred test signal is for the local device simply to transmit a time stamp at a given time which may be in the broad view of time domain calculated above, here shown as BVT1.
  • the time stamp is received at the broadcast time server and immediately transmitted back to the local device, along with a time indicator in the time server time domain, here shown as TS1.
  • This signal containing BVT1 and TS1 are received at the local device at a second time BVT2.
  • the additional service device can then calculate the time delta D by simple subtraction of BVT1-TS1. Alternatively, if it is assumed that the uplink and down link times are the same, then the additional service device can calculate BVT2-BVT1 which will give twice the time delta 2D.
  • the broadcast time server provides to the additional service device the ability to derive the broadcast time clock as received at the receiver 12 and also start and stop times of programmes relative to the broadcast clock.
  • the synchronisation of the local clock of the user device with the timed server as described above may be performed just once when a user requests additional data to be presented at their user device, and thereafter the local clock in the user device is sufficiently accurate that contents can be presented relative to the local clock. If a user requests the additional content just once, though, and then continues to use their device all day, perhaps using data related to a given television channel, it is possible for the local clock to become shifted relative to the broadcast time server. Accordingly, the broadcast time server may periodically push the synchronisation signals described above or alternatively the user device may periodically pull the time synchronisation signals.
  • the device's network client can then query the current programme server for the current programme, and use this information to choose which additional service to use.
  • This may be one stored locally - for example on a DVD, CDROM, hard drive, or similar storage device, or from a network location, such as an HTTP download, FTP download, or by requesting the addition service description (ASD) from a mobile system, for example, sending an SMS to a server and receiving an MMS reply with the ASD as the payload.
  • the network client can then use the additional service description to schedule events to occur at particular times relative to the network client's local application clock (CLAC).
  • CLAC local application clock
  • the network client interprets the message according to rules appropriate for the specific additional service description.
  • the event is defined as simply the textual data.
  • the interpretation of such data is to simply display the data.
  • the textual data display system may choose to detect HTML formatted text, and render such fragments according to HTML rendering rules.
  • an additional service description triple is a timestamp, datatype tag, and an arbitrary octet sequence
  • the event is defined to be the contents of the octet sequence, interpreted according to rules defined relative to the datatype tag.
  • the rules for the datatype tag are defined relative to the broadcaster.
  • Figure 5 is a schematic view of the core functional components of a user device, along with its external dependencies on the broadcast time server and additional service data store.
  • the broadcast time server provides three 3 services on a given IP address, with each service on a separate port.
  • a programme time summary service is provided on port 1700.
  • the user client uses these, along with the data from the additional service data to control 1 or more output devices in a timely fashion.
  • the user client's broadcast time synchronisation subsystem 30 initiates a TCP connection to port 1800 on the broadcast time server 18.
  • the time server responds by sending an octet stream to the client.
  • the octet stream forms a string, terminated by the connection close.
  • the octect sequence forms a string of textual characters representing a floating point number.
  • the user client may then parse this string to turn it into an internal representation of a floating point number.
  • the floating point number relates to the number of seconds since the beginning of what is known as the Unix Epoch - or more specifically the precise length of time, according to the broadcast time server, that has elapsed, in seconds, since 00:00:00 on the 1 st of January 970.
  • This time is the remote baseline time.
  • the user client's broadcast time synchronisation subsystem 30 then reads the user devices system clock, and that time is denoted as the local baseline time. For example the octet stream " 249038001.709007" represents the time Friday Jul 31 11:00:01 2009 and 0.709007 seconds.
  • the broadcast time synchronisation subsystem 30 can initiate a second TCP connection to port 1800 on the broadcast time server. Again, it receives a time back. This time can be denote remote current time. The system clock is read, and this is denoted local current time.
  • the remote elapsed time is calculated by subtracting the remote baseline time from the remote current time.
  • the local elapsed time is calculated by subtracting the local baseline time from the local current time.
  • a ratio denoting the skew between the two clocks can be calculated by dividing the local elapsed time by remote elapsed time. This enables a calculation to be performed that transforms a local time derived from the system clock into the remote time. That is this allows the user device to map from local system clock time to the broadcast view of time as received by a broadcast receiver. To do this, this requires the triplet of information (local baseline time, remote baseline time , clock ratio )
  • the user application clock provides two services to other subsystems. One is the ability to provide the current time according to the broadcast view of time, the other is to sleep for a given number of seconds, including fractional seconds, according to the broadcast view of time. This given a time "now" taken from the system clock, a first order approximation of the broadcast time can be derived from the calculation: remote baseline time + (now - local baseline time) * ratio .
  • the application clock simply divides the requested number of seconds to sleep by the ratio. This is to transform broadcast elapsed time into local elapsed time. The sleep service may then sleep for this time period, waking after the local elapsed time in sync with the remote elapsed time.
  • the network delay measurement subsystem 34 retrieves the broadcast view of time from the application clock. This time is denoted send time. This time is denoted by a floating point number representing the number of seconds elapsed since 1 st January 1970. It then initiates a TCP connection to the broadcast time server on port 1801. It then sends an octet sequence string representation of send time. This string is terminated by the addition of a network end of line - that is " ⁇ r ⁇ n" or specifically the raw octet values are 13 and 10 respectively.
  • the broadcast time server treats the network end of line " ⁇ r ⁇ n" as a message terminator, and throws the network end of line away. It then appends a space to the message, and then appends the current time encoded as an octet sequence which is again a string representation of a floating point value which represents the number of seconds elapsed since 1 st Janurary 1970.
  • the broadcast time server then terminates the TCP connection to signify that it has finished sending its response message. For example, if the network delay is 50ms, then the time on the time server will be 50ms ahead of the user device's application clock. Additionally, the message from the user device network delay measurement subsystem will take an additional 50ms to reach the server.
  • the broadcast time server's clock would be1249038300.050000 .
  • the broadcast time server's clock would be 1249038300.100000.
  • the response message sent by the broadcast time server to the user device would be "1249038300.000000 1249038300.100000"
  • the user client device can then parse these two timestamps to give sent time and remote time .
  • the user client also retrieves an expected time from the application clock. This time should match the remote time within a certain tolerance level.
  • the tolerance level used by the preferred embodiment is 10ms. If the difference between remote time and expected time match is not within tolerance, the user device restarts the clock calibration process.
  • the round trip network delay is then be calculated by subtracting the remote time from the sent time.
  • the one way network delay, and hence error in the local application clock can then be determined by dividing the round trip network delay by two. This delay is then used to calibrate the user device application clock.
  • the user device application clock simply uses this network delay by adding it to the times it current calculates.
  • the application clock is synchronised with the broadcast view of time as received by a broadcast receiver.
  • the user is then required to inform the user device what channel they are watching.
  • the user must do this because in the preferred embodiment the user's set top box and broadcast chain are unmodified.
  • other embodiments may take this from another system that is able to communicate to enhanced set top boxes that can inform external devices what channel the set top box is tuned to.
  • the channel could be determined via audio watermarking or by another other method. In the preferred embodiment, this is achieved by the user typing the channel name on a keyboard, though clearly a graphical menu system or voice recognition system could be used to achieve the same goal.
  • the programme time client 36 then create a TCP connection to port 1700 on the broadcast time server to connect to the programme time summary service.
  • the progamme time client sends the octet string "summary ⁇ r ⁇ n", that is the single word “summary” followed by a network end of line sequence.
  • the server then sends a response message to this, which is an octet sequence terminated by the connection being closed.
  • the programme time client 36 can then parse this response as follows.
  • the expected format of the response string is ( "OK”
  • the command tag is the command the programme time client sent to the server.
  • the response is the required message. This can be parsed by searching the string for the first space character, and taking the response code - OK or ERROR - from the characters in the string preceding that first space.
  • the command tag can be found by searching for the second space character in the response.
  • the command can then be extracted from between the first and second space characters in the response.
  • the actual command result - the result can be extracted from the response string by extracting the string after the second space character up to and excluding the 2 network end of line characters at the end of the string.
  • the response is a JSON encoded message - irrespective of success or error.
  • JavaScript Object Notation - JSON is an encoding format commonly used by internet clients, and is defined in RFC 4627, as published by the Internet Engineering Task Force (ietf). The standard can be found at this URL: http://www.ietf.org/rfc/rfc4627.txt . This is then decoded by the client using any suitable JSON parser.
  • the response represents a "dictionary" object, that is an object that maps keys to values.
  • the keys are channel names
  • the values are arrays. These arrays are pairs of values - the first relates to a timestamp that the programme started, the second is the programme name.
  • the programme start time has been derived by the broadcast time server from the broadcast chain.
  • the preferred embodiment which uses DVB-T performs this by monitoring the now and next event information table from for change, and using this as programme start time. This could also be derived from subtitle junction changes. It could also be enhanced by the broadcaster inserting a marker indicating the start time of the programme into the broadcast transmission - for example using the related content table descriptor in DVB.
  • the programme start time information is then passed onto Timed Event Scheduler 38.
  • the programme start time and programme name is passed onto the events retriever subsystem 40.
  • the events retriever subsystem 40 uses the programme name to read a given scheduled events file from a file system on the user device.
  • the events retriever subsystem is preconfigured to connect to an additional services server system and makes a request for the scheduled events file for the given programme name being broadcast at the given time.
  • the server is an HTTP (web) server, which responds to POST requests on a preconfigued URL containing the programme name and programme start time.
  • the additional services server system then responds with the scheduled events file.
  • the timed event scheduler subsystem takes the scheduled events file and parses it.
  • the scheduled events file is a JSON format file containing one object representing a schedule.
  • the schedule object is an array of event objects. Each event object is an array consisting of 3 parts - a timestamp, an event type and event data. In the case of the events file being timed against broadcast time, this schedule object can be used "as is" to drive the timed event scheduler 38.
  • the timestamps will be relative to programme time.
  • the timed event scheduler has to take the programme start time - as provided by the programme time server and add this to each of the timestamps in the schedule object - mapping programme time to broadcast time. Now that the timestamps are relative to broadcast time, the schedule object can then be used to drive the timed event scheduler.
  • the timed event scheduler then consists of two key portions - a timed scheduler, and an event handler.
  • the timed scheduler uses the application clock to drive a local scheduler. This works through the schedule object - that is the array of events in order, looking at the timestamps.
  • each timestamp For each timestamp, it looks at the current (broadcast) time as retrieved from the application clock, and subtracts that from the next event's timestamp. It then uses the sleep service from the application clock to sleep for the given (broadcast) time period (or 0s if the difference is less than zero). Once the scheduler has finished sleeping for the given period, the time for the scheduled event has been reached. The scheduler then looks at the event type to determine how to handle the event data. In particular, the scheduler can then send the event data to different output subsystems.
  • output subsystems 42 There will be one or more such output subsystems 42. These subsystems can be audio systems, text display systems, video display systems or even systems that control physical devices via interface boards such as arduino. Thus the system can use the event data produce a variety of synchronised reponses in addition to the main broadcast channel.
  • the event type "text” causes the text to be sent a text display output system;
  • the event type "audio” causes the data to be passed to an audio output subsystem and played back;
  • the event type "arduino” causes the data to be sent to an arduino output subsystem which passes the event data unmodified over a serial port to an attached arduino device which may spin a motor or flash a light, etc in response to the command.
  • an arduino is just a microcontroller based electronic breakout box this allows this system to use the event data to control anything from robots through motion systems, lighting systems, etc this means that the broadcast content can be synchronously augmented by this system by anything within the imagination of the programme maker.
  • the user device makes HTTP requests for each of the 3 services described above ⁇
  • the MIME type of the response is of application/json, containing a dictionary object with 3 representations of time - timestamp as before, english textual and a 9 part array of year, month, day, hours, minutes, seconds, weekday (0..6, monday is 0), day in year, and whether daylight saving is active.
  • An example response is: ⁇ "localtime”: [2010, 7, 5, 17, 21 , 10, 0, 186, 1], "asctime”: “Mon Jul 5 17:21 :10 2010", "time”: 1278346870.0 ⁇ .
  • the MIME type of the response is also application/json, and contains a dictionary object with the same 3 representation of time as the time service, but additionally includes an extra field which contains the send time from the user device.
  • the MIME type of the response is again application/json, and contains a dictionary object representing a summary of all channels, programme start times and programme names as before.
  • the format of this response is exactly the same as the response previously described.
  • the user device uses this data in precisely the same way as before.
  • An example network client may connect to a service based around interpreting events in the service in the same way as a web browser interprets content.
  • An example datatype tag for a broadcaster could be "text/html”, which the network client could interpret the octet sequence as HTML to be rendered according to HTML rendering rules.
  • Another example may be "base64/audio/wav”, which the network client could interpret the octet sequence as a base 64 encoded audio file in wav form, which would then be played at that point in time.
  • Another example may be "link", which the network client would interpret the octet sequence as an URL to be downloaded and interpreted as a web browser would in the usual way as soon as possible.
  • a network client could choose to precache a local copy of the link's content.
  • Such a network client could render textual data - such as subtitles, links, textual footnotes & comments, audio, video, flash, (and so on) synchronously with the broadcast.
  • Another network client may connect to a service based solely around audio, and play back audio event synchronous with the broadcast.
  • audio services could include audio description, director narratives, audio footnotes, or even 3D surround sound (such as ambisonics).
  • Another network client may be a games console or similar device rich in computer processing power.
  • Such a device may connect to an additional service description which is a moving 3D model of the programme as broadcast. (This may be captured via a system such as ORIGAMI, or i3DLive).
  • This network client may also act as a receiver system, and project the broadcast video as a texture onto the 3D model, providing the audience with a 3D video experience.
  • video data may also be provided as additional services, and synchronised using an appropriate additional service description. These secondary views could then be applied to the 3D model providing higher quality 3D video.
  • Such a platform could also be capable of creating a local stereoscopic 3D rendering for playback on a suitable stereoscopic display.
  • Other possible service descriptions may include information such as telemetry, motion vectors, olfactory or even gustatory data, enabling the control of the location of local devices, force feedback (games consoles), generation of timely smells or tastes.
  • the system described is applicable to analogue digital terrestrial cable or satellite television.
  • the receiver described may be a terrestrial satellite or cable digital television, set-top-box, or other separate audio or video decoder.
  • the network over which the additional service data is provided may be the Internet, a mobile phone data service or the like and may use HTTP, TCP, UDP, and XMPP, IRC or other known protocol. Similarly, the request for time may use any of these protocols and may be sent by SMS or MMS.
  • the processing mechanism which is used to process the additional service data may be specific to object a network or may be according to a standard and will depend upon the type of the additional service data. As already discussed, this may be text, audio, video telemetry and may include data such as 3D model data, motion vectors, footnotes or alternative view points from camera angles being viewed and, indeed, any data which may be related to the broadcast programme and which may be asserted by a user device. Within the scope of the term "asserted” is included any delivery to a user which can include control of other devices within the user's environment.
  • the broadcast time server may generate an appropriate time signal of a variety of forms.
  • the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
  • timing may be provided from a now and next page on analog teletext.
  • the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
  • timing may be provided from a now and next page on analog teletext.
  • the time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table.
  • timing may be provided from a now and next page on analog teletext.
  • the time stamp may be from a program status information or from a time and date table.
  • timing may be provided from a now and next page on analog teletext.
  • the time stamp may be from a program status information
  • embodiment is a digital implementation using the event information table.
  • the receiver of the audio-video broadcast may therefore be an analog or digital television, cable receiver, set-top box, or similar. It is noted, for the avoidance of doubt, that no change is required of the receiver for implementation of the invention. In particular, the receiver does not need to have any additional connection for providing timing information to the user device, because the user device retrieves timing information from the separate broadcast time server.
  • the user device can be a variety of different types of device including mobile telephones, laptops, personal data assistance (PDA) music players and similar user devices.
  • the user device is a portable device.
  • the connection from the user device to the broadcast time server is preferably a network such as the internet or a mobile phone data service using any known protocol such as HTTP, TCP, UDP, EMPP or IRC.
  • the request for the broadcast time signal may be sent from the user device to the broadcast time server by SMS, MMS or similar protocol.
  • the additional data is considered additional in the sense that it supplements the content of the broadcast audio-video and is asserted by a presentation synchronised to the audio-video as presented at a receiver near the user of the user device.
  • the additional service data may be provided to the user device by download, in advance, or retrieved dynamically alongside receipt of the broadcast audio-video at the receiver.
  • the data may be provided as a file containing a list of timed events. Having one or more time stamps relative to broadcast time, program time or the beginning of a given sequence of audio-video data.
  • the additional data may be text, images, sound of many different types, and may also include data that causes the user device to instruct another device. A particular example of this would be receipt of additional audio at the user device which is then provided to a separate audio decoder.
  • a further example would be movement data which may be received and asserted to cause movement of a further user device such as any device for providing special effects.

Abstract

A system and method for providing data such as text, graphics or sound to a portable user device in synchronisation with a service broadcast and received by separate receivers includes a broadcast time server. The broadcast time server receives a broadcast service at substantially the same time as other broadcast receivers in the locality of a transmitter and provides time synchronisation signals by a separate network to user devices. A user device such as a mobile telephone can then obtain the synchronisation signal from the broadcast time server so that the additional data can be asserted in synchronisation with a television programme being viewed by the user on a television. The user device does not require any connection to the user's television receiver and so no modification to the broadcast transmitter or receiver is required.

Description

METHOD AND SYSTEM FOR ADDITIONAL SERVICE SYNCHRONISATION
BACKGROUND OF THE INVENTION This invention relates to audience experience of broad audio-video content. In particular, it relates to the degree with which a viewer can adapt & personalise the traditional broadcast experience using content derived from a secondary data source. Television broadcasting has been around in commercial form since the 1930s, and whilst the technologies to deliver it have changed over time, the user experience remains largely the same - a single box with a single screen, and one or more speakers playing video and audio chosen by a broadcaster. Since television, and broadcasting in general is fundamentally about storytelling - having a specific take on a story by a given storyteller, this is unsurprising. However, how we enjoy a story can depend largely on the senses and capabilities we have available. If I am blind, I need more audio cues. If I am deaf, I need more visual cues. If I speak a different language, I need the story in my language.
Over time, more channels have been added, causing a squeeze on broadcast spectrum, and some simple adaptations have been squeezed into an ever decreasing space. Example adaptations include subtitles, teletext, interactive programming (eg MHEG), signing and audio description (also known as narrative subtitles). Whilst these adaptations have often been termed "accessibility", they are generally useful adaptations and the term accessibility should be understood broadly to include any functionality to improve access to content. Someone can watch subtitles whilst another person is on the phone. Another may listen to the audio and audio description whilst using a second device that uses their eyes.
However, broadcast spectrum is limited. Better use of broadcast spectrum has been achieved by switching from analogue broadcast to digital broadcast. Balances between signal robustness and bit rate have been made, in an attempt to increase available bandwidth. Video and audio quality has been degraded, in order to fit in ever more channels, and ever more adaptation services. Despite all of these technological changes, there are both theoretical and practical limits as to how many services you can fit in the broadcast chain. There is in essence a broadcast bandwidth limit. Over the past 20-25 years however, newer mechanisms for delivery of content have been growing in popularity. Examples here include the Internet and mobile telephone networks (both voice and data).
Such networks and clients of such networks have become very small, very powerful devices, more than capable of playing back audio & video, and a plethora of user choice driven services for audio and video have come into existence. Furthermore, the capabilities of the audience have grown beyond simply being able to have one video and audio display. They may have a laptop, several mobile phones, and even a network connected audio system, including a 3D surround sound system all within a single room. Adapting the broadcast service to such systems is possible, but again hits the broadcast bandwidth limit, limiting the number of services which can expand to such a system.
There have been attempts to unify the broadcast and internet experience. For example, web tv was a system that looks like a TV that enables a user to browse the web on their television. The Nintendo Wii provides access to the internet, and hence access to the plethora of audio/video services described earlier. However, these devices do not attempt to provide an integrated experience - they do not attempt to enhance and extend the broadcast experience, nor adapt the broadcast to the desires, limitation or capabilities of the user or their devices.
There are examples of prior art enabling the use of the internet to adapt the broadcast in some manner, they typically try to do so in a handful of different ways:
They modify the broadcast channel to include extra metadata as hints. However, this requires a modified broadcast chain - typically involving a change to playout systems at the broadcaster, or the transmitter network or to with the audiences' receivers. This essentially stretches the broadcast bandwidth limitation further, but does not eradicate the issue, since future improvements require changes to a broadcast chain.
They seek to enhance the television experience as a single integrated unit. An example here is of a television or set top box with integrated network connection. These typically timestamp the Internet delivered secondary content (eg secondary audio track) with a presentation timestamp that correlates with a presentation timestamp in the broadcast content, allowing the two signals to be integrated in the receiver.
U.S Pat. No 7634798 by Watson discloses a system for enhanced interactive television. This does indeed take advantage of the multi-device nature of the modern home, allowing a user to participate in interactive experiences in time with a broadcast event. However, this system does not attempt to enhance the linear nature of storytelling, and in particular offers a process oriented approach based upon the execution of commands, for interactivity requiring the use of a common clock. As a result this approach is primarily oriented towards "events" such as quizzes, rather than general programmes. Finally, there is prior art aimed at enabling more detail regarding advertising to be displayed synchronously with a programme, which may be viewed as a form of content adaptation. However, such prior art requires the audience's device to communicate to a central server for additional information in order to "look up" what they were watching, and then in response to a query respond with information regarding the on screen products and advertisements at those times. Such systems are limited to advertising and do not seek to enhance the story telling environment. Such systems are aimed providing timely links to web content, rather than adaptations of the existing broadcast. None of the known systems enable a user, or group of co-located users, to adapt a shared broadcast experience from an unmodified broadcast chain (broadcaster, transmitter, reception & display) using local devices such as laptops, mobiles, network connected media players. SUMMARY OF THE INVENTION
We have also appreciated that it is preferable for a local device to be able to synchronise retrieved additional content without modification to the broadcast chain and, in particular, without modification to the broadcast receiver device.
In broad terms, the invention provides a system for synchronising retrieved additional service data with a broadcast service as received at a broadcast receiver by additionally receiving the broadcast service at a local time server and supplying a time or synchronisation signal from the local time server to an additional service device. The use of such an extra local time server allows the additional service device, such as a laptop mobile or the like, to determine the time of program content, as received at the local time server and, hence, within the tolerance needed as received at the broadcast receiver, and thereby to synchronise assertion of additional services with the programme at the broadcast receiver. In essence, the additional device is thereby provided with a synchronisation signal relative to the programme as received at the broadcast receiver, but without requiring a connection between the additional device and the broadcast receiver. The invention is defined in the claims to which reference is now directed.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in more detail by way of example with reference to the drawings, in which:
Figure 1 shows an overview of a system embodying the invention;
Figure 2 is a schematic view of the core functional components of a system embodying the invention;
Figure 3 is a time line showing receipt of timed signals from a broadcast time server to an additional service device;
Figure 4 is a time line showing transmission of a test time signal from an additional service device to a broadcast time server and back again for calculation of transmission time delay; and
Figure 5 is a schematic view of the core functional components of a user device. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The embodiment of the invention seeks to provide the viewer the possibility to enhance their traditional television broadcast experience, by using the devices they own to play back additional adaptation content synchronously with the broadcast.
The audience's devices utilise a network connection to identify the programme being watched, the time into the programme and the broadcaster's concept of time as received at the user's receiver. This enables these additional devices to synchronise a local playout scheduler for playback of additional content. Thus additional adaptive content can be played back timed relative to a programme, or to a schedule. This additional content can be obtained by the audiences' devices via a number of means including pulled from a network (Internet, mobile, etc), pushed from the network by the broadcaster (eg over X PP, SMS, MMS), or on pre-recorded media (CDROM, DVD, memory stick etc), or even from a previous broadcast by the broadcaster. Thus the system does not require continuous connection to a central server, and enables each device the user or users own to be synchronised to the broadcast, allowing unlimited adaptations, removing the broadcast bandwidth limitation, without change to a pre-existing broadcast infrastructure. Combining the adaptability of modern networks, such as the Internet, with broadcast enables broadcasters to send additional adaptation services over such modern networks, without the need for additional broadcast spectrum, essentially enabling them to break the broadcast bandwidth limit. Using the system, the broadcaster broadcasts the common core of the story being told using audio and video, but can present adaptations - for example subtitles, audio description, director's narrative - via another medium, such as a network based around user choice - such as the Internet or mobile. This enables the audience to make use of all local devices to enhance their broadcast experience, for example with subtitles along with footnotes and additional text on a second screen; another audience member in the same room could enjoy audio description delivered over a personal device (eg mobile) on headphones, whilst the rest of the audience in the room listen to a 3D surround sound track delivered over IP, played back by another device.
An overview of a system embodying the invention is shown in Figure 1 though the invention may equally be embodied in a method or a computer program for executing a method. A broadcast transmitter 10 transmits television programmes (over air or by cable) which are received at user receivers 12 which may be set- top-boxes, freeview receivers, television receivers, or other receivers of broadcast audio-video services. It is noted now for ease of understanding that there is a broadcast delay between the broadcast transmitter 10 and the receiver 12 of a user. The user also has an additional device, such as a mobile telephone, laptop or indeed any user device 14 capable of receiving data related to the programme being received by the receiver 12. Such data may be considered as "additional data" 16 in the sense that it supplements or is additional to the corresponding programme being broadcast and received at the receivers. The additional data may also be referred to as supplementary data, additional service data or the like, being the important point being that the data in some way relates to the programme broadcast from the transmitter and received at the receivers. The user device may be referred to as an additional service device for ease of description. It is desired that this additional service device should assert retrieved additional data relating to the programme being viewed on the receiver 12 in synchronisation with that programme. However, the system does not have any connection between the additional service device 14 and the broadcast receiver 12 because it is preferred not to require any modification to a standard broadcast receiver 12. It is for this purpose that a broadcast time server 18 is provided which receives the broadcast signals and extracts from the signals broadcast timing information and programme time information which the additional service device can then receive. The additional service device is thereby able to calculate based on its own internal clock when additional content should be asserted in relation to the programme as received at the local time server and therefore as received at the user's own receiver 12. As shown in Figure 1 , the additional service may comprise text audio-video 3D information telemetry and a wide variety of other possible data and can be retrieved by any suitable routes such as via the Internet or mobile telephone network.
A more detailed explanation of the operation of the system embodying the invention will now be given with respect to the simplified diagram of the system shown in Figure 2.
The system shown in Figure 2 comprises a broadcast transmitter 10 and receiver 12 as already described and an additional service device 14, such as a mobile telephone, laptop or the like which can retrieve additional service data from an additional service data store 16. A broadcast time server 18 also receives the broadcast programme from the broadcast transmitter 10 over a broadcast channel 24. The additional service device may communicate with the broadcast time server over a communication link 22.
The broadcaster makes available a television broadcast service (TBS) which includes a programme service, a time clock, along with now and next information, which identifies the currently broadcast programme. This may be as simple as an analogue service with a clock on a teletext data service along with a now and next page. In the preferred embodiment the time is taken from a digital video broadcasting (DVB) time and date table of the programme status information, and the now and next information taken from the event information table of the programme status information. The time clock broadcast by the transmitter 10 and received at the broadcast time server 18 is indicative of the broadcast time clock as received at the receiver 12. The difference in time of receipt depends upon the difference between the time A of the transmission via the broadcast channel 20 in comparison to the time B of transmission of the broadcast channel 24. If the broadcast time server is arranged to be in the same general locality as the receiver 12, then time A = time B within the accuracy needed by the system. The broadcast time server 18 may therefore be local to a locality such as a particular city or geographical area of a country such that the time difference between time A and time B is of the order 10s of milliseconds and therefore imperceptible to the user. The locality may therefore be considered as a "transmitter region" in the sense that all receivers receiving the broadcast signal from a given transmitter are considered to be within the same locality. For a terrestrial transmitter, the locality may therefore be a certain distance from the terrestrial broadcast antenna. For satellite transmissions, the locality may be the footprint of a satellite which could cover an entire country. The concept of "locality" therefore relates to the fact that all receivers in the given locality receive the broadcast transmission at substantially the same time and any differences in time of arrival of the broadcast transmission are imperceptible to a user.
The now and next information mentioned above is a specific example of programme clock information and is the preferred choice in a DVB implementation of the system. The now and next information indicates the start of a given programme with reference to the broadcast time clock. In DVB, the now and next information is typically broadcast once per second giving an indication of the current programme and next programme to be broadcast. In addition, an extra now and next information signal is transmitted when a programme changes. This junction between different now and next signals may be used as the indicator of a new programme relative to the broadcast time clock. The broadcast time server 18 thus receives a broadcast time clock within an allowable tolerance and also the timing of a programme being broadcast relative to the broadcast time clock.
The broadcaster makes the broadcast time server 18 (BTS) available over a communications network 22. This time server provides the broadcaster concept of time, as received by receivers of the broadcast, rather than as generated by a playout system prior to transmission, and thus represents actual broadcast time, irrespective of delays integral to a playout system which may change over time.
The broadcast time server also provides a current programme server (CPS) available over the communications network 22. This receives the now and next information, for example from DVB's programme status information's event information table, to identify the current broadcast programme, and makes this available as a network queryable service. The broadcast time server also provides a time into current programme server (TICPS) available over the communications network. This receives the now and next information, and monitors it for changes. When the now and next information changes, the system takes the junction point as the start time for the new programme. The broadcast time server (BTS), thereby provides current programme and time into current programme with which the additional service can synchronise.
The broadcast time server may derive the time signal from the audio-video transmission in a variety of ways. One approach may be based on timing of subtitles for a programme. Subtitles are timed accurately relative to a programme and so extracting the subtitles and program clock reference from the subtitles gives a direct indication of time into the current programme, and this may be used as the derived broadcast time signal.
The broadcaster also makes available an additional services server system 16 (ASSS) available over a communications network 26. These services are made available as declarative timed schedules. The simplest of these comprises of a list of timestamps and textual information. More complex schedules comprise of a list of triples, where each triple is a timestamp, datatype tag, and an arbitrary octet sequence. These timestamps may be relative to the current programme time, or relative to the broadcaster's concept of time.
Clearly, the system requires the broadcaster to ensure that the now and next information as broadcast is synchronised with the programme. This is practical due to automated playout systems, and essentially a configuration for the broadcaster's existing equipment, rather than a change to the broadcaster's existing equipment. The audience has a standard receiver 12 which receives the television broadcast service and displays this. An example receiver is a traditional television, another would be a digital video broadcasting receiver.
The audience also has secondary device 14 - having a network client. The audience may have a plurality of such devices. Each network client is synchronised with the broadcast chain, and hence with the audience's receiver as will now be described.
First, the user configures the device for a particular broadcast service. The configuring of a device for a particular broadcast service may be by any convenient user interface. A typical approach on a smart mobile telephone would be to access a mobile compatible web page from which a given broadcast service may be selected by a user. The additional service device may then immediately commence retrieval of additional service data and cache this data in readiness for presentation to the user at appropriate time relative to the broadcast being viewed on the television receiver 12. The additional service data may be retrieved from a remote server as shown over a network, but equally may be retrieved from a local store or from any other storage arrangement such as CDRO , memory and so on. A further possibility that a certain amount of the data is provided in the user device in advance, perhaps speculatively fetched or loaded and when the additional service is requested, the remainder of the data is downloaded. The device also contacts the time server 18 and synchronises a client local application clock (CLAC). The synchronisation of the local clock of the additional service device 14 with the time server 18 can be achieved in a number of ways. A possible approach is simply to receive a number of time samples from the time server to reproduce a clock locally based on calculating the skew and drift relative to the system clock of the client additional service device 14. However, such an approach would omit any calculation of the network latency of the network 22 via which the time signals are provided. As shown, a time delta D may exist between the broadcast time server and the additional service device. The preferred approach to calculate this latency is for the additional service device to transmit a signal to the broadcast time server and for the broadcast time server to return a signal, so that the additional service device can calculate the time delta D from the transmitted and received signals.
If greater accuracy is needed, a time server may choose to allow the client to synchronise using a known network time synchronisation algorithm such as the Marzullo's algorithm (as used by the Network Time Protocol (RFC 1035), Simple Network Time Protocol (RFC 4330), etc).
The calculation of the synchronisation will now be described with reference to Figures 3 and 4. As shown in Figure 3, when requested by the additional service device, the broadcast time server can repeatedly transmit a clock signal in the time server domain (TS1 , TS2...TSN). This is received after a latency time D at the additional service device. The time signals are received at corresponding times in a local clock time domain (LC1 , LC2...LCN). A broad view of time at the local device in the broadcast time domain may then be calculated so that the connection does not need to be permanently maintained. This is done by calculating the relative drift of the two clocks by dividing the difference between TSN and TS by the difference between LCN and LC to derive the relative rates of the clocks. Multiplying this relative rate by the difference between the local clock at any given point in time and the local clock at a start point, provides a broad view of time in the broadcast time server domain effectively removing any different rates of the two clocks. However, this does not take account of the latency delta D and for this purpose the client device can determine this delta D by transmission and reception of a test signal, as shown in Figure 4. As shown in Figure 4, the preferred test signal is for the local device simply to transmit a time stamp at a given time which may be in the broad view of time domain calculated above, here shown as BVT1. The time stamp is received at the broadcast time server and immediately transmitted back to the local device, along with a time indicator in the time server time domain, here shown as TS1. This signal containing BVT1 and TS1 are received at the local device at a second time BVT2. The additional service device can then calculate the time delta D by simple subtraction of BVT1-TS1. Alternatively, if it is assumed that the uplink and down link times are the same, then the additional service device can calculate BVT2-BVT1 which will give twice the time delta 2D.
By the various approaches described above, the broadcast time server provides to the additional service device the ability to derive the broadcast time clock as received at the receiver 12 and also start and stop times of programmes relative to the broadcast clock.
The synchronisation of the local clock of the user device with the timed server as described above may be performed just once when a user requests additional data to be presented at their user device, and thereafter the local clock in the user device is sufficiently accurate that contents can be presented relative to the local clock. If a user requests the additional content just once, though, and then continues to use their device all day, perhaps using data related to a given television channel, it is possible for the local clock to become shifted relative to the broadcast time server. Accordingly, the broadcast time server may periodically push the synchronisation signals described above or alternatively the user device may periodically pull the time synchronisation signals.
The device's network client can then query the current programme server for the current programme, and use this information to choose which additional service to use. This may be one stored locally - for example on a DVD, CDROM, hard drive, or similar storage device, or from a network location, such as an HTTP download, FTP download, or by requesting the addition service description (ASD) from a mobile system, for example, sending an SMS to a server and receiving an MMS reply with the ASD as the payload. The network client can then use the additional service description to schedule events to occur at particular times relative to the network client's local application clock (CLAC).
When an event occurs, the network client interprets the message according to rules appropriate for the specific additional service description. In the case of a basic additional service description, which consists of a list of timestamps and textual data, the event is defined as simply the textual data. The interpretation of such data is to simply display the data. The textual data display system may choose to detect HTML formatted text, and render such fragments according to HTML rendering rules.
In the case of an additional service description triple is a timestamp, datatype tag, and an arbitrary octet sequence, the event is defined to be the contents of the octet sequence, interpreted according to rules defined relative to the datatype tag. The rules for the datatype tag are defined relative to the broadcaster.
Figure 5 is a schematic view of the core functional components of a user device, along with its external dependencies on the broadcast time server and additional service data store. The broadcast time server provides three 3 services on a given IP address, with each service on a separate port.
• An overall broadcast time service is provided on port 1800
• An echoing time service is provided on port 1801
· A programme time summary service is provided on port 1700.
The user client uses these, along with the data from the additional service data to control 1 or more output devices in a timely fashion. The user client's broadcast time synchronisation subsystem 30 initiates a TCP connection to port 1800 on the broadcast time server 18. The time server responds by sending an octet stream to the client. The octet stream forms a string, terminated by the connection close. The octect sequence forms a string of textual characters representing a floating point number. The user client may then parse this string to turn it into an internal representation of a floating point number. In the preferred embodiment, the floating point number relates to the number of seconds since the beginning of what is known as the Unix Epoch - or more specifically the precise length of time, according to the broadcast time server, that has elapsed, in seconds, since 00:00:00 on the 1st of January 970. This time is the remote baseline time. The user client's broadcast time synchronisation subsystem 30 then reads the user devices system clock, and that time is denoted as the local baseline time. For example the octet stream " 249038001.709007" represents the time Friday Jul 31 11:00:01 2009 and 0.709007 seconds.
After a brief delay (typically a few seconds), the broadcast time synchronisation subsystem 30 can initiate a second TCP connection to port 1800 on the broadcast time server. Again, it receives a time back. This time can be denote remote current time. The system clock is read, and this is denoted local current time.
The remote elapsed time is calculated by subtracting the remote baseline time from the remote current time. The local elapsed time is calculated by subtracting the local baseline time from the local current time. A ratio denoting the skew between the two clocks can be calculated by dividing the local elapsed time by remote elapsed time. This enables a calculation to be performed that transforms a local time derived from the system clock into the remote time. That is this allows the user device to map from local system clock time to the broadcast view of time as received by a broadcast receiver. To do this, this requires the triplet of information (local baseline time, remote baseline time , clock ratio )
This triplet of information is used to initialise the application clock 32. The user application clock provides two services to other subsystems. One is the ability to provide the current time according to the broadcast view of time, the other is to sleep for a given number of seconds, including fractional seconds, according to the broadcast view of time. This given a time "now" taken from the system clock, a first order approximation of the broadcast time can be derived from the calculation: remote baseline time + (now - local baseline time) * ratio . In order to provide the sleep service, the application clock simply divides the requested number of seconds to sleep by the ratio. This is to transform broadcast elapsed time into local elapsed time. The sleep service may then sleep for this time period, waking after the local elapsed time in sync with the remote elapsed time.
If the granularity of event data is significantly greater than the likely network delay, the network delay measurement can be skipped. The preferred embodiment measures the network delay as follows. The network delay measurement subsystem 34 retrieves the broadcast view of time from the application clock. This time is denoted send time. This time is denoted by a floating point number representing the number of seconds elapsed since 1st January 1970. It then initiates a TCP connection to the broadcast time server on port 1801. It then sends an octet sequence string representation of send time. This string is terminated by the addition of a network end of line - that is "\r\n" or specifically the raw octet values are 13 and 10 respectively. For example the time Friday Jul 31 11 :05:00 2009 relates to 1249038300.000000 seconds since 1st January 1970. This would be encoded over the network as a message of the form "1249038300.000000\r\n", or more precisely the raw octet values 49, 50, 52, 57, 48, 51, 56, 51 , 48, 48, 46, 48, 48, 48, 48, 48, 48, 13, 10.
The broadcast time server treats the network end of line "\r\n" as a message terminator, and throws the network end of line away. It then appends a space to the message, and then appends the current time encoded as an octet sequence which is again a string representation of a floating point value which represents the number of seconds elapsed since 1st Janurary 1970. The broadcast time server then terminates the TCP connection to signify that it has finished sending its response message. For example, if the network delay is 50ms, then the time on the time server will be 50ms ahead of the user device's application clock. Additionally, the message from the user device network delay measurement subsystem will take an additional 50ms to reach the server. Thus when the user device sends its message encoded with the timestampl 249038300.000000 , the broadcast time server's clock would be1249038300.050000 . When the message reaches the server, the broadcast time server's clock would be 1249038300.100000. This in this example, the response message sent by the broadcast time server to the user device would be "1249038300.000000 1249038300.100000"
The user client device can then parse these two timestamps to give sent time and remote time .The user client also retrieves an expected time from the application clock. This time should match the remote time within a certain tolerance level. The tolerance level used by the preferred embodiment is 10ms. If the difference between remote time and expected time match is not within tolerance, the user device restarts the clock calibration process.
If the difference between remote time and expected time match is within tolerance, the round trip network delay is then be calculated by subtracting the remote time from the sent time. The one way network delay, and hence error in the local application clock can then be determined by dividing the round trip network delay by two. This delay is then used to calibrate the user device application clock. The user device application clock simply uses this network delay by adding it to the times it current calculates.
At this point the application clock is synchronised with the broadcast view of time as received by a broadcast receiver.
The user is then required to inform the user device what channel they are watching. In particular, the user must do this because in the preferred embodiment the user's set top box and broadcast chain are unmodified. However since this is a signal to the user device, other embodiments may take this from another system that is able to communicate to enhanced set top boxes that can inform external devices what channel the set top box is tuned to. Similarly the channel could be determined via audio watermarking or by another other method. In the preferred embodiment, this is achieved by the user typing the channel name on a keyboard, though clearly a graphical menu system or voice recognition system could be used to achieve the same goal. The programme time client 36 then create a TCP connection to port 1700 on the broadcast time server to connect to the programme time summary service. Over this connection the progamme time client sends the octet string "summary\r\n", that is the single word "summary" followed by a network end of line sequence. The server then sends a response message to this, which is an octet sequence terminated by the connection being closed.
The programme time client 36 can then parse this response as follows. The expected format of the response string is ( "OK" | "ERROR" ) " " <command tag> " " <response> "\r" "\n". The command tag is the command the programme time client sent to the server. The response is the required message. This can be parsed by searching the string for the first space character, and taking the response code - OK or ERROR - from the characters in the string preceding that first space. The command tag can be found by searching for the second space character in the response. The command can then be extracted from between the first and second space characters in the response. The actual command result - the result, can be extracted from the response string by extracting the string after the second space character up to and excluding the 2 network end of line characters at the end of the string. The response is a JSON encoded message - irrespective of success or error. JavaScript Object Notation - JSON is an encoding format commonly used by internet clients, and is defined in RFC 4627, as published by the Internet Engineering Task Force (ietf). The standard can be found at this URL: http://www.ietf.org/rfc/rfc4627.txt . This is then decoded by the client using any suitable JSON parser. In the case of the command "summary", the response represents a "dictionary" object, that is an object that maps keys to values. In particular the keys are channel names, the values are arrays. These arrays are pairs of values - the first relates to a timestamp that the programme started, the second is the programme name. The programme start time has been derived by the broadcast time server from the broadcast chain. The preferred embodiment which uses DVB-T performs this by monitoring the now and next event information table from for change, and using this as programme start time. This could also be derived from subtitle junction changes. It could also be enhanced by the broadcaster inserting a marker indicating the start time of the programme into the broadcast transmission - for example using the related content table descriptor in DVB. It could also be achieved by inserting a message into the teletext data channel. For example in the response message, the key "BBC ONE" could map to the value [1278346448.0, "The Weakest Link"]. A sample response for all BBC freeview channels in the northwest of the UK is this: {"bbc one": [1278346448.0, "The Weakest Link"], "bbc parliament": [1278340200.0, "Live House of Commons"], "bbc radio 1": [1278342000.0, "Scott Mills"], "bbc radio 2":
[1278345900.0, "Simon Mayo"], "bbc three": [1278304200.0, "This Is BBC Three"], "bbc radio 3": [1278345610.0, "In Tune"], "bbc red button":
[1278327600.0, "BBC Red Button"], "301": [1278306000.0, "Weekend Wogan: Highlights"], "bbc r5sx": [ 278344700.0, "Cricket"], "bbc two": [1278346554.0, "Escape to the Country"], "bbc world sv.": [1278345959.0, "Europe Today"], "cbeebies": [1278346632.0, "ZingZillas"], "cbbc channel": [1278346613.0, "ROY"], "bbc r1 x": [1278342000.0, "Westwood"], "bbc four": [1278304200.0, "This Is BBC Four"], "bbc radio 4": [1278345610.0, "PM"], "bbc radio 7": [null, "The Small, Intricate Life of Gerald C..."], "bbc 6 music": [1278342000.0, "Andrew Collins"], "bbc news": [null, "BBC News at Five O'clock"], "bbc asian net.":
[1278338400.0, "Noreen Khan"], "bbc r5l": [1278342000.0, "5 live Drive"]}
The programme start time information is then passed onto Timed Event Scheduler 38. The programme start time and programme name is passed onto the events retriever subsystem 40. In one embodiment of the system, the events retriever subsystem 40 uses the programme name to read a given scheduled events file from a file system on the user device. In the preferred embodiment the events retriever subsystem is preconfigured to connect to an additional services server system and makes a request for the scheduled events file for the given programme name being broadcast at the given time. In particular in the preferred embodiment, the server is an HTTP (web) server, which responds to POST requests on a preconfigued URL containing the programme name and programme start time. The additional services server system then responds with the scheduled events file. In either case of the file being retrieved from a store on the user device or from the network, this file can then be passed onto the Timed Event Scheduler subsystem. Finally, the timed event scheduler subsystem takes the scheduled events file and parses it. In the preferred embodiment, the scheduled events file is a JSON format file containing one object representing a schedule. The schedule object is an array of event objects. Each event object is an array consisting of 3 parts - a timestamp, an event type and event data. In the case of the events file being timed against broadcast time, this schedule object can be used "as is" to drive the timed event scheduler 38.
However, in the majority of cases, the timestamps will be relative to programme time. In that case, the timed event scheduler has to take the programme start time - as provided by the programme time server and add this to each of the timestamps in the schedule object - mapping programme time to broadcast time. Now that the timestamps are relative to broadcast time, the schedule object can then be used to drive the timed event scheduler. The timed event scheduler then consists of two key portions - a timed scheduler, and an event handler. The timed scheduler uses the application clock to drive a local scheduler. This works through the schedule object - that is the array of events in order, looking at the timestamps. For each timestamp, it looks at the current (broadcast) time as retrieved from the application clock, and subtracts that from the next event's timestamp. It then uses the sleep service from the application clock to sleep for the given (broadcast) time period (or 0s if the difference is less than zero). Once the scheduler has finished sleeping for the given period, the time for the scheduled event has been reached. The scheduler then looks at the event type to determine how to handle the event data. In particular, the scheduler can then send the event data to different output subsystems.
There will be one or more such output subsystems 42. These subsystems can be audio systems, text display systems, video display systems or even systems that control physical devices via interface boards such as arduino. Thus the system can use the event data produce a variety of synchronised reponses in addition to the main broadcast channel.
In the preferred embodiment, the event type "text" causes the text to be sent a text display output system; the event type "audio" causes the data to be passed to an audio output subsystem and played back; the event type "arduino" causes the data to be sent to an arduino output subsystem which passes the event data unmodified over a serial port to an attached arduino device which may spin a motor or flash a light, etc in response to the command. Given that an arduino is just a microcontroller based electronic breakout box this allows this system to use the event data to control anything from robots through motion systems, lighting systems, etc this means that the broadcast content can be synchronously augmented by this system by anything within the imagination of the programme maker.
Finally, as well as making the services available over a TCP based service it also provides the same services to a user client over HTTP. In this case, the user device makes HTTP requests for each of the 3 services described above · The overall broadcast time service is provided via an URL of the form http://broadcasttimeserver:8082/dvb-bridge?command=time . The MIME type of the response is of application/json, containing a dictionary object with 3 representations of time - timestamp as before, english textual and a 9 part array of year, month, day, hours, minutes, seconds, weekday (0..6, monday is 0), day in year, and whether daylight saving is active. An example response is: {"localtime": [2010, 7, 5, 17, 21 , 10, 0, 186, 1], "asctime": "Mon Jul 5 17:21 :10 2010", "time": 1278346870.0} . Aside from the initial parsing step the user device uses this data in precisely the same way as before. The echoing time service is provided via an URL of the form http://broadcasttimeserver:8082/dvb-bridge?command=summary . The MIME type of the response is also application/json, and contains a dictionary object with the same 3 representation of time as the time service, but additionally includes an extra field which contains the send time from the user device. An example response is: {"echo": "1278346870.0", "localtime": [2010, 7, 5, 17, 21 , 15, 0, 186, 1], "asctime": "Mon Jul 5 17:21 :15 2010", "time": 1278346875.0} . Aside from the initial parsing step the user device uses this data in precisely the same way as before.
The programme time summary service is provided via an URL of the form http://broadcasttimeserver:8082/dvb-bridge?command=summary . The MIME type of the response is again application/json, and contains a dictionary object representing a summary of all channels, programme start times and programme names as before. The format of this response is exactly the same as the response previously described. The user device uses this data in precisely the same way as before.
The provision of the user service over HTTP rather than a basic TCP service enables clients to be written in for a browser in exactly the same was described above. An example network client may connect to a service based around interpreting events in the service in the same way as a web browser interprets content. An example datatype tag for a broadcaster could be "text/html", which the network client could interpret the octet sequence as HTML to be rendered according to HTML rendering rules. Another example may be "base64/audio/wav", which the network client could interpret the octet sequence as a base 64 encoded audio file in wav form, which would then be played at that point in time. Another example may be "link", which the network client would interpret the octet sequence as an URL to be downloaded and interpreted as a web browser would in the usual way as soon as possible. A network client could choose to precache a local copy of the link's content. Thus such a network client could render textual data - such as subtitles, links, textual footnotes & comments, audio, video, flash, (and so on) synchronously with the broadcast.
Another network client may connect to a service based solely around audio, and play back audio event synchronous with the broadcast. Such audio services could include audio description, director narratives, audio footnotes, or even 3D surround sound (such as ambisonics).
Another network client may be a games console or similar device rich in computer processing power. Such a device may connect to an additional service description which is a moving 3D model of the programme as broadcast. (This may be captured via a system such as ORIGAMI, or i3DLive). This network client may also act as a receiver system, and project the broadcast video as a texture onto the 3D model, providing the audience with a 3D video experience. Furthermore, given such models are often created using secondary video cameras from alternative angles, such video data may also be provided as additional services, and synchronised using an appropriate additional service description. These secondary views could then be applied to the 3D model providing higher quality 3D video. Such a platform could also be capable of creating a local stereoscopic 3D rendering for playback on a suitable stereoscopic display. Other possible service descriptions may include information such as telemetry, motion vectors, olfactory or even gustatory data, enabling the control of the location of local devices, force feedback (games consoles), generation of timely smells or tastes. The system described is applicable to analogue digital terrestrial cable or satellite television. The receiver described may be a terrestrial satellite or cable digital television, set-top-box, or other separate audio or video decoder. The network over which the additional service data is provided may be the Internet, a mobile phone data service or the like and may use HTTP, TCP, UDP, and XMPP, IRC or other known protocol. Similarly, the request for time may use any of these protocols and may be sent by SMS or MMS.
The processing mechanism which is used to process the additional service data may be specific to object a network or may be according to a standard and will depend upon the type of the additional service data. As already discussed, this may be text, audio, video telemetry and may include data such as 3D model data, motion vectors, footnotes or alternative view points from camera angles being viewed and, indeed, any data which may be related to the broadcast programme and which may be asserted by a user device. Within the scope of the term "asserted" is included any delivery to a user which can include control of other devices within the user's environment.
The broadcast time server may generate an appropriate time signal of a variety of forms. The time signal may be a teletext time stamp from a digital video broadcast system and the time stamp may be from a program status information or from a time and date table. In an analog implementation timing may be provided from a now and next page on analog teletext. Preferably, the
embodiment is a digital implementation using the event information table.
The receiver of the audio-video broadcast may therefore be an analog or digital television, cable receiver, set-top box, or similar. It is noted, for the avoidance of doubt, that no change is required of the receiver for implementation of the invention. In particular, the receiver does not need to have any additional connection for providing timing information to the user device, because the user device retrieves timing information from the separate broadcast time server.
The user device can be a variety of different types of device including mobile telephones, laptops, personal data assistance (PDA) music players and similar user devices. Typically, the user device is a portable device. The connection from the user device to the broadcast time server is preferably a network such as the internet or a mobile phone data service using any known protocol such as HTTP, TCP, UDP, EMPP or IRC. Similarly, the request for the broadcast time signal may be sent from the user device to the broadcast time server by SMS, MMS or similar protocol. The additional data is considered additional in the sense that it supplements the content of the broadcast audio-video and is asserted by a presentation synchronised to the audio-video as presented at a receiver near the user of the user device. The additional service data may be provided to the user device by download, in advance, or retrieved dynamically alongside receipt of the broadcast audio-video at the receiver. The data may be provided as a file containing a list of timed events. Having one or more time stamps relative to broadcast time, program time or the beginning of a given sequence of audio-video data. As already noted the additional data may be text, images, sound of many different types, and may also include data that causes the user device to instruct another device. A particular example of this would be receipt of additional audio at the user device which is then provided to a separate audio decoder. A further example would be movement data which may be received and asserted to cause movement of a further user device such as any device for providing special effects.

Claims

A method of providing data to a user device for assertion synchronised to a broadcast transmission received by a receiver in the locality of the user device, comprising:
broadcasting an audio-video programme to a plurality of receivers in a given locality;
receiving the broadcast audio-video transmission at a broadcast time server in the given locality;
- deriving a time signal at the broadcast time server from the transmission;
providing the time signal to the user device;
retrieving the data for the user device, the data including time data indicating the time for assertion relative to the time signal provided by the broadcast time server; and
asserting the data at the user device in sychronisation with the received broadcast transmission using the time signal from the broadcast time server.
A method of retrieving data at a user device for assertion synchronised to a broadcast transmission received by a receiver in the locality of the user device, comprising:
retrieving a time signal from a broadcast time server in the given locality, the time signal being derived from the broadcast transmission as received at the broadcast time server,
retrieving the data for the user device, the data including time data indicating the time for assertion relative to the time signal provided by the broadcast time server; and
asserting the data at the user device in sychronisation with the received broadcast transmission using the time signal from the broadcast time server.
3. A method according to claim 1 or 2, wherein the time signal comprises a broadcast clock inserted in the broadcast transmission.
4. A method according to claim 3, wherein the time signal comprises a programme time indicating the time of presentation of a programme relative to the broadcast clock.
5. A method according to any preceding claim, further comprising determining a transmission delay from the broadcast time server to the user device and asserting the data at the user device in synchronisation with the received broadcast transmission using the time signal and the transmission delay.
6. A method according to claim 5, wherein determining the transmission delay comprises transmitting a test signal between the user device and the broadcast time server and determining the transmission delay from the time taken for the test signal to be received.
7. A method according to any preceding claim, wherein the locality is a geographical area of a size such that any difference in time of reception of the broadcast audio-video program at the receiver and at the broadcast time server is imperceptible to a user.
8. A method according to claim 7, wherein multiple broadcast time servers are provided, and wherein the step of providing the time signal to the user device comprises providing the time signal from the broadcast time server closest to the receiver in the locality of the user device.
9. A method according to any preceding claim, wherein deriving the time signal comprises extracting at least one of a broadcast time, a programme time or a time into current programme signal from the audio-video programme.
10. A method according to any preceding claim, wherein retrieving the data comprises retrieving the data over a network separate from the broadcast transmission.
11. A method according to claim 9, wherein the network is the Internet.
12. A method according to any of claims 1 to 9, wherein retrieving the data comprises retrieving the data from a store within the user device.
13. A method according to any preceding data, wherein asserting the data comprises presenting one or more of text, audio, graphics or video to a user.
14. A system for providing data to a user device for assertion synchronised to a broadcast transmission, comprising:
a broadcast time server arranged to receive a broadcast audio-video transmission and to derive a time signal from the transmission and to provide the time signal to the user device in the locality of the broadcast time server; and
a data store arranged to provide the data to the user device.
15. A system according to claim 14, wherein the time signal comprises a broadcast clock inserted in the broadcast transmission.
A system according to claim 15, wherein the time signal comprises a programme time indicating the time of presentation of a program relative to the broadcast clock.
A system according to claim 14, 15 or 16, wherein the time signal comprises one or more of a broadcast clock, a programme time or a time into current programme.
A system according to any of claims 14 to 17, wherein the broadcast time server comprises means for determining a transmission delay between the broadcast time server and the user device.
19. A system according to any of claims 14 to 8, wherein the user device is one of a laptop, mobile phone or PDA.
20. A user device arranged to retrieve data for assertion synchronised to a broadcast transmission received by a receiver in the locality of the user device, comprising: - means for retrieving a time signal from a broadcast time server in the given locality, the time signal being derived from the broadcast transmission as received at the broadcast time server
- means for retrieving the data for the user device, the data including time data indicating the time for assertion relative to the time signal provided by the broadcast time server; and
means for asserting the data at the user device in synchronisation with the received broadcast transmission using the time signal from the broadcast time server.
21. A computer program code comprising instructions which when excluded cause a user device to undertake the method of claim 2.
PCT/GB2011/001288 2010-09-02 2011-09-01 Method and system for additional service synchronisation WO2012028851A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1014608.2 2010-09-02
GB1014608.2A GB2483277A (en) 2010-09-02 2010-09-02 Additional service synchronisation using time server

Publications (1)

Publication Number Publication Date
WO2012028851A1 true WO2012028851A1 (en) 2012-03-08

Family

ID=43013584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2011/001288 WO2012028851A1 (en) 2010-09-02 2011-09-01 Method and system for additional service synchronisation

Country Status (2)

Country Link
GB (1) GB2483277A (en)
WO (1) WO2012028851A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2579605A1 (en) * 2011-10-07 2013-04-10 Accenture Global Services Limited Synchronising digital media content
WO2015116984A1 (en) * 2014-01-30 2015-08-06 Echostar Uk Holdings Limited Methods and apparatus for creation of a reference time index for audio/video programming
US9237368B2 (en) 2009-02-12 2016-01-12 Digimarc Corporation Media processing methods and arrangements
US9292894B2 (en) 2012-03-14 2016-03-22 Digimarc Corporation Content recognition and synchronization using local caching
US9615122B2 (en) 2014-01-30 2017-04-04 Echostar Technologies L.L.C. Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data
US9787768B1 (en) * 2013-03-15 2017-10-10 Arris Enterprises Llc M-CMTS, Edge-QAM and upstream receiver core timing synchronization
WO2018009287A1 (en) * 2016-07-02 2018-01-11 Qualcomm Incorporated Distributed implementation architecture for broadcast receiver
US9971319B2 (en) 2014-04-22 2018-05-15 At&T Intellectual Property I, Lp Providing audio and alternate audio simultaneously during a shared multimedia presentation
US10673609B2 (en) 2015-12-07 2020-06-02 Fujitsu Limited Synchronization device, method, program and system
CN113923522A (en) * 2021-10-14 2022-01-11 深圳市华曦达科技股份有限公司 Time updating method and device for set top box and computer readable storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2677764B1 (en) * 2012-06-22 2017-10-25 Orange Triggering an action relative to a flow
FR2993742A1 (en) * 2012-07-18 2014-01-24 France Telecom Method for triggering action relative to e.g. audio visual flow on e.g. tablet, during broadcast and restitution of multimedia content, involves obtaining triggering instant according to timestamps and triggering action at instant
FR2992516A1 (en) * 2012-06-22 2013-12-27 France Telecom Method for triggering action on content of stream returned by broadcasting reproduction device i.e. TV, involves obtaining given moment of trigger according to temporal time stamps, and triggering action at known moment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003003743A2 (en) * 2001-06-29 2003-01-09 Lightmotive Technologies Method and apparatus for synchronization of parallel media networks
US6630963B1 (en) * 2001-01-23 2003-10-07 Digeo, Inc. Synchronizing a video program from a television broadcast with a secondary audio program
WO2008084947A1 (en) * 2007-01-08 2008-07-17 Sk Telecom Co., Ltd. System and method for synchroning broadcast content with supplementary information
US7634798B2 (en) 2000-11-03 2009-12-15 The Walt Disney Company System and method for enhanced broadcasting and interactive television

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001039506A2 (en) * 1999-11-22 2001-05-31 Spiderdance, Inc. System and method for synchronizing online activities with broadcast programming
TWI220036B (en) * 2001-05-10 2004-08-01 Ibm System and method for enhancing broadcast or recorded radio or television programs with information on the world wide web
US20070022437A1 (en) * 2005-07-19 2007-01-25 David Gerken Methods and apparatus for providing content and services coordinated with television content
JP5191493B2 (en) * 2006-11-20 2013-05-08 エスケー プラネット カンパニー、リミテッド Additional information service providing system related to broadcast content, additional information service providing server, and additional information service providing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7634798B2 (en) 2000-11-03 2009-12-15 The Walt Disney Company System and method for enhanced broadcasting and interactive television
US6630963B1 (en) * 2001-01-23 2003-10-07 Digeo, Inc. Synchronizing a video program from a television broadcast with a secondary audio program
WO2003003743A2 (en) * 2001-06-29 2003-01-09 Lightmotive Technologies Method and apparatus for synchronization of parallel media networks
WO2008084947A1 (en) * 2007-01-08 2008-07-17 Sk Telecom Co., Ltd. System and method for synchroning broadcast content with supplementary information

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9237368B2 (en) 2009-02-12 2016-01-12 Digimarc Corporation Media processing methods and arrangements
EP2579605A1 (en) * 2011-10-07 2013-04-10 Accenture Global Services Limited Synchronising digital media content
US9986282B2 (en) 2012-03-14 2018-05-29 Digimarc Corporation Content recognition and synchronization using local caching
US9292894B2 (en) 2012-03-14 2016-03-22 Digimarc Corporation Content recognition and synchronization using local caching
US9787768B1 (en) * 2013-03-15 2017-10-10 Arris Enterprises Llc M-CMTS, Edge-QAM and upstream receiver core timing synchronization
US9615122B2 (en) 2014-01-30 2017-04-04 Echostar Technologies L.L.C. Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data
US9942599B2 (en) 2014-01-30 2018-04-10 Echostar Technologies Llc Methods and apparatus to synchronize second screen content with audio/video programming using closed captioning data
WO2015116984A1 (en) * 2014-01-30 2015-08-06 Echostar Uk Holdings Limited Methods and apparatus for creation of a reference time index for audio/video programming
US9971319B2 (en) 2014-04-22 2018-05-15 At&T Intellectual Property I, Lp Providing audio and alternate audio simultaneously during a shared multimedia presentation
US10754313B2 (en) 2014-04-22 2020-08-25 At&T Intellectual Property I, L.P. Providing audio and alternate audio simultaneously during a shared multimedia presentation
US10673609B2 (en) 2015-12-07 2020-06-02 Fujitsu Limited Synchronization device, method, program and system
WO2018009287A1 (en) * 2016-07-02 2018-01-11 Qualcomm Incorporated Distributed implementation architecture for broadcast receiver
CN113923522A (en) * 2021-10-14 2022-01-11 深圳市华曦达科技股份有限公司 Time updating method and device for set top box and computer readable storage medium

Also Published As

Publication number Publication date
GB2483277A (en) 2012-03-07
GB201014608D0 (en) 2010-10-13

Similar Documents

Publication Publication Date Title
WO2012028851A1 (en) Method and system for additional service synchronisation
JP5903924B2 (en) Receiving apparatus and subtitle processing method
KR100449742B1 (en) Apparatus and method for transmitting and receiving SMIL broadcasting
CN1742492B (en) Automatic synchronization of audio and video based media services of media content
JP6935396B2 (en) Media content tag data synchronization
KR101727050B1 (en) Method for transmitting/receiving media segment and transmitting/receiving apparatus thereof
Howson et al. Second screen TV synchronization
CN101809965B (en) Communication technique able to synchronise received stream with that sent to another device
US20090106357A1 (en) Synchronized Media Playback Using Autonomous Clients Over Standard Internet Protocols
KR20170074866A (en) Receiving device, transmitting device, and data processing method
KR20120080214A (en) System, method and apparatus for dynamic media file streaming
US10503460B2 (en) Method for synchronizing an alternative audio stream
KR101192207B1 (en) System for providing real-time subtitles service of many languages for online live broadcasting and method thereof
US20190373296A1 (en) Content streaming system and method
Boronat et al. HbbTV-compliant platform for hybrid media delivery and synchronization on single-and multi-device scenarios
EP2891323B1 (en) Rendering time control
US20040244057A1 (en) System and methods for synchronizing the operation of multiple remote receivers in a broadcast environment
CA2938478A1 (en) Methods and apparatus for creation of a reference time index for audio/video programming
WO2014178796A1 (en) System and method for identifying and synchronizing content
CN107534792B (en) Receiving apparatus, transmitting apparatus, and data processing method
van Deventer et al. Media synchronisation for television services through HbbTV
KR101025274B1 (en) Mobile telecommunication ystem, and method for transforming data-information for Synchronization between broadcasting contents and data-information
CN102088625A (en) Automatic synchronization of audio-video-based media services of media content
JP2001094945A (en) Partial reproduction method for video and audio data in digital broadcast and receiver

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11752607

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11752607

Country of ref document: EP

Kind code of ref document: A1