WO2005071686A1 - Television production technique - Google Patents

Television production technique Download PDF

Info

Publication number
WO2005071686A1
WO2005071686A1 PCT/US2005/002425 US2005002425W WO2005071686A1 WO 2005071686 A1 WO2005071686 A1 WO 2005071686A1 US 2005002425 W US2005002425 W US 2005002425W WO 2005071686 A1 WO2005071686 A1 WO 2005071686A1
Authority
WO
WIPO (PCT)
Prior art keywords
production
television
mem
control
video
Prior art date
Application number
PCT/US2005/002425
Other languages
French (fr)
Other versions
WO2005071686A9 (en
Inventor
Edward Marion Casaccia
David Alan Casper
Paul Martell Trethewey
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to US10/586,554 priority Critical patent/US8063990B2/en
Priority to CA2553603A priority patent/CA2553603C/en
Priority to JP2006551427A priority patent/JP4895825B2/en
Publication of WO2005071686A1 publication Critical patent/WO2005071686A1/en
Publication of WO2005071686A9 publication Critical patent/WO2005071686A9/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Definitions

  • This invention relates to a system for pre-programming of television productions, and to a method of that simplifies such pre-programming and enhances operator control of the exact timing of the production.
  • the production of a television program comprises complex undertaking.
  • Traditional methods require the cooperation and coordination of talent and technical staff, using a wide range of audio and video equipment. This is particularly apparent in the production of a television news program.
  • Such programs are generally produced "live” and embody multiple pre-recorded elements, one or more live presenters, and complex production effects that contribute to the flow and interest level of the program.
  • Many television organizations produce news broadcasts, and such organizations strongly compete to attract and retain the maximum number of viewers.
  • Most viewers want fast-paced news programs that make use sophisticated production techniques for audio and video including, for example, complex visual effects.
  • Such complexity requires a large number of equipment operators, thus increasing the likelihood of mistakes during production.
  • U.S. Patent 5,115,310 (Takano et al) and U.S. Patent 5,892,507 (Moorby et al) represent past attempts to add the elements of automation and improved user interface to the television production process.
  • US Patent 6,452,612 (Holtz et al) best exemplifies the state of the prior art of automated television production systems. Holtz et al describe a system that allows pre- programming of most of the complex actions required for a television program, and particularly, a news program, thus minimizing the work required by operators during production.
  • the Holtz et al system makes use of a time line.
  • Each event defined as a change in the status of, or any new command to, any piece of production equipment, receives an allocated slot on the timeline.
  • a processor executes each event at its designated time on the time line, thus allowing completion of the event within the allocated time.
  • the Holtz et al. patent characterizes events as "transition macros" and each such transition macro can include a number of individual timed production activities such as an audio fade or a video wipe, for example. Executing such transition macros automatically without interruption can present certain difficulties for a production that includes live talent. A person reading a script typically will do so at slightly different speeds at different times.
  • Holtz et al address this problem by introducing "step marks” or "pause events.” for insertion in the timeline.
  • a pause event effectively defeats the automatic triggering of a subsequent event, by interrupting execution of the timer.
  • stored events refer only to a single controlled device. If a new program segment requires, for example, a video switcher selection, fade-up of a different microphone, and zoom of a camera lens, programming of these events must occur separately to accomplish a transition to the new segment.
  • Other program transitions can have much more complexity than this simple example and will require creation of a larger number of events. Programming of complex transition typically involves many separate events.
  • the system of Holtz et al provides one or more Graphical User Interfaces (GUIs) for controlling one or more devices, obviating the need to provide large and complex control panels that are normally used to control devices such as video switchers, audio mixers, and digital effects devices.
  • GUIs Graphical User Interfaces
  • this approach also incurs limitations.
  • GUIs do not always constitute the preferred user interface for adjusting critical controls. Many operations, particularly on video equipment, require that the operator view the result of control adjustment on a video screen, while adjusting the control, but operation of the GUI frequently requires that the operator look at the GUI rather than the video image. There are many other circumstances where the "feel" of a physical control is preferred to use of a GUI.
  • a method of controlling at least one production device for producing a television show commences by first establishing a plurality of states of the at least one production device, each state corresponding to at least.one operation executable by the device.
  • the states of the states of the at least one production device are stored as corresponding memory objects which upon execution cause the one production device to execute the at least one operation, which results in generation of a scene.
  • at least one actuator is actuated to control an operation of the at least one production device in accordance with the at least one operation associated with that state memory object.
  • FIGURE 1 depicts a work flow arrangement according to the prior art for producing a television program
  • FIGURE 2 depicts a workflow arrangement according to the present principles for producing a television program
  • FIGURE 3 depicts a block schematic diagram of a television production system embodying the present principles
  • FIGURE 4 depicts a simplified block schematic diagram of a presentation system in the work portion of FIG. 2 showing State Memory Objects (S-MEMs) which when executed; trigger the execution of one or more television production devices in the television production system
  • S-MEMs State Memory Objects
  • FIGURE 5 depicts a plan view of a context sensitive control panel in accordance with another aspect of the present principles
  • FIGURE 6 depicts a block schematic diagram of elements comprising the context sensitive control panel of FIG. 5.
  • a television production system affords simplification over the automation of a television program such as a news program by parameterizing State Memory Objects (S-MEMs), each defining one or more operations for execution by one or more production devices.
  • S-MEMs are typically parameterized in accordance with the scenes they generate.
  • the S-MEMs can be categorized by style, that is to say, by the "look" or appearance of the associated scene. In this way, a director can more easily select among available S-MEMs to choose those that maintain a particular appearance for a succession of scenes.
  • FIGURE 1 illustrates the general workflow arrangement associated with creating a television program, such as a television news program, according to the prior art.
  • News reporters 10 prepare news items; some of which can take the form of complete program segments that include edited video and associated audio.
  • Other news items will contain only partially complete stories, in the form of edited video, with a script for reading by live talent upon transmission of the video.
  • Still other items might comprise only a script, perhaps with specifications for graphics that should be prepared for use with the script.
  • a Newsroom Computer System such as the News Edit System, available from Thomson Broadcast and Media Solutions, Nevada City, California, registers the assets associated with each of these items.
  • FIGURE 2 shows a revised workflow of a television production process in accordance with the present principles.
  • the workflow of FIG. 2 bears many similarities in common with the workflow of FIG. 1 and like reference numbers refer to like elements.
  • the workflow associated with the television production process of the present principles illustrated in FIG. 2 has the same sequence of operation up to generation of the running order 16. At this point, the Director 18 can pre-produce the show.
  • the Director 18 uses the running order 16, taking into account the available production equipment such as that shown including audio mixer 38, video switcher 40, and one or more cameras, possibly with robotic lenses and dollies 42.
  • the Director 18 will create successive segments of the show in accordance with the rundown.
  • a Presentation System 36 performs setup of the equipment.
  • the Presentation System 36 can include one or more Graphical User Interfaces, and can optionally include one or more context sensitive control panels (described in greater detail with respect to FIGS. 4-7) that can operate some or all of the different pieces of production equipment, or subsets of the controls thereof, as required.
  • S- MEM State Memory Object
  • a sequence of S-MEM objects comprises an event list 32.
  • the event list 32 comprises a sequence of S-MEM objects 30 that together represent all the segments of the show. Following such pre-production, the production phase can commence.
  • the Presentation System 36 control the production equipment such as that shown including audio mixer 38, video switcher 40, and one or more cameras, possibly with robotic lenses and dollies 42, by triggering the events upon execution of the S-MEMs in accordance with the rundown.
  • the Presentation System 36 issues commands to various pieces of production equipment that cause each piece of equipment to enter the particular state recorded in the S-MEM.
  • Director 18 issues a "Next" command 34, and the Presentation System 36 will issue commands so that the appropriate pieces of production equipment enters the particular state defined by the next S-MEM.
  • Each S-MEM 30 typically has a finite duration so that Director 18 can see the expected run time of the show. Durations can be of two types.
  • FIGURE 3 depicts a block schematic diagram of a television production system 300 embodying the present principles for enabling automated production of a television program, such as a television news program.
  • a context-sensitive control panel 302 described in greater detail in FIG. 4, for allowing the director 18 individually to control multiple production devices by the use of S-MEM as discussed above.
  • Such production devices can include one or more video playout devices, such as a server 305 comprising part of an existing Digital News Production System 306.
  • Other devices controlled via the control panel 302 can include one or more television cameras 306, associated camera robotics 308, a character generator 310, and a still store 312 for storing still video images.
  • Video signals from the cameras 306, the character generator 310, and the still store 312 pass to a video switcher 313 that selectively switches among input signals under the control of the control panel 302.
  • the switcher 313 can to perform various digital video effects, obviating the need for a standalone DNE device.
  • the system 300 could include one or more separate DVEs (not shown).
  • the switcher 313 provides both a video program output for transmission and/or recording, as well as a preview output for receipt by a preview monitor (not shown).
  • the video switcher 313 can also receive video from one or more devices, such as videotape recorders, video cartridge machines, and/or satellite receivers, to name but a few.
  • the control panel 302 also controls an audio mixer 314 that receives audio input signals from a digital cart machine 316 as well as one or more studio microphones 318.
  • the audio mixer 314 can receive input signals from one or more devices, such as the playback server 304, as well as one or more audio tape recorders (not shown) and/or one or more satellite receivers (not shown).
  • the audio mixer 314 provides a program audio output, as well as an intercom output and an output for audio monitoring, by way of a monitor speaker or the like (not shown).
  • a controller 320 serves to interface the control panel 302 to the video switcher 313, the audio mixer 314, and to a video switcher device selector 322.
  • the video selector 322 enables the control panel 320 to select one or more of the cameras 306, the camera robotics 308, the character generator 310, and the still store 312 for control.
  • the controller 320 can take the form of a personal computer or the like suitably equipped with appropriate input/output drivers for interfacing with the various elements controllable by the control panel 302.
  • Associated with the control panel 302 are one or more hardware control devices 324 that allow the director 18 of FIG. 2 to enter one or more commands for receipt by the controller 320 for ultimate transmission to the appropriate device for control.
  • the control panel 302 also includes graphical user interfaces 326, 327 and 328, for the camera robotics 308, the cameras 306, and the audio mixer 314, respectively.
  • Such graphical user interfaces can include visual displays provided by
  • the television production system 300 of FIG. 3 can also include a Media Object Server (MOS) gateway coupled to a teleprompter 332 as well as to the character generator 310 and still store 312.
  • the MOS gateway 330 provides an interface to the Digital News
  • FIGURE 4 shows a simplified block schematic diagram of the Presentation System 36 of FIG. 1 showing the manner in which the presentation system establishes and parameterizes S-MEMs.
  • the presentation system 36 includes a processing unit 100, in the form of a computer or the like.
  • the processing unit 100 enjoys a link through a bi-directional bus to a memory 403 that stores a sequence of S-MEMs 30 ⁇ , 30 2 , 30 3 ...30,,, where n is an integer greater than zero, the sequence of S-MEMs representing a sequence of segments (scenes) of a television production.
  • each of the S-MEMs such as S-MEM 30 ⁇ comprises a set of operations executable by one or production devices, to create a particular segment.
  • the S-MEM 30 ⁇ includes a pan, tilt, and zoom operation associated with a first camera (CAM 1), as well as a pan, tilt, and zoom operation associated with a second camera (CAM 2) and the lighting of a first, second and third lights (LIGHT 1, LIGHT 2 and LIGHT 3, respectively).
  • the S-MEM 30 ⁇ also includes two additional operations associated with placing a respective one of video switcher 410 and Digital Video Effect device 412, respectively, in a particular state, in accordance with the contents of a memory location 23 within the switcher, and a memory location 46 in the DVE, respectively.
  • both the switcher 410 and DVE 412 have memories whose location can store a particular device state, such as switch, fade, or wipe between two video sources in the case of the switcher, or a particular video effect in the case of the DNE.
  • S- MEM such as S-MEM 30 ls which contains a reference to a particular production device memory location
  • the production device Upon execution of an S- MEM, such as S-MEM 30 ls which contains a reference to a particular production device memory location, the production device will enter the state specified by the contents of that location.
  • the video switcher 410 and DVE 412 can each have a variety of different steps.
  • the television production system such as the television production system 300
  • the S-MEMs are parameterized in accordance in terms of the scene (i.e., the image that results from execution of the S-MEM, that is what appears within an image), rather that in terms of commands (i.e., what each device must do to achieve such an image.) Parameterizing the S-MEMs in this fashion greatly reduces the effort needed to pre- produce a television show.
  • the processor 100 can establish and thereafter record each S-MEM by creating the required state of one or more associated production devices.
  • the director can define a number of "style" S-MEMs, thus parameterizing the S-MEMs.
  • the director seeking to pre-program a scene can select, for example, a previously defined "style” S-MEM that represents, say, a standard "two shot.”
  • This style S-MEM would act as a template, establishing most of the required parameters for a standardized scene consistent with the established "look" of the show. The director would then apply only such control changes as may be necessary to establish the exact parameters for the precise scene envisoned for the show being pre- produced.
  • definition of styles i.e , parameterization of the S-MEMs
  • each style constitutes the equivalent of a vocabulary element of a show.
  • the director selects various S-MEMs having a particular style, or even different styles if desired, to create a show. If the director seeks a particular appearance, the director will choose from among the S-MEMs associated with that style.
  • parameterizing the S- MEMs by style greatly reduces the selection effort.
  • FIGURE 5 depicts a plan view of an exemplary physical layout of the control panel 302 of FIG. 4.
  • the control panel 302 includes a plurality of lamps 500i-500 ⁇ where x constitutes an integer greater than zero.
  • At least some of the lamps 500i-500 represent a particular condition in the context of a particular S-MEM.
  • three of the lamps 500 ⁇ -500 would represent the actuation of LIGHT 1, LIGHT 2, LIGHT 3, respectively.
  • Others of the lamps can represent other operations associated with the S-MEM 30 1 ⁇ such as the particular state of the switch 410 and the DVE 412.
  • Some of the lamps 500 ⁇ -500 x can represent the state of one or more dedicated devices, such as one of more television cameras, or dedicated functions, i.e., "take”, "program (PGM)", and "edit” to name but a few such functions.
  • the control panel 302 includes a first set of actuators 502 1 -502 ⁇ , (where y is an integer greater than zero), a second set of actuators 504 ⁇ - 504-, where z is an integer greater than zero, a third set of actuators 506 ⁇ -506 p , a fourth set of actuators SO ⁇ SO ⁇ c (where c is an integer greater than zero) as well as at least one joy stick 510.
  • each of the set of actuators 502 ⁇ -502 z comprises a push- button switch, which can execute a dedicated operation, i.e., a "take” or a specific operation in the context of the execution of a particular S-MEM.
  • each of the second set of actuators 504 ⁇ -504 z comprises a servo-controlled fader.
  • each of the actuators can execute a dedicated operation, for example, a master fade or wipe in the case of fader 504 ⁇ or an operation dependent on the context of a specific S-MEM.
  • a particular one of the faders 504 2 -504 z could execute an audio fade in the context of a particular S-MEM, whereas in the context of another S-MEM, that same fader could execute a video wipe.
  • the actuators 506 ⁇ -506 / comprise rotary devices, such as potentiometers or rotatable shaft encoders.
  • One or more of these actuators can have a dedicated function irrespective of the execution of a current S-MEM.
  • Others of the actuators 506 ⁇ -506 / can control a function associated with one or more devices in the context of a particular S-MEM, whereas, in the context of a different S-MEM, the actuators will control a different function associated with the same or different devices.
  • each of actuators 508 ⁇ -508 6 typically comprises a push button.
  • the majority of the push buttons 508 ⁇ -508 c have dedicated roles, e.g., accomplishing "preview”, “next page", “cut”, and “transmit” operations to name but a few.
  • the Actuator 510 comprises a joystick, the function of which is typically context dependent. Thus, depending on the execution of a particular S-MEM, the joystick 510 could serve to pan and tilt a first television camera, whereas in the context of another S-MEM, the joystick could operate a video replay device.
  • control panel 302 can include a plurality of audio level monitors 512 ⁇ -512, where 7 * is an integer greater than zero.
  • Each of the audio level monitors provides an indication, typically by means of a bar indicator, of the level of a particular audio device, such as a microphone, for example, in the context of a particular S-MEM.
  • a given one of the audio level monitors will indicate the audio level of a particular microphone, while in connection with a different S-MEM, the same audio indicator will indicate the level of a different microphone.
  • each of audio level monitors 512 1 -512 lies aligned with a corresponding one of the faders 504 2 -504 z .
  • FIGURE 5 depicts a electrical block diagram of the control panel 302 of FIG 4.
  • a single board microcomputer 600 serves as the controller for the control panel 302.
  • the microcomputer 600 has address, data, and control busses, through which the microcomputer connects to a Random Access Memory 602, a Flash Memory 604, and a mass storage device 606, typically in the form of a magnetic hard disk drive.
  • the hard disk drive 606 will contain program instructions, whereas the flash memory 604 can contain a basic input/output operating system (BIOS).
  • BIOS basic input/output operating system
  • the microcomputer 600 has interfaces 608 and 610 for interfacing to an Ethernet network (not shown) and a console teletype, respectively.
  • a background debugger 612 typically comprising a memory block or the like, contains a debugging program suitable for debugging errors.
  • An optional USB port 614 enables the computer 600 to interface to device via a USB connection.
  • a clock 616 typically having a 25 MHz. frequency, provides clock pulses to a multiplier 618 that provides clock signals to the microcomputer 600 at 2.5 times the frequency of the clock 616.
  • a Media Access Control (MAC) storage block 620 provides storage for MAC addresses used by the microcomputer 600.
  • MAC Media Access Control
  • Each of power supplies 62 li, 621 ⁇ and 621 3 provides microcomputer 600, as well as other elements associated with the control panel 302, with 5 volts, +12 and -12 volts, DC, respectively.
  • the microcomputer 600 has at least one display 622 for displaying information. In practice, the microcomputer could have several displays, including one or more touch screens (not shown), one or more Liquid Crystal Displays (LCDs) as well at least one video monitor.
  • One or more input/output devices 624, such as a keyboard and/or computer mouse, are connected to the microcomputer 600 to allow an operator to enter programming information and/or data.
  • a Field Programmable Gate Array (FPGA) 620 serves to interface the microcomputer 600 to faders 504 ⁇ -504 z .
  • FPGA Field Programmable Gate Array
  • each of faders 504 ⁇ -504 z comprises an analog fader servo 624 supplied from the FPGA 620 with an analog signal via a digital-to-analog converter 626.
  • a analog-to-digital converter 628 converts the analog signal produced by the analog fader servo 624 back to a digital signal for input to the FPGA 620.
  • the FPGA 620 also interfaces the shaft encoders 506 ⁇ 506 ⁇ , the buttons 502 ⁇ -502 ⁇ - and the joystick 510 to the microcomputer 600.
  • An Electrically Programmable Reads-Only Memory 622 stores program instructions for the FPGA 622 to facilitate its operation.
  • the microcomputer 600 will serve to activate the appropriate ones of the lamps of the 500 ⁇ -500 ⁇ -, taking into account the which S-MEM is currently active.
  • the microcomputer 600 will also activate one or more of the push buttons 5021-502 ⁇ ,, one or more of the faders 504 ⁇ -504 z , one or more of the shaft encoders 506 ⁇ -506 p , one or more of the push buttons 508 ⁇ -508 c , as well as the joystick 510 depending on the active S-MEM.
  • the microcomputer 600 will assign the audio indicating devices 512 ⁇ -512y depending on the active S-MEM.
  • the control panel 302 as described above advantageously allows an operator to control different aspects of various production devices for individual show segments.
  • the context sensitive control panel 302 provides a set of control elements one or more of which can control different functions of different production devices at different times, depending on the context of a particular scene. In this way, physical size of the control panel can be reduced, yet still afford an operator the control function necessary.
  • the foregoing describes a television production technique, which affords greater simplification in preprogramming a television show.

Abstract

A television production system (300) affords simplification over the automation of a television program such as a news program by making use of State Memory Objects (S-MEMs), each defining one or more operations for execution by one or more production devices. The S-MEMS serve to control one or more actuators on a control panel (302) so that each actuator on the control panel can control different function of different pieces of production equipment depending on the S-MEM selected.

Description

TELENISION PRODUCTION TECHNIQUE
CROSS REFERENCE TO RELATED APPLICATION This application claims priority under 35 U.S.C. 119(e) to U.S. Provisional
Application, Serial Number 60/537,875, filed January 20, 2004, herein incorporated by reference.
TECHNICAL FIELD
This invention relates to a system for pre-programming of television productions, and to a method of that simplifies such pre-programming and enhances operator control of the exact timing of the production.
BACKGROUND ART
The production of a television program comprises complex undertaking. Traditional methods require the cooperation and coordination of talent and technical staff, using a wide range of audio and video equipment. This is particularly apparent in the production of a television news program. Such programs are generally produced "live" and embody multiple pre-recorded elements, one or more live presenters, and complex production effects that contribute to the flow and interest level of the program. Many television organizations produce news broadcasts, and such organizations strongly compete to attract and retain the maximum number of viewers. Most viewers want fast-paced news programs that make use sophisticated production techniques for audio and video including, for example, complex visual effects. Such complexity requires a large number of equipment operators, thus increasing the likelihood of mistakes during production. For these reasons, there have been a large number of attempts to automate the process to some degree, and to provide improved user interfaces that simplify the tasks of the operators. U.S. Patent 5,115,310 (Takano et al) and U.S. Patent 5,892,507 (Moorby et al) represent past attempts to add the elements of automation and improved user interface to the television production process. US Patent 6,452,612 (Holtz et al) best exemplifies the state of the prior art of automated television production systems. Holtz et al describe a system that allows pre- programming of most of the complex actions required for a television program, and particularly, a news program, thus minimizing the work required by operators during production. The Holtz et al system makes use of a time line. Each event, defined as a change in the status of, or any new command to, any piece of production equipment, receives an allocated slot on the timeline. Upon actuation of a timer, a processor executes each event at its designated time on the time line, thus allowing completion of the event within the allocated time. The Holtz et al. patent, characterizes events as "transition macros" and each such transition macro can include a number of individual timed production activities such as an audio fade or a video wipe, for example. Executing such transition macros automatically without interruption can present certain difficulties for a production that includes live talent. A person reading a script typically will do so at slightly different speeds at different times. For that reason, the time required for reading of a particular item "on air" will likely differ to some degree from the time recorded during rehearsals. Unpredictable events, such as a cough or stumbling over a word, add to the uncertainty of the actual event time during live production. With experienced talent, the differences, while small, remain sufficiently significant to make a simple automatic timeline-driven system unsatisfactory. Television viewers have a high sensitivity to imperfections such as clipped words or inappropriate pauses. Any program that exhibits such problems likely would not retain its audience over a long period of time. A high-quality production requires manual triggering of events in response to the actual timing as determined by live performers and other factors. In their patent, Holtz et al address this problem by introducing "step marks" or "pause events." for insertion in the timeline. A pause event effectively defeats the automatic triggering of a subsequent event, by interrupting execution of the timer. Within the Holtz et al. system, stored events refer only to a single controlled device. If a new program segment requires, for example, a video switcher selection, fade-up of a different microphone, and zoom of a camera lens, programming of these events must occur separately to accomplish a transition to the new segment. Other program transitions can have much more complexity than this simple example and will require creation of a larger number of events. Programming of complex transition typically involves many separate events. Given that a typical television production system usually includes a large number of separate devices, arranging all of events needed to accomplish a transition to achieve a particular scene for transmission often proves problematical. Selecting among the many individual operations of each of the various pieces of production equipment takes significant time, making programming an arduous task. In practice, the change from one program segment to the next will typically require simultaneous or closely coordinated changes in many of the controlled devices.
Advantageously, the system of Holtz et al provides one or more Graphical User Interfaces (GUIs) for controlling one or more devices, obviating the need to provide large and complex control panels that are normally used to control devices such as video switchers, audio mixers, and digital effects devices. However, this approach also incurs limitations. GUIs do not always constitute the preferred user interface for adjusting critical controls. Many operations, particularly on video equipment, require that the operator view the result of control adjustment on a video screen, while adjusting the control, but operation of the GUI frequently requires that the operator look at the GUI rather than the video image. There are many other circumstances where the "feel" of a physical control is preferred to use of a GUI. Within the Holtz et al system, all dynamic transitions, such as video wipes, audio fades, etc., require pre-programming under the control of the program timer. However, to achieve a high quality television production, sometimes, the operator will need to change the speed of such a transition, or slightly offset the video and audio transitions. Such a refinement can occur only if the operator has access to the physical controls of the various pieces of production equipment during production. However, as discussed above, the physical control panels normally supplied with such equipments are large and complex, and it is not generally practical for a single person to be responsible for operation of an array of such control panels. The drawbacks associated with present day production equipment, as discussed above, typically preclude a single operator from handling all of the controls of an array of control panels needed to effect the desired offset. Thus, a need exists for a television production technique that overcomes the aforementioned disadvantages. SUMMARY OF THE INVENTION
In accordance with a preferred embodiment of the present principles, there is provided a method of controlling at least one production device for producing a television show. The method commences by first establishing a plurality of states of the at least one production device, each state corresponding to at least.one operation executable by the device. The states of the states of the at least one production device are stored as corresponding memory objects which upon execution cause the one production device to execute the at least one operation, which results in generation of a scene. Responsive to selection of each memory object, at least one actuator is actuated to control an operation of the at least one production device in accordance with the at least one operation associated with that state memory object.
BRIEF DESCRIPTION OF DRAWINGS FIGURE 1 depicts a work flow arrangement according to the prior art for producing a television program; FIGURE 2 depicts a workflow arrangement according to the present principles for producing a television program; FIGURE 3 depicts a block schematic diagram of a television production system embodying the present principles; FIGURE 4 depicts a simplified block schematic diagram of a presentation system in the work portion of FIG. 2 showing State Memory Objects (S-MEMs) which when executed; trigger the execution of one or more television production devices in the television production system FIGURE 5 depicts a plan view of a context sensitive control panel in accordance with another aspect of the present principles; and FIGURE 6 depicts a block schematic diagram of elements comprising the context sensitive control panel of FIG. 5.
DETAILED DESCRIPTION
In accordance with the present principles, a television production system affords simplification over the automation of a television program such as a news program by parameterizing State Memory Objects (S-MEMs), each defining one or more operations for execution by one or more production devices. The S-MEMs are typically parameterized in accordance with the scenes they generate. In this way, the S-MEMs can be categorized by style, that is to say, by the "look" or appearance of the associated scene. In this way, a director can more easily select among available S-MEMs to choose those that maintain a particular appearance for a succession of scenes. While the description of the various embodiments in accordance with the present principles and applicable background art will focus on live television news production, those practiced in the art will recognize the principles equally apply to other complex television productions, whether live or recorded for future transmission. To understand the television production technique of the present principles, a brief discussion of present day production techniques will prove helpful. FIGURE 1 illustrates the general workflow arrangement associated with creating a television program, such as a television news program, according to the prior art. News reporters 10 prepare news items; some of which can take the form of complete program segments that include edited video and associated audio. Other news items will contain only partially complete stories, in the form of edited video, with a script for reading by live talent upon transmission of the video. Still other items might comprise only a script, perhaps with specifications for graphics that should be prepared for use with the script. A Newsroom Computer System (NRCS) 12, such as the News Edit System, available from Thomson Broadcast and Media Solutions, Nevada City, California, registers the assets associated with each of these items. A news producer 14, responsible for producing the news program, makes content decisions. The producer 14 will review all submitted news items for "news worthiness" and, in conjunction with known rules for program format, commercial break structure, etc., will decide which items to include, and will generate a "running order" 16. The running order 16 specifies the order of the items, as well as their duration. The producer 14 enters the running order into the NRCS 12, for further refinement by a Director 18. The Director 18 uses the running order in conjunction with knowledge of the technical facilities available for the program to create a Technical Rundown 20. Traditionally the Technical Rundown constituted a printed document for use by all of the staff creating the program, including a Technical Director 22 who, along with other production staff 24, control an audio mixer.38, a video switcher 40, and one or more cameras, possibly with robotic lenses and dollies 42. FIGURE 2 shows a revised workflow of a television production process in accordance with the present principles. The workflow of FIG. 2 bears many similarities in common with the workflow of FIG. 1 and like reference numbers refer to like elements. Like the workflow of FIG. 1, the workflow associated with the television production process of the present principles illustrated in FIG. 2 has the same sequence of operation up to generation of the running order 16. At this point, the Director 18 can pre-produce the show. The Director 18 uses the running order 16, taking into account the available production equipment such as that shown including audio mixer 38, video switcher 40, and one or more cameras, possibly with robotic lenses and dollies 42. The Director 18 will create successive segments of the show in accordance with the rundown. However, instead of having live operators manipulate the various control panels normally associated with such equipment, a Presentation System 36 performs setup of the equipment. The Presentation System 36 can include one or more Graphical User Interfaces, and can optionally include one or more context sensitive control panels (described in greater detail with respect to FIGS. 4-7) that can operate some or all of the different pieces of production equipment, or subsets of the controls thereof, as required. As each segment is finalized, the Director 18 establishes a State Memory Object (S- MEM) 30 that embodies all of the operations necessary for execution by the various pieces of production equipment to create that program segment, which manifests itself as a scene for display on a display device (not shown). A sequence of S-MEM objects comprises an event list 32. Thus, upon completion of the pre-production, the event list 32 comprises a sequence of S-MEM objects 30 that together represent all the segments of the show. Following such pre-production, the production phase can commence. In this phase, the Presentation System 36 control the production equipment such as that shown including audio mixer 38, video switcher 40, and one or more cameras, possibly with robotic lenses and dollies 42, by triggering the events upon execution of the S-MEMs in accordance with the rundown. Upon recall of each S-MEM 30 from the event list 32, the Presentation System 36 issues commands to various pieces of production equipment that cause each piece of equipment to enter the particular state recorded in the S-MEM. As each segment ends, Director 18 issues a "Next" command 34, and the Presentation System 36 will issue commands so that the appropriate pieces of production equipment enters the particular state defined by the next S-MEM. Each S-MEM 30 typically has a finite duration so that Director 18 can see the expected run time of the show. Durations can be of two types. An Absolute duration has a precise length and finds application for pre-recorded source material (video, audio, etc.) having a fixed run-time. In this case, completion of an S- MEM having an absolute duration can serve to trigger automatically the next event without the need for a manual "Next" command 34. However, segment that involves live talent, use of an approximate duration is preferred. The approximate duration aids in predicting the run time of the show, but the progression to the subsequent event will always requires manual initiation to accommodate the timing variations that are inherent in the use of live talent. FIGURE 3 depicts a block schematic diagram of a television production system 300 embodying the present principles for enabling automated production of a television program, such as a television news program. At the heart of the system 300 lies a context-sensitive control panel 302 described in greater detail in FIG. 4, for allowing the director 18 individually to control multiple production devices by the use of S-MEM as discussed above. Such production devices can include one or more video playout devices, such as a server 305 comprising part of an existing Digital News Production System 306. Other devices controlled via the control panel 302 can include one or more television cameras 306, associated camera robotics 308, a character generator 310, and a still store 312 for storing still video images. Video signals from the cameras 306, the character generator 310, and the still store 312 pass to a video switcher 313 that selectively switches among input signals under the control of the control panel 302. In the illustrated embodiment, the switcher 313 can to perform various digital video effects, obviating the need for a standalone DNE device. However, the system 300 could include one or more separate DVEs (not shown). The switcher 313 provides both a video program output for transmission and/or recording, as well as a preview output for receipt by a preview monitor (not shown). While not illustrated, the video switcher 313 can also receive video from one or more devices, such as videotape recorders, video cartridge machines, and/or satellite receivers, to name but a few. The control panel 302 also controls an audio mixer 314 that receives audio input signals from a digital cart machine 316 as well as one or more studio microphones 318. Further, the audio mixer 314 can receive input signals from one or more devices, such as the playback server 304, as well as one or more audio tape recorders (not shown) and/or one or more satellite receivers (not shown). The audio mixer 314 provides a program audio output, as well as an intercom output and an output for audio monitoring, by way of a monitor speaker or the like (not shown). A controller 320 serves to interface the control panel 302 to the video switcher 313, the audio mixer 314, and to a video switcher device selector 322. The video selector 322 enables the control panel 320 to select one or more of the cameras 306, the camera robotics 308, the character generator 310, and the still store 312 for control. The controller 320 can take the form of a personal computer or the like suitably equipped with appropriate input/output drivers for interfacing with the various elements controllable by the control panel 302. Associated with the control panel 302 are one or more hardware control devices 324 that allow the director 18 of FIG. 2 to enter one or more commands for receipt by the controller 320 for ultimate transmission to the appropriate device for control. The control panel 302 also includes graphical user interfaces 326, 327 and 328, for the camera robotics 308, the cameras 306, and the audio mixer 314, respectively. Such graphical user interfaces can include visual displays provided by The television production system 300 of FIG. 3 can also include a Media Object Server (MOS) gateway coupled to a teleprompter 332 as well as to the character generator 310 and still store 312. The MOS gateway 330 provides an interface to the Digital News
Production System (not shown) to allow receipt of updates made by the producer 14, to be received by such devices. FIGURE 4 shows a simplified block schematic diagram of the Presentation System 36 of FIG. 1 showing the manner in which the presentation system establishes and parameterizes S-MEMs. The presentation system 36 includes a processing unit 100, in the form of a computer or the like. The processing unit 100 enjoys a link through a bi-directional bus to a memory 403 that stores a sequence of S-MEMs 30ι, 302, 303...30,,, where n is an integer greater than zero, the sequence of S-MEMs representing a sequence of segments (scenes) of a television production. As discussed above, each of the S-MEMs, such as S-MEM 30ι comprises a set of operations executable by one or production devices, to create a particular segment. In the illustrated embodiment, the S-MEM 30ι, includes a pan, tilt, and zoom operation associated with a first camera (CAM 1), as well as a pan, tilt, and zoom operation associated with a second camera (CAM 2) and the lighting of a first, second and third lights (LIGHT 1, LIGHT 2 and LIGHT 3, respectively). In addition, the S-MEM 30ι also includes two additional operations associated with placing a respective one of video switcher 410 and Digital Video Effect device 412, respectively, in a particular state, in accordance with the contents of a memory location 23 within the switcher, and a memory location 46 in the DVE, respectively. In practice, both the switcher 410 and DVE 412 have memories whose location can store a particular device state, such as switch, fade, or wipe between two video sources in the case of the switcher, or a particular video effect in the case of the DNE. Upon execution of an S- MEM, such as S-MEM 30ls which contains a reference to a particular production device memory location, the production device will enter the state specified by the contents of that location. Various television production devices can execute a variety of operations. Similarly, the video switcher 410 and DVE 412 can each have a variety of different steps. Thus, for the television production system, such as the television production system 300, there can exist almost an infinite number of S-MEMs. To facilitate S-MEM selection, the S-MEMs are parameterized in accordance in terms of the scene (i.e., the image that results from execution of the S-MEM, that is what appears within an image), rather that in terms of commands (i.e., what each device must do to achieve such an image.) Parameterizing the S-MEMs in this fashion greatly reduces the effort needed to pre- produce a television show. Of course, the processor 100 can establish and thereafter record each S-MEM by creating the required state of one or more associated production devices. To achieve a greater consistency of a particular television show, the director can define a number of "style" S-MEMs, thus parameterizing the S-MEMs. The director seeking to pre-program a scene can select, for example, a previously defined "style" S-MEM that represents, say, a standard "two shot." This style S-MEM would act as a template, establishing most of the required parameters for a standardized scene consistent with the established "look" of the show. The director would then apply only such control changes as may be necessary to establish the exact parameters for the precise scene envisoned for the show being pre- produced. Parameterizing the S-MEMs by the scenes they produce greater facilitates S-MEM selection. In practice, definition of styles (i.e , parameterization of the S-MEMs) occurs in advance of the production. In this sense, each style constitutes the equivalent of a vocabulary element of a show. Just as a writer chooses various vocabulary elements to create a writing, the director selects various S-MEMs having a particular style, or even different styles if desired, to create a show. If the director seeks a particular appearance, the director will choose from among the S-MEMs associated with that style. Thus, parameterizing the S- MEMs by style greatly reduces the selection effort. In practice, the processor 100, or another element can manufacture the styles themselves, placing the S-MEMs associated with each style in an associated style library. The producer 14 need not know very much about the particular operations associated with a particular S-MEM. Typically, a particular show will have a limited number of associated styles. For example, a news show will have a style associate with an individual anchorperson. When the producer selects that style, the producer can then select among those S-MEMs associated with that style, greatly reducing the effort needed to produce the show. FIGURE 5 depicts a plan view of an exemplary physical layout of the control panel 302 of FIG. 4. The control panel 302 includes a plurality of lamps 500i-500Λ where x constitutes an integer greater than zero. At least some of the lamps 500i-500 represent a particular condition in the context of a particular S-MEM. Thus, for example, in connection with the S-MEM 30t of FIG. 5, three of the lamps 500ι-500 would represent the actuation of LIGHT 1, LIGHT 2, LIGHT 3, respectively. Others of the lamps can represent other operations associated with the S-MEM 301} such as the particular state of the switch 410 and the DVE 412. Some of the lamps 500ι-500x can represent the state of one or more dedicated devices, such as one of more television cameras, or dedicated functions, i.e., "take", "program (PGM)", and "edit" to name but a few such functions. Thus, such lamps will always represent the state of such device or a particular function regardless of the S-MEM. In addition to the lamps 500ι-500Λ, the control panel 302 includes a first set of actuators 5021-502^, (where y is an integer greater than zero), a second set of actuators 504ι- 504-, where z is an integer greater than zero, a third set of actuators 506ι-506p, a fourth set of actuators SOδ SOδc (where c is an integer greater than zero) as well as at least one joy stick 510. hi the illustrated embodiment, each of the set of actuators 502ι-502z comprises a push- button switch, which can execute a dedicated operation, i.e., a "take" or a specific operation in the context of the execution of a particular S-MEM. In the illustrated embodiment, each of the second set of actuators 504ι-504z, comprises a servo-controlled fader. As with each of the push buttons 502ι-502z, each of the actuators can execute a dedicated operation, for example, a master fade or wipe in the case of fader 504ι or an operation dependent on the context of a specific S-MEM. Thus, for example, a particular one of the faders 5042-504z could execute an audio fade in the context of a particular S-MEM, whereas in the context of another S-MEM, that same fader could execute a video wipe. The actuators 506ι-506/; comprise rotary devices, such as potentiometers or rotatable shaft encoders. One or more of these actuators can have a dedicated function irrespective of the execution of a current S-MEM. Others of the actuators 506ι-506/ can control a function associated with one or more devices in the context of a particular S-MEM, whereas, in the context of a different S-MEM, the actuators will control a different function associated with the same or different devices. Like each of actuators 502ι-502z, each of actuators 508ι-5086 typically comprises a push button. As compared to the push buttons 502i-502z, the majority of which are context dependent, the majority of the push buttons 508ι-508c have dedicated roles, e.g., accomplishing "preview", "next page", "cut", and "transmit" operations to name but a few. The Actuator 510 comprises a joystick, the function of which is typically context dependent. Thus, depending on the execution of a particular S-MEM, the joystick 510 could serve to pan and tilt a first television camera, whereas in the context of another S-MEM, the joystick could operate a video replay device. Lastly, the control panel 302 can include a plurality of audio level monitors 512ι-512, where 7* is an integer greater than zero. Each of the audio level monitors provides an indication, typically by means of a bar indicator, of the level of a particular audio device, such as a microphone, for example, in the context of a particular S-MEM. Thus for example, in connection with a particular S-MEM, a given one of the audio level monitors will indicate the audio level of a particular microphone, while in connection with a different S-MEM, the same audio indicator will indicate the level of a different microphone. In practice, each of audio level monitors 5121-512, lies aligned with a corresponding one of the faders 5042-504z. To the extent that a particular one of the faders 5042-504~ controls a particular audio device, such as a microphone, in connection with a particular S- MEM, the audio level monitor aligned with that fader will indicate the level of that controlled device. FIGURE 5 depicts a electrical block diagram of the control panel 302 of FIG 4. A single board microcomputer 600 serves as the controller for the control panel 302. The microcomputer 600 has address, data, and control busses, through which the microcomputer connects to a Random Access Memory 602, a Flash Memory 604, and a mass storage device 606, typically in the form of a magnetic hard disk drive. In practice, the hard disk drive 606 will contain program instructions, whereas the flash memory 604 can contain a basic input/output operating system (BIOS). The microcomputer 600 has interfaces 608 and 610 for interfacing to an Ethernet network (not shown) and a console teletype, respectively. A background debugger 612, typically comprising a memory block or the like, contains a debugging program suitable for debugging errors. An optional USB port 614 enables the computer 600 to interface to device via a USB connection. A clock 616, typically having a 25 MHz. frequency, provides clock pulses to a multiplier 618 that provides clock signals to the microcomputer 600 at 2.5 times the frequency of the clock 616. A Media Access Control (MAC) storage block 620 provides storage for MAC addresses used by the microcomputer 600. Each of power supplies 62 li, 621 ι and 6213 provides microcomputer 600, as well as other elements associated with the control panel 302, with 5 volts, +12 and -12 volts, DC, respectively. The microcomputer 600 has at least one display 622 for displaying information. In practice, the microcomputer could have several displays, including one or more touch screens (not shown), one or more Liquid Crystal Displays (LCDs) as well at least one video monitor. One or more input/output devices 624, such as a keyboard and/or computer mouse, are connected to the microcomputer 600 to allow an operator to enter programming information and/or data. A Field Programmable Gate Array (FPGA) 620 serves to interface the microcomputer 600 to faders 504ι-504z. In practice, each of faders 504ι-504z comprises an analog fader servo 624 supplied from the FPGA 620 with an analog signal via a digital-to-analog converter 626. A analog-to-digital converter 628 converts the analog signal produced by the analog fader servo 624 back to a digital signal for input to the FPGA 620. The FPGA 620 also interfaces the shaft encoders 506^506^, the buttons 502ι-502λ- and the joystick 510 to the microcomputer 600. An Electrically Programmable Reads-Only Memory 622 stores program instructions for the FPGA 622 to facilitate its operation. In operation, the microcomputer 600 will serve to activate the appropriate ones of the lamps of the 500ι-500Λ-, taking into account the which S-MEM is currently active. The microcomputer 600 will also activate one or more of the push buttons 5021-502^,, one or more of the faders 504ι-504z, one or more of the shaft encoders 506ι-506p, one or more of the push buttons 508ι-508c, as well as the joystick 510 depending on the active S-MEM. Similarly, the microcomputer 600 will assign the audio indicating devices 512ι-512y depending on the active S-MEM. The control panel 302 as described above advantageously allows an operator to control different aspects of various production devices for individual show segments. In the past, a single technical operator could not easily control all of the devices by virtue of the inability to physically reach all of the separate control panels of all the devices. The context sensitive control panel 302 provides a set of control elements one or more of which can control different functions of different production devices at different times, depending on the context of a particular scene. In this way, physical size of the control panel can be reduced, yet still afford an operator the control function necessary. The foregoing describes a television production technique, which affords greater simplification in preprogramming a television show.

Claims

CLALMS 1. A method of controlling at least one production device for producing a television show, comprising the steps of: establishing a plurality of states of the at least one production device, each state corresponding to at least one operation executable by the device; storing the states of the at least one production device as corresponding memory objects which upon execution cause the one production device to execute the at least one operation, which results in generation of a scene responsive to selection of each memory object, actuating at least one actuator to control an operation of the at least one production device in accordance with the at least one operation associated with that state memory object. 2. Apparatus for controlling at least one production device for producing a television show, comprising the steps of: means for establishing a plurality of states of the at least one production device, each state corresponding to at least one operation executable by the device; means for storing the states of the at least one production device as corresponding memory objects which upon execution cause the one production device to execute the at least one operation, which results in generation of a scene means responsive to selection of each memory object, actuating at least one actuator to control an operation of the at least one production device in accordance with the at least one operation associated with that state memory object.
PCT/US2005/002425 2004-01-20 2005-01-20 Television production technique WO2005071686A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/586,554 US8063990B2 (en) 2004-01-20 2005-01-20 Television production technique
CA2553603A CA2553603C (en) 2004-01-20 2005-01-20 Television production technique
JP2006551427A JP4895825B2 (en) 2004-01-20 2005-01-20 TV production technology

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US53787504P 2004-01-20 2004-01-20
US60/537,875 2004-01-20

Publications (2)

Publication Number Publication Date
WO2005071686A1 true WO2005071686A1 (en) 2005-08-04
WO2005071686A9 WO2005071686A9 (en) 2006-03-16

Family

ID=34807141

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/002425 WO2005071686A1 (en) 2004-01-20 2005-01-20 Television production technique

Country Status (4)

Country Link
US (1) US8063990B2 (en)
JP (1) JP4895825B2 (en)
CA (1) CA2553603C (en)
WO (1) WO2005071686A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9118888B1 (en) 2014-03-14 2015-08-25 Tribune Broadcasting Company, Llc News production system with integrated display
US9258516B2 (en) * 2014-03-14 2016-02-09 Tribune Broadcasting Company, Llc News production system with display controller
US11871138B2 (en) * 2020-10-13 2024-01-09 Grass Valley Canada Virtualized production switcher and method for media production

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5307456A (en) * 1990-12-04 1994-04-26 Sony Electronics, Inc. Integrated multi-media production and authoring system
US20020175931A1 (en) * 1998-12-18 2002-11-28 Alex Holtz Playlist for real time video production

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3577675A (en) 1969-07-15 1971-05-04 Kohner Bros Inc Child{3 s bathing toy
US3758712A (en) * 1972-03-27 1973-09-11 Sarkes Tarzian Digital special effects generator
JP2629802B2 (en) 1988-04-16 1997-07-16 ソニー株式会社 News program broadcasting system
US5148154A (en) * 1990-12-04 1992-09-15 Sony Corporation Of America Multi-dimensional user interface
US5262865A (en) * 1991-06-14 1993-11-16 Sony Electronics Inc. Virtual control apparatus for automating video editing stations
US6553178B2 (en) 1992-02-07 2003-04-22 Max Abecassis Advertisement subsidized video-on-demand system
GB2268816B (en) 1992-07-14 1996-01-17 Sony Broadcast & Communication Controlling equipment
US6463205B1 (en) 1994-03-31 2002-10-08 Sentimental Journeys, Inc. Personalized video story production apparatus and method
DE19625954A1 (en) 1996-06-28 1998-01-02 Philips Patentverwaltung Control unit for a production unit of a television studio or television broadcasting van
US5864366A (en) 1997-02-05 1999-01-26 International Business Machines Corporation System and method for selecting video information with intensity difference
CA2284886A1 (en) 1997-04-04 1999-02-04 Avid Technology, Inc. A digital multimedia editing and data management system
AUPO960197A0 (en) 1997-10-03 1997-10-30 Canon Information Systems Research Australia Pty Ltd Multi-media editing method and apparatus
JP3276596B2 (en) 1997-11-04 2002-04-22 松下電器産業株式会社 Video editing device
US20030214605A1 (en) * 1998-12-18 2003-11-20 Snyder Robert J. Autokeying method, system, and computer program product
US7024677B1 (en) * 1998-12-18 2006-04-04 Thomson Licensing System and method for real time video production and multicasting
US20030001880A1 (en) 2001-04-18 2003-01-02 Parkervision, Inc. Method, system, and computer program product for producing and distributing enhanced media
US6760916B2 (en) 2000-01-14 2004-07-06 Parkervision, Inc. Method, system and computer program product for producing and distributing enhanced media downstreams
US6792469B1 (en) 1999-03-22 2004-09-14 General Electric Company System and method for monitoring and controlling the production of audio and video streams
JP4230599B2 (en) * 1999-03-23 2009-02-25 株式会社東芝 Broadcast system
JP4017290B2 (en) * 1999-07-07 2007-12-05 日本放送協会 Automatic program production device and recording medium recorded with automatic program production program
US7020381B1 (en) * 1999-11-05 2006-03-28 Matsushita Electric Industrial Co., Ltd. Video editing apparatus and editing method for combining a plurality of image data to generate a series of edited motion video image data
US6747706B1 (en) 2000-01-11 2004-06-08 International Business Machines Corporation Workflow process for managing digital broadcast program production
JP4470259B2 (en) 2000-01-27 2010-06-02 ソニー株式会社 Video processing device
JP2002013506A (en) * 2000-06-29 2002-01-18 Smc Corp Multi-axis actuator
JP2002124929A (en) * 2000-10-18 2002-04-26 Jisedai Joho Hoso System Kenkyusho:Kk Apparatus and method for processing information as well as recording medium
JP2003008943A (en) * 2001-06-21 2003-01-10 Techno Ikegami:Kk Operation console for video changeover device
JP4276393B2 (en) * 2001-07-30 2009-06-10 日本放送協会 Program production support device and program production support program
JP4090926B2 (en) * 2002-03-29 2008-05-28 富士フイルム株式会社 Image storage method, registered image retrieval method and system, registered image processing method, and program for executing these methods
US8872979B2 (en) 2002-05-21 2014-10-28 Avaya Inc. Combined-media scene tracking for audio-video summarization
US20040036632A1 (en) * 2002-08-21 2004-02-26 Intel Corporation Universal display keyboard, system, and methods

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5307456A (en) * 1990-12-04 1994-04-26 Sony Electronics, Inc. Integrated multi-media production and authoring system
US20020175931A1 (en) * 1998-12-18 2002-11-28 Alex Holtz Playlist for real time video production

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NOWARA T: "HALZBEIT IN DER DIGITALEN NONLINEAREN NACHBEARBEITUNG. PRAXISNAHE BETRACHTUNGEN UND SYSTEMUBERBLICK", FERNSEH UND KINOTECHNIK, VDE VERLAG GMBH. BERLIN, DE, vol. 49, no. 12, 1 December 1995 (1995-12-01), pages 715 - 720,722, XP000545308, ISSN: 0015-0142 *

Also Published As

Publication number Publication date
CA2553603C (en) 2015-03-10
JP4895825B2 (en) 2012-03-14
WO2005071686A9 (en) 2006-03-16
US20080231758A1 (en) 2008-09-25
CA2553603A1 (en) 2005-08-04
US8063990B2 (en) 2011-11-22
JP2007519381A (en) 2007-07-12

Similar Documents

Publication Publication Date Title
US6952221B1 (en) System and method for real time video production and distribution
US8560951B1 (en) System and method for real time video production and distribution
US6452612B1 (en) Real time video production system and method
US10013154B2 (en) Broadcast control
US7024677B1 (en) System and method for real time video production and multicasting
CA2553481C (en) Television production technique
JP4280767B2 (en) Combination of editing system and digital video recording system
US20040027368A1 (en) Time sheet for real time video production system and method
JPH11203837A (en) Editing system and method therefor
EP1262063B1 (en) System for real time video production and multicasting
CA2553603C (en) Television production technique
JPH11205670A (en) Device and method for editing and provision medium
JPH11205673A (en) Device and method for editing and provision medium
JP5050424B2 (en) Effect switcher and control method of video playback device in effect switcher
JPH11203835A (en) Edit device and method, and magnetic tape
JPH11205672A (en) Editing device and method and providing medium
JPH11205671A (en) Device and method for editing and provision medium
JPH11203834A (en) Device and method for editing and providing medium

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
COP Corrected version of pamphlet

Free format text: PAGES 1/6-6/6, DRAWINGS, REPLACED BY NEW PAGES 1/7-7/7; DUE TO LATE TRANSMITTAL BY THE RECEIVING OFFICE

WWE Wipo information: entry into national phase

Ref document number: 2553603

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 10586554

Country of ref document: US

Ref document number: 2006551427

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

122 Ep: pct application non-entry in european phase