WO2000054213A1 - System for color balancing animations and the like - Google Patents

System for color balancing animations and the like Download PDF

Info

Publication number
WO2000054213A1
WO2000054213A1 PCT/US1999/010517 US9910517W WO0054213A1 WO 2000054213 A1 WO2000054213 A1 WO 2000054213A1 US 9910517 W US9910517 W US 9910517W WO 0054213 A1 WO0054213 A1 WO 0054213A1
Authority
WO
WIPO (PCT)
Prior art keywords
color space
color
data
images
scanner
Prior art date
Application number
PCT/US1999/010517
Other languages
French (fr)
Inventor
Arjun Ramamurthy
Lemuel L. Davis
Original Assignee
Time Warner Entertainment Co.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Time Warner Entertainment Co. filed Critical Time Warner Entertainment Co.
Priority to AU18065/00A priority Critical patent/AU1806500A/en
Publication of WO2000054213A1 publication Critical patent/WO2000054213A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • This invention pertains to a system for making motion pictures, including feature animations, as well as standard motion pictures, as well as combinations thereof. More particularly, the present invention pertains to a system of making a motion picture in which a plurality of devices are used to both generate and replay various scenes or portions of scenes of said motion picture digitally, wherein these scenes or portions thereof are automatically color corrected in a manner which insures uniform color perception.
  • the subject invention has been invented for managing color in process for making feature animations and hence most comments and references are made to this art. However, one skilled in the art would understand that the invention is also applicable to processes for making motion pictures, television, HDTV and other digital or analog image generation, display and distribution systems.
  • color matching is a problem is in the area of digital cameras and scanners.
  • the digital cameras and scanners response to the artwork is determined by the dyes used for the filters over charge coupled devices (CCDs), the calibration tables and the gain of the amplifiers.
  • CCDs charge coupled devices
  • the lights used to illuminate the artwork also dictate the color response of the scanner.
  • the image is then viewed on a monitor.
  • the color and tonal response of the monitor is affected by the phosphor set, the gamma of the cathode ray tube (CRT), the graphics card and its look-up table (LUT).
  • CTR cathode ray tube
  • LUT look-up table
  • One common practice, for achieving a better scanner to monitor match was to apply custom LUTs to each channel of the scanned image. Another approach is to modify the hardware LUTs to the graphics card.
  • a second area where color matching is a problem is the display of scenes on an optical film on a CRT monitor.
  • CRT monitor On CRT monitors a combination of closely spaced phosphors generates the desired color, while the intensity is determined by the energy in the electron beam.
  • the reproduction of color is based on the calibration of the digital film recorder, the development process, the controls in the developing and printing lab, the response of each of the emulsion layers, and the light intensity and saturation at which the print was struck.
  • a common practice prior to the present invention has been to try to make the monitor emulate the response of film.
  • Production facilities accomplish this by either loading custom LUTs into the hardware LUTs of the video card or electrically overdriving the monitor, thereby enhancing the gain function. These schemes are approximate at best and do not compensate for the phosphor colors.
  • Another drawback of attempting tonal changes or gamma curves is that compositing and rendering software packages assume that images are in a linear space and hence any divergence from this paradigm may result in image artifacts. Further, it has been shown 8 bit tables effectively lose steps when used for gamma adjustment.
  • Some facilities have schemes that model the printing process and allows an operator to vary the lights at which the print is made and also see the effect of different timing scenarios.
  • Yet another approach is to develop LUTs and then apply these either sequentially or as a group to the image either before it is brought to the film recorder or within the film recorder itself. These LUTs are iteratively generated and require regular fine-tuning
  • the camera department was also provided with a match clip of the given scene.
  • This match clip was most often a piece of daily or one-light print from the production. While viewing the clip on the light table and the corresponding digital image on the monitor, adjustments to the digital image were manually effected using proprietary software that allowed the user to adjust RGB values, color saturation and brightness. Once a visual match was achieved, the user would save the applied changes as a software LUT which could be later applied to the entire scene. This LUT would immediately be applied to the given frame and that frame would be wedged to film. Wedging the frame simply means that the digital frame, post LUT, is output to film with several variations of RGB value, saturation, contrast and brightness.
  • Another company (Cinesite, Digital Studio of Hollywood, California 90038) employs a scheme whereby the monitors black level is adjusted by modifying the output of the electron guns, and the overall response is modified through a combination of analog controls and hardware look-up-tables to approximate a film-stocks S-curve. The monitors white level and contrast range are then adjusted to equal that of the film stock (compared on a light table). Artistic selections are made in this non-standard state and compositing is carried out by specialized software that composites in log space as opposed to linear space.
  • a further objective is to provide a method and system for producing moving images by defining a common color space and converting at each intermediate step the produced images into said common color space.
  • Yet another objective is to provide a system for producing moving pictures which is readily incorporated into current work flow without either disturbing or severely altering the same.
  • a further objective is to provide a color management system for production which speeds up the total amount of time required to complete a project by insuring that colors are reproduced accurately at almost every step, thereby eliminating the need for retakes.
  • a production system for making moving pictures in accordance with this invention includes one or more scanners, cameras or other similar devices capable of generating electrical signals corresponding to art works and/or live action shots and sequences thereof. These electrical signals are digitized to define image data in a corresponding device dependent color space. That is, the electronic data can be used to represent images on a color space, said color space being dependent on the specific characteristics of the device generating the data.
  • the electronic or digital data is then converted to a common color space, such as the CIELab or CIE XYX color spaces.
  • a common color space such as the CIELab or CIE XYX color spaces.
  • the data representing the images in the common color space is transformed to a new data in the color space of the corresponding device, i.e., the monitor color space or the printer color space.
  • the data is transferred to the recorder color space and a recorder is used to record the images on a media which can be used to replay the images, such as film, or video tape.
  • each of the devices associated with the system are calibrated to the manufacturer's specification, and then profiles are generated to characterize the respective devices so that the required transformations to and from the common color space can be accomplished.
  • profiles are generated to characterize the respective devices so that the required transformations to and from the common color space can be accomplished.
  • color charts are generated and used to generate the profiles.
  • Some of the charts are custom generated for a particular feature presentation by adding color patches to the charts which are particularly characteristic of the presentation.
  • a parameter for color reproduction error, ⁇ E is generated for the system that is either producing or reproducing color and the operation of the system is optimized in such a manner that reduces the value of this parameter.
  • ⁇ E parameter for color reproduction error
  • the notion of a system gamma is developed which permits the overall optimization of the operation of monitors, through adjustment of each intermediate step (or image control parameter, such as display-gamma). In this manner the system ensures that the colors are accurately reproduced and that various effects which may distort color rendition are reduced.
  • Fig. 1 shows somewhat diagrammatic block diagram of a color management system constructed in accordance with this invention
  • Fig. 1 A shows a CIE color chart with various device spaces outlined therein;
  • Fig. 2 shows a standard color chart that may be used for generating device profiles for the system of Fig. 1;
  • Fig. 3 shows a modified color chart that may be used for generating device profiles for the system Fig. 2;
  • Figs. 4A-4E show flow charts and associated image transformation diagrams for generating an animation feature using the system of Fig.1;
  • Figs. 5A-5E show flow charts and associated image transformation diagrams for generating a live action digital production using the system of Fig. 1;
  • Figs. 6A-6B show details for calibrating and generating profiles for a scanner;
  • Figs. 7A-7B show details for calibrating and generating a device profile for a monitor;
  • Figs. 8A-8C show details for calibrating and generating a device profile for a film recorder;
  • Fig. 9 shows a color chart used to generate a profile for a digital film recorder
  • Fig. 10 shows a color chart digitally generated on a monitor and used to verify and confirm the operation of the system
  • Fig. 11 shows the variation of an error assessing parameter ⁇ E for the color patches of color chart 10
  • Figs. 12A-12D show tables of data obtained for the chart of Fig. 10.
  • Fig. 1 A shows a standard two dimensional CIE color chart generally referred to as the CIE 1931 2° chromaticity diagrams.
  • the generally horse-shaped color image represents the CIE curve in ran.
  • the dark curve spanning across the image is the black body locus as a function of temperature (in °K).
  • the largest triangle shows a representative color response of a standard film used in cinematography.
  • the next (slightly smaller) triangle shows a representative response of a typical color monitor.
  • the smaller triangle shows the response of a typical RGB printer problem being addressed in this invention is "How to optimize a process of generating a sequence of images using various devices having the responses illustrated?"
  • An important part of the subject color management system is the designation and use of an objective device independent color space, such as a space known in the art as the CIE color space.
  • the CIE color space is promulgated by the Commission Internationale De Leclairage, (i.e., International Commission on Illumination) and is well defined, repeatable and reproducible.. It allows the representation of any viewable and/or measurable color as a numerical triplet. Several different representations are possible, CIE XYZ and CIE Lab being the most common.
  • CIE XYZ describes colors as tristimulus values, attempting to simulate how the human eye sees color, X being the eyes response to red, Y being the green response, and Z being the blue response.
  • CIELab is a conical model with L being Lightness, a and b defining a hue/chroma wheel. It can also be described in polar coordinates as CIE LCH (Lightness, Chroma, Hue), which comes close to the way we think about color.
  • CIE color space can be measured by colorimeters or computed from spectral data captured by spectrophotometers. Theoretically any instrument measuring a specific color should produce the same CIE color coordinates as long as the viewing conditions are set identically. According to this invention, a color management system and method is presented in which optimized color space conversions are performed based on two criteria.
  • the best color rendition by a output device to represent an image from an input device is determined based on the color gamut (color gamut refers to the overall color range of a particular device) of both the input device and the output device and selecting the appropriate mapping that best minimizes the overall color reproduction metric ⁇ E, discussed below.
  • color gamut refers to the overall color range of a particular device
  • the algorithm to use to perform the calculations required to map colors from the input space to the output space and to also compute intermediatory colors to provide a perceptual match between input and output device Given the different color gamuts (both in number of colors and range) of different devices, the simplistic approach to generate a system parameter for the gamut compression is to select said parameter to correspond or be limited to the range of colors to the most limiting device, say the monitor. However this approach is not perceived to be practical as color matches will not be obtained between devices. Another approach is described as follows. An ICC profile requires three gamut mappings to be provided: Perceptual (also called photographic), Saturation, and Relative Colorimetrc. Perceptual is the most common rendering intent, especially used for the photographic reproduction of images. There are no standard recipes to achieve perceptual rendering, every profiling application uses different approaches.
  • the values X tract, Y n , Z n are the CIE Tristimulus values for the perfect reflecting or transmitting diffuser. These are:
  • ⁇ E a color difference metric
  • ⁇ E * ab [ ( ⁇ L * ) 2 + ( ⁇ a * ) 2 + ( ⁇ b * ) 2 ] '/2 where ⁇ L * , ⁇ a * , ⁇ b * are the deltas in the L * , a * , b * values.)
  • the ⁇ E value is widely accepted as a metric of color reproduction accuracy. Generally speaking, one ⁇ E represents the smallest color difference the average human eye can perceive. However, current research in color science is beginning to indicate that ⁇ E values of up to 3 are perceived as being acceptable by most users.
  • the next step is to use some device-specific information, but again in a purely formulaic manner.
  • a linearization of the tone reproduction ,or gamma, curves can be developed.
  • redg, greeng and blueg are the red, green and blue channel gamma curves and R, G and B are the red, green and blue components of the input pixel.
  • the three tone reproduction (gamma) curves linearize the raw values with respect to luminance (Y) and the 3x3 matrix converts these linearized values into XYZ values for the CIEXYZ space.
  • the ICC does not mandate a specific operating system or architecture, rather provides a framework which is ideally suited for a Color Management System (CMS).
  • CMS Color Management System
  • the ICC also placed the responsibility of computing the color transformations on the operating system.
  • the operating system presently in use does not support a CMS.
  • a batch processor For that purpose, a batch processor and a color engine were developed. This color engine carries out the translations from one device space to the device independent space, i.e., the CIE space and then performs the necessary connections to translate from the CIE space to the output devices color space. Each frame is recalculated from either scanner space to monitor space or monitor to film recorder space in 16 bits per color plane. This ensures smooth gradations and color fidelity.
  • each device of the system should be calibrated in accordance with the manufacturer's instructions and on a regular basis to verify that it is still in calibration. Therefore, the first preliminary step, to calibrate each device associated with the system.
  • the International Color Consortium (ICC) was established in 1993 by eight industry vendors, and now consists of ca. 60 companies, for the purpose of creating, promoting and encouraging a vendor-neutral cross platform color management architecture.
  • the ICC while not mandating a specific operating system or architecture, provided a framework to implement a Color Management System (CMS). It required that devices be individually characterized, and then allow a color engine (like ColorSync made by Apple Computer, Inc.) to perform the conversion of image data from one color space to another.
  • CMS Color Management System
  • the ICC also developed the ICC profile standard that serves as a cross- platform device profile format used to characterize color devices.
  • An ICC profile or, as referred to herein, a device profile is a mathematical expression for transforming device specific coordinates into and out of a device independent space, most commonly CIE Lab. This standard enables color management vendors to produce profile creation software and allows system level color management, to work seamlessly with ICC profiles across applications and platforms.
  • the colorimetry of a set of colors from some imaging media or display is measured for each device to determine its characteristic profile. This can be done in a variety of ways, and is dictated to a large extent on the device being characterized.
  • a reference image is scanned and compared with a data file that indicates what the scanned values should be.
  • the profile making application compares the scanned image with the data file that indicates what the data should be in CIE space.
  • Building a profile for a digital film recorder and film inverts the process. Here a set of patches evenly distributed in the output color space are generated and printed. These patches are then measured to provide colorimetric data for the respective profile.
  • the target patch values are selected to derive the most information at places where precision is critical. This means that small steps are selected for highlights and shadows, and larger steps are selected in between. From the measured data points over one million colors are filled in between through spline interpolation. This process ensures precise prediction of where in color space additional printed patches would have resulted in device independent color space. This collection of data points now represents a very accurate characterization of the printable colors.
  • the software creates two comprehensive tables, one four dimensional table which describes a CIE value (in our case CIELab) for every possible CMY(K)/RGB (i.e., either CMY, CMYK or RGB, depending on the device) combination.
  • the other table consists of a three-dimensional table which maps every possible CIELab value to a CMY(K)/RGB equivalent. Since large portions of this table are likely to be outside of the printer's color gamut, gamut mapping needs to be invoked to come up with a reasonable approximation for this scenario.
  • Each device profile also contains a media white point tag, which contains the actual measurement of the substrate (e.g. paper or film substrate).
  • the entire profile is calculated relative to this substrate, so that paper or film density from an input profile would get mapped onto paper white on any output, regardless of the paper color.
  • the human eye (and brain) is extremely adaptive, and we do not perceive paper color unless it is very dark or colorful, or it is presented to us next to a different paper color. Therefore photographic mapping is usually the preferred rendering in use today.
  • the digital camera is calibrated using manufacturer recommended practices so that we obtain uniform response in all three channels.
  • the digital cameras aperture settings and scan times are kept constant.
  • Scene illumination is provided by continuous band tungsten lamps and to maintain consistent light levels, the light output from the lamps is monitored constantly.
  • Camera calibration and light stability is also verified using a white target and checking the digital values returned from the scan.
  • a color target is measured with a spectrophotometer, producing a set of reference data. This target is subsequently scanned, and an ICC profile is computed that maps the RGB colors of the scanner to the CIE colors of the references set.
  • a typical target used by the industry as the standard for making color decisions on reflective material is the Kodak IT8 7.2 Reflective Target shown in Fig. 2 may also be used for the calibration of the scanner. These have been in common use for the past 5 years, and are produced by Kodak, Agfa and Fuji, both for transparent and for reflective material. Each target, is accompanied by its reference data file, expressing each color in terms of CIE values.
  • a modified target may be used.
  • the inventors have created and used a variant of the Kodak IT8 7.2 target.
  • This modified target shown in Fig. 3, is generated by examining typical frames from several scenes of the animation, and selecting certain predominant or frequently used colors from these scenes which are not found on the standard target. Panels or squares of these colors are added to the chart of the other colors. More specifically, referring to Figs. 6 A, 6B, each of the scanners and/or cameras
  • step 60 the scanner black point and white point are calibrated in accordance with the specifications of the manufacturer.
  • step 62 one or more light sources having a uniform illumination and color are set.
  • step 64 the exposure time, scan speed (if any- obviously video and movie cameras do not have this parameter) F-stops are set to obtain a maximum dynamic range over a designated target.
  • the target may be either the Kodak IT8 7.2 target of Fig. 2, or the modified target of Fig. 3 has already been developed as described below.
  • step 66 a user is given the choice of using 16 or 8 bit data. If 8 bit data is selected then in step 68, the appropriate correction curve, one that preserves precision in the low order bits, is applied and calibration is complete. If 16 bit data is preferred then the calibration step is complete. (Step 70). Next in step 72 the white field variation is determined. If this variation is smaller than a preselected tolerance, then in step 74 a check is performed to determine if there is an existing profile for the subject scanner. If a scanner profile has been previously calculated, or otherwise obtained (for example from a manufacturer) then the profiling process is complete (step 76).
  • step 278 color swatches used in the subject feature presentation or film are obtained from the manufacturer.
  • step 80 the color swatches are sorted using the standard lighting conditions of step 62.
  • step 82 a determination is made as to whether the color swatches include any metamaric colors. If metamaric colors are present then in step 84, spectral data for these colors are obtained using a spectrophotometer, and these colors are added to the customized color target of Fig. 3.
  • the color target of Fig. 3 is generated by adding to a standard target the metamaric colors.
  • Metamaric color pairs are two colors that exhibit metamarism. Metamarist is the phenomenon wherein two color samples appear to match when viewed under certain lighting (illuminant) conditions, but not under others.
  • the target of Fig. 3 may be generated by painting the patches on an appropriate media, or alternatively by using a high quality digital printer.
  • each of the color patches on the target is analyzed under controlled lighting using a spectrophotometer.
  • spectral data is obtained and used to generate a CIEXYZ and CIELAB color space for the target patches (step 88).
  • the target is scanned with the subject scanner under controlled light(the same conditions that were used to calibrate the scanner). The data thus obtained is used to generate the transformation matrix to the scanner color space.
  • interpolation tables for colors not found on the target are also developed (step 92), thereby completing the calculation of the scanner profile (step 94). This profile is then stored for use during the color management process as described below.
  • Profiles may be generated in a similar manner for all digital input devices.
  • a preferred monitor used in the present system is the Barco Monitor made under the Model number 120T reference calibrator by BARCO Display Systems of Kennesaw, Georgia 30144. These monitors are deployed in all of critical stations, i.e. the Color Models, Final Check and Film Print departments.
  • Each monitor is equipped with a microprocessor that measures the instability of each gun and feeds back correction data to the video amplifier. This dedicated microprocessor continuously counteracts instabilities in the picture tube and maintains correct black levels over time.
  • the unit also controls other visual parameters, including color gain, geometry and timings. This permits long ter m color stability and a stable contrast ratio, even at low intensities.
  • the BARCO Optisense measurement head is an optical instrument. It is meant for the calibration and color measurement on a BARCO monitor.
  • the Optisense is essentially a photometer, which when combined with the ob-board microprocessor allows "color" calibration. Calibration for each individual monitor is carried out at the factory, during production, and stored in the on-board memory. In this way each monitor carries its individual signature. Recalibration with the photometer ensures that the monitor is maintained in this state.
  • the photometer is placed over the central portion of the screen, and the onboard software presents a series of color swatches of differing intensity and color. Color values are measured in CIE XYZ terms and offsets are fed back to the microprocessor controlling each of the electron guns.
  • the device profile may be generated.
  • Monitor profiles are generated by displaying a comprehensive range of colors on the monitor while measuring the CIE values for those colors with a spectral device. The relationship between the colors and the corresponding CIE data is then stored as the monitor profile. More particularly, referring to Fig. 7 A, first, in step 302, the subject monitors are set up in a windowless room in which lighting is controllable. In step 304 the geometric properties of the monitors are calibrated to eliminate distortions. In step 306 the white point of the monitors is set to a preselected color temperature standard. In step 308, a photometer such as Optisense is used to calibrate the cross gray and color ramps of the monitor.
  • step 310 the overall system gamma or gain is adjusted by varying the hardware lookup table.
  • step 312 a check is performed to determine if a profile exists for the subject monitors. If it does then the profiling procedure is complete. If not, then in step 316 a target is produced on the monitors digitally.
  • the target includes a plurality of patches of different color similar to the targets of Figs. 2 or 3.
  • each of the patches on the monitor is analyzed using a photometer to generate spectral data. This spectral data is then used to generate the values defining the monitor color space. From these values, CIEXYZ or CIELAB values are derived for the patches in the device independent color space. In step 320 interpolation tables are generated for colors not found on the target.
  • step 322 the output profile of the monitor is generated. Since the monitor may be used as either an input or output device, the generated profile is replicated
  • the first step in calibrating the Digital Film Printer requires the film characteristic curve or D Log E plot.
  • a sensitometric 21 -step density wedge is available for films from their respective manufacturers. These wedges are in normally defined in Vi stop increments of exposure and illustrate the dynamic range of the chosen film stock.
  • the LAD Laboratory Aim Density
  • the LAD point is determined as a point of reference. This point LAD represents a neutral gray of about 16 % reflection in a normally exposed scene.
  • the LAD point is typically found around step 11 or 12.
  • Diffuse white, at 90% reflectance represents five times the exposure of the LAD, or about 2-1/2 f-stops, and this can be found at or-around step 17.
  • Celco Film recorders Me by CELCO of Rancho Cucamonga, California 91730
  • an extra Vi stop of dynamic range was obtainable without inducing fogging.
  • a profile for the recorder itself may be generated.
  • Characterizing film recorders represents a more difficult task than standard color printers because of the small size of the output media and the material used. Since the output is negative film, the processing of the film becomes part of the profiling itself. Controlling the processing is therefore critical for the success of the profiling.
  • a CRT based film recorder produces flare, which makes it difficult to image lots of small color patches on the 35mm area as the neighborhood of each color patch starts to have an effect on other patches. This problem was solved by measuring large patches on subsequent frames. These patches were also added to the modified target shown in Fig. 3.
  • step 402 a sensistometric strip is obtained for the subject negative film.
  • step 404 using a densitometer such as X-Rite 310, the dynamic range of the exposed negative film is measured and the maximum and minimum density levels Dmax and Dmin are established.
  • step 406 using these limits a strip of film is shot on the digital recorder to record a linear gray scale between said limits.
  • step 408 the film strip with the gray scale is processed and the negative gray scale is analyzed.
  • step 410 the difference between the measured and the targeted densities are analyzed. If this difference is below a preselected tolerance level, then the calibration is complete (step 412). Otherwise, in step 414 the digital-to-analog conversion constants and other parameters are adjusted and linearized and the gray scale is reshot.
  • step 416 a check is performed to determine if a profile corresponding to the parameter Dmax is available. If not, then a respective profile is determined as follows.
  • a special target is generated. This target may be obtained by modifying for example some existing targets by emphasizing patches with dark and light areas.
  • a film profiling target is shown in Fig. 9. Each patch of the target is printed on the subject film as individual frames, in step 422.
  • the exposed film is then processed (step 424) and in step 426 the exposed frames are analyzed first to determine if the gamma of the negative film is acceptable. If it is not, then in steps 428 and 430 negative strips are generated using different baths and the strips are analyzed and the strip with the least deviation from a mean density is selected.
  • step 432 the positive film is analyzed using a special transmissive spectral analyzer such as a Colortron II mated with a Spotlight Light Table.
  • Colortron II is available from X-Rite/California Tech. Center, San Rafael, California 94903.
  • the color space for the film recorder is developed as well as the transformation parameters to transfer data from the device independent color space and the recorder color space.
  • interpolation tables are developed for colors not found on the target of Fig. 9.
  • Fig. 1 The system for generating animated features in accordance with this invention is shown diagrammatically in Fig. 1.
  • the color management system 10 is used to generate a film strip on which the image of a tree 12 and a standing figure 14 are superimposed.
  • the image of the tree 12 is drawn, painted or otherwise on art work supporting media such as a sheet of cardboard 16.
  • the image is recorded by a scanner or digital camera 18 to generate an image 20 of tree 12 in the scanner color space.
  • a microprocessor 22 is used to transform the image 12 from the scanner color space to a device independent color space 24. ( In the following description several different microprocessors are shown and identified as separate elements, it being understood that in some instances a single microprocessor may perform the function of several of these microprocessors).
  • Color space 24 is preferably a CIE space such as CIELab.
  • image 12 is converted to the device independent color space 24, it can be transformed to any other color space specific to any device.
  • a microprocessor 28 is used to transform the image of the tree 12 in the color space 24 to a monitor color space.
  • the image 12 is displayed by monitor 26, it can be compared visually to the original art work on media 16. If the images are very similar, the algorithms used in the system 10 are appropriate.
  • the image 14 is also created in a well known manner, scanned and transformed to the color space 24. To show image 14 on monitor 26, its representation in color space 24 is transformed by microprocessor 30. If required, the images 12 and 14 in the monitor color space are combined into a single composite image by means well known in the art and then displayed on the monitor 26.
  • the combined image in the monitor color space is transformed to the color space 24 by a microprocessor 32.
  • the images 12, 14, or their composite can be transmitted and manipulated by other device by appropriate transformation thereof from the color space 24 to the appropriate color space.
  • the required image is converted into a film recorder color space using microprocessor 38.
  • the image in the film recorder data space is provided to a digital recorder station 40 where said data is used to generate film strip.
  • the film strip is sent to a lab 42 which processes it to form a negative film strip 44.
  • the negative film strip 44 is used to generate a positive film strip 46.
  • the computer monitor 26 is the device under which all approvals are carried out (as described in more detail below), all the images are transformed into and maintained in monitor color space. In this manner the production computer network or the host computer is not tasked with additional computation every time an image is brought up on the monitor 26. Even though a monitor 26 has a reduced color gamut when compared with either the scanner color space or the film color space, moving transforming of images to either of these spaces, is performed without truncation, permitting easier integration into the production flow.
  • digital camera 18 has with a Trilinear CCD scanning array attached to 4x5 camera.
  • Trilinear CCD scanning array is available for example from Dicomed, Inc. of Bunsville, MN 55337.
  • Monitor 26 is preferably made by BARCO Model 120T especially in color critical areas. Less expensive monitors may be used in other less critical areas.
  • the digital film recorder station 40 includes a Celco film recorder available from CELCO. This recorder is used to expose Eastman Kodak Ektachrome color negative, stock# 5245. Alternatively, in some instances, a Black and White negative film , Kodak stock# 5231 is used . These instruments used to calibrate each of the devices were a photometer for the
  • Monitor calibration an Xrite densitometer for film recorder calibration and a chroma meter for scanner luminance calibration.
  • spectrophotometers were used so that we may determine CIEXYZ and CIELab values.
  • step 100 an artwork is generated on board 16.
  • step 102 the art work is viewed in a controlled environment.
  • Such an environment may consist of an enclosed windowless room having a plurality of lights individually calibrated to a color temperature of 5400 °K.
  • the room has a designated viewing area selected so that incident light on the artwork is diffused and uniform and has an illuminance on a flat white matte surface that is equivalent to the illuminance of the monitor.
  • the director determines if the art work is acceptable esthetically as well as coloration. If the artwork is not acceptable, in step 106 the art work is corrected and steps 102-104 are repeated until the director is satisfied.
  • step 108 the art work is sent to the scanner 18.
  • step 110 the scanner equipment is checked to determine if the scanner and equipment related to it are calibrated. If not, then in step 112 the scanner 18 is recalibrated as discussed above and shown in Figs. 6A-6C.
  • step 114 the art work is scanned and the resulting digital data representative of the image of the art work in the scanner color space. This data is sent to microprocessor
  • step 116 the microprocessor 22 processes this data to convert the image from the scanner color space and its device profile is applied. If a valid profile does not exist, a device profile is generated, as described in more detail in conjunction with the device independent color space (CIE). This data is then converted again by the microprocessor 28 to generate data corresponding to said image in the monitor color space. This transformation of image data is illustrated in Fig. 4D.
  • step 118 a check is performed to determine if monitor 26 is calibrated. If it is not, then in step 120 the monitor is calibrated and its monitor device profile is applied. If a valid device profile does not exist, a device profile is generated, as described below. The monitor is then moved to a room having the controlled illumination described above. In step 122 animation and/or other special effects are generated in the usual manner. In step 124 the art work is combined with the images generated in step 122. In step 126, additional effects are generated and/or the images created thus far are corrected. In step 128 all the image elements are combined again. In step 130 the new and/or corrected images are viewed again by the director under controlled illumination conditions.
  • step 132 the director decides whether he is satisfied with the final product. If not then in step 134 the images are reworked. During any of the steps 118-132 the director may request prints. The data corresponding to the current images is transformed to the printer color space and the images are printed on proofing device 36. If in step 132 the director is satisfied then in step the data corresponding to the final images is sent to the film recording station 40 in step 136. In step 138 the film recording equipment is checked to insure that it has been calibrated. If not, then in step 142 the film recording equipment is applied. In step the animation is ready to be printed on film. The data is now transformed from the monitor color space to the device independent color space and then from the device independent color space to the recorder color space in step 144, as illustrated in Fig. 4E.
  • step 146 the data is recorded on film and the film is sent to the processing lab 42.
  • step 148 the gamma of the processed negative film is checked. If this parameter is within a preset tolerance then the negative film is used to generate a corresponding positive film. If the negative film is not satisfactory then in step the negative film is reshot.
  • step 152 the positive film is shot from the negative film and its positive film density gamma is checked. If it is not satisfactory then the positive film is reshot in step 154.
  • step 156 the positive film is sent to an editing room where it is edited and then a complete animation feature is generated using standard SMPTE procedures.
  • Standard live action motion picture production Referring now Figs. 5A-5E, standard, live action motion picture is produced using the subject color management system as follows.
  • step 200 the live actions scenes are shot on film or video tape.
  • step 202 the director checks the live action scenes.
  • step 204 the unacceptable scenes are reshot.
  • step 206 the film is sent to a film scanner (not shown in Fig. 1).
  • step 208 the film scanner is checked for calibration. If necessary, the scanner is recalibrated and its profile recalculated in steps 208-210.
  • the film scanner then scans the film and generates corresponding digital data in the scanner color space in step 212.
  • the digital data corresponding to the film in the scanner color image space is transformed by microprocessors into corresponding data in the device independent space. This information is in turn transformed to the monitor color space as illustrated in Fig. 5D.
  • step s 216, 218 the monitor is checked, and if necessary, recalibrated and its profile is recalculated.
  • step 220 additional footage and/or special effects are created.
  • step 222 the additional material is added to the original data.
  • step 224 the new and the original material is combined.
  • step 226 the combined material is checked under controlled lighting conditions on monitor 26.
  • steps 228 and 230 the material undergoes one more check, and portions are reworked under instructions from the director.
  • step 232 the data is sent to the film recording station. Once again, the recording station is checked and, if necessary, the recorder is recalibrated and its profile is recalculated in step 234.
  • step 238 the digital data representative of images in the monitor color space is transformed first into the device independent color space CIE and then into the recorder color space, as shown in Fig. 5E.
  • the remaining steps 240-250 are similar to steps 146- 156 in Fig. 4C.
  • Verifying ⁇ E B Objectively measure the output of each device.
  • ⁇ E was shown as universal metric of color reproduction accuracy. It is possible to compute the ⁇ E for a particular device by examining the quality of fit between the forward and reverse profile of a device. For the devices in the system of Fig. 1, the ⁇ E values obtained from this mathematical fit are shown in Table 1.
  • delta E is within the acceptable range for all the media.
  • a spectrophotometer is used to measure the output of a device using a standard target.
  • Fig 10 shows the target used.
  • This target is organized as 4 rows of 6 patches each, designated by rows and columns as shown.
  • Row 4 is the gray scale row, with A4 at full white (Digital value on monitor of 255,255,255) and dark gray at F4 (Digital value of 30,30,30 on the monitor).
  • Row 3 combines the primary and the secondary colors, i.e. BGRYMC for patches A through F, respectively.
  • Row 2 has colors that lie approximately between the primary colors.
  • Row 1 has colors that represent natural colors. As the image of the target is created on the monitor, it exists in the color monitor space.
  • a spectrophotometer was used to measure each of the patches, shown in Fig 10, on the Barco 120T monitor so that the CIELab values may be obtained.
  • the data thus obtained was transformed to the device independent color space using the monitor's output profile.
  • the data from the device independent color space was transformed again to the recorder color space using the recorder profile and the target shown in Fig. 10 has been recorded on film .
  • To eliminate color interaction within the 35mm frame each patch was shot to film as a separate frame. The negative was then developed, and a positive film was made from the negative. The positive film was measured using the spectrophotometer with the ability to measure in transmissive mode.
  • the data was also transformed to the printer color space and a color print of the target was generated on a semi-matte paper using an Iris 5030 InkJet Printer. This printer is available from Scitex.
  • the semi-matte print was scanned.
  • the output of the scanner was transformed from the scanner color space to the device independent color space, and then tp the monitor color space and the result was displayed on the monitor.
  • the tables are organized in a format similar to the Fig 10. For each of the patches, Al through F4, the measurements obtained from the Iris Semi -matte proof and the film recorder (on EK Film) are shown alongside the values that were generated on the Monitor (Monitor Digital).
  • the Iris-print was scanned. Examination of the ⁇ E values between Scanned image (on the Monitor) and the Iris Print shows some large deviations in patches A3, B3, D2, E3. This highlights how one can easily violate one of the tenets of color management, this being, A device profile is only valid for the media and under the conditions which the profile was created.
  • the Iris printers pigments are not the same as the paint pigments, and hence the scanner interprets (sees) these as different colors.
  • the RMS ⁇ E for the process of transferring Iris Print to the monitor (i.e. scanned) is 19.96 and the average ⁇ E is 12.63. Although the pigments are incorrect, this test does permit us to show relative data and the comparison between starting Monitor data and the final scanned data.
  • color matches will only be as good as the instrumentation, the devices, and the consistency with which devices can reproduce color.
  • the key to good color management is correct target selection, calibration, maintenance and good process control. When so many components are being used it becomes imperative to analyze and control each component carefully. Correct calibration and maintenance of devices in the color generation and reproduction loop is perhaps the cornerstone of color management.
  • CMS color management system
  • the CMS was able to provide the artists with the creative freedom to select from a wider gamut of colors than previously available, and also have the confidence that the colors would reproduce faithfully.

Abstract

A system is described which provides consistent and repeatable color matches between reflective artwork, scanned images (12) viewed on monitor displays (26) and film print (38). The system is used in production of motion pictures formed of feature animations, live action shots, or combination thereof.

Description

SYSTEM FOR COLOR BALANCING ANIMATIONS AND THE LIKE BACKGROUND OF THE INVENTION A. FIELD OF INVENTION
This invention pertains to a system for making motion pictures, including feature animations, as well as standard motion pictures, as well as combinations thereof. More particularly, the present invention pertains to a system of making a motion picture in which a plurality of devices are used to both generate and replay various scenes or portions of scenes of said motion picture digitally, wherein these scenes or portions thereof are automatically color corrected in a manner which insures uniform color perception. B. Description of the Prior Art
The subject invention has been invented for managing color in process for making feature animations and hence most comments and references are made to this art. However, one skilled in the art would understand that the invention is also applicable to processes for making motion pictures, television, HDTV and other digital or analog image generation, display and distribution systems.
Color decisions are an extremely important component of the feature animation production process. Before the advent of digital ink and paint systems, the colors and contrast of paints used for eels and backgrounds would usually appear to be very different on film as shot on an animation camera stands, particularly greatly increasing contrast and saturation, resulting in images that exhibited more "depth". Feature animation artists and color modelists learned to compensate for this problem by producing backgrounds and eel colors that looked flat when directly viewed.
When digital ink and paint systems were brought into use for feature animation, character colors began to be chosen based on the colors represented on computer monitors. The process of getting a drawing into the computer system begins with the animator's pencil drawings being scanned using flatbed scanners or digital cameras. The contrast of the pencil lines is then increased to the point where the lines appear to be black. The Color Model department defines the character colors, the color palette, for each scene. The character colors are carefully chosen by interactively viewing the each scene. The character colors are carefully chosen by interactively viewing the painted character drawing in the context of the scene and making adjustments to the associated color palette. Subsequently, the Director and/or Art Director are called in to approve the choice of colors. As noted earlier, all of this work and decision making is based on the colors represented on the computer monitors.
Backgrounds are still painted by hand and scanned with high resolution digital cameras. The background department paints the backgrounds for a scene prior to Color Models doing their work. The backgrounds are approved, by the Director and/or Art Director, by viewing of the actual artwork. Digital ink and paint systems allow computer generated imagery (CGI) to be easily combined with hand-drawn artwork. Most feature animation releases in recent years have made heavy use of CGI elements due to artistic and cost savings considerations. The colors for these elements are iteratively chosen on a computer monitor by making adjustments to the CGI model colors using one program and generating renders with another. As with backgrounds, this work is done prior to the scene arriving in Color Models. After selecting colors in the Color Model department, frames are Inked and Painted and then recorded onto film.
Generally, when a prior art process or method computer assisted (such as CGI) was used to produce color images, for instance for a feature animation, the resulting colors on film were different than the colors that had been seen on the computer monitors. Complicating matters even more was the fact that the background colors viewed on the monitors were different than the original artwork. Effectively artists could not make final decisions on colors until the scene had been printed to film. The color change was different than what many had experienced using the traditional process, so they were not able to draw on that past experience to predict how a particular paint color would look when put to film. No one could predict with any degree of accuracy how colors would be changed by the various transformations imposed by the of the digital production process.
Similar problems occur during the creation of motion pictures, especially, now that computers are used extensively for various special effects.
One particular example where color matching is a problem is in the area of digital cameras and scanners. The digital cameras and scanners response to the artwork is determined by the dyes used for the filters over charge coupled devices (CCDs), the calibration tables and the gain of the amplifiers. The lights used to illuminate the artwork also dictate the color response of the scanner. Once scanned, the image is then viewed on a monitor. The color and tonal response of the monitor, is affected by the phosphor set, the gamma of the cathode ray tube (CRT), the graphics card and its look-up table (LUT). One common practice, for achieving a better scanner to monitor match was to apply custom LUTs to each channel of the scanned image. Another approach is to modify the hardware LUTs to the graphics card. In some cases commercially available software, such as Adobe Photoshop, is used to color correct each scan manually. Asking users to manually color correct each scan is clearly time-consuming, not very repeatable and labor intensive. The approaches that utilize custom LUTs compromise the overall color range (or color gamut ) and the tonal range available from either scanner or monitor. This is of greater concern when it is observed that monitors have smaller color gamuts than most scanners and thus cannot reproduce the same kind of contrast ratio.
A second area where color matching is a problem is the display of scenes on an optical film on a CRT monitor. On CRT monitors a combination of closely spaced phosphors generates the desired color, while the intensity is determined by the energy in the electron beam. On film, the reproduction of color is based on the calibration of the digital film recorder, the development process, the controls in the developing and printing lab, the response of each of the emulsion layers, and the light intensity and saturation at which the print was struck.
A common practice prior to the present invention has been to try to make the monitor emulate the response of film. Production facilities accomplish this by either loading custom LUTs into the hardware LUTs of the video card or electrically overdriving the monitor, thereby enhancing the gain function. These schemes are approximate at best and do not compensate for the phosphor colors. Another drawback of attempting tonal changes or gamma curves is that compositing and rendering software packages assume that images are in a linear space and hence any divergence from this paradigm may result in image artifacts. Further, it has been shown 8 bit tables effectively lose steps when used for gamma adjustment. Some facilities have schemes that model the printing process and allows an operator to vary the lights at which the print is made and also see the effect of different timing scenarios. Yet another approach is to develop LUTs and then apply these either sequentially or as a group to the image either before it is brought to the film recorder or within the film recorder itself. These LUTs are iteratively generated and require regular fine-tuning.
Most production facilities often shoot out sample pieces of the scene to verify the match then apply a 'wedge' technique to achieve ideal matching. The 'wedge' or 'ring- around' process involves varying the printer lights, Red, Green and Blue by one point to obtain eight wedges (combinations) that are very slightly different from the starting point. This iteration allows the director to pick the ideal match.
In addition to having been created in an iterative fashion, requiring a fair amount of labor, the current schemes suffer from the inherent problems of LUTs, operator interaction, non-repeatability and do not handle color gamut compression issues. One of the biggest problems is that they do not address the differences in the RGB primaries (color space differences).
A variety of other schemes have been previously developed to adjust color, contrast and saturation of images as they move from one device to another. Some of these schemes are outlined below, as are the inherent drawbacks with these schemes. The practices outlined below are our understanding of the proprietary processes used by many production facilities. While numerous schemes are used, they are by and large flavors of the two outlined below.
The practice at BOSS film studio and Warner Digital was to modify, by eye, the monitor, via the hardware look-up tables (LUT) to approximate the 21 step gray ramp produced on the film stock. This was done using a light table that had a white point of 5000°K. In the production flow, images were either scanned from film, or delivered in digital format to the facility.
The camera department was also provided with a match clip of the given scene. This match clip was most often a piece of daily or one-light print from the production. While viewing the clip on the light table and the corresponding digital image on the monitor, adjustments to the digital image were manually effected using proprietary software that allowed the user to adjust RGB values, color saturation and brightness. Once a visual match was achieved, the user would save the applied changes as a software LUT which could be later applied to the entire scene. This LUT would immediately be applied to the given frame and that frame would be wedged to film. Wedging the frame simply means that the digital frame, post LUT, is output to film with several variations of RGB value, saturation, contrast and brightness. When these wedges returned from the processing Lab the next morning they were placed next to the match- clip on the light-table and a suitable-visual match was determined. Any choice made from the wedge was then applied to the original LUT inside the software. If a suitable match is not found the wedging is repeated using greater or lessor steps. Once color-correction had been carried out on a given scene, the LUT was stored in the camera department until the scene was completed and returned to camera. Upon return of a scene the LUT (chosen earlier) was applied to each frame and subsequently output to film via the film-recorder.
The drawbacks to the approach by BOSS and Warner Digital is that the color gamut difference between monitor and film is not addressed in the monitor adjustment process.
There is also a large amount of manual interaction as wells as multiple iterations in producing an acceptable color match on film.
Another company (Cinesite, Digital Studio of Hollywood, California 90038) employs a scheme whereby the monitors black level is adjusted by modifying the output of the electron guns, and the overall response is modified through a combination of analog controls and hardware look-up-tables to approximate a film-stocks S-curve. The monitors white level and contrast range are then adjusted to equal that of the film stock (compared on a light table). Artistic selections are made in this non-standard state and compositing is carried out by specialized software that composites in log space as opposed to linear space.
The draw backs of the approach by Cinesite is that even though the monitor is modified to represent film, in terms of brightness and contrast, the difference in color gamut is ignored. Secondly, by operating in a non-standard fashion, the monitor life is dramatically curtailed, requiring replacement within 18 months. Lastly, by employing a proprietary compositing scheme, they are unable to take advantage of the advances in compositi OBJECTIVES AND SUMMARY OF THE INVENTION In view of the abovementioned disadvantages, it is an objective of the present invention to provide a production arrangement for producing moving pictures composed of real action, animations or combinations thereof by using digital signal processing, wherein a color management scheme is used that the color of various images is correctly, accurately and reproducibly transferred between various stages of the process.
A further objective is to provide a method and system for producing moving images by defining a common color space and converting at each intermediate step the produced images into said common color space.
Yet another objective is to provide a system for producing moving pictures which is readily incorporated into current work flow without either disturbing or severely altering the same.
A further objective is to provide a color management system for production which speeds up the total amount of time required to complete a project by insuring that colors are reproduced accurately at almost every step, thereby eliminating the need for retakes.
Other objectives and advantages of the invention shall become apparent from the following description. Briefly, a production system for making moving pictures in accordance with this invention includes one or more scanners, cameras or other similar devices capable of generating electrical signals corresponding to art works and/or live action shots and sequences thereof. These electrical signals are digitized to define image data in a corresponding device dependent color space. That is, the electronic data can be used to represent images on a color space, said color space being dependent on the specific characteristics of the device generating the data.
Importantly the electronic or digital data is then converted to a common color space, such as the CIELab or CIE XYX color spaces. An advantage of these latter color spaces is that they are device independent. Therefore profiles can be generated characterizing each device associated with the system and the profiles can be used to transfer the data to and from the common color space as required.
To display, or otherwise reproduce these images for an intermediate purpose, such as editing, color checking and the like, for example on a monitor, or by generating proof prints, the data representing the images in the common color space is transformed to a new data in the color space of the corresponding device, i.e., the monitor color space or the printer color space. Once the data has been transferred, it can be readily and accurately displayed or printed as desired.
When the editing is completed to the liking of the director, the data is transferred to the recorder color space and a recorder is used to record the images on a media which can be used to replay the images, such as film, or video tape.
Prior to the operation of the subject system, two preliminary steps are performed. Each of the devices associated with the system are calibrated to the manufacturer's specification, and then profiles are generated to characterize the respective devices so that the required transformations to and from the common color space can be accomplished. For at least some of the devices color charts are generated and used to generate the profiles. Some of the charts are custom generated for a particular feature presentation by adding color patches to the charts which are particularly characteristic of the presentation.
An important feature of the invention is that a parameter for color reproduction error, ΔE, is generated for the system that is either producing or reproducing color and the operation of the system is optimized in such a manner that reduces the value of this parameter. In addition, for display devices such as monitors, the notion of a system gamma is developed which permits the overall optimization of the operation of monitors, through adjustment of each intermediate step (or image control parameter, such as display-gamma). In this manner the system ensures that the colors are accurately reproduced and that various effects which may distort color rendition are reduced.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows somewhat diagrammatic block diagram of a color management system constructed in accordance with this invention; Fig. 1 A shows a CIE color chart with various device spaces outlined therein;
Fig. 2 shows a standard color chart that may be used for generating device profiles for the system of Fig. 1;
Fig. 3 shows a modified color chart that may be used for generating device profiles for the system Fig. 2; Figs. 4A-4E show flow charts and associated image transformation diagrams for generating an animation feature using the system of Fig.1; Figs. 5A-5E show flow charts and associated image transformation diagrams for generating a live action digital production using the system of Fig. 1; Figs. 6A-6B show details for calibrating and generating profiles for a scanner; Figs. 7A-7B show details for calibrating and generating a device profile for a monitor; Figs. 8A-8C show details for calibrating and generating a device profile for a film recorder;
Fig. 9 shows a color chart used to generate a profile for a digital film recorder; Fig. 10 shows a color chart digitally generated on a monitor and used to verify and confirm the operation of the system ; Fig. 11 shows the variation of an error assessing parameter ΔE for the color patches of color chart 10; and Figs. 12A-12D show tables of data obtained for the chart of Fig. 10.
DETAILED DESCRIPTION OF THE INVENTION Theoretical Basis for the Invention
Fig. 1 A shows a standard two dimensional CIE color chart generally referred to as the CIE 1931 2° chromaticity diagrams. On this diagram, the generally horse-shaped color image represents the CIE curve in ran. The dark curve spanning across the image is the black body locus as a function of temperature (in °K). The largest triangle shows a representative color response of a standard film used in cinematography. The next (slightly smaller) triangle shows a representative response of a typical color monitor. The smaller triangle shows the response of a typical RGB printer problem being addressed in this invention is "How to optimize a process of generating a sequence of images using various devices having the responses illustrated?" An important part of the subject color management system is the designation and use of an objective device independent color space, such as a space known in the art as the CIE color space. The CIE color space is promulgated by the Commission Internationale De Leclairage, ( i.e., International Commission on Illumination) and is well defined, repeatable and reproducible.. It allows the representation of any viewable and/or measurable color as a numerical triplet. Several different representations are possible, CIE XYZ and CIE Lab being the most common. CIE XYZ describes colors as tristimulus values, attempting to simulate how the human eye sees color, X being the eyes response to red, Y being the green response, and Z being the blue response. CIELab is a conical model with L being Lightness, a and b defining a hue/chroma wheel. It can also be described in polar coordinates as CIE LCH (Lightness, Chroma, Hue), which comes close to the way we think about color. In the Lab scale, defined by the CIE 1976 diagram, L (or more precisely L*) is a uniform lightness scale computed directly from Luminance (Y) from the formula below: L* = 116 (Y/Yn) )"3 -16; where Yn is the CIE tristimulus value (normally expressed as luminance) for the perfect reflecting or transmitting diffuser and Y is the measured CIE tristimulus value.
CIE color space can be measured by colorimeters or computed from spectral data captured by spectrophotometers. Theoretically any instrument measuring a specific color should produce the same CIE color coordinates as long as the viewing conditions are set identically. According to this invention, a color management system and method is presented in which optimized color space conversions are performed based on two criteria.
First, the best color rendition by a output device to represent an image from an input device is determined based on the color gamut (color gamut refers to the overall color range of a particular device) of both the input device and the output device and selecting the appropriate mapping that best minimizes the overall color reproduction metric ΔE, discussed below.
Second, the algorithm to use to perform the calculations required to map colors from the input space to the output space and to also compute intermediatory colors to provide a perceptual match between input and output device. Given the different color gamuts (both in number of colors and range) of different devices, the simplistic approach to generate a system parameter for the gamut compression is to select said parameter to correspond or be limited to the range of colors to the most limiting device, say the monitor. However this approach is not perceived to be practical as color matches will not be obtained between devices. Another approach is described as follows. An ICC profile requires three gamut mappings to be provided: Perceptual (also called photographic), Saturation, and Relative Colorimetrc. Perceptual is the most common rendering intent, especially used for the photographic reproduction of images. There are no standard recipes to achieve perceptual rendering, every profiling application uses different approaches.
The most simplistic approach to color space transformation is given by an algorithmic transformation of the one color space to another. This is acceptable for device independent color spaces. For example, in the CIE 1976 color space the following defines the transformation from XYZ to Lab(or L*a*b*)
L* = H6 * f*(Y/Yn ) - 16 a* = 500 * ( X/X - f(Y/Nn)) b* = 200 * (f(Y/Yn ) - f( Z/Zn )) The index n marks the coordinates of the white point. In the above set f (X/XJ = (X/X m for (X/Xn) >= 0.008856 f (X/X = 7.787 * (X/XJ + 16/116 for (X/Xn) < 0.008856
The same being true for f (Y/Yn) and f(Z/Zn)
The values X„, Yn, Zn are the CIE Tristimulus values for the perfect reflecting or transmitting diffuser. These are:
Xn = 96.422
Yn = 100.00
Zn = 82.521
It is also possible to define a color difference metric, ΔE, so that we may identify the separation between the sample and the target color.
ΔE* ab = [ (ΔL*)2 + (Δa*)2 + (Δb*)2 ] '/2 where ΔL*, Δa*, Δ b* are the deltas in the L*, a*, b* values.)
The ΔE value is widely accepted as a metric of color reproduction accuracy. Generally speaking, one ΔE represents the smallest color difference the average human eye can perceive. However, current research in color science is beginning to indicate that ΔE values of up to 3 are perceived as being acceptable by most users.
The next step is to use some device-specific information, but again in a purely formulaic manner. Here we use both data from the image and data from the device profile. Utilizing the individual colorant information relative to each of the XYZ tristimulus values, a linearization of the tone reproduction ,or gamma, curves can be developed. We can represent this as follows: lr = redg [R] lg = greeng [G] lb blueg [B]
where lr, lg and lb are the linear components, redg, greeng and blueg are the red, green and blue channel gamma curves and R, G and B are the red, green and blue components of the input pixel.
The calculations can then be made as follows: [X] [Rx Gx Bx] f ir ~|
| Y | = | Ry Gy Gy| * | lg |
Figure imgf000013_0001
where Rx , Gx, Bx are the Red, Green and Blue colorants expressed as XYZ tristimulus values. XN,Z are the Device Independent Space (CIE) tristimulus values.
The three tone reproduction (gamma) curves linearize the raw values with respect to luminance (Y) and the 3x3 matrix converts these linearized values into XYZ values for the CIEXYZ space.
In moving an image between devices, i.e., across profiles, it is also necessary to take into account the different mapping strategies, or rendering intents, as explained above.
The ICC does not mandate a specific operating system or architecture, rather provides a framework which is ideally suited for a Color Management System (CMS). The ICC also placed the responsibility of computing the color transformations on the operating system. However, the operating system presently in use does not support a CMS. In addition, in the production environment, there was a need for a batch processor. For that purpose, a batch processor and a color engine were developed. This color engine carries out the translations from one device space to the device independent space, i.e., the CIE space and then performs the necessary connections to translate from the CIE space to the output devices color space. Each frame is recalculated from either scanner space to monitor space or monitor to film recorder space in 16 bits per color plane. This ensures smooth gradations and color fidelity.
Preliminary steps- determination of device profiles
Before the subject system can be implemented some preliminary steps must be performed.
In order to make use of the concepts of the present invention, each device of the system should be calibrated in accordance with the manufacturer's instructions and on a regular basis to verify that it is still in calibration. Therefore, the first preliminary step, to calibrate each device associated with the system.
The International Color Consortium (ICC) was established in 1993 by eight industry vendors, and now consists of ca. 60 companies, for the purpose of creating, promoting and encouraging a vendor-neutral cross platform color management architecture. The ICC, while not mandating a specific operating system or architecture, provided a framework to implement a Color Management System (CMS). It required that devices be individually characterized, and then allow a color engine (like ColorSync made by Apple Computer, Inc.) to perform the conversion of image data from one color space to another. The ICC also developed the ICC profile standard that serves as a cross- platform device profile format used to characterize color devices.
An ICC profile or, as referred to herein, a device profile is a mathematical expression for transforming device specific coordinates into and out of a device independent space, most commonly CIE Lab. This standard enables color management vendors to produce profile creation software and allows system level color management, to work seamlessly with ICC profiles across applications and platforms.
In a second step preliminary of the present invention, the colorimetry of a set of colors from some imaging media or display is measured for each device to determine its characteristic profile. This can be done in a variety of ways, and is dictated to a large extent on the device being characterized.
For example, to build a scanner profile, a reference image is scanned and compared with a data file that indicates what the scanned values should be. The profile making application then compares the scanned image with the data file that indicates what the data should be in CIE space. Building a profile for a digital film recorder and film inverts the process. Here a set of patches evenly distributed in the output color space are generated and printed. These patches are then measured to provide colorimetric data for the respective profile.
While developing scanner profiles, a standard target is compared with the reference data available for the target. For film recorder profiles and CMY/CMYK printer profiles, software is preferable, such as, for example, ColorBlind ® made by Color Solutions, Inc. of Cardiff-by-the Sea; California 92007.
The target patch values are selected to derive the most information at places where precision is critical. This means that small steps are selected for highlights and shadows, and larger steps are selected in between. From the measured data points over one million colors are filled in between through spline interpolation. This process ensures precise prediction of where in color space additional printed patches would have resulted in device independent color space. This collection of data points now represents a very accurate characterization of the printable colors. In order to create an ICC printer profile the software creates two comprehensive tables, one four dimensional table which describes a CIE value (in our case CIELab) for every possible CMY(K)/RGB (i.e., either CMY, CMYK or RGB, depending on the device) combination. The other table consists of a three-dimensional table which maps every possible CIELab value to a CMY(K)/RGB equivalent. Since large portions of this table are likely to be outside of the printer's color gamut, gamut mapping needs to be invoked to come up with a reasonable approximation for this scenario.
Each device profile also contains a media white point tag, which contains the actual measurement of the substrate (e.g. paper or film substrate). The entire profile is calculated relative to this substrate, so that paper or film density from an input profile would get mapped onto paper white on any output, regardless of the paper color. The human eye (and brain) is extremely adaptive, and we do not perceive paper color unless it is very dark or colorful, or it is presented to us next to a different paper color. Therefore photographic mapping is usually the preferred rendering in use today.
Creating scanner and monitor profiles represent a subset of creating printer profiles mathematically. Different interpolation techniques are sometimes used depending on the number of colors available. Scanner Calibration and Profiling
The digital camera is calibrated using manufacturer recommended practices so that we obtain uniform response in all three channels. The digital cameras aperture settings and scan times are kept constant. Scene illumination is provided by continuous band tungsten lamps and to maintain consistent light levels, the light output from the lamps is monitored constantly. Camera calibration and light stability is also verified using a white target and checking the digital values returned from the scan.
For scanner characterization a color target is measured with a spectrophotometer, producing a set of reference data. This target is subsequently scanned, and an ICC profile is computed that maps the RGB colors of the scanner to the CIE colors of the references set. A typical target used by the industry as the standard for making color decisions on reflective material is the Kodak IT8 7.2 Reflective Target shown in Fig. 2 may also be used for the calibration of the scanner. These have been in common use for the past 5 years, and are produced by Kodak, Agfa and Fuji, both for transparent and for reflective material. Each target, is accompanied by its reference data file, expressing each color in terms of CIE values. However the inventors have found that using this target does not always lead to optimized results because in some animations, certain colors are shades may be used which are translated correctly. Therefore, if necessary, a modified target may be used. For example, the inventors have created and used a variant of the Kodak IT8 7.2 target. This modified target, shown in Fig. 3, is generated by examining typical frames from several scenes of the animation, and selecting certain predominant or frequently used colors from these scenes which are not found on the standard target. Panels or squares of these colors are added to the chart of the other colors. More specifically, referring to Figs. 6 A, 6B, each of the scanners and/or cameras
(such as video or movie cameras) is calibrated as follows. First, in step 60, the scanner black point and white point are calibrated in accordance with the specifications of the manufacturer. Next, in step 62 one or more light sources having a uniform illumination and color are set. Using these lights, in step 64, the exposure time, scan speed (if any- obviously video and movie cameras do not have this parameter) F-stops are set to obtain a maximum dynamic range over a designated target. For example the target may be either the Kodak IT8 7.2 target of Fig. 2, or the modified target of Fig. 3 has already been developed as described below.
In step 66 a user is given the choice of using 16 or 8 bit data. If 8 bit data is selected then in step 68, the appropriate correction curve, one that preserves precision in the low order bits, is applied and calibration is complete. If 16 bit data is preferred then the calibration step is complete. (Step 70). Next in step 72 the white field variation is determined. If this variation is smaller than a preselected tolerance, then in step 74 a check is performed to determine if there is an existing profile for the subject scanner. If a scanner profile has been previously calculated, or otherwise obtained (for example from a manufacturer) then the profiling process is complete (step 76).
If the white field variation is too large or if no standard profile exists yet then in step 278 color swatches used in the subject feature presentation or film are obtained from the manufacturer. In step 80 the color swatches are sorted using the standard lighting conditions of step 62. In step 82 a determination is made as to whether the color swatches include any metamaric colors. If metamaric colors are present then in step 84, spectral data for these colors are obtained using a spectrophotometer, and these colors are added to the customized color target of Fig. 3.
In step 86 the color target of Fig. 3 is generated by adding to a standard target the metamaric colors. Metamaric color pairs are two colors that exhibit metamarism. Metamarist is the phenomenon wherein two color samples appear to match when viewed under certain lighting (illuminant) conditions, but not under others. The target of Fig. 3 may be generated by painting the patches on an appropriate media, or alternatively by using a high quality digital printer.
After the target has been painted then each of the color patches on the target is analyzed under controlled lighting using a spectrophotometer. In this manner, spectral data is obtained and used to generate a CIEXYZ and CIELAB color space for the target patches (step 88). Next in step 90 the target is scanned with the subject scanner under controlled light(the same conditions that were used to calibrate the scanner). The data thus obtained is used to generate the transformation matrix to the scanner color space. As part of this step, interpolation tables for colors not found on the target are also developed (step 92), thereby completing the calculation of the scanner profile (step 94). This profile is then stored for use during the color management process as described below.
Profiles may be generated in a similar manner for all digital input devices.
Monitor Setup, Calibration And Profiling. A preferred monitor used in the present system is the Barco Monitor made under the Model number 120T reference calibrator by BARCO Display Systems of Kennesaw, Georgia 30144. These monitors are deployed in all of critical stations, i.e. the Color Models, Final Check and Film Print departments. Each monitor is equipped with a microprocessor that measures the instability of each gun and feeds back correction data to the video amplifier. This dedicated microprocessor continuously counteracts instabilities in the picture tube and maintains correct black levels over time. The unit also controls other visual parameters, including color gain, geometry and timings. This permits long ter m color stability and a stable contrast ratio, even at low intensities. Utilizing the Automatic Kinescope Biasing techniques, Barco is able to guarantee low light level stability. Use of the BARCO Optisense permits high level stability. The BARCO Optisense measurement head is an optical instrument. It is meant for the calibration and color measurement on a BARCO monitor. The Optisense is essentially a photometer, which when combined with the ob-board microprocessor allows "color" calibration. Calibration for each individual monitor is carried out at the factory, during production, and stored in the on-board memory. In this way each monitor carries its individual signature. Recalibration with the photometer ensures that the monitor is maintained in this state. For recalibration, the photometer is placed over the central portion of the screen, and the onboard software presents a series of color swatches of differing intensity and color. Color values are measured in CIE XYZ terms and offsets are fed back to the microprocessor controlling each of the electron guns.
After calibration, the device profile may be generated. Monitor profiles are generated by displaying a comprehensive range of colors on the monitor while measuring the CIE values for those colors with a spectral device. The relationship between the colors and the corresponding CIE data is then stored as the monitor profile. More particularly, referring to Fig. 7 A, first, in step 302, the subject monitors are set up in a windowless room in which lighting is controllable. In step 304 the geometric properties of the monitors are calibrated to eliminate distortions. In step 306 the white point of the monitors is set to a preselected color temperature standard. In step 308, a photometer such as Optisense is used to calibrate the cross gray and color ramps of the monitor.
In step 310 the overall system gamma or gain is adjusted by varying the hardware lookup table.
This completes the calibration of the monitors. In step 312 a check is performed to determine if a profile exists for the subject monitors. If it does then the profiling procedure is complete. If not, then in step 316 a target is produced on the monitors digitally. Preferably the target includes a plurality of patches of different color similar to the targets of Figs. 2 or 3.
In step 318 each of the patches on the monitor is analyzed using a photometer to generate spectral data. This spectral data is then used to generate the values defining the monitor color space. From these values, CIEXYZ or CIELAB values are derived for the patches in the device independent color space. In step 320 interpolation tables are generated for colors not found on the target.
In step 322 the output profile of the monitor is generated. Since the monitor may be used as either an input or output device, the generated profile is replicated
(mathematically inverted) to serve as an input profile and an output profile." The resulting printed target is analyzed with the photometer and the results are used to generate an input profile for the monitors.
Digital Film Printer Calibration and Profiling The first step in calibrating the Digital Film Printer requires the film characteristic curve or D Log E plot. A sensitometric 21 -step density wedge is available for films from their respective manufacturers. These wedges are in normally defined in Vi stop increments of exposure and illustrate the dynamic range of the chosen film stock. On the 21 step density wedge that characterizes the film, the LAD (Laboratory Aim Density) point is determined as a point of reference. This point LAD represents a neutral gray of about 16 % reflection in a normally exposed scene. On the 21 -step wedge that characterizes the film, the LAD point is typically found around step 11 or 12. Diffuse white, at 90% reflectance (film white) represents five times the exposure of the LAD, or about 2-1/2 f-stops, and this can be found at or-around step 17. In practice, however, it was found that using Celco Film recorders (Made by CELCO of Rancho Cucamonga, California 91730), an extra Vi stop of dynamic range was obtainable without inducing fogging.
Once the maximum and minimum densities were obtained, generally different for Red, Green and Blue, the behavior of the film is linearized. In continuous tone pictures it is important that the relationship between the input and the output be linear, implying that when a numerical intensity value is scaled, the physical intensity of the light imaging onto film be scaled by the same amount. This linearizing/calibration stage is achieved by developing look-up-tables for each of the Red, Green and Blue channels. To develop this linearization, it is important for us to select the correct encoding scheme that will adequately avoid visible quantizing. It is known that it is necessary to employ a scheme such that 'Just Noticeable Difference '(JND) is not violated. This JND pertains to the limit of human perception, which is approximately 1% of the luminance level. Utilizing a 12 bit film recording scheme, the inventors were able to apply a gamma 1.5 encoding scheme to select 21 pixel values over which the pixel value to AIM density plot is linearized. Schemes utilizing 42 pixel values were also tried but they did not provide a noticeable improved performance. More particularly, a typical iterative process, wherein, a LUT is generated for the first go-around and the 21 gray scale values are generated. The resulting densities are developed on a negative film and sensed using an Xrite-310 densitometer, available from X-Rite, Inc. of Granville, MI 49418. A spline curve fitting algorithm is utilized to adjust the 12 bit values. Given a tolerance of +/- 0.02 density units, at least one more iteration is required to bring the measured values to the required values.
Once film recorder calibration of the film stock is complete, a profile for the recorder itself may be generated. Characterizing film recorders represents a more difficult task than standard color printers because of the small size of the output media and the material used. Since the output is negative film, the processing of the film becomes part of the profiling itself. Controlling the processing is therefore critical for the success of the profiling. A CRT based film recorder produces flare, which makes it difficult to image lots of small color patches on the 35mm area as the neighborhood of each color patch starts to have an effect on other patches. This problem was solved by measuring large patches on subsequent frames. These patches were also added to the modified target shown in Fig. 3.
More specifically, referring to Figs. 8A-8C, in step 402, a sensistometric strip is obtained for the subject negative film. In step 404, using a densitometer such as X-Rite 310, the dynamic range of the exposed negative film is measured and the maximum and minimum density levels Dmax and Dmin are established. In step 406, using these limits a strip of film is shot on the digital recorder to record a linear gray scale between said limits.
In step 408 the film strip with the gray scale is processed and the negative gray scale is analyzed. In step 410 the difference between the measured and the targeted densities are analyzed. If this difference is below a preselected tolerance level, then the calibration is complete (step 412). Otherwise, in step 414 the digital-to-analog conversion constants and other parameters are adjusted and linearized and the gray scale is reshot.
Once calibration is completed, in step 416 a check is performed to determine if a profile corresponding to the parameter Dmax is available. If not, then a respective profile is determined as follows.
In step 418 a special target is generated. This target may be obtained by modifying for example some existing targets by emphasizing patches with dark and light areas. A film profiling target is shown in Fig. 9. Each patch of the target is printed on the subject film as individual frames, in step 422. The exposed film is then processed (step 424) and in step 426 the exposed frames are analyzed first to determine if the gamma of the negative film is acceptable. If it is not, then in steps 428 and 430 negative strips are generated using different baths and the strips are analyzed and the strip with the least deviation from a mean density is selected.
In step 432 the positive film is analyzed using a special transmissive spectral analyzer such as a Colortron II mated with a Spotlight Light Table. Colortron II is available from X-Rite/California Tech. Center, San Rafael, California 94903. Using this data, the color space for the film recorder is developed as well as the transformation parameters to transfer data from the device independent color space and the recorder color space. In step 434 interpolation tables are developed for colors not found on the target of Fig. 9.
General Description of the Preferred Embodiment of a System for Making a
Feature Presentation
The system for generating animated features in accordance with this invention is shown diagrammatically in Fig. 1. In this Figure, the color management system 10 is used to generate a film strip on which the image of a tree 12 and a standing figure 14 are superimposed. First, the image of the tree 12 is drawn, painted or otherwise on art work supporting media such as a sheet of cardboard 16. The image is recorded by a scanner or digital camera 18 to generate an image 20 of tree 12 in the scanner color space. A microprocessor 22 is used to transform the image 12 from the scanner color space to a device independent color space 24. ( In the following description several different microprocessors are shown and identified as separate elements, it being understood that in some instances a single microprocessor may perform the function of several of these microprocessors).
Color space 24 is preferably a CIE space such as CIELab.
Once image 12 is converted to the device independent color space 24, it can be transformed to any other color space specific to any device. For example, to display the image 12 on a monitor 26, a microprocessor 28 is used to transform the image of the tree 12 in the color space 24 to a monitor color space. Once the image 12 is displayed by monitor 26, it can be compared visually to the original art work on media 16. If the images are very similar, the algorithms used in the system 10 are appropriate. The image 14 is also created in a well known manner, scanned and transformed to the color space 24. To show image 14 on monitor 26, its representation in color space 24 is transformed by microprocessor 30. If required, the images 12 and 14 in the monitor color space are combined into a single composite image by means well known in the art and then displayed on the monitor 26. The combined image in the monitor color space is transformed to the color space 24 by a microprocessor 32. Similarly, the images 12, 14, or their composite can be transmitted and manipulated by other device by appropriate transformation thereof from the color space 24 to the appropriate color space. For example, in the art of making feature animations, it is desirable to have color prints for proofing. This can be accomplished by the present color management system by converting the required image from color space 24 to the printer color space, for example by using a microprocessor 34. Once this conversion is completed, the image is printed by printer 36.
To generate a film, the required image is converted into a film recorder color space using microprocessor 38. The image in the film recorder data space is provided to a digital recorder station 40 where said data is used to generate film strip. The film strip is sent to a lab 42 which processes it to form a negative film strip 44. The negative film strip 44 is used to generate a positive film strip 46.
Preferably, as the computer monitor 26 is the device under which all approvals are carried out (as described in more detail below), all the images are transformed into and maintained in monitor color space. In this manner the production computer network or the host computer is not tasked with additional computation every time an image is brought up on the monitor 26. Even though a monitor 26 has a reduced color gamut when compared with either the scanner color space or the film color space, moving transforming of images to either of these spaces, is performed without truncation, permitting easier integration into the production flow.
Preferably, digital camera 18 has with a Trilinear CCD scanning array attached to 4x5 camera. Such cameras are available for example from Dicomed, Inc. of Bunsville, MN 55337. Monitor 26 is preferably made by BARCO Model 120T especially in color critical areas. Less expensive monitors may be used in other less critical areas.
The digital film recorder station 40 includes a Celco film recorder available from CELCO. This recorder is used to expose Eastman Kodak Ektachrome color negative, stock# 5245. Alternatively, in some instances, a Black and White negative film , Kodak stock# 5231 is used . These instruments used to calibrate each of the devices were a photometer for the
Monitor calibration, an Xrite densitometer for film recorder calibration and a chroma meter for scanner luminance calibration. For device profiling, spectrophotometers were used so that we may determine CIEXYZ and CIELab values.
Production of a Feature Animation
A more detailed description of the steps required to generate a feature animation using the novel color management system is now described in conjunction with the flow charts of Figs. 4A-4C. In step 100 an artwork is generated on board 16. In step 102 the art work is viewed in a controlled environment. Such an environment may consist of an enclosed windowless room having a plurality of lights individually calibrated to a color temperature of 5400 °K. The room has a designated viewing area selected so that incident light on the artwork is diffused and uniform and has an illuminance on a flat white matte surface that is equivalent to the illuminance of the monitor. In step 104, the director determines if the art work is acceptable esthetically as well as coloration. If the artwork is not acceptable, in step 106 the art work is corrected and steps 102-104 are repeated until the director is satisfied.
In step 108 the art work is sent to the scanner 18. In step 110 the scanner equipment is checked to determine if the scanner and equipment related to it are calibrated. If not, then in step 112 the scanner 18 is recalibrated as discussed above and shown in Figs. 6A-6C.
In step 114 the art work is scanned and the resulting digital data representative of the image of the art work in the scanner color space. This data is sent to microprocessor
22. In step 116 the microprocessor 22 processes this data to convert the image from the scanner color space and its device profile is applied. If a valid profile does not exist, a device profile is generated, as described in more detail in conjunction with the device independent color space (CIE). This data is then converted again by the microprocessor 28 to generate data corresponding to said image in the monitor color space. This transformation of image data is illustrated in Fig. 4D.
In step 118 a check is performed to determine if monitor 26 is calibrated. If it is not, then in step 120 the monitor is calibrated and its monitor device profile is applied. If a valid device profile does not exist, a device profile is generated, as described below. The monitor is then moved to a room having the controlled illumination described above. In step 122 animation and/or other special effects are generated in the usual manner. In step 124 the art work is combined with the images generated in step 122. In step 126, additional effects are generated and/or the images created thus far are corrected. In step 128 all the image elements are combined again. In step 130 the new and/or corrected images are viewed again by the director under controlled illumination conditions.
In step 132, the director decides whether he is satisfied with the final product. If not then in step 134 the images are reworked. During any of the steps 118-132 the director may request prints. The data corresponding to the current images is transformed to the printer color space and the images are printed on proofing device 36. If in step 132 the director is satisfied then in step the data corresponding to the final images is sent to the film recording station 40 in step 136. In step 138 the film recording equipment is checked to insure that it has been calibrated. If not, then in step 142 the film recording equipment is applied. In step the animation is ready to be printed on film. The data is now transformed from the monitor color space to the device independent color space and then from the device independent color space to the recorder color space in step 144, as illustrated in Fig. 4E.
In step 146 the data is recorded on film and the film is sent to the processing lab 42. In step 148 the gamma of the processed negative film is checked. If this parameter is within a preset tolerance then the negative film is used to generate a corresponding positive film. If the negative film is not satisfactory then in step the negative film is reshot.
In step 152 the positive film is shot from the negative film and its positive film density gamma is checked. If it is not satisfactory then the positive film is reshot in step 154.
In step 156 the positive film is sent to an editing room where it is edited and then a complete animation feature is generated using standard SMPTE procedures.
Standard live action motion picture production Referring now Figs. 5A-5E, standard, live action motion picture is produced using the subject color management system as follows. In step 200 the live actions scenes are shot on film or video tape. In step 202 the director checks the live action scenes. In step 204 the unacceptable scenes are reshot.
In step 206 the film is sent to a film scanner (not shown in Fig. 1). In step 208 the film scanner is checked for calibration. If necessary, the scanner is recalibrated and its profile recalculated in steps 208-210.
The film scanner then scans the film and generates corresponding digital data in the scanner color space in step 212. In step 214 the digital data corresponding to the film in the scanner color image space is transformed by microprocessors into corresponding data in the device independent space. This information is in turn transformed to the monitor color space as illustrated in Fig. 5D.
In step s 216, 218, the monitor is checked, and if necessary, recalibrated and its profile is recalculated.
In step 220 additional footage and/or special effects are created.
In step 222 the additional material is added to the original data. In step 224 the new and the original material is combined. In step 226 the combined material is checked under controlled lighting conditions on monitor 26. In steps 228 and 230 the material undergoes one more check, and portions are reworked under instructions from the director.
In step 232, the data is sent to the film recording station. Once again, the recording station is checked and, if necessary, the recorder is recalibrated and its profile is recalculated in step 234.
In step 238 the digital data representative of images in the monitor color space is transformed first into the device independent color space CIE and then into the recorder color space, as shown in Fig. 5E. The remaining steps 240-250 are similar to steps 146- 156 in Fig. 4C.
Confirmation of the Procedure
There are numerous ways to evaluate the performance of the system that has been described above. These can be broken down into three areas:
A. Verifying ΔE B. Objectively measure the output of each device.
C. Visual Check. A. ΔE Results
As described above, ΔE, was shown as universal metric of color reproduction accuracy. It is possible to compute the ΔE for a particular device by examining the quality of fit between the forward and reverse profile of a device. For the devices in the system of Fig. 1, the ΔE values obtained from this mathematical fit are shown in Table 1.
Figure imgf000029_0001
As can be seen from the above, delta E is within the acceptable range for all the media.
Objective Measurements The premises and concepts have been further verified using objective measurements. In this verification process, a spectrophotometer is used to measure the output of a device using a standard target. Fig 10, shows the target used. This target is organized as 4 rows of 6 patches each, designated by rows and columns as shown. Row 4, is the gray scale row, with A4 at full white (Digital value on monitor of 255,255,255) and dark gray at F4 (Digital value of 30,30,30 on the monitor). Row 3 combines the primary and the secondary colors, i.e. BGRYMC for patches A through F, respectively. Row 2, has colors that lie approximately between the primary colors. Lastly, Row 1, has colors that represent natural colors. As the image of the target is created on the monitor, it exists in the color monitor space. A spectrophotometer was used to measure each of the patches, shown in Fig 10, on the Barco 120T monitor so that the CIELab values may be obtained. The data thus obtained was transformed to the device independent color space using the monitor's output profile. The data from the device independent color space was transformed again to the recorder color space using the recorder profile and the target shown in Fig. 10 has been recorded on film . To eliminate color interaction within the 35mm frame, each patch was shot to film as a separate frame. The negative was then developed, and a positive film was made from the negative. The positive film was measured using the spectrophotometer with the ability to measure in transmissive mode.
In addition, from the device independent color space, the data was also transformed to the printer color space and a color print of the target was generated on a semi-matte paper using an Iris 5030 InkJet Printer. This printer is available from Scitex.
Finally to close the loop, the semi-matte print was scanned. The output of the scanner was transformed from the scanner color space to the device independent color space, and then tp the monitor color space and the result was displayed on the monitor.
A summary of results is shown in Tables 2, 3, 4 and 5 shown in Figs 12A-12D separately.
The tables are organized in a format similar to the Fig 10. For each of the patches, Al through F4, the measurements obtained from the Iris Semi -matte proof and the film recorder (on EK Film) are shown alongside the values that were generated on the Monitor (Monitor Digital).
The results show that going from Monitor to Film, we are able to reproduce the colors very well, with a lone outlayer for patch A2 (Δ E of 6.91). The RMS value for the Δ E for Monitor to EK Film match can be computed to be 3.68. Given the variability in the film developing and the One light printing process, this is an extremely close match. The Average ΔE is computed as 3.55.
The results for the color match between Monitor and the Iris Semi-matte proof show that we have a very large ΔE for patches A3 and B3, 59.91 and 20.24, respectively. These two patches are the Blue and Green patches, highlighting the reduced gamut of the Iris printer. To allow for this gamut compression, within the perceptual rendering intent, the color engine has adjusted the relative match of all the colors in such a way that even if there is not a very good absolute match, perceptually the iris print and the monitor appear to match under identical viewing conditions. The RMS value for the ΔE for Monitor to Iris Print on Semi-matte paper can be computed to be 13.9, while the average is 7.75.
As stated earlier, the Iris-print was scanned. Examination of the ΔE values between Scanned image (on the Monitor) and the Iris Print shows some large deviations in patches A3, B3, D2, E3. This highlights how one can easily violate one of the tenets of color management, this being, A device profile is only valid for the media and under the conditions which the profile was created. The Iris printers pigments are not the same as the paint pigments, and hence the scanner interprets (sees) these as different colors. The RMS ΔE for the process of transferring Iris Print to the monitor (i.e. scanned) is 19.96 and the average ΔE is 12.63. Although the pigments are incorrect, this test does permit us to show relative data and the comparison between starting Monitor data and the final scanned data.
In practice, with the painted pigments, we are able to achieve a very close match between scanner and monitor. The current RMS ΔE value is 3.75 with an average ΔE of 2.2. Visual Verification
In spite of all the quantitative data that we can present on the quality of the profile and the color engine, it is imperative that the color match, or lack thereof is verified visually. It is, however, important to view the test objects under the same viewing conditions. To this end, to verify scanner to monitor match, special viewing booths were designed and installed with D54 (5400 K, color correct) lamps. In addition, the light output and uniformity was adjusted so that it corresponded with the monitor. For verifying for monitor to film color match, a light table was similarly modified. While, the results from this tests cannot be quantified, the comparison indicated a close relationship between the two sets of images, confirming the validity of the approach disclosed.
As device manufacturers improve their devices, providing larger color gamuts and wider dynamic ranges, it is expected that they will also include device profiles. This will eliminate the need for users to profile each device. In our case we found the device profile provided by the Barco monitor was better than that generated using our instruments. We found that the low light level response of the chosen spectrophotometers was not as repeatable and stable as we would have liked, causing large variations in the dark regions of images. For this reason, we chose to utilize the individual signature or CCP supplied by the manufacturer. Of course, we had to select the average CCP from our entire set of monitors, but still the results proved acceptable. For film, tighter process control at the film laboratory is the most important factor in improving monitor to film match. Further, ICC specifications do not currently provide for film negative profiling. This will be an important addition, since it will allow us to separate the One light prints variability from the negative films developing stability. Proper viewing conditions are also the most important, yet often overlooked aspect of color management evaluation. Viewing booths, light tables and ambient light conditions should be properly setup and checked regularly before color match verification is carried out.
It is, however, important to emphasize that color management is not infallible. As shown, it is important to maintain the same media and the same conditions under which the profile was generated. Further, as the results show, there is a DE between matches, and given the peculiarities of Human vision, under some conditions, for some people, some mismatches will be more apparent than others.
In addition, color matches will only be as good as the instrumentation, the devices, and the consistency with which devices can reproduce color. The key to good color management is correct target selection, calibration, maintenance and good process control. When so many components are being used it becomes imperative to analyze and control each component carefully. Correct calibration and maintenance of devices in the color generation and reproduction loop is perhaps the cornerstone of color management.
Over the course of the QUEST FOR CAMELOT production the color management system (CMS) described above was implemented and integrated into the production flow and continues to be used in current productions.. Savings in time, material and labor were the result. Prior to color management, the production had budgeted for two hours for manual color correction on each background. After color management was implemented, this had dropped to almost zero. Camera retakes, due to incorrect color match, were also eliminated.
Perhaps equally important, the CMS was able to provide the artists with the creative freedom to select from a wider gamut of colors than previously available, and also have the confidence that the colors would reproduce faithfully.
Obviously numerous modifications may be made to the invention without departing from its scope as defined in the appended claims.

Claims

We claim:
1. A system for producing a sequence of color images from several art works and/or real live action shots, comprising: a digital scanner generating digital scanned data from said artworks, and/or said action shot s, said digital scanned data corresponding to said artworks and/or said live action shots in a scanner color space; a first processor receiving said scanned digital data and converting said digital scanned data into an intermediate digital data defining images in an intermediate color space; a monitor receiving said intermediate digital data and displaying in responses digital images to allow a director to review and edit said digital images; a second processor generating output digital data in a recorder color space from said intermediate data; and a recorder generating recorded images on a preselected media based on said output digital data.
2. The system of claim 1 wherein said first processor transforms said scanned digital data to intermediate data defining images in a device independent color space, said system further comprising a third processor converting said intermediate data into monitor data defining images in a monitor color space.
3. The system of claim 1 further comprising a third processor converting said intermediate data to printer data defining images in a printer color space, and a printer for printing images on a paper from said printer data.
4. The system of claim 1 wherein said recorder records said images on a transparent film.
5. The system if claim 1 wherein said recorder records said images on a video tape.
6. In a moving picture production for making a film composed of animations and/or live action shots by using scanners and/or cameras for generating scanner data, monitors for displaying images resulting from said scanner data, and recorders for recording on film or video tape said film, a color management system comprising: a first microprocessor which converts said scanner data from a scanner color space to a device independent color space in accordance with a scanner profile; and a second microprocessor which converts data from said device independent color space to another color space associated with one of said monitors and said recorder in accordance with a corresponding profile; said profiles and said device independent color space being selected to minimize color distortion.
7. The system of claim 1 further comprising an enclosed room with predetermined illumination characteristics, said monitors being disposed in said room.
8. A method of producing a film comprising the steps of : generating digital data corresponding to least one of a plurality of art works and a plurlatiy of live action shots in a scanner color space; converting said digital data to a device independent color space; converting said digital data from said device independent color space to a monitor color space; displaying said digital data in said monitor color space on a corresponding monitor; converting said digital data in device independent color space to a recorder color space; and recording images on a recording medium, said images being defined by said digital data in said recorder color space.
9. The method of claim 8 further comprising generating profiles for converting digital data to and from said device independent color space.
10. The method of claim 9 further comprising defining a target formed of several color patches, and using said target for generating said profiles.
11. The system of claim 9 further comprising generating a scanner profile for converting said digital data from said scanner color space to said device independent color space.
12. The system of claim 11 wherein step of generating said scanner profile includes defining a target formed of several color patches, said color patches being selected based on some of the common colors of said art works and/or said live action shots, scanning said target to obtain test data and analyzing said test data to determine the optical characteristics of said scanner.
13. T he system of claim 11 further comprising calibrating said scanner before scanning said target.
14. The system of claim 8 further comprising converting said digital data from said device independent color space to a printer color space, and using said converted data to print images on a printer.
PCT/US1999/010517 1999-03-09 1999-06-15 System for color balancing animations and the like WO2000054213A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU18065/00A AU1806500A (en) 1999-03-09 1999-06-15 System for color balancing animations and the like

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26503799A 1999-03-09 1999-03-09
US09/265,037 1999-03-09

Publications (1)

Publication Number Publication Date
WO2000054213A1 true WO2000054213A1 (en) 2000-09-14

Family

ID=23008690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/010517 WO2000054213A1 (en) 1999-03-09 1999-06-15 System for color balancing animations and the like

Country Status (2)

Country Link
AU (1) AU1806500A (en)
WO (1) WO2000054213A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1528794A1 (en) * 2003-10-29 2005-05-04 Thomson Licensing S.A. Method for color correction of digital image data
EP1528791A1 (en) * 2003-10-29 2005-05-04 Thomson Licensing S.A. Method for colour correction of digital image data
WO2005043886A1 (en) * 2003-10-29 2005-05-12 Thomson Licensing S. A. Method and system for color correction of digital image data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5257097A (en) * 1991-09-27 1993-10-26 Eastman Kodak Company Method and apparatus for selective interception of a graphics rendering operation for effecting image data modification
US5420979A (en) * 1989-12-22 1995-05-30 Eastman Kodak Company Method and apparatus for using composite transforms to form intermediary image data metrics which achieve device/media compatibility for subsequent imaging applications
US5666436A (en) * 1993-10-14 1997-09-09 Electronics For Imaging Method and apparatus for transforming a source image to an output image
US5687011A (en) * 1990-10-11 1997-11-11 Mowry; Craig P. System for originating film and video images simultaneously, for use in modification of video originated images toward simulating images originated on film
US5874988A (en) * 1996-07-08 1999-02-23 Da Vinci Systems, Inc. System and methods for automated color correction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5420979A (en) * 1989-12-22 1995-05-30 Eastman Kodak Company Method and apparatus for using composite transforms to form intermediary image data metrics which achieve device/media compatibility for subsequent imaging applications
US5687011A (en) * 1990-10-11 1997-11-11 Mowry; Craig P. System for originating film and video images simultaneously, for use in modification of video originated images toward simulating images originated on film
US5257097A (en) * 1991-09-27 1993-10-26 Eastman Kodak Company Method and apparatus for selective interception of a graphics rendering operation for effecting image data modification
US5666436A (en) * 1993-10-14 1997-09-09 Electronics For Imaging Method and apparatus for transforming a source image to an output image
US5874988A (en) * 1996-07-08 1999-02-23 Da Vinci Systems, Inc. System and methods for automated color correction

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1528794A1 (en) * 2003-10-29 2005-05-04 Thomson Licensing S.A. Method for color correction of digital image data
EP1528791A1 (en) * 2003-10-29 2005-05-04 Thomson Licensing S.A. Method for colour correction of digital image data
WO2005043886A1 (en) * 2003-10-29 2005-05-12 Thomson Licensing S. A. Method and system for color correction of digital image data

Also Published As

Publication number Publication date
AU1806500A (en) 2000-09-28

Similar Documents

Publication Publication Date Title
EP1237379B1 (en) Image processing for digital cinema
US6473199B1 (en) Correcting exposure and tone scale of digital images captured by an image capture device
US6594388B1 (en) Color image reproduction of scenes with preferential color mapping and scene-dependent tone scaling
EP1139653B1 (en) Color image reproduction of scenes with preferential color mapping
US6671067B1 (en) Scanner and printer profiling system
US5809164A (en) System and method for color gamut and tone compression using an ideal mapping function
KR101680254B1 (en) Method of calibration of a target color reproduction device
US7298892B2 (en) Producing a balanced digital color image having minimal color errors
US6864915B1 (en) Method and apparatus for production of an image captured by an electronic motion camera/sensor that emulates the attributes/exposure content produced by a motion camera film system
US5721811A (en) Pre-press process and system for accurately reproducing color images
MacDonald Developments in colour management systems
US6211973B1 (en) Color transforming method
US6985253B2 (en) Processing film images for digital cinema
WO2000054213A1 (en) System for color balancing animations and the like
MXPA06004741A (en) Method and system for color correction of digital image data.
Sharma ICC color management: Architecture and implementation
Ramamurthy et al. Achieving color match between scanner, monitor, and film: a color management implementation for feature animation
WO2001078368A2 (en) Film and video bi-directional color matching system and method
MacDonald Color fidelity issues in image reproduction for print
US6882451B2 (en) Method and means for determining estimated relative exposure values from optical density values of photographic media
Kuznetsov et al. Color Management in Printing
Fdhal et al. TOWARDS AN AUTOMATED SOFT PROOFING SYSTEM USING HIGH DYNAMIC RANGE IMAGING AND ARTIFICIAL NEURAL NETWORKS
Kraushaar et al. ICC color management in the motion picture industry
JPH084321B2 (en) Method and apparatus for calibrating color value
Strgar Kurečić et al. Color Management Implementation in Digital Photography

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref country code: AU

Ref document number: 2000 18065

Kind code of ref document: A

Format of ref document f/p: F

AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642