US20080246855A1 - Image processing device, electronic camera, and image processing program - Google Patents

Image processing device, electronic camera, and image processing program Download PDF

Info

Publication number
US20080246855A1
US20080246855A1 US12/153,518 US15351808A US2008246855A1 US 20080246855 A1 US20080246855 A1 US 20080246855A1 US 15351808 A US15351808 A US 15351808A US 2008246855 A1 US2008246855 A1 US 2008246855A1
Authority
US
United States
Prior art keywords
color
space
image data
gamut
determining part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/153,518
Inventor
Hideo Hoshuyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to US12/153,518 priority Critical patent/US20080246855A1/en
Publication of US20080246855A1 publication Critical patent/US20080246855A1/en
Priority to US12/929,869 priority patent/US8520097B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6058Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/841Camera processing pipelines; Components thereof for processing colour signals to modify gamut

Definitions

  • the invention relates to an image processing device for converting color spaces of image data.
  • the invention also relates to an electronic camera on which the image processing device is mounted, and an image processing program.
  • image data created by a color image processing device such as an electronic camera, a digital video camera, and a scanner is initially subjected to processings including color conversion, tone processing, and contour enhancement processing.
  • the image data is then recorded on a recording medium such as a memory and a magnetic tape, or transmitted to external equipment via communication media.
  • the recorded image data is reproduced, for example, as a photograph by a developing machine, a printer, etc.
  • the transmitted image data is reproduced on a monitor as a moving image or a still image, for example.
  • the image-capturing side and the reproduction side need to process the image data by using the same standard.
  • various types of standards color spaces
  • the color coordinates of the three principal colors R, G, and B
  • FIG. 1 shows an xy chromaticity diagram showing NTSC color space and sRGB color space.
  • the horseshoe shaped area is a color range that humans are perceptible of (hereinafter, to be referred to as visible region).
  • the image-capturing side can encode only colors inside the respective triangles with the coordinates of R, G, and B as the vertexes in the color space it uses.
  • the reproduction side can reproduce only colors inside the respective triangles with the coordinates of R, G, and B as the vertexes in the color space it uses.
  • the range of colors that can be thus expressed in a color space, as well as the range of color distribution of a subject shall be referred to as color gamut.
  • the ranges of colors that can be expressed in NTSC color space and sRGB color space are smaller than the visible region. This also holds for most other color spaces (including CIE RGB and Adobe RGBTM).
  • the colors of the subject is not reproducible accurately from the image data created by this image-capturing system. Additionally, even with the image-capturing system having a color space that covers the color gamut of a subject, it is not possible to reproduce the colors of the subject with accuracy if the image data created by this image-capturing system is converted into such image data that it is rendered in a color space not covering the color gamut of the subject.
  • Japanese Unexamined Patent Application Publication No. 2002-109523 has proposed a method of establishing a new color space capable of expressing all colors and capturing an image in this color space.
  • This new color space differs from the known color spaces in the coordinates of the three principal colors.
  • the image data based on the new three principal colors is thus converted into image data based on known three principal colors before output to an existing image output apparatus.
  • image data yet to be compressed consists of pixels whose colors are encoded in a predetermined number of bits each (for example, 8 bits for each of the three principal colors). If encoded in a larger color space, the captured image data is thus expected to be greater in color difference per tone. Once the image data is encoded in wider tones, it is impossible to make the tones finer in subsequent processings. A greater color difference per tone results in unclear reproduced images and making it difficult to process the image data.
  • An image processing device of the present invention includes a color-gamut determining part, a color-space determining part, and a color-space conversion part.
  • the color-gamut determining part determines a color gamut as a range of color distribution from input image data.
  • the color-space determining part determines a color space substantially containing the color gamut determined by the color-gamut determining part.
  • the color-space conversion part converts the input image data into image data which is rendered in the determined color space. It may be expected that the colors of the subject are accurately reproducible from the converted image data.
  • the color-space conversion part herein will sometimes be referred to as color correcting part.
  • the color-gamut determining part divides the input image data into a plurality of image regions, calculates a hue and a chroma for each of the image regions, and determines a maximum chroma for each of the hues calculated.
  • the color-space determining part selects a smallest color space from color spaces each having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated by the color-gamut determining part.
  • a color space substantially containing the color gamut corresponds to a color space having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated, for example.
  • a small color space signifies that an average of the maximum chroma determined for each of the hues is small, for example.
  • the color-gamut determining part maps the input image data onto a chromaticity diagram. Then, the color-space determining part selects a smallest color space from color spaces each containing a predetermined percentage or more of the color gamut of the input image data on the chromaticity diagram.
  • the color spaces each containing a predetermined percentage or more of the color gamut correspond to the above-mentioned color space substantially containing the color gamut. Specifically, for example, it corresponds to the color space containing the color gamut of the subject at or over a predetermined area ratio on the chromaticity diagram.
  • the small color space here refers to a color space of a small size on the chromaticity diagram, for example.
  • the color-space conversion part transmits information on the color space determined by the color-space determining part to a destination to which the converted image data is output.
  • the information on the color space refers to several bits of digital data indicating the name of the color space, for example.
  • An electronic camera of the present invention includes an image-capturing part and an image processing device.
  • the image-capturing part captures an optical image formed with a shooting lens to create image data.
  • this image-capturing part refers to a part having a release button, a CPU, a focal-plane shutter, a CCD, and a signal processing part, for example.
  • the image processing device includes a color-gamut determining part, a color-space determining part, and a color-space conversion part.
  • the color-gamut determining part determines a color gamut as a range of color distribution from image data obtained from the image-capturing part.
  • the color-space determining part determines a color space substantially containing the color gamut determined by the color-gamut determining part.
  • the color-space conversion part converts the input image data into image data which is rendered in the determined color space.
  • An image processing program of the present invention causes a computer to function as a color-gamut determining part, a color-space determining part, and a color-space conversion part.
  • the color-gamut determining part has a function of determining a color gamut as a range of color distribution from input image data.
  • the color-space determining part has a function of determining a color space substantially containing the color gamut determined by the color-gamut determining part.
  • the color-space conversion part has a function of converting the input image data into image data which is rendered in the determined color space.
  • the color-gamut determining part divides the input image data into a plurality of image regions, calculates a hue and a chroma for each of the image regions, and determines a maximum chroma for each of the calculated hues.
  • the color-space determining part selects a smallest color space from color spaces each having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated by the color-gamut determining part.
  • the color-gamut determining part maps the input image data onto a chromaticity diagram. Then, the color-space determining part selects a smallest color space from color spaces containing a predetermined percentage or more of the color gamut of the input image data on the chromaticity diagram.
  • the color-space conversion part transmits information on the color space determined by the color-space determining part to a destination to which the image data converted is output.
  • FIG. 1 is an xy chromaticity diagram showing NTSC color space and sRGB color space
  • FIG. 2 is a block diagram of an electronic camera on which an image processing device according to a first embodiment of the present invention is mounted;
  • FIG. 3 is a flowchart showing the operation of the image processing device of the first embodiment
  • FIG. 4 is an explanatory diagram showing an example of a hue calculation table to be used by the color-gamut determining part of FIG. 2 ;
  • FIG. 5 is an explanatory diagram showing an example of a chroma calculation table to be used by the color-gamut determining part of FIG. 2 ;
  • FIG. 6 shows a way of comparing the color gamut of a subject with the color gamuts of respective color spaces stored in advance by the color-space determining part of FIG. 2 ;
  • FIG. 7 is a flowchart showing the operation of the image processing device of a second embodiment
  • FIG. 8 is a block diagram of an electronic camera on which the image processing device according to a third embodiment of the present invention is mounted;
  • FIG. 9 is a flowchart showing the operation of the image processing device of the third embodiment.
  • FIGS. 10(A) , (B) are diagrams for illustrating the image processing device's processings of determining the color gamut of the subject and comparing it with the color gamuts of respective color spaces stored in advance according to the third embodiment.
  • FIG. 2 shows a first embodiment of the present invention.
  • a photographing device 10 A is made up of an electronic camera 12 A of the present invention, equipped with a shooting lens 14 and a recording medium 16 .
  • the shooting lens 14 consists of a lens group 20 and an aperture (diaphragm) 22 .
  • the electronic camera 12 A includes a release button 30 , a CPU 32 , a memory 34 , a focal-plane shutter 36 , a CCD 38 , a signal processing part 40 , a white balance adjusting part 42 , a color interpolation processing part 44 (hereinafter, to be referred to as Debayer processing part 44 because it performs Debayer processing on a Bayer array as a way of example in the present embodiment), an image processing device 50 of the present invention, a gamma correction part 52 , a contour enhancing part 54 , an image-data compressing part 56 , and a recording part 58 .
  • the CPU 32 controls each part of the electronic camera 12 A.
  • the CCD 38 On its light receiving plane, the CCD 38 has color filters FR, FG, and FB (not shown) transmitting the three principal colors, red, green, and blue (hereinafter, abbreviated as R, G, and B), respectively. Each pixel of the CCD 38 thus converts only the intensity of a wavelength corresponding to one of R, G, and B into a stored charge.
  • the signal processing part 40 applies clamp processing, sensitivity correction processing, analog-to-digital conversion, and the like to the pixel outputs of the CCD 38 to create image data. Note that the present embodiment describes an example of the analog-to-digital conversion in which each of the R, G, and B pixel outputs is encoded in unit of 12 bits.
  • the signal processing part 40 inputs the created image data to the image processing device 50 and the white balance adjusting part 42 .
  • the white balance adjusting part 42 applies white balance processing to the image data by using gains for white balance processing to be described later as parameters.
  • the white balance adjusting part 42 inputs the processed image data to the Debayer processing part 44 .
  • the Debayer processing part 44 applies Debayer processing to the image data. This provides each pixel with 12 bits of digital data on all the three principal colors.
  • the Debayer processing part 44 inputs the Debayer-processed image data to the image processing device 50 .
  • the image processing device 50 includes an evaluation value calculation part 62 , a WB gain calculating part 64 (WB is short for white balance), a color-gamut determining part 66 , a color-space determining part 68 , and a color correcting part 70 .
  • the image processing device 50 converts the image data based on the color space of the three principal colors of the color filters on the CCD 38 into image data based on an appropriate color space, and inputs the same to the gamma correction part 52 (details will be given later).
  • the gamma correction part 52 applies gamma correction to the input image data, and then outputs the resultant to the contour enhancing part 54 .
  • the gamma correction part 52 reduces the tones of pre-converted image data in which every pixel has 12 bits for each of the three principal colors so that every pixel has 8 bits for each of the three principal colors in the processed image data.
  • the contour enhancing part 54 applies image sharpening processing to the image data, and inputs the resultant to the image-data compressing part 56 .
  • the image-data compressing part 56 applies, for example, JPEG conversion to the image data for compression.
  • the recording part 58 receives, from the image processing device 50 , color-space information indicating in what color space the image data input from the image-data compressing part 56 is rendered.
  • the recording part 58 records the image data onto the recording medium 16 along with this color-space information.
  • FIG. 3 is a flowchart showing the operation of the image processing device 50 described above.
  • FIG. 4 is an example of a hue calculation table for use in the processing of the color-gamut determining part 66 .
  • FIG. 5 is an example of a chroma calculation table for use in the processing of the color-gamut determining part 66 .
  • FIG. 6 is an explanatory diagram showing a way of comparing the color gamut of a subject with the color gamuts of respective color spaces stored in advance by the color-space determining part 68 .
  • the operation of the image processing device 50 will be described in the order of step numbers shown in FIG. 3 , with reference to FIGS. 4 to 6 . It should be appreciated that arithmetic expressions and numeric values to be seen below are given by way of example for the purpose of reference, not limitations on the present invention.
  • the CCD 38 converts light received from a subject through the shooting lens 14 into electric charges for storage.
  • the signal processing part 40 reads the stored charges from the CCD 38 to create image data.
  • the image data consists of 1000 vertical ⁇ 1500 horizontal, i.e., 1.5 million pixels.
  • the image processing part 40 inputs the created image data to the evaluation value calculation part 62 . Note that this image data is not subjected to Debayer processing yet, and it consists of pixels each encoded in 12 bits for one of the three principal colors R, G, and B.
  • the evaluation value calculation part 62 divides the image data into 8 vertical ⁇ 12 horizontal, i.e., 96 regions. Hereinafter, each of the divided regions will be referred to as small region. For each small region, the evaluation value calculation part 62 calculates averages Rav, Gav, and Bav of the values (expressed by digital data) that indicate the intensities of the three principal colors R, G, and B, respectively. Specifically, the average Rav is determined by averaging the digital data on all the pixels corresponding to R in a small region. The same operations are performed for G and B to calculate Gav and Bav. The evaluation value calculation part 62 transmits Rav, Gav, and Bav to the color-gamut determining part 66 and the WB gain calculating part 64 . The WB gain calculating part 64 determines gains for white balance processing based on Rav, Gav, and Bav, and transmits the same to the white balance adjusting part 42 .
  • the color-gamut determining part 66 determines a representative hue and a representative chroma through the following procedure. Initially, R/B and B/G defined by the following equations are determined from Rav, Gav, and Bav calculated at step S 2 :
  • the data corresponding to R/G and B/G determined is considered as a representative hue of the small region.
  • the data corresponding to R/G and B/G determined is considered as the representative chroma of the small region.
  • the hue calculation table mentioned above has values of 0 to 255 (8 bits) on both the ordinate (B/G) and abscissa (R/G), whereas FIG. 4 shows only some representative values. The same holds for the chroma calculation table of FIG. 5 .
  • the color-gamut determining part 66 considers B/G or R/G exceeding 255 in value as 255.
  • the color-gamut determining part 66 classifies all the small regions according to the representative hues. Next, a small region having a maximum representative chroma is determined from small regions having a same value of representative hue. The representative chroma of the determined small region shall be the maximum chroma for the representative hue of this small region. In this way, the color-gamut determining part 66 determines the maximum chroma for each of the representative hues obtained at step S 3 . The color-gamut determining part 66 transmits the maximum chromas for the respective representative hues to the color-space determining part 68 as the color gamut of the subject.
  • the color-space determining part 68 stores in advance a relationship between representative values of hue (0, 1, . . . , 15) and maximum chromas in several color spaces (such as CIE-RGB color space, NTSC color space, and sRGB color space).
  • color spaces such as CIE-RGB color space, NTSC color space, and sRGB color space.
  • FIG. 6 also shows an example of the color gamut of a subject.
  • the color-space determining part 68 compares the color gamut of each of the color spaces and that of the subject to select a smallest color space out of the color spaces that include the color gamut of the subject. Specifically, the color-space determining part 68 selects a color space having a smallest average of the maximum chroma from the color spaces each having a maximum chroma equal to or higher than that of the color gamut of the subject in all the representative hues.
  • FIG. 6 shows an example in which NTSC color space is selected as the optimum color space.
  • the color-space determining part 68 transmits the information as to which color space has been selected (hereinafter, referred to as color-space information) to the color correcting part 70 and the recording part 58 .
  • the transmission is effected, for example, by setting four bits of digital data for indicating the names of the respective color spaces in advance and transmitting the digital data.
  • the color correcting part 70 receives the image data transmitted from the Debayer processing part 44 . Incidentally, this image data is rendered in the color space determined by the three principal colors of the color filters on the CCD 38 .
  • the color correcting part 70 stores in advance therein matrix factors Ma, Mb, Mc, Md, Me, Mf, Mg, Mh, and Mi for each color space which are used for converting the transmitted image data into such image data that it is rendered in CIE-RGB color space, NTSC color space, sRGB color space, and the like. Note that the matrix factors Ma to Mi are intended not only for color-space conversion but also for color correction ascribable to the fact that neither the shooting lens 14 nor the CCD 38 has ideal spectral characteristics.
  • the color correcting part 70 selects matrix factors Ma to Mi corresponding to the color space selected at step S 5 .
  • the color correcting part 70 performs color-space conversion on the transmitted image data by using the following three equations (collectively referred to as Equation (3)):
  • Rc, Gc, and Bc are pieces of digital data corresponding to the three principal colors of the image data transmitted from the Debayer processing part 44 .
  • Rm, Gm, and Bm are pieces of digital data corresponding to the three principal colors of the converted image data.
  • the color correcting part 70 then transmits the converted image data to the gamma correction part 52 .
  • the description so far has been made on the operation of the image processing device 50 of the present embodiment.
  • the converted image data which is rendered in an appropriate color space in this way is subjected to the above-mentioned processings in the gamma correction part 52 , the contour enhancing part 54 , and the image-data compressing part 56 before recorded onto the recording medium 16 along with the color-space information.
  • the image processing device 50 of the present embodiment uses table data shown in FIGS. 4 and 5 to determine representative hues and representative chromas in the respective small regions of the image data. Then, with the maximum chromas for the representative hues determined as evaluation reference, the image processing device 50 determines the color gamut of the subject that is expressed by the image data based on the color space of the color filters on the CCD 38 . Consequently, the color gamut of the subject can be obtained efficiently with a fewer times of operations. This results in simplifying the configuration of the image processing device 50 . Moreover, as shown in FIG. 6 , whether or not the individual color spaces cover the color gamut of the subject can be determined easily by simply comparing the maximum chromas for the representative hues.
  • the smallest color space is selected from among the color spaces that cover the color gamut of the subject. More specifically, it is possible to automatically select a color space that covers the color gamut of the subject and has a minimum color difference per tone, for the image data obtained immediately after photographing and consisting of pixels whose colors are encoded in a predetermined number of bits. This holds true even if the image data is reduced in the number of bits by subsequent processings (gamma correction part 52 ).
  • the image data is converted into such image data that it is rendered in an appropriate color space selected, and thereafter it is recorded onto the recording medium 16 along with this color-space information (step S 6 ). Consequently, reproducing the image data based on the color-space information enables the colors of the captured subject to be reproduced accurately in favorable tones.
  • the user need not have expertise on color spaces for selecting a color space so that he or she can focus on taking photographs. Also, allowing the image processing device 50 to select an appropriate color space depending on the color gamut of the subject makes it possible to create better pictures. As a result, the user's usability improves greatly.
  • the evaluation value calculation part 62 calculates the averages Rav, Gav, and Bav of R, G, and B for each small region, and transmits the calculation results to the color-gamut determining part 66 and the WB gain calculating part 64 . It is therefore possible to use the calculation results of the evaluation value calculation part 62 both for the processing of determining the color gamut of the subject and for the white balance processing. This results in simplifying the configuration of the image processings of the electronic camera 12 A.
  • the present embodiment differs from the first embodiment only in that the calculations of the WB gain calculating part are also used for the processing in the color-gamut determining part (corresponding to the part shown by the broken-lined arrow in FIG. 2 ).
  • the image processing device shall be designated distinctively as 50 b, the WB gain calculating part as 64 b, and the color-gamut determining part as 66 b while the block diagram is omitted.
  • FIG. 7 is a flowchart showing the operation of the image processing device 50 b of the present embodiment.
  • the operation of the image processing device 50 b will be described in the order of step numbers shown in FIG. 7 . It should be appreciated that arithmetic expressions and numeric values to be seen below are given by way of example for the purpose of reference, not limitations on the present invention.
  • image data is created and input to the evaluation value calculation part 62 .
  • the evaluation value calculation part 62 divides the image data into a plurality of small regions, and determines the averages Rav, Gav, and Bav of R, G, and B in each small region.
  • the evaluation value calculation part 62 transmits Rav, Gav, and Bav to the color-gamut determining part 66 b and the WB gain calculating part 64 b.
  • the WB gain calculating part 64 b determines gains Wr, Wg, and Wb for white balance processing based on Rav, Gav, and Bav, and transmits the same to the white balance adjusting part 42 and the color-gamut determining part 66 b.
  • the color-gamut determining part 66 b Based on the gains Wr, Wg, and Wb for white balance processing, the color-gamut determining part 66 b converts Rav, Gav, and Bav into values Rav′, Gav′, and Bav′ that are adjusted in white balance.
  • This conversion method is the same as what the white balance adjusting part 42 applies to image data, being expressed by, e.g., the following three equations (collectively referred to as Equation (4)):
  • Gav′ Gav ⁇ Wg
  • the color-gamut determining part 66 b determines R/G and B/G in each small region by the following equations, and determines a representative hue and a representative chroma in each small region by using the hue calculation table of FIG. 4 and the chroma calculation table of FIG. 5 .
  • steps S 14 , S 15 , and S 16 are the same as that of steps S 4 , S 5 , and S 6 of the first embodiment, respectively. Description thereof will thus be omitted.
  • the second embodiment can provide the same effects as those of the foregoing first embodiment.
  • the color-gamut determining part 66 b converts the averages Rav, Gav, and Bav of R, G, and B determined for each small region into the values Rav′, Gav′, and Bav′ that are adjusted in white balance, and then determines representative hues and representative chromas in the respective small regions. That is, the processing of the color-gamut determining part 66 b is equivalent to predicting how the image data is converted by the white balance adjusting part 42 and determining the color gamut of the subject to be expressed by the image data adjusted in white balance. As a result, it is possible to determine the color gamut of the subject more accurately regardless of the color temperature of the light source that has illuminated the subject at the time of shooting.
  • FIG. 8 shows a third embodiment of the present invention.
  • the same parts as those of the first embodiment will be designated by identical reference numbers. Description thereof will be omitted.
  • a photographing device 10 C is made up of an electronic camera 12 C of the present invention, equipped with a shooting lens 14 and a recording medium 16 .
  • the electronic camera 12 C includes the release button 30 , a CPU 32 c, the memory 34 , the focal-plane shutter 36 , the CCD 38 , a signal processing part 40 , an evaluation value calculation part 62 C, the WB gain calculating part 64 , a white balance adjusting part 42 c, the Debayer processing part 44 , an image processing device 50 c of the present invention, the gamma correction part 52 , the contour enhancing part 54 , the image-data compressing part 56 , and the recording part 58 .
  • the CPU 32 c controls each part of the electronic camera 12 C.
  • the evaluation value calculation part 62 c is identical to the evaluation value calculation part 62 of the first embodiment except that Rav, Gav, and Bav calculated for each small region are transmitted only to the WB gain calculating part 64 .
  • the white balance adjusting part 42 c is identical to the white balance adjusting part 42 of the first embodiment except that the image data adjusted in white balance is also input to the image processing device 50 c.
  • the image processing device 50 c includes a color-gamut determining part 66 c, a color-space determining part 68 c, and a color correcting part 70 .
  • the image processing device 50 c converts image data based on the color space of the three principal colors of the color filters on the CCD 38 into image data based on an appropriate color space, and inputs the same to the gamma correction part 52 .
  • FIG. 9 is a flowchart showing the operation of the image processing device 50 c described above.
  • FIGS. 10 (A), (B) are diagrams for explaining the processing of determining the color gamut of the subject and comparing it with the color gamuts of respective color spaces stored in advance by the image processing device 50 c.
  • the operation of the image processing device 50 c will be described in the order of step numbers shown in FIG. 9 , with reference to FIG. 10 .
  • the signal processing part 40 reads the stored charges from the CCD 38 to create image data, and inputs the same to the evaluation value calculation part 62 c and the white balance adjusting part 42 c.
  • the evaluation value calculation part 62 c divides the image data into a plurality of small regions, and determines the averages Rav, Gav, and Bav of R, G, and B, respectively, in each small region.
  • the WB gain calculating part 64 determines gains for white balance processing, and transmits the same to the white balance adjusting part 42 c.
  • the white balance adjusting part 42 c applies white balance processing to the image data, and then inputs the resultant to the color-gamut determining part 66 c and the Debayer processing part 44 .
  • the color-gamut determining part 66 c maps the input image data (based on the color space determined by the color filters on the CCD 38 ) onto an xy chromaticity diagram, for example. This mapping is performed in unit of pixels, and table data is created at the same time. For example, when the image data covers three pixels that show the color corresponding to an x-coordinate of 0.3 and a y-coordinate of 0.4, a row of table data is expressed as (0.3, 0.4, 3). Such table data is created on all the coordinates within the visible region.
  • the color-gamut determining part 66 c divides the visible region on the xy chromaticity diagram into N regions based on MacAdam ellipse, for example.
  • each of the N divided regions will be referred to as a region of comparable colors.
  • the color-gamut determining part 66 c classifies the individual rows of table data created at step S 32 according to regions of comparable colors. From among the regions of comparable colors, the color-gamut determining part 66 c selects ones that include T or more pixels of the mapped image data.
  • the hatched area is an example of the regions of comparable colors selected here.
  • the color-gamut determining part 66 c informs the color-space determining part 68 c of which regions of comparable colors have been selected, as the color gamut of the subject.
  • the value of T mentioned above may be determined according to the value of N and the total number of pixels of the image data so that a difference between the actual color gamut of the subject and the color gamut of the subject determined by the color-gamut determining part 66 c falls to or below an acceptable value.
  • the color-space determining part 68 stores in advance the ranges of distribution of several color spaces (such as NTSC color space and sRGB color space) on the xy chromaticity diagram. Then, the color-space determining part 68 selects the smallest color space out of the color spaces that cover the color gamut of the subject on the xy chromaticity diagram.
  • a small color space refers to a color area of a small size on the chromaticity diagram.
  • NTSC color space which covers the hatched color gamut of the subject is selected as the optimum color space.
  • the color-space determining part 68 transmits color-space information on which color space is selected to the color correcting part 70 and the recording part 58 .
  • the smallest color space is selected from among color spaces that cover the color gamut of the subject on the xy chromaticity diagram at or above a predetermined area ratio.
  • the predetermined area ratio may be set to a value which allows the ratio of the region not covered by the selected color space to the color gamut of the subject determined by the color-gamut determining part 66 c to fall to or below an acceptable value.
  • the color correcting part 70 converts the image data transmitted from the Debayer processing part 44 into such image data that it is rendered in the color space selected at step S 34 .
  • the color correcting part 70 then transmits the converted image data to the gamma correction part 52 .
  • the third embodiment can provide the same effects as those of the first and second embodiments described above.
  • the foregoing first and second embodiments have dealt with the cases where the image data is divided into 8 vertical ⁇ 12 horizontal, i.e., 96 regions.
  • the present invention is not limited to such embodiments. If the color gamut of the subject must be determined more precisely, the image data may be divided more finely. To put a functional limitation, the image data should be divided at such a fineness that a difference between the actual color gamut of the subject and the color gamut of the subject determined by the image processing device 50 falls to or below an acceptable value (such as 1%).
  • the first and second embodiments have dealt with the cases where the evaluation value calculation part 62 calculates, at step S 2 (step S 12 ), the averages Rav, Gav, and Bav of the three principal colors R, G, and B for each small region.
  • the present invention is not limited to such embodiments.
  • values that occur with highest frequency may be determined from the digital data on all the pixels corresponding to R in the respective small regions.
  • the values corresponding to G and B may also be determined similarly.
  • the values occurring with highest frequency can be used in subsequent processing instead of the averages.
  • maximum values Rmax, Gmax, and Bmax in the digital data on all the pixels corresponding to R, G, and B in the small regions, respectively may be determined for use instead of the averages.
  • the third embodiment has dealt with the case where the image data is mapped onto the xy chromaticity diagram.
  • the present invention is not limited to such an embodiment.
  • a uv chromaticity diagram may be used instead of the xy chromaticity diagram.
  • the first to third embodiments have dealt with the cases where the image processing device ( 50 , 50 b, 50 c ) performs color-space conversion on the image data before gamma processing.
  • the present invention is not limited to such embodiments.
  • the gamma correction part 52 may perform the gamma correction before the image data is input to the color correcting part 70 .
  • the first to third embodiments have dealt with the cases where one color space is selected from among a plurality of color spaces stored in advance.
  • the present invention is not limited to such embodiments. For example, it is possible to determine a triangle of a smallest size from triangles covering the color gamut of the subject on the chromaticity diagram, and establish a new color space having the vertexes of the determined triangle as the color coordinates of the three principal colors.
  • the first to third embodiments have dealt with the cases where the image sensor (CCD 38 ) has a color filter array of principal colors R, G, and B.
  • the present invention is not limited to such embodiments.
  • the present invention is also applicable to a color filter array of complementary colors, cyan, magenta, and yellow.
  • the first to third embodiments have dealt with the cases where the image processing device of the present invention is used for an electronic camera.
  • the present invention is not limited to such embodiments.
  • the image processing device of the present invention may be used for a scanner and the like.
  • steps S 1 to S 6 , steps S 11 to S 16 , or steps S 31 to S 35 described above may be coded into an image processing program.
  • the same effects as those of the first to third embodiments can be obtained if the image processing program is used as part of the CPU of an electronic camera, for example.

Abstract

An image processing device of the present invention includes a color-gamut determining part, a color-space determining part, and a color-space conversion part. The color-gamut determining part determines a color gamut as a range of color distribution from input image data. The color-space determining part determines a color space substantially covering the color gamut determined by the color-gamut determining part. The color-space conversion part converts the input image data into such image data that is rendered in the determined color space. The colors of the subject can thus be reproduced accurately from the converted image data.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application is Continuation of application Ser. No. 10/730,057 filed Dec. 9, 2003, the disclosure of which is incorporated herein by references in its entirety.
  • This application claims priority from Japanese Patent Application No. 2002-365476, filed on Dec. 17, 2002, the entire contents of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to an image processing device for converting color spaces of image data. The invention also relates to an electronic camera on which the image processing device is mounted, and an image processing program.
  • 2. Description of the Related Art
  • In general, image data created by a color image processing device such as an electronic camera, a digital video camera, and a scanner is initially subjected to processings including color conversion, tone processing, and contour enhancement processing. The image data is then recorded on a recording medium such as a memory and a magnetic tape, or transmitted to external equipment via communication media. The recorded image data is reproduced, for example, as a photograph by a developing machine, a printer, etc. The transmitted image data is reproduced on a monitor as a moving image or a still image, for example.
  • In order to reproduce the colors of the recorded or transmitted image data accurately, the image-capturing side and the reproduction side need to process the image data by using the same standard. For this purpose, various types of standards (color spaces) for expressing colors have been established. Then, the color coordinates of the three principal colors (R, G, and B) differ from one standard to another.
  • FIG. 1 shows an xy chromaticity diagram showing NTSC color space and sRGB color space. Note that the horseshoe shaped area is a color range that humans are perceptible of (hereinafter, to be referred to as visible region). The image-capturing side can encode only colors inside the respective triangles with the coordinates of R, G, and B as the vertexes in the color space it uses. Similarly, the reproduction side can reproduce only colors inside the respective triangles with the coordinates of R, G, and B as the vertexes in the color space it uses. In the present invention, the range of colors that can be thus expressed in a color space, as well as the range of color distribution of a subject, shall be referred to as color gamut. As is evident from FIG. 1, the ranges of colors that can be expressed in NTSC color space and sRGB color space are smaller than the visible region. This also holds for most other color spaces (including CIE RGB and Adobe RGB™).
  • When the color space determined according to the color filters of an image sensor does not cover the color gamut of a subject, the colors of the subject is not reproducible accurately from the image data created by this image-capturing system. Additionally, even with the image-capturing system having a color space that covers the color gamut of a subject, it is not possible to reproduce the colors of the subject with accuracy if the image data created by this image-capturing system is converted into such image data that it is rendered in a color space not covering the color gamut of the subject.
  • In view of this, Japanese Unexamined Patent Application Publication No. 2002-109523 has proposed a method of establishing a new color space capable of expressing all colors and capturing an image in this color space. This new color space differs from the known color spaces in the coordinates of the three principal colors. The image data based on the new three principal colors is thus converted into image data based on known three principal colors before output to an existing image output apparatus.
  • In general, image data yet to be compressed consists of pixels whose colors are encoded in a predetermined number of bits each (for example, 8 bits for each of the three principal colors). If encoded in a larger color space, the captured image data is thus expected to be greater in color difference per tone. Once the image data is encoded in wider tones, it is impossible to make the tones finer in subsequent processings. A greater color difference per tone results in unclear reproduced images and making it difficult to process the image data.
  • Besides, it is troublesome and difficult for the user to select an appropriate color space depending on the subject because he or she is required to have expertise on NTSC, sRGB, and other color spaces.
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to provide a technique for reproducing the color gamut of a subject with good chroma and tones without the necessity for the user to select a color space.
  • An image processing device of the present invention includes a color-gamut determining part, a color-space determining part, and a color-space conversion part. The color-gamut determining part determines a color gamut as a range of color distribution from input image data. The color-space determining part determines a color space substantially containing the color gamut determined by the color-gamut determining part. The color-space conversion part converts the input image data into image data which is rendered in the determined color space. It may be expected that the colors of the subject are accurately reproducible from the converted image data. Incidentally, the color-space conversion part herein will sometimes be referred to as color correcting part.
  • According to one of the aspects of the image processing device of the present invention, the color-gamut determining part divides the input image data into a plurality of image regions, calculates a hue and a chroma for each of the image regions, and determines a maximum chroma for each of the hues calculated. The color-space determining part selects a smallest color space from color spaces each having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated by the color-gamut determining part.
  • This calculation function of the color-gamut determining part will sometimes be referred to as evaluation value calculation part, and each of the divided image regions will sometimes be referred to as a small region. Moreover, in this aspect of the image processing device, the above-described “a color space substantially containing the color gamut” corresponds to a color space having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated, for example. A small color space signifies that an average of the maximum chroma determined for each of the hues is small, for example.
  • According to another aspect of the image processing device of the present invention, the color-gamut determining part maps the input image data onto a chromaticity diagram. Then, the color-space determining part selects a smallest color space from color spaces each containing a predetermined percentage or more of the color gamut of the input image data on the chromaticity diagram. Here, the color spaces each containing a predetermined percentage or more of the color gamut correspond to the above-mentioned color space substantially containing the color gamut. Specifically, for example, it corresponds to the color space containing the color gamut of the subject at or over a predetermined area ratio on the chromaticity diagram. The small color space here refers to a color space of a small size on the chromaticity diagram, for example.
  • According to another aspect of the image processing device of the present invention, the color-space conversion part transmits information on the color space determined by the color-space determining part to a destination to which the converted image data is output. Here, the information on the color space refers to several bits of digital data indicating the name of the color space, for example.
  • An electronic camera of the present invention includes an image-capturing part and an image processing device. The image-capturing part captures an optical image formed with a shooting lens to create image data. Incidentally, this image-capturing part refers to a part having a release button, a CPU, a focal-plane shutter, a CCD, and a signal processing part, for example.
  • The image processing device includes a color-gamut determining part, a color-space determining part, and a color-space conversion part. The color-gamut determining part determines a color gamut as a range of color distribution from image data obtained from the image-capturing part. The color-space determining part determines a color space substantially containing the color gamut determined by the color-gamut determining part. The color-space conversion part converts the input image data into image data which is rendered in the determined color space.
  • An image processing program of the present invention causes a computer to function as a color-gamut determining part, a color-space determining part, and a color-space conversion part. Here, the color-gamut determining part has a function of determining a color gamut as a range of color distribution from input image data. The color-space determining part has a function of determining a color space substantially containing the color gamut determined by the color-gamut determining part. The color-space conversion part has a function of converting the input image data into image data which is rendered in the determined color space.
  • According to one of the aspects of the image processing program of the present invention, the color-gamut determining part divides the input image data into a plurality of image regions, calculates a hue and a chroma for each of the image regions, and determines a maximum chroma for each of the calculated hues. The color-space determining part selects a smallest color space from color spaces each having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated by the color-gamut determining part.
  • According to another aspect of the image processing program of the present invention, the color-gamut determining part maps the input image data onto a chromaticity diagram. Then, the color-space determining part selects a smallest color space from color spaces containing a predetermined percentage or more of the color gamut of the input image data on the chromaticity diagram.
  • According to another aspect of the image processing program of the present invention, the color-space conversion part transmits information on the color space determined by the color-space determining part to a destination to which the image data converted is output.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The nature, principle, and utility of the invention will become more apparent from the following detailed description when read in conjunction with the accompanying drawings in which like parts are designated by identical reference numbers, in which
  • FIG. 1 is an xy chromaticity diagram showing NTSC color space and sRGB color space;
  • FIG. 2 is a block diagram of an electronic camera on which an image processing device according to a first embodiment of the present invention is mounted;
  • FIG. 3 is a flowchart showing the operation of the image processing device of the first embodiment;
  • FIG. 4 is an explanatory diagram showing an example of a hue calculation table to be used by the color-gamut determining part of FIG. 2;
  • FIG. 5 is an explanatory diagram showing an example of a chroma calculation table to be used by the color-gamut determining part of FIG. 2;
  • FIG. 6 shows a way of comparing the color gamut of a subject with the color gamuts of respective color spaces stored in advance by the color-space determining part of FIG. 2;
  • FIG. 7 is a flowchart showing the operation of the image processing device of a second embodiment;
  • FIG. 8 is a block diagram of an electronic camera on which the image processing device according to a third embodiment of the present invention is mounted;
  • FIG. 9 is a flowchart showing the operation of the image processing device of the third embodiment; and
  • FIGS. 10(A), (B) are diagrams for illustrating the image processing device's processings of determining the color gamut of the subject and comparing it with the color gamuts of respective color spaces stored in advance according to the third embodiment.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Hereinafter, embodiments of the present invention will be described with reference to the drawings.
  • First Embodiment
  • FIG. 2 shows a first embodiment of the present invention. In the diagram, a photographing device 10A is made up of an electronic camera 12A of the present invention, equipped with a shooting lens 14 and a recording medium 16. The shooting lens 14 consists of a lens group 20 and an aperture (diaphragm) 22.
  • The electronic camera 12A includes a release button 30, a CPU 32, a memory 34, a focal-plane shutter 36, a CCD 38, a signal processing part 40, a white balance adjusting part 42, a color interpolation processing part 44 (hereinafter, to be referred to as Debayer processing part 44 because it performs Debayer processing on a Bayer array as a way of example in the present embodiment), an image processing device 50 of the present invention, a gamma correction part 52, a contour enhancing part 54, an image-data compressing part 56, and a recording part 58.
  • The CPU 32 controls each part of the electronic camera 12A.
  • On its light receiving plane, the CCD 38 has color filters FR, FG, and FB (not shown) transmitting the three principal colors, red, green, and blue (hereinafter, abbreviated as R, G, and B), respectively. Each pixel of the CCD 38 thus converts only the intensity of a wavelength corresponding to one of R, G, and B into a stored charge.
  • The signal processing part 40 applies clamp processing, sensitivity correction processing, analog-to-digital conversion, and the like to the pixel outputs of the CCD 38 to create image data. Note that the present embodiment describes an example of the analog-to-digital conversion in which each of the R, G, and B pixel outputs is encoded in unit of 12 bits. The signal processing part 40 inputs the created image data to the image processing device 50 and the white balance adjusting part 42.
  • The white balance adjusting part 42 applies white balance processing to the image data by using gains for white balance processing to be described later as parameters. The white balance adjusting part 42 inputs the processed image data to the Debayer processing part 44.
  • The Debayer processing part 44 applies Debayer processing to the image data. This provides each pixel with 12 bits of digital data on all the three principal colors. The Debayer processing part 44 inputs the Debayer-processed image data to the image processing device 50.
  • The image processing device 50 includes an evaluation value calculation part 62, a WB gain calculating part 64 (WB is short for white balance), a color-gamut determining part 66, a color-space determining part 68, and a color correcting part 70. The image processing device 50 converts the image data based on the color space of the three principal colors of the color filters on the CCD 38 into image data based on an appropriate color space, and inputs the same to the gamma correction part 52 (details will be given later).
  • The gamma correction part 52 applies gamma correction to the input image data, and then outputs the resultant to the contour enhancing part 54. Here, for example, the gamma correction part 52 reduces the tones of pre-converted image data in which every pixel has 12 bits for each of the three principal colors so that every pixel has 8 bits for each of the three principal colors in the processed image data. The contour enhancing part 54 applies image sharpening processing to the image data, and inputs the resultant to the image-data compressing part 56.
  • The image-data compressing part 56 applies, for example, JPEG conversion to the image data for compression. The recording part 58 receives, from the image processing device 50, color-space information indicating in what color space the image data input from the image-data compressing part 56 is rendered. The recording part 58 records the image data onto the recording medium 16 along with this color-space information.
  • FIG. 3 is a flowchart showing the operation of the image processing device 50 described above. FIG. 4 is an example of a hue calculation table for use in the processing of the color-gamut determining part 66. FIG. 5 is an example of a chroma calculation table for use in the processing of the color-gamut determining part 66. FIG. 6 is an explanatory diagram showing a way of comparing the color gamut of a subject with the color gamuts of respective color spaces stored in advance by the color-space determining part 68. Hereinafter, the operation of the image processing device 50 will be described in the order of step numbers shown in FIG. 3, with reference to FIGS. 4 to 6. It should be appreciated that arithmetic expressions and numeric values to be seen below are given by way of example for the purpose of reference, not limitations on the present invention.
  • [Step S1]
  • According to instructions from the CPU 32, the CCD 38 converts light received from a subject through the shooting lens 14 into electric charges for storage. According to instructions from the CPU 32, the signal processing part 40 reads the stored charges from the CCD 38 to create image data. For example, the image data consists of 1000 vertical×1500 horizontal, i.e., 1.5 million pixels. The image processing part 40 inputs the created image data to the evaluation value calculation part 62. Note that this image data is not subjected to Debayer processing yet, and it consists of pixels each encoded in 12 bits for one of the three principal colors R, G, and B.
  • [Step S2]
  • The evaluation value calculation part 62 divides the image data into 8 vertical×12 horizontal, i.e., 96 regions. Hereinafter, each of the divided regions will be referred to as small region. For each small region, the evaluation value calculation part 62 calculates averages Rav, Gav, and Bav of the values (expressed by digital data) that indicate the intensities of the three principal colors R, G, and B, respectively. Specifically, the average Rav is determined by averaging the digital data on all the pixels corresponding to R in a small region. The same operations are performed for G and B to calculate Gav and Bav. The evaluation value calculation part 62 transmits Rav, Gav, and Bav to the color-gamut determining part 66 and the WB gain calculating part 64. The WB gain calculating part 64 determines gains for white balance processing based on Rav, Gav, and Bav, and transmits the same to the white balance adjusting part 42.
  • [Step S3]
  • For each small region, the color-gamut determining part 66 determines a representative hue and a representative chroma through the following procedure. Initially, R/B and B/G defined by the following equations are determined from Rav, Gav, and Bav calculated at step S2:

  • R/G=Rav÷Gav×100   (1)

  • B/G=Bav÷Gav×100   (2)
  • Next, in the hue calculation table shown in FIG. 4, the data corresponding to R/G and B/G determined is considered as a representative hue of the small region. In addition, in the chroma calculation table shown in FIG. 5, the data corresponding to R/G and B/G determined is considered as the representative chroma of the small region. Note that the hue calculation table mentioned above has values of 0 to 255 (8 bits) on both the ordinate (B/G) and abscissa (R/G), whereas FIG. 4 shows only some representative values. The same holds for the chroma calculation table of FIG. 5. The color-gamut determining part 66 considers B/G or R/G exceeding 255 in value as 255.
  • [Step S4]
  • The color-gamut determining part 66 classifies all the small regions according to the representative hues. Next, a small region having a maximum representative chroma is determined from small regions having a same value of representative hue. The representative chroma of the determined small region shall be the maximum chroma for the representative hue of this small region. In this way, the color-gamut determining part 66 determines the maximum chroma for each of the representative hues obtained at step S3. The color-gamut determining part 66 transmits the maximum chromas for the respective representative hues to the color-space determining part 68 as the color gamut of the subject.
  • [Step S5]
  • As shown in FIG. 6, the color-space determining part 68 stores in advance a relationship between representative values of hue (0, 1, . . . , 15) and maximum chromas in several color spaces (such as CIE-RGB color space, NTSC color space, and sRGB color space). For reference, FIG. 6 also shows an example of the color gamut of a subject.
  • Then, the color-space determining part 68 compares the color gamut of each of the color spaces and that of the subject to select a smallest color space out of the color spaces that include the color gamut of the subject. Specifically, the color-space determining part 68 selects a color space having a smallest average of the maximum chroma from the color spaces each having a maximum chroma equal to or higher than that of the color gamut of the subject in all the representative hues.
  • FIG. 6 shows an example in which NTSC color space is selected as the optimum color space. The color-space determining part 68 transmits the information as to which color space has been selected (hereinafter, referred to as color-space information) to the color correcting part 70 and the recording part 58. Here, the transmission is effected, for example, by setting four bits of digital data for indicating the names of the respective color spaces in advance and transmitting the digital data.
  • [Step S6]
  • The color correcting part 70 receives the image data transmitted from the Debayer processing part 44. Incidentally, this image data is rendered in the color space determined by the three principal colors of the color filters on the CCD 38. The color correcting part 70 stores in advance therein matrix factors Ma, Mb, Mc, Md, Me, Mf, Mg, Mh, and Mi for each color space which are used for converting the transmitted image data into such image data that it is rendered in CIE-RGB color space, NTSC color space, sRGB color space, and the like. Note that the matrix factors Ma to Mi are intended not only for color-space conversion but also for color correction ascribable to the fact that neither the shooting lens 14 nor the CCD 38 has ideal spectral characteristics.
  • The color correcting part 70 selects matrix factors Ma to Mi corresponding to the color space selected at step S5. The color correcting part 70 performs color-space conversion on the transmitted image data by using the following three equations (collectively referred to as Equation (3)):

  • Rm=Rc×Ma+Gc×Mb+Bc×Mc

  • Gm=Rc×Md+Gc×Me+Bc×Mf

  • Bm=Rc×Mg+Gc×Mh+Bc×Mi   (3)
  • In the foregoing equation, Rc, Gc, and Bc are pieces of digital data corresponding to the three principal colors of the image data transmitted from the Debayer processing part 44. Rm, Gm, and Bm are pieces of digital data corresponding to the three principal colors of the converted image data. The color correcting part 70 then transmits the converted image data to the gamma correction part 52.
  • The description so far has been made on the operation of the image processing device 50 of the present embodiment. The converted image data which is rendered in an appropriate color space in this way is subjected to the above-mentioned processings in the gamma correction part 52, the contour enhancing part 54, and the image-data compressing part 56 before recorded onto the recording medium 16 along with the color-space information.
  • As described above, the image processing device 50 of the present embodiment uses table data shown in FIGS. 4 and 5 to determine representative hues and representative chromas in the respective small regions of the image data. Then, with the maximum chromas for the representative hues determined as evaluation reference, the image processing device 50 determines the color gamut of the subject that is expressed by the image data based on the color space of the color filters on the CCD 38. Consequently, the color gamut of the subject can be obtained efficiently with a fewer times of operations. This results in simplifying the configuration of the image processing device 50. Moreover, as shown in FIG. 6, whether or not the individual color spaces cover the color gamut of the subject can be determined easily by simply comparing the maximum chromas for the representative hues.
  • Then, the smallest color space is selected from among the color spaces that cover the color gamut of the subject. More specifically, it is possible to automatically select a color space that covers the color gamut of the subject and has a minimum color difference per tone, for the image data obtained immediately after photographing and consisting of pixels whose colors are encoded in a predetermined number of bits. This holds true even if the image data is reduced in the number of bits by subsequent processings (gamma correction part 52).
  • In addition, the image data is converted into such image data that it is rendered in an appropriate color space selected, and thereafter it is recorded onto the recording medium 16 along with this color-space information (step S6). Consequently, reproducing the image data based on the color-space information enables the colors of the captured subject to be reproduced accurately in favorable tones.
  • Moreover, the user need not have expertise on color spaces for selecting a color space so that he or she can focus on taking photographs. Also, allowing the image processing device 50 to select an appropriate color space depending on the color gamut of the subject makes it possible to create better pictures. As a result, the user's usability improves greatly.
  • The evaluation value calculation part 62 calculates the averages Rav, Gav, and Bav of R, G, and B for each small region, and transmits the calculation results to the color-gamut determining part 66 and the WB gain calculating part 64. It is therefore possible to use the calculation results of the evaluation value calculation part 62 both for the processing of determining the color gamut of the subject and for the white balance processing. This results in simplifying the configuration of the image processings of the electronic camera 12A.
  • Second Embodiment
  • Next, description will be made on a second embodiment of the present invention. The present embodiment differs from the first embodiment only in that the calculations of the WB gain calculating part are also used for the processing in the color-gamut determining part (corresponding to the part shown by the broken-lined arrow in FIG. 2). Thus, in the present embodiment, the image processing device shall be designated distinctively as 50 b, the WB gain calculating part as 64 b, and the color-gamut determining part as 66 b while the block diagram is omitted.
  • FIG. 7 is a flowchart showing the operation of the image processing device 50 b of the present embodiment. Hereinafter, the operation of the image processing device 50 b will be described in the order of step numbers shown in FIG. 7. It should be appreciated that arithmetic expressions and numeric values to be seen below are given by way of example for the purpose of reference, not limitations on the present invention.
  • [Step S11]
  • As in step S1 of the first embodiment, image data is created and input to the evaluation value calculation part 62.
  • [Step S12]
  • As in step S2 of the first embodiment, the evaluation value calculation part 62 divides the image data into a plurality of small regions, and determines the averages Rav, Gav, and Bav of R, G, and B in each small region. The evaluation value calculation part 62 transmits Rav, Gav, and Bav to the color-gamut determining part 66 b and the WB gain calculating part 64 b. The WB gain calculating part 64 b determines gains Wr, Wg, and Wb for white balance processing based on Rav, Gav, and Bav, and transmits the same to the white balance adjusting part 42 and the color-gamut determining part 66 b.
  • [Step S13]
  • Based on the gains Wr, Wg, and Wb for white balance processing, the color-gamut determining part 66 b converts Rav, Gav, and Bav into values Rav′, Gav′, and Bav′ that are adjusted in white balance. This conversion method is the same as what the white balance adjusting part 42 applies to image data, being expressed by, e.g., the following three equations (collectively referred to as Equation (4)):

  • Rav′=Rav×Wr

  • Gav′=Gav×Wg

  • Bav′=Bav×Wb   (4)
  • As in the first embodiment, the color-gamut determining part 66 b determines R/G and B/G in each small region by the following equations, and determines a representative hue and a representative chroma in each small region by using the hue calculation table of FIG. 4 and the chroma calculation table of FIG. 5.

  • R/G=Rav′÷Gav′×100   (5)

  • B/G=Bav′÷Gav′×100   (6)
  • The processing of the subsequent steps S14, S15, and S16 are the same as that of steps S4, S5, and S6 of the first embodiment, respectively. Description thereof will thus be omitted.
  • As above, the second embodiment can provide the same effects as those of the foregoing first embodiment. Besides, in the present embodiment, the color-gamut determining part 66 b converts the averages Rav, Gav, and Bav of R, G, and B determined for each small region into the values Rav′, Gav′, and Bav′ that are adjusted in white balance, and then determines representative hues and representative chromas in the respective small regions. That is, the processing of the color-gamut determining part 66 b is equivalent to predicting how the image data is converted by the white balance adjusting part 42 and determining the color gamut of the subject to be expressed by the image data adjusted in white balance. As a result, it is possible to determine the color gamut of the subject more accurately regardless of the color temperature of the light source that has illuminated the subject at the time of shooting.
  • Third Embodiment
  • FIG. 8 shows a third embodiment of the present invention. The same parts as those of the first embodiment will be designated by identical reference numbers. Description thereof will be omitted. In the diagram, a photographing device 10C is made up of an electronic camera 12C of the present invention, equipped with a shooting lens 14 and a recording medium 16.
  • The electronic camera 12C includes the release button 30, a CPU 32 c, the memory 34, the focal-plane shutter 36, the CCD 38, a signal processing part 40, an evaluation value calculation part 62C, the WB gain calculating part 64, a white balance adjusting part 42 c, the Debayer processing part 44, an image processing device 50 c of the present invention, the gamma correction part 52, the contour enhancing part 54, the image-data compressing part 56, and the recording part 58.
  • The CPU 32 c controls each part of the electronic camera 12C.
  • The evaluation value calculation part 62 c is identical to the evaluation value calculation part 62 of the first embodiment except that Rav, Gav, and Bav calculated for each small region are transmitted only to the WB gain calculating part 64.
  • The white balance adjusting part 42 c is identical to the white balance adjusting part 42 of the first embodiment except that the image data adjusted in white balance is also input to the image processing device 50 c.
  • The image processing device 50 c includes a color-gamut determining part 66 c, a color-space determining part 68 c, and a color correcting part 70. The image processing device 50 c converts image data based on the color space of the three principal colors of the color filters on the CCD 38 into image data based on an appropriate color space, and inputs the same to the gamma correction part 52.
  • FIG. 9 is a flowchart showing the operation of the image processing device 50 c described above. FIGS. 10 (A), (B) are diagrams for explaining the processing of determining the color gamut of the subject and comparing it with the color gamuts of respective color spaces stored in advance by the image processing device 50 c. Hereinafter, the operation of the image processing device 50 c will be described in the order of step numbers shown in FIG. 9, with reference to FIG. 10.
  • [Step S31]
  • The signal processing part 40 reads the stored charges from the CCD 38 to create image data, and inputs the same to the evaluation value calculation part 62 c and the white balance adjusting part 42 c. As in the first embodiment, the evaluation value calculation part 62 c divides the image data into a plurality of small regions, and determines the averages Rav, Gav, and Bav of R, G, and B, respectively, in each small region. Based on Rav, Gav, and Bav transmitted from the evaluation value calculation part 62 c, the WB gain calculating part 64 determines gains for white balance processing, and transmits the same to the white balance adjusting part 42 c. The white balance adjusting part 42 c applies white balance processing to the image data, and then inputs the resultant to the color-gamut determining part 66 c and the Debayer processing part 44.
  • [Step S32]
  • The color-gamut determining part 66 c maps the input image data (based on the color space determined by the color filters on the CCD 38) onto an xy chromaticity diagram, for example. This mapping is performed in unit of pixels, and table data is created at the same time. For example, when the image data covers three pixels that show the color corresponding to an x-coordinate of 0.3 and a y-coordinate of 0.4, a row of table data is expressed as (0.3, 0.4, 3). Such table data is created on all the coordinates within the visible region.
  • [Step S33]
  • As shown in FIG. 10(A), the color-gamut determining part 66 c divides the visible region on the xy chromaticity diagram into N regions based on MacAdam ellipse, for example. Hereinafter, each of the N divided regions will be referred to as a region of comparable colors. The color-gamut determining part 66 c classifies the individual rows of table data created at step S32 according to regions of comparable colors. From among the regions of comparable colors, the color-gamut determining part 66 c selects ones that include T or more pixels of the mapped image data. In FIG. 10(A), the hatched area is an example of the regions of comparable colors selected here. Note that a single region including T pixels of exactly the same color can also be selected. The color-gamut determining part 66 c informs the color-space determining part 68 c of which regions of comparable colors have been selected, as the color gamut of the subject.
  • Incidentally, the value of T mentioned above may be determined according to the value of N and the total number of pixels of the image data so that a difference between the actual color gamut of the subject and the color gamut of the subject determined by the color-gamut determining part 66 c falls to or below an acceptable value. The smaller the value of T, the smaller the difference.
  • [Step S34]
  • As shown in FIG. 10(B), the color-space determining part 68 stores in advance the ranges of distribution of several color spaces (such as NTSC color space and sRGB color space) on the xy chromaticity diagram. Then, the color-space determining part 68 selects the smallest color space out of the color spaces that cover the color gamut of the subject on the xy chromaticity diagram. Here, a small color space refers to a color area of a small size on the chromaticity diagram. In the example shown in FIG. 10(B), NTSC color space which covers the hatched color gamut of the subject is selected as the optimum color space. As in the first embodiment, the color-space determining part 68 transmits color-space information on which color space is selected to the color correcting part 70 and the recording part 58.
  • If there is no color space that fully covers the color gamut of the subject, the smallest color space is selected from among color spaces that cover the color gamut of the subject on the xy chromaticity diagram at or above a predetermined area ratio. Here, the predetermined area ratio may be set to a value which allows the ratio of the region not covered by the selected color space to the color gamut of the subject determined by the color-gamut determining part 66 c to fall to or below an acceptable value.
  • [Step S35]
  • As in step S6 of the first embodiment, the color correcting part 70 converts the image data transmitted from the Debayer processing part 44 into such image data that it is rendered in the color space selected at step S34. The color correcting part 70 then transmits the converted image data to the gamma correction part 52.
  • The description so far has been made on the operation of the image processing device 50 c of the present embodiment.
  • As above, the third embodiment can provide the same effects as those of the first and second embodiments described above.
  • <Supplemental Remarks on the Present Invention>
  • [1] The foregoing first and second embodiments have dealt with the cases where the image data is divided into 8 vertical×12 horizontal, i.e., 96 regions. However, the present invention is not limited to such embodiments. If the color gamut of the subject must be determined more precisely, the image data may be divided more finely. To put a functional limitation, the image data should be divided at such a fineness that a difference between the actual color gamut of the subject and the color gamut of the subject determined by the image processing device 50 falls to or below an acceptable value (such as 1%).
  • [2] The first and second embodiments have dealt with the cases where the evaluation value calculation part 62 calculates, at step S2 (step S12), the averages Rav, Gav, and Bav of the three principal colors R, G, and B for each small region. However, the present invention is not limited to such embodiments. For example, values that occur with highest frequency may be determined from the digital data on all the pixels corresponding to R in the respective small regions. The values corresponding to G and B may also be determined similarly. The values occurring with highest frequency can be used in subsequent processing instead of the averages. Alternatively, maximum values Rmax, Gmax, and Bmax in the digital data on all the pixels corresponding to R, G, and B in the small regions, respectively, may be determined for use instead of the averages.
  • [3] The third embodiment has dealt with the case where the image data is mapped onto the xy chromaticity diagram. However, the present invention is not limited to such an embodiment. With human visual sensitivity taken into account, for example, a uv chromaticity diagram may be used instead of the xy chromaticity diagram.
  • [4] The first to third embodiments have dealt with the cases where the image processing device (50, 50 b, 50 c) performs color-space conversion on the image data before gamma processing. However, the present invention is not limited to such embodiments. Following the Debayer processing by the Debayer processing part 44, the gamma correction part 52 may perform the gamma correction before the image data is input to the color correcting part 70.
  • [5] The first to third embodiments have dealt with the cases where one color space is selected from among a plurality of color spaces stored in advance. However, the present invention is not limited to such embodiments. For example, it is possible to determine a triangle of a smallest size from triangles covering the color gamut of the subject on the chromaticity diagram, and establish a new color space having the vertexes of the determined triangle as the color coordinates of the three principal colors.
  • [6] The first to third embodiments have dealt with the cases where the image sensor (CCD 38) has a color filter array of principal colors R, G, and B. However, the present invention is not limited to such embodiments. For example, the present invention is also applicable to a color filter array of complementary colors, cyan, magenta, and yellow.
  • [7] The first to third embodiments have dealt with the cases where the image processing device of the present invention is used for an electronic camera. However, the present invention is not limited to such embodiments. For example, the image processing device of the present invention may be used for a scanner and the like.
  • [8] The processing of steps S1 to S6, steps S11 to S16, or steps S31 to S35 described above may be coded into an image processing program. In this case, the same effects as those of the first to third embodiments can be obtained if the image processing program is used as part of the CPU of an electronic camera, for example.
  • The invention is not limited to the above embodiments and various modifications may be made without departing from the spirit and scope of the invention. Any improvement may be made in part or all of the components.

Claims (8)

1. An image processing device comprising:
a color-gamut determining part for determining a color gamut as a range of color distribution from input image data input from an input part;
a color-space determining part for determining a color space substantially containing the color gamut determined by said color-gamut determining part; and
a color-space conversion part for converting the input image data into image data which is rendered in the determined color space and for transmitting a converted image data and information on the color space determined by said color-space determining part as information corresponding to the converted image data.
2. The image processing device according to claim 1, wherein:
said color-gamut determining part divides the input image data into a plurality of image regions and calculates a hue and a chroma for each of the image regions to determine a maximum chroma for each of hues calculated; and
said color-space determining part selects a smallest color space from color spaces each having a maximum chroma equal to or higher than that of the input image data in all of the hues calculated by said color-gamut determining part.
3. The image processing device according to claim 1, wherein:
said color-gamut determining part maps the input image data onto a chromaticity diagram; and
said color-space determining part selects a smallest color space from color spaces each containing a predetermined percentage or more of the color gamut of the input image data on said chromaticity diagram.
4. An electronic camera comprising:
an image-capturing part for capturing an optical image formed with a shooting lens to create image data; and
the image processing device according to claim 1, for determining a range of color distribution of the created image data to determine a color space, and converting the created image data into image data which is rendered in the determined color space.
5. A computer readable recording medium recording a image processing program for causing a computer to function as said color-gamut determining part, said color-space determining part, and said color-space conversion part according to claim 1.
6. A computer readable recording medium recording a image processing program for causing a computer to function as said color-gamut determining part, said color-space determining part, and said color-space conversion part according to claim 2.
7. A computer readable recording medium recording a image processing program for causing a computer to function as said color-gamut determining part, said color-space determining part, and said color-space conversion part according to claim 3.
8. The electronic camera according to claim 4, wherein
said color-space conversion part performs a color correction of imaging condition when converting the input image data into image data which is rendered in the determined color space.
US12/153,518 2002-12-17 2008-05-20 Image processing device, electronic camera, and image processing program Abandoned US20080246855A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/153,518 US20080246855A1 (en) 2002-12-17 2008-05-20 Image processing device, electronic camera, and image processing program
US12/929,869 US8520097B2 (en) 2002-12-17 2011-02-22 Image processing device, electronic camera, and image processing program

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2002365476A JP2004200902A (en) 2002-12-17 2002-12-17 Image processor, electronic camera, and image processing program
JP2002-365476 2002-12-17
US10/730,057 US20040119843A1 (en) 2002-12-17 2003-12-09 Image processing device, electronic camera, and image processing program
US12/153,518 US20080246855A1 (en) 2002-12-17 2008-05-20 Image processing device, electronic camera, and image processing program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/730,057 Continuation US20040119843A1 (en) 2002-12-17 2003-12-09 Image processing device, electronic camera, and image processing program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/929,869 Continuation US8520097B2 (en) 2002-12-17 2011-02-22 Image processing device, electronic camera, and image processing program

Publications (1)

Publication Number Publication Date
US20080246855A1 true US20080246855A1 (en) 2008-10-09

Family

ID=32376246

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/730,057 Abandoned US20040119843A1 (en) 2002-12-17 2003-12-09 Image processing device, electronic camera, and image processing program
US12/153,518 Abandoned US20080246855A1 (en) 2002-12-17 2008-05-20 Image processing device, electronic camera, and image processing program
US12/929,869 Expired - Lifetime US8520097B2 (en) 2002-12-17 2011-02-22 Image processing device, electronic camera, and image processing program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/730,057 Abandoned US20040119843A1 (en) 2002-12-17 2003-12-09 Image processing device, electronic camera, and image processing program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/929,869 Expired - Lifetime US8520097B2 (en) 2002-12-17 2011-02-22 Image processing device, electronic camera, and image processing program

Country Status (5)

Country Link
US (3) US20040119843A1 (en)
EP (1) EP1432237B1 (en)
JP (1) JP2004200902A (en)
AT (1) ATE340480T1 (en)
DE (1) DE60308472T2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100135634A1 (en) * 2006-04-26 2010-06-03 Takeshi Ito Video processing device, recording medium, video signal processing method, video signal processing program, and integrated circuit
US8837907B2 (en) 2006-11-29 2014-09-16 Sony Corporation Recording apparatus, recording method, image pickup apparatus, reproducing apparatus and video system

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2384007B1 (en) * 2003-02-27 2014-09-10 Seiko Epson Corporation Image reproduction using a particular color space
US6937253B2 (en) * 2003-12-11 2005-08-30 Xerox Corporation Method for determining color space of an image
US7948501B2 (en) * 2004-03-09 2011-05-24 Olympus Corporation Display control apparatus and method under plural different color spaces
US7593147B2 (en) * 2004-05-26 2009-09-22 Fujifilm Corporation Output apparatus, color conversion method, and machine readable medium storing program
WO2006022028A1 (en) * 2004-08-27 2006-03-02 Seiko Epson Corporation Image reproduction using particular color space
JP4684030B2 (en) 2005-07-06 2011-05-18 株式会社リコー Image processing apparatus and image processing method
JP4815267B2 (en) * 2006-05-11 2011-11-16 オリンパスイメージング株式会社 White balance control method, imaging apparatus, and white balance control program
JP4874752B2 (en) * 2006-09-27 2012-02-15 Hoya株式会社 Digital camera
KR20080049360A (en) * 2006-11-30 2008-06-04 삼성전자주식회사 The method of transmitting color gamut and the image device thereof
TWI332352B (en) * 2007-03-30 2010-10-21 Au Optronics Corp Hue segmentation system and method thereof
JP4957676B2 (en) * 2008-03-19 2012-06-20 セイコーエプソン株式会社 Image data analysis apparatus, image data analysis method, and program
JP2010041636A (en) 2008-08-08 2010-02-18 Sony Corp Information processing device, method and program
JP4970419B2 (en) * 2008-12-22 2012-07-04 株式会社沖データ Image processing device
KR101650451B1 (en) * 2009-09-21 2016-09-06 삼성전자주식회사 System and method for generating rgb primary for wide ganut, and color encoding system using rgb primary
JP5957813B2 (en) * 2011-07-07 2016-07-27 株式会社ニコン Imaging apparatus, program, and recording medium
JP2013077879A (en) * 2011-09-29 2013-04-25 Sony Corp Imaging device, imaging method and program
JP5556823B2 (en) 2012-01-13 2014-07-23 株式会社ニコン Solid-state imaging device and electronic camera
EP2835965B1 (en) 2012-03-30 2017-05-03 Nikon Corporation Imaging device and image sensor
JP5598769B2 (en) * 2012-04-16 2014-10-01 コニカミノルタ株式会社 Color adjustment method and color adjustment program
US8836716B1 (en) * 2013-09-20 2014-09-16 Spinella Ip Holdings, Inc. System and method for reducing visible artifacts in the display of compressed and decompressed digital images and video
JP5958497B2 (en) * 2014-06-06 2016-08-02 株式会社ニコン Solid-state imaging device and electronic camera
US10440355B2 (en) * 2015-11-06 2019-10-08 Facebook Technologies, Llc Depth mapping with a head mounted display using stereo cameras and structured light
US10863158B2 (en) * 2016-05-17 2020-12-08 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
CN113129815B (en) * 2021-04-23 2022-03-01 利亚德光电股份有限公司 Color gamut coefficient processing method and device, LED display screen and computer equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5450216A (en) * 1994-08-12 1995-09-12 International Business Machines Corporation Color image gamut-mapping system with chroma enhancement at human-insensitive spatial frequencies
US5606632A (en) * 1992-05-28 1997-02-25 Fujitsu Limited Device and method for reducing the size of a color image to display separate color images simultaneously on a screen
US5963670A (en) * 1996-02-12 1999-10-05 Massachusetts Institute Of Technology Method and apparatus for classifying and identifying images
US6075563A (en) * 1996-06-14 2000-06-13 Konica Corporation Electronic camera capable of adjusting color tone under different light sources
US20010016064A1 (en) * 2000-02-22 2001-08-23 Olympus Optical Co., Ltd. Image processing apparatus
US20020060688A1 (en) * 2000-09-28 2002-05-23 Kenji Mizumoto Imaging apparatus
US6437792B1 (en) * 1999-01-22 2002-08-20 Sony Corporation Image processing apparatus and method, color gamut conversion table creating apparatus and method, storage medium having image processing program recorded therein, and storage medium having recorded therein color gamut conversion table creating program
US6453072B1 (en) * 1997-10-09 2002-09-17 Olympus Optical Co., Ltd. Image coding system
US20030012433A1 (en) * 2001-07-06 2003-01-16 Jasc Software, Inc. Automatic saturation adjustment
US20030063299A1 (en) * 2001-10-02 2003-04-03 Cowan Philip B. Color calibration method and apparatus
US20030151758A1 (en) * 2002-02-13 2003-08-14 Nikon Corporation Image processing unit, image processing method, and image processing program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05244443A (en) 1992-02-27 1993-09-21 Canon Inc Picture communication system and communication equipment
JP3284829B2 (en) 1995-06-15 2002-05-20 ミノルタ株式会社 Image processing device
JPH0951443A (en) 1995-08-02 1997-02-18 Fuji Xerox Co Ltd Image processing unit
JP3608533B2 (en) * 2001-02-09 2005-01-12 セイコーエプソン株式会社 Image processing via network

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5606632A (en) * 1992-05-28 1997-02-25 Fujitsu Limited Device and method for reducing the size of a color image to display separate color images simultaneously on a screen
US5450216A (en) * 1994-08-12 1995-09-12 International Business Machines Corporation Color image gamut-mapping system with chroma enhancement at human-insensitive spatial frequencies
US5963670A (en) * 1996-02-12 1999-10-05 Massachusetts Institute Of Technology Method and apparatus for classifying and identifying images
US6075563A (en) * 1996-06-14 2000-06-13 Konica Corporation Electronic camera capable of adjusting color tone under different light sources
US6453072B1 (en) * 1997-10-09 2002-09-17 Olympus Optical Co., Ltd. Image coding system
US6437792B1 (en) * 1999-01-22 2002-08-20 Sony Corporation Image processing apparatus and method, color gamut conversion table creating apparatus and method, storage medium having image processing program recorded therein, and storage medium having recorded therein color gamut conversion table creating program
US20010016064A1 (en) * 2000-02-22 2001-08-23 Olympus Optical Co., Ltd. Image processing apparatus
US20020060688A1 (en) * 2000-09-28 2002-05-23 Kenji Mizumoto Imaging apparatus
US20030012433A1 (en) * 2001-07-06 2003-01-16 Jasc Software, Inc. Automatic saturation adjustment
US20030063299A1 (en) * 2001-10-02 2003-04-03 Cowan Philip B. Color calibration method and apparatus
US20030151758A1 (en) * 2002-02-13 2003-08-14 Nikon Corporation Image processing unit, image processing method, and image processing program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100135634A1 (en) * 2006-04-26 2010-06-03 Takeshi Ito Video processing device, recording medium, video signal processing method, video signal processing program, and integrated circuit
US8339412B2 (en) 2006-04-26 2012-12-25 Panasonic Corporation Video processing device, recording medium, video signal processing method, video signal processing program, and integrated circuit
US8837907B2 (en) 2006-11-29 2014-09-16 Sony Corporation Recording apparatus, recording method, image pickup apparatus, reproducing apparatus and video system

Also Published As

Publication number Publication date
EP1432237B1 (en) 2006-09-20
JP2004200902A (en) 2004-07-15
DE60308472T2 (en) 2007-03-29
US8520097B2 (en) 2013-08-27
US20110141302A1 (en) 2011-06-16
ATE340480T1 (en) 2006-10-15
EP1432237A3 (en) 2004-07-07
US20040119843A1 (en) 2004-06-24
DE60308472D1 (en) 2006-11-02
EP1432237A2 (en) 2004-06-23

Similar Documents

Publication Publication Date Title
US8520097B2 (en) Image processing device, electronic camera, and image processing program
US7076119B2 (en) Method, apparatus, and program for image processing
EP0757473B1 (en) Image processing apparatus and method
US7636473B2 (en) Image color adjustment
JP4040625B2 (en) Image processing apparatus, printer apparatus, photographing apparatus, and television receiver
US7750948B2 (en) Image signal processing device, digital camera and computer program product for processing image signal
US8270712B2 (en) Image processing device, image processing program, and image processing method
US7312824B2 (en) Image-capturing apparatus, image processing apparatus and image recording apparatus
US20060012808A1 (en) Image processing device, image processing method, and image processing device manufacturing method
US7409082B2 (en) Method, apparatus, and recording medium for processing image data to obtain color-balance adjusted image data based on white-balance adjusted image data
JP4874752B2 (en) Digital camera
JP2005354372A (en) Apparatus and method for image recording device, method and system for image processing
US6507667B1 (en) Color digital imaging apparatus having a rule-based hue-shift processor
US7324702B2 (en) Image processing method, image processing apparatus, image recording apparatus, program, and recording medium
JP2004172745A (en) Automatic adjustment of image quality in response to size of object
Lukac Single-sensor digital color imaging fundamentals
JP2002109523A (en) Image pickup device, optical filter group, and image data converter
JP3863773B2 (en) Image photographing method and apparatus
JP2003179939A (en) Color chart, image data generating device, profile creating method utilizing color chart, and recording medium for recording profile creating program
US20100165149A1 (en) Opponent Color Detail Enhancement for Saturated Colors
JP4240257B2 (en) Electronic camera
JP3539883B2 (en) Image processing method and apparatus, recording medium, imaging apparatus, and image reproducing apparatus
JP2005260693A (en) Image reproducing method with coordinate transformation according to lighting optical source
JP2004096444A (en) Image processor and method thereof
JP4047154B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION