US20070285530A1 - Automatic white balancing method, medium, and system - Google Patents

Automatic white balancing method, medium, and system Download PDF

Info

Publication number
US20070285530A1
US20070285530A1 US11/802,225 US80222507A US2007285530A1 US 20070285530 A1 US20070285530 A1 US 20070285530A1 US 80222507 A US80222507 A US 80222507A US 2007285530 A1 US2007285530 A1 US 2007285530A1
Authority
US
United States
Prior art keywords
illuminant
detection region
value
image
luminance value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/802,225
Other versions
US8624995B2 (en
Inventor
Sung-su Kim
Byoung-Ho Kang
Seong-deok Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANG, BYOUNG-HO, KIM, SUNG-SU, LEE, SEONG-DEOK
Publication of US20070285530A1 publication Critical patent/US20070285530A1/en
Application granted granted Critical
Publication of US8624995B2 publication Critical patent/US8624995B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Definitions

  • One or more embodiments of the present invention relate to a color reproduction technology, and more particularly, to a white balancing method, medium, and system.
  • AVB auto white balancing
  • an image pick-up device discussed in Japanese Patent Unexamined Publication No. 2002-290988 divides an object into a plurality of regions, detects chromaticity in every region having a luminance higher than a threshold value, and calculates a gain value to perform white balancing based on the detected chromaticity.
  • the present invention has been made to solve such above-mentioned problems, with an aspect of one or more embodiments of the present invention being to improve the performance of color reproduction through a more stable illuminant estimation.
  • embodiments of the present invention include a method with white balancing, including setting an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image was captured, and detecting an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
  • embodiments of the present invention include at least one medium including computer readable code to control at least one processing element to implement one or more embodiments of the present invention.
  • embodiments of the present invention include a system, including a setting unit to set an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image was captured, and a detection unit to detect an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
  • FIG. 1 illustrates a white balancing system, according to an embodiment of the present invention
  • FIG. 2 illustrates a white balancing method, according to an embodiment of the present invention
  • FIG. 3 illustrates an illuminant detection region, according to an embodiment of the present invention
  • FIG. 4 illustrates a candidate region that can be set as an illuminant detection region, according to an embodiment of the present invention
  • FIGS. 5A through 5C are illustrations explaining a setting of an illuminant detection region, according to an embodiment of the present invention.
  • FIGS. 6A through 6E are illustrations explaining an obtaining of variation of color gamut and a central point, according to an embodiment of the present invention.
  • FIG. 7 illustrates a detection unit, according to an embodiment of the present invention
  • FIG. 8 illustrates an operation of a division unit, according to an embodiment of the present invention
  • FIG. 9 illustrates an operation of a comparative value determination unit, according to an embodiment of the present invention.
  • FIG. 10 is an illustration explaining the inconsistency between the illuminant distribution probability and a reference illuminant locus axis
  • FIG. 11 illustrates a correcting of an illuminant, according to an embodiment of the present invention.
  • FIGS. 12A and 12B are illustrations explaining a correcting of an illuminant, according to an embodiment of the present invention.
  • FIG. 1 illustrates a white balancing system 100 , according to an embodiment of the present invention.
  • the white balancing system 100 may include a setting unit 110 , a detection unit 120 , a stabilization unit 130 , and a white balancing unit 140 , for example.
  • the white balancing system may be an image processing system such as a digital still camera, a digital video camcorder, and others, for example.
  • the setting unit 110 may set an illuminant detection region for an input image, for example, in operation S 210 , e.g., by selectively using an exposure integration time (EIT) as a reference to determine the illuminant detection region.
  • EIT exposure integration time
  • the EIT means a time required for collecting light when photographing/taking an image.
  • the EIT is not limited to a temporal element, and may be determined by other information which can predict a collected amount of the light when the image is photographed/taken. More specifically, for example, the EIT may be determined by exposure information such as a shutter speed or an aperture value.
  • the EIT may also be provided together with an image to be input either at the time of the image photographing of the image or stored with or for the image for subsequent correction.
  • the digital still camera attaches the exposure information at the time of the photographing of the image, such as shutter speed or aperture value, to the photographed image as additional data
  • the EIT may be attached to the image file, or imbedded in the image, as the additional data.
  • the illuminant detection region represents a range of data to be used to detect an illuminant of the image, an example of which is shown in FIG. 3 .
  • the detection unit 120 may detect the illuminant by using data to be contained in an illuminant detection region 320 , which may be set by the setting unit 110 , in a color gamut 310 of the image, for example, in operation S 220 .
  • the illuminant detected by the detection unit 120 may reflect distorted information, e.g., in accordance with a deviation between devices or an amount of data sampled for detecting the illuminant, such that the stabilization unit 130 may further correct the detected illuminant so as to correct the distorted information, for example, in operation S 230 .
  • the white balancing unit 140 may further perform white balancing on the image by use of the corrected illuminant, for example, in operation S 240 . Since there are diverse known techniques for performing the white balancing on the image, the detailed description thereof will be omitted herein.
  • the example operation of setting the illuminant detection region for the input image can be performed by the setting unit 110 in FIG. 1 , and can correspond to the operation S 210 of FIG. 2 , for example, noting that alternative operations and units for accomplishing the same are equally available.
  • the setting unit 110 may set the illuminant detection region associated with the EIT.
  • any of the candidate regions 410 through 440 having a high possibility that the illuminant exists may be previously set in accordance with the EIT, as shown in FIG. 4 .
  • the setting unit 110 may set the first candidate region 410 as the illuminant detection region.
  • the setting unit 110 may set the second candidate region 420 as the illuminant detection region.
  • Other candidate regions 430 and 440 may be set as the illuminant detection region associated with the EIT of the input image.
  • the candidate regions 410 through 440 may be fixed on chromaticity coordinates, and any one of the fixed candidate regions determined as the illuminant detection region.
  • embodiments of the present invention are not limited thereto.
  • an alternative embodiment may be implemented which performs modeling of the point, in which the illuminant of the image exists, on the chromaticity coordinates in accordance with the EIT, and variably sets the illuminant detection region in accordance with the EIT of the input image. Such an embodiment will now be described in detail with reference to FIGS. 5A through 5C .
  • FIG. 5A illustrates chromatic values of illuminants of respective images associated with respective EITs. Such information may be obtained through previous experiments or during operation of a respective camera device or white balancing system.
  • a modeling of a median chromaticity locus of the illuminant associated with the EIT can be performed.
  • an average chromatic value of the illuminants corresponding to each EIT can be calculated, and a trend line 510 of the points representing each average chromatic value can be obtained.
  • the obtained trend line can further be projected on the chromaticity coordinates to obtain the median chromaticity locus 520 of the illuminant, as shown in FIG. 5B .
  • the illustrated median chromaticity locus 520 of the illuminant associated with any particular EIT may not be previously set, and the setting unit 110 can obtain the central illuminant point corresponding to the EIT of the image to be input from the median chromaticity locus 520 of the illuminant.
  • the setting unit 110 may then set a given range as the illuminant detection region 540 based on the central illuminant point 530 , as shown in FIG. 5C .
  • the illuminant detection region 540 can be set from the central illuminant point 530 to the first threshold distance 550 in the median chromaticity locus 520 of the illuminant, and to the second threshold distance 560 in a direction perpendicular to the median chromaticity locus 520 of the illuminant.
  • the first threshold distance and the second threshold distance may be set in accordance with the tendency of the chromaticity distribution of the illuminant associated with the EIT, which again may be previously determined by experiment or previous operation.
  • the first threshold distance and the second threshold distance may be determined dynamically in accordance with the EIT. For example, if it is assumed that when the EIT is low, the chromaticity distributed range of the illuminant is narrow, while when the EIT is high, the chromaticity distributed range of the illuminant is wide, at least one of the first threshold distance and the second threshold distance may be altered in accordance with the EIT, so as to reflect this observed tendency. Here, alternate tendencies may also be observed depending on embodiment.
  • the setting unit 110 may additionally use a variance of the color gamut of the image to be input and the central point of the color gamut, so as to set the illuminant detection region, as shown in FIG. 6A .
  • the setting unit 110 may select the illustrated threshold number of data 610 - 1 through 610 - 4 in near order from four reference points O, P, Q, and R on the chromaticity coordinates in the color gamut of the image to be input.
  • the four reference points include, as shown in FIG. 6B , an origin O (0, 0) of the chromaticity coordinates, a point P (Cr-max, 0) indicative of the maximum Cr value, Cr-max, which can be possessed by a general image on a Cr-axis, a point Q (Cb-max, 0) indicative of the maximum Cb value, Cb-max, which can be possessed by a general image on a Cb-axis, and a coordinate point R (Cr-max, Cb-max) indicative of the maximum Cr value and the maximum Cb value.
  • FIG. 6C illustrates four such reference points, according to alternative embodiment of the present invention, with two reference points O (0, 0) and R(Cr-max, Cb-max) among four reference points being similar to those of the embodiment in FIG. 6B .
  • the other two reference points P and Q are cross points formed by the axes of two chromaticity coordinates and an extension of the reference illuminant locus 620 .
  • the reference illuminant locus 620 means a trend line based on the chromaticity of diverse types of standard illuminants (e.g., D65, D50, CWF (Cool White Fluorescent), A, Horizon, and others) which are proper to characteristics of a device (e.g., a digital still camera including a white balancing system 100 ) capturing an image.
  • standard illuminants e.g., D65, D50, CWF (Cool White Fluorescent), A, Horizon, and others
  • the setting unit 110 may determine edge points 630 - 1 , 630 - 2 , 630 - 3 , and 630 - 4 having the average chromatic value of the data every selected data through each reference point.
  • the setting unit 110 may calculate a distance between the edge points derived from the diagonal reference points among four reference points. That is, the setting unit 110 may calculate a distance 640 (referred to as a color gamut height) between the edge point 630 - 1 derived from the reference point O and the edge point 630 - 4 derived from the reference point R, and a distance 650 (referred to as a color gamut width) between the edge point 630 - 2 derived from the reference point P and the edge point 630 - 3 derived from the reference point Q.
  • a distance 640 referred to as a color gamut height
  • 650 referred to as a color gamut width
  • the setting unit 110 may determine whether the color gamut height 540 and the color gamut width 650 exist in a given threshold range, respectively. If the color gamut height 540 and the color gamut width 650 satisfy the given threshold range, the setting unit 110 may use the illuminant detection region determined in accordance with the EIT as it is, since it may be considered that the input image has a normal color distribution. However, if the color gamut height 540 and the color gamut width 650 are found to be outside of the threshold range, it can be understood that the input image has an abnormal color distribution, since the color gamut of the input image is excessively wider or narrower than a normal case.
  • the setting unit 110 can set the color gamut of the input image as the illuminant detection region irrespective of the EIT, such as through conventional techniques, after such a detection that the color gamut height or the color gamut width is out of the threshold range.
  • the setting unit 110 may select the threshold number of data in the color gamut of the input image in near order from four reference points, and predict the variance of the color gamut by use of the distance between the points having the mean chromaticity of the selected data. Alternate methods are also available.
  • the central point of the color gamut can be determined as a cross point 660 of a segment representing the color gamut height and the color gamut width.
  • the setting unit 110 may determine whether the use of the illuminant detection region determined in accordance with the EIT is appropriate through the central point 660 of the color gamut.
  • modeling can be performed of the point, on which the illuminant of the image exists, on the chromaticity coordinates in accordance with the central point 660 of the color gamut, similar to the method of modeling the point, on which the illuminant of the image exists, on the chromaticity coordinates in accordance with the EIT. That is, the point, on which the illuminant of the image can exist, may be set as a desired number of regions on the chromaticity coordinates in accordance with the central point 660 of the color gamut.
  • the illuminant detection region determined by the EIT it can be determined whether the use of the illuminant detection region determined by the EIT is appropriate, through whether the region on the chromaticity coordinates corresponding to the central point 660 of the color gamut of the input image overlaps the illuminant detection region determined in accordance with the EIT. If the region on the chromaticity coordinates corresponding to the central point 660 of the color gamut is identical or sufficiently similar to the illuminant detection region determined by the EIT, the illuminant detection region determined by the EIT can be used as it is.
  • the color gamut of the input image can be set as the illuminant detection region irrespective of the EIT.
  • An operation of detecting the illuminant may, thus, be performed by the detection unit 120 in FIG. 1 , for example, and may correspond to operation S 220 in FIG. 2 , also for example, noting that alternative operations and units for accomplishing the same are equally available.
  • the detection unit 120 may include, as shown in FIG. 7 , a division unit 710 , a comparative value determination unit 720 , and an illuminant estimation unit 730 , for example.
  • the division unit 710 may divide the data contained in the illuminant detection region, e.g., as set by the setting unit 110 in the color gamut of an input image, into two groups on the basis of luminance.
  • FIG. 8 illustrates an example operation of the division unit 710 , noting that alternative operations and units for accomplishing the same are equally available.
  • an average luminance value and a mean luminance value of the data contained in the illuminant detection region may be calculated, in operation S 810 .
  • Threshold number of data (referred to as superior luminance data) may be further sorted in order of high luminance value among luminance distribution of the data contained in the illuminant detection region, and the same number of data as that of the superior luminance data in order of low luminance value.
  • the average luminance value Y thresh may be calculated from a mean between an average value of the luminance values of the upper luminance data and an average value of the luminance values of lower luminance data.
  • the mean luminance value Y avg may be calculated by an average of the luminance values of all data contained in the illuminant detection region.
  • a threshold luminance value may be calculated to be used to divide the data in the illuminant detection region into two groups by using the average luminance value and the mean luminance value, in operation S 820 .
  • the threshold luminance value may be determined by a weighted sum of the average luminance value of the data contained in the illuminant detection region and the mean luminance value thereof, which may be expressed by the below Equation 1, for example.
  • Y thresh k ⁇ Y median +(1 ⁇ k ) ⁇ Y avg Equation 1:
  • Y thresh is a threshold luminance value to be calculated
  • Y median is a median luminance value
  • Y avg is an average luminance value.
  • k may be a weighted value of 0 or 1, for example.
  • the data in the illuminant detection region may be divided into two groups on the basis of the threshold luminance value, in operation S 830 .
  • the data having a luminance more than a threshold luminance value among the data in the illuminant detection region may be classified into the first group, and the data having a luminance less than a threshold luminance value may be classified into the second group.
  • the comparative value determination unit 720 in FIG. 7 may set a comparative illuminant to be a standard of determining the illuminant.
  • FIG. 9 illustrates an example operation of the comparative value determination unit 720 , noting that alternative operations and units for accomplishing the same are equally available.
  • an average may be calculated of an average chromatic value of the data contained in the illuminant detection region and a mean chromatic value thereof, in operation S 910 .
  • the average chromatic value may be calculated by the average of the chromatic values of the data contained in the illuminant detection region, and the median chromatic value may be calculated by a chromatic value of the center point of the illuminant detection region.
  • Ch w m ⁇ Ch avg +(1 ⁇ m ) ⁇ Ch median Equation 2:
  • Ch w is a weighted average to be calculated
  • Ch avg is an average chromatic value
  • Ch median is a median chromatic value
  • m may be a weighted value of 0 or 1, for example.
  • An average may be calculated of chromatic values of the data contained in each of two divided groups, e.g., as divided by the division unit 710 , in operation S 930 .
  • the average of the chromatic values of the data contained in the first group will be referred to as the first average
  • the average of the chromatic values of the data contained in the second group will be referred to as the second average.
  • a difference value may further be calculated between the first average and the second average, in operation S 940 .
  • a comparative illuminant may be set by using the weighted average Ch w , e.g., calculated in the operation S 920 , the difference value, e.g., calculated in the operation S 940 , and a standard illuminant (e.g., D65, D50, CWF, A, Horizon, and the others) of a device providing a corresponding image frame (e.g., a digital still camera comprising the white balancing system 100 ) as an input value, in operation S 950 .
  • the below Equation 3 may be used, for example.
  • W ref ( r,b ) F 2 ( F 1 ( Ch w ,DEV w ), Ch dist ) Equation 3:
  • W ref (r,b) is a chromatic value of the comparative illuminant
  • Ch w is a weighted average calculated in operation S 920 , for example
  • DEV w is a standard illuminant
  • Ch dist is a difference value between the first average and the second average calculated in the operation S 940 , for example.
  • F1 may be a quadratic correlation function
  • F2 may be a linear correlation function.
  • the correlation function F1 may be a function reflecting a correlation between the standard illuminants and Ch w under the standard illuminant, and a substantial comparative estimating function to estimate the point of the illuminant in the image, for example.
  • the correlation function F2 may be a modeling function considering Ch dist in the standard illuminant locus function, and a function to compensate a performance of a comparative illuminant estimation of the correlation function F1, for example.
  • F1 and F2 can be varied, and F1 and F2 may be set as a quadratic function and a linear function, respectively, as one example of optimizing its complexity.
  • a concrete embodiment of F1 and F2 can be easily understood through the below Equations 4 through 6, for example.
  • F 1 ⁇ *Ch w 2 + ⁇ *Ch w + 65 Equation 5:
  • F 2 ⁇ *( F 1 ( Ch w ,DEV w ) ⁇ Ch dist )+ ⁇ Equation 6:
  • Equations 4 through 6 ⁇ , ⁇ , ⁇ , ⁇ , and ⁇ are a certain real number, and may be determined as a proper value based on experiment results.
  • ⁇ , ⁇ , and ⁇ may preferably exist as in the relation in Equation 4, noting that alternative embodiments are equally available.
  • the illuminant estimation unit 730 may determine the point, in which a chromatic difference between the comparative illuminant, e.g., as determined in the above operation S 950 , and any one of the first average and the second average, e.g., as calculated in the above operation S 940 , as the initial illuminant.
  • the illuminant estimation unit 730 may calculate a chromatic difference (referred to as first chromatic difference) between the first average and the comparative illuminant, and a chromatic difference (referred to as second chromatic difference) between the second average and the comparative illuminant, and compare the first chromatic difference and the second chromatic difference. If the first chromatic difference is smaller than the second chromatic difference, the first average may be determined to be the initial illuminant, while if the second chromatic difference is smaller than the first chromatic difference, the second average may be determined to be the initial illuminant.
  • first chromatic difference referred to as first chromatic difference
  • second chromatic difference a chromatic difference between the second average and the comparative illuminant
  • distorted information may be reflected in the illuminant detected by the detection unit 120 in accordance with the deflection between devices providing the image or an amount of data sampled to detect the illuminant. For example, if the illuminant is detected according to the data contained in the illuminant detection region, the position of the illuminant detection region may be determined to be in the specified range in accordance with the EIT, but the amount of data or the chromaticity information to be input into the illuminant detection region may be varied depending upon the chromaticity variation of the device providing the image.
  • the illustrated probability distribution 1020 in which the illuminant information exists in the illuminant detection region 1010 may not conform to the reference illuminant locus axis 1030 , it becomes desirable to provide a new illuminant locus axis 1040 by adjusting the reference illuminant locus axis 1030 .
  • the stabilization unit 130 may stabilize the initial illuminant detected by the detection unit 120 based on the reference illuminant locus and the average chromatic value of data input into the illuminant detection region in the color gamut of the input image.
  • FIG. 11 illustrates an example operation of the stabilization unit 130 , noting that alternative operations and units for accomplishing the same are equally available.
  • the stabilization unit 130 may calculate a weighted average point between the average chromatic value of data input into the illuminant detection region in the color gamut of the image and the chromatic value of the initial illuminant, in operation S 1110 .
  • the weighted average may be expressed by the below Equation 7, for example.
  • W avg N ⁇ W i +(1 ⁇ N ) ⁇ Ch avg Equation 7:
  • W avg is a weighted value to be calculated
  • W i is a chromatic value of the initial illuminant, e.g., as determined by the illuminant estimation unit 730
  • Ch avg is an average of chromatic values of the data contained in the illuminant detection region.
  • Ch avg is also used in Equation 2.
  • N may be a weighted value of 0 or 1, for example.
  • a new illuminant locus 1230 which contains a weighted average point 1210 and is parallel with the reference illuminant locus 1220 may then be set, in operation S 1120 .
  • a point 1250 in which the initial illuminant 1240 is projected on the new illuminant locus 1230 in a vertical direction may be determined, in operation S 1130 .
  • the term “unit” indicating a respective component of the white balancing system 100 may be constructed as a module, for example.
  • the term “module”, as used herein, means, but is not limited to, a software and/or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks.
  • a module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors.
  • a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • embodiments of the present invention can also be implemented through computer readable code/instructions in/on a medium, e.g., a computer readable medium, to control at least one processing element to implement any above described embodiment.
  • a medium e.g., a computer readable medium
  • the medium can correspond to any medium/media permitting the storing and/or transmission of the computer readable code.
  • the computer readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as carrier waves, as well as through the Internet, for example.
  • the medium may further be a signal, such as a resultant signal or bitstream, according to embodiments of the present invention.
  • the media may also be a distributed network, so that the computer readable code is stored/transferred and executed in a distributed fashion.
  • the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
  • one or more embodiments of the present invention include a white balancing method, medium, and system, where the color reproducing performance can be improved through more stabilized illuminant estimation.

Abstract

A white balancing detecting method, medium, and system. The white balancing method includes setting an illuminant detection region of an input image in accordance with an exposure integration time indicative of a collected amount of light when the image is taken, and detecting an illuminant by using data contained in the illuminant detection region in a color gamut of the image.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based on and claims priority from Korean Patent Application No. 10-2006-0047751, filed on May 26, 2006 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • One or more embodiments of the present invention relate to a color reproduction technology, and more particularly, to a white balancing method, medium, and system.
  • 2. Description of the Related Art
  • Though natural Light is typically thought of as being white, in actuality the light may have an abundance of one or more wavelengths resulting in the overall light having a peculiar color called a color temperature, expressed in Kevin (K). In general, since the human beings' visual ability automatically adjusts for such minor discrepancies, the human beings' cognitive difference for the colors is very insignificant even though light of a particular color temperature may be illuminated. However, image pick-up devices, such as a camera or a camcorder, sense colors, in which color temperatures are reflected, as they are. Accordingly, if an illuminant is changed, images taken by the image pick-up device are tingled with different colors.
  • For example, since the color temperature of sunlight around noon on a sunny day is considered to be high, the image taken by an image pick-up device will appear bluish on the whole. By contrast, since the color temperature of the sunlight just after sunrise or just before sunset is considered to be low, the image taken by the image pick-up device will appear reddish on the whole.
  • An auto white balancing (AWB) technique proposed to solve this problem compensates for distortions of the color tone of the image if the image is deflected toward any one of red (R), green (G), and blue (B) components depending upon its color temperature.
  • In one example, an image pick-up device discussed in Japanese Patent Unexamined Publication No. 2002-290988, divides an object into a plurality of regions, detects chromaticity in every region having a luminance higher than a threshold value, and calculates a gain value to perform white balancing based on the detected chromaticity.
  • However, such white balancing techniques have problems in that it is difficult to perform a consistent color reproduction in accordance with the color or dimension of an object existing in the image even though the image is taken under the same light source or illuminant.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention has been made to solve such above-mentioned problems, with an aspect of one or more embodiments of the present invention being to improve the performance of color reproduction through a more stable illuminant estimation.
  • Additional aspects and/or advantages of the invention will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the invention.
  • To achieve the above and/or other aspects and advantages, embodiments of the present invention include a method with white balancing, including setting an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image was captured, and detecting an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
  • To achieve the above and/or other aspects and advantages, embodiments of the present invention include at least one medium including computer readable code to control at least one processing element to implement one or more embodiments of the present invention.
  • To achieve the above and/or other aspects and advantages, embodiments of the present invention include a system, including a setting unit to set an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image was captured, and a detection unit to detect an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects and advantages of the invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 illustrates a white balancing system, according to an embodiment of the present invention;
  • FIG. 2 illustrates a white balancing method, according to an embodiment of the present invention;
  • FIG. 3 illustrates an illuminant detection region, according to an embodiment of the present invention;
  • FIG. 4 illustrates a candidate region that can be set as an illuminant detection region, according to an embodiment of the present invention;
  • FIGS. 5A through 5C are illustrations explaining a setting of an illuminant detection region, according to an embodiment of the present invention;
  • FIGS. 6A through 6E are illustrations explaining an obtaining of variation of color gamut and a central point, according to an embodiment of the present invention;
  • FIG. 7 illustrates a detection unit, according to an embodiment of the present invention;
  • FIG. 8 illustrates an operation of a division unit, according to an embodiment of the present invention;
  • FIG. 9 illustrates an operation of a comparative value determination unit, according to an embodiment of the present invention;
  • FIG. 10 is an illustration explaining the inconsistency between the illuminant distribution probability and a reference illuminant locus axis;
  • FIG. 11 illustrates a correcting of an illuminant, according to an embodiment of the present invention; and
  • FIGS. 12A and 12B are illustrations explaining a correcting of an illuminant, according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. Embodiments are described below to explain the present invention by referring to the figures.
  • FIG. 1 illustrates a white balancing system 100, according to an embodiment of the present invention. The white balancing system 100 may include a setting unit 110, a detection unit 120, a stabilization unit 130, and a white balancing unit 140, for example. In differing embodiments, the white balancing system may be an image processing system such as a digital still camera, a digital video camcorder, and others, for example.
  • In addition, white balancing operations illustrated in FIG. 2 will be described with reference to the white balancing system 100, noting that the reference to the white balancing system 100 is merely used as an example for instructive purposes. The setting unit 110 may set an illuminant detection region for an input image, for example, in operation S210, e.g., by selectively using an exposure integration time (EIT) as a reference to determine the illuminant detection region. The EIT means a time required for collecting light when photographing/taking an image. However, the EIT is not limited to a temporal element, and may be determined by other information which can predict a collected amount of the light when the image is photographed/taken. More specifically, for example, the EIT may be determined by exposure information such as a shutter speed or an aperture value.
  • The EIT may also be provided together with an image to be input either at the time of the image photographing of the image or stored with or for the image for subsequent correction. For example, as the digital still camera attaches the exposure information at the time of the photographing of the image, such as shutter speed or aperture value, to the photographed image as additional data, the EIT may be attached to the image file, or imbedded in the image, as the additional data.
  • Meanwhile, the illuminant detection region represents a range of data to be used to detect an illuminant of the image, an example of which is shown in FIG. 3. The detection unit 120 may detect the illuminant by using data to be contained in an illuminant detection region 320, which may be set by the setting unit 110, in a color gamut 310 of the image, for example, in operation S220.
  • The illuminant detected by the detection unit 120 may reflect distorted information, e.g., in accordance with a deviation between devices or an amount of data sampled for detecting the illuminant, such that the stabilization unit 130 may further correct the detected illuminant so as to correct the distorted information, for example, in operation S230.
  • The white balancing unit 140 may further perform white balancing on the image by use of the corrected illuminant, for example, in operation S240. Since there are diverse known techniques for performing the white balancing on the image, the detailed description thereof will be omitted herein.
  • The example operation of setting the illuminant detection region for the input image can be performed by the setting unit 110 in FIG. 1, and can correspond to the operation S210 of FIG. 2, for example, noting that alternative operations and units for accomplishing the same are equally available.
  • As described above, the setting unit 110 may set the illuminant detection region associated with the EIT. In an embodiment, by statistically analyzing the relationship between the EIT consumed when the image is taken and the point, in which the illuminant exists, in the color gamut of the taken image, any of the candidate regions 410 through 440 having a high possibility that the illuminant exists may be previously set in accordance with the EIT, as shown in FIG. 4. For example, if the EIT of the input image belongs to a range of 0 to N1, the setting unit 110 may set the first candidate region 410 as the illuminant detection region. Meanwhile, if the EIT of the input image belongs to a range of N1 to N2, the setting unit 110 may set the second candidate region 420 as the illuminant detection region. Other candidate regions 430 and 440 may be set as the illuminant detection region associated with the EIT of the input image.
  • According to the embodiment shown in FIG. 4, the candidate regions 410 through 440 may be fixed on chromaticity coordinates, and any one of the fixed candidate regions determined as the illuminant detection region. However, embodiments of the present invention are not limited thereto. For example, an alternative embodiment may be implemented which performs modeling of the point, in which the illuminant of the image exists, on the chromaticity coordinates in accordance with the EIT, and variably sets the illuminant detection region in accordance with the EIT of the input image. Such an embodiment will now be described in detail with reference to FIGS. 5A through 5C.
  • FIG. 5A illustrates chromatic values of illuminants of respective images associated with respective EITs. Such information may be obtained through previous experiments or during operation of a respective camera device or white balancing system.
  • From such information, a modeling of a median chromaticity locus of the illuminant associated with the EIT can be performed. To this end, an average chromatic value of the illuminants corresponding to each EIT can be calculated, and a trend line 510 of the points representing each average chromatic value can be obtained. The obtained trend line can further be projected on the chromaticity coordinates to obtain the median chromaticity locus 520 of the illuminant, as shown in FIG. 5B.
  • The illustrated median chromaticity locus 520 of the illuminant associated with any particular EIT may not be previously set, and the setting unit 110 can obtain the central illuminant point corresponding to the EIT of the image to be input from the median chromaticity locus 520 of the illuminant.
  • In an embodiment, if the central illuminant point is obtained, the setting unit 110 may then set a given range as the illuminant detection region 540 based on the central illuminant point 530, as shown in FIG. 5C. In one embodiment, the illuminant detection region 540 can be set from the central illuminant point 530 to the first threshold distance 550 in the median chromaticity locus 520 of the illuminant, and to the second threshold distance 560 in a direction perpendicular to the median chromaticity locus 520 of the illuminant. Herein, the first threshold distance and the second threshold distance may be set in accordance with the tendency of the chromaticity distribution of the illuminant associated with the EIT, which again may be previously determined by experiment or previous operation.
  • The first threshold distance and the second threshold distance may be determined dynamically in accordance with the EIT. For example, if it is assumed that when the EIT is low, the chromaticity distributed range of the illuminant is narrow, while when the EIT is high, the chromaticity distributed range of the illuminant is wide, at least one of the first threshold distance and the second threshold distance may be altered in accordance with the EIT, so as to reflect this observed tendency. Here, alternate tendencies may also be observed depending on embodiment.
  • According to one embodiment, the setting unit 110 may additionally use a variance of the color gamut of the image to be input and the central point of the color gamut, so as to set the illuminant detection region, as shown in FIG. 6A.
  • In order to obtain the variance of the color gamut, the setting unit 110 may select the illustrated threshold number of data 610-1 through 610-4 in near order from four reference points O, P, Q, and R on the chromaticity coordinates in the color gamut of the image to be input.
  • The four reference points, according to an embodiment of the present invention, include, as shown in FIG. 6B, an origin O (0, 0) of the chromaticity coordinates, a point P (Cr-max, 0) indicative of the maximum Cr value, Cr-max, which can be possessed by a general image on a Cr-axis, a point Q (Cb-max, 0) indicative of the maximum Cb value, Cb-max, which can be possessed by a general image on a Cb-axis, and a coordinate point R (Cr-max, Cb-max) indicative of the maximum Cr value and the maximum Cb value.
  • FIG. 6C illustrates four such reference points, according to alternative embodiment of the present invention, with two reference points O (0, 0) and R(Cr-max, Cb-max) among four reference points being similar to those of the embodiment in FIG. 6B. However, the other two reference points P and Q are cross points formed by the axes of two chromaticity coordinates and an extension of the reference illuminant locus 620. The reference illuminant locus 620 means a trend line based on the chromaticity of diverse types of standard illuminants (e.g., D65, D50, CWF (Cool White Fluorescent), A, Horizon, and others) which are proper to characteristics of a device (e.g., a digital still camera including a white balancing system 100) capturing an image.
  • As illustrated in FIG. 6D, if the threshold number of data is selected in near order from each reference point, the setting unit 110 may determine edge points 630-1, 630-2, 630-3, and 630-4 having the average chromatic value of the data every selected data through each reference point.
  • After that, as illustrated in FIG. 6E, the setting unit 110 may calculate a distance between the edge points derived from the diagonal reference points among four reference points. That is, the setting unit 110 may calculate a distance 640 (referred to as a color gamut height) between the edge point 630-1 derived from the reference point O and the edge point 630-4 derived from the reference point R, and a distance 650 (referred to as a color gamut width) between the edge point 630-2 derived from the reference point P and the edge point 630-3 derived from the reference point Q.
  • Then, the setting unit 110 may determine whether the color gamut height 540 and the color gamut width 650 exist in a given threshold range, respectively. If the color gamut height 540 and the color gamut width 650 satisfy the given threshold range, the setting unit 110 may use the illuminant detection region determined in accordance with the EIT as it is, since it may be considered that the input image has a normal color distribution. However, if the color gamut height 540 and the color gamut width 650 are found to be outside of the threshold range, it can be understood that the input image has an abnormal color distribution, since the color gamut of the input image is excessively wider or narrower than a normal case. In this instance, where the color gamut height 540 and the color gamut width 650 are outside of the threshold range, if a portion of the color gamut of the input image were to be determined to be the illuminant detection region in accordance with the EIT, there is a high possibility that the correct illuminant would not be detected. Accordingly, the setting unit 110 can set the color gamut of the input image as the illuminant detection region irrespective of the EIT, such as through conventional techniques, after such a detection that the color gamut height or the color gamut width is out of the threshold range.
  • Since the variance of the color gamut indicates the uniformity of the color gamut, embodiments of the present invention are not limited to the above described methods of calculating the variance of the color gamut. For example, the setting unit 110 may select the threshold number of data in the color gamut of the input image in near order from four reference points, and predict the variance of the color gamut by use of the distance between the points having the mean chromaticity of the selected data. Alternate methods are also available.
  • Returning to FIG. 6E, in another embodiment, the central point of the color gamut can be determined as a cross point 660 of a segment representing the color gamut height and the color gamut width. The setting unit 110 may determine whether the use of the illuminant detection region determined in accordance with the EIT is appropriate through the central point 660 of the color gamut.
  • For example, modeling can be performed of the point, on which the illuminant of the image exists, on the chromaticity coordinates in accordance with the central point 660 of the color gamut, similar to the method of modeling the point, on which the illuminant of the image exists, on the chromaticity coordinates in accordance with the EIT. That is, the point, on which the illuminant of the image can exist, may be set as a desired number of regions on the chromaticity coordinates in accordance with the central point 660 of the color gamut.
  • Then, it can be determined whether the use of the illuminant detection region determined by the EIT is appropriate, through whether the region on the chromaticity coordinates corresponding to the central point 660 of the color gamut of the input image overlaps the illuminant detection region determined in accordance with the EIT. If the region on the chromaticity coordinates corresponding to the central point 660 of the color gamut is identical or sufficiently similar to the illuminant detection region determined by the EIT, the illuminant detection region determined by the EIT can be used as it is. However, if the region on the chromaticity coordinates corresponding to the central point 660 of the color gamut is not identical or sufficiently similar to the illuminant detection region determined by the EIT, the color gamut of the input image can be set as the illuminant detection region irrespective of the EIT.
  • An operation of detecting the illuminant may, thus, be performed by the detection unit 120 in FIG. 1, for example, and may correspond to operation S220 in FIG. 2, also for example, noting that alternative operations and units for accomplishing the same are equally available. The detection unit 120 may include, as shown in FIG. 7, a division unit 710, a comparative value determination unit 720, and an illuminant estimation unit 730, for example.
  • The division unit 710 may divide the data contained in the illuminant detection region, e.g., as set by the setting unit 110 in the color gamut of an input image, into two groups on the basis of luminance. FIG. 8 illustrates an example operation of the division unit 710, noting that alternative operations and units for accomplishing the same are equally available.
  • In reference to FIG. 8, an average luminance value and a mean luminance value of the data contained in the illuminant detection region may be calculated, in operation S810. Threshold number of data (referred to as superior luminance data) may be further sorted in order of high luminance value among luminance distribution of the data contained in the illuminant detection region, and the same number of data as that of the superior luminance data in order of low luminance value. In an embodiment, the average luminance value Ythresh may be calculated from a mean between an average value of the luminance values of the upper luminance data and an average value of the luminance values of lower luminance data. In addition, the mean luminance value Yavg may be calculated by an average of the luminance values of all data contained in the illuminant detection region.
  • Then, a threshold luminance value may be calculated to be used to divide the data in the illuminant detection region into two groups by using the average luminance value and the mean luminance value, in operation S820. The threshold luminance value may be determined by a weighted sum of the average luminance value of the data contained in the illuminant detection region and the mean luminance value thereof, which may be expressed by the below Equation 1, for example.
    Y thresh =k·Y median+(1−kY avg   Equation 1:
  • Here, Ythresh is a threshold luminance value to be calculated, Ymedian is a median luminance value, and Yavg is an average luminance value. In addition, k may be a weighted value of 0 or 1, for example.
  • If the threshold luminance value is calculated, the data in the illuminant detection region may be divided into two groups on the basis of the threshold luminance value, in operation S830. For example, the data having a luminance more than a threshold luminance value among the data in the illuminant detection region may be classified into the first group, and the data having a luminance less than a threshold luminance value may be classified into the second group.
  • The comparative value determination unit 720 in FIG. 7 may set a comparative illuminant to be a standard of determining the illuminant. FIG. 9 illustrates an example operation of the comparative value determination unit 720, noting that alternative operations and units for accomplishing the same are equally available.
  • In reference to FIG. 9, an average may be calculated of an average chromatic value of the data contained in the illuminant detection region and a mean chromatic value thereof, in operation S910. Here, for example, the average chromatic value may be calculated by the average of the chromatic values of the data contained in the illuminant detection region, and the median chromatic value may be calculated by a chromatic value of the center point of the illuminant detection region.
  • Then, a weighted average may be calculated of the average chromatic value and the median chromatic value, in operation S920, as expressed by the below Equation 2, for example.
    Ch w =m·Ch avg+(1−mCh median   Equation 2:
  • Here, Chw is a weighted average to be calculated, Chavg is an average chromatic value, and Chmedian is a median chromatic value. In addition, m may be a weighted value of 0 or 1, for example.
  • An average may be calculated of chromatic values of the data contained in each of two divided groups, e.g., as divided by the division unit 710, in operation S930. Hereinafter, the average of the chromatic values of the data contained in the first group will be referred to as the first average, and the average of the chromatic values of the data contained in the second group will be referred to as the second average.
  • A difference value may further be calculated between the first average and the second average, in operation S940. Next, a comparative illuminant may be set by using the weighted average Chw, e.g., calculated in the operation S920, the difference value, e.g., calculated in the operation S940, and a standard illuminant (e.g., D65, D50, CWF, A, Horizon, and the others) of a device providing a corresponding image frame (e.g., a digital still camera comprising the white balancing system 100) as an input value, in operation S950. In order to obtain the comparative illuminant, the below Equation 3 may be used, for example.
    W ref(r,b)=F 2(F 1(Ch w ,DEV w),Ch dist)   Equation 3:
  • Here, Wref(r,b) is a chromatic value of the comparative illuminant, Chw is a weighted average calculated in operation S920, for example, DEVw is a standard illuminant, and Chdist is a difference value between the first average and the second average calculated in the operation S940, for example. In addition, F1 may be a quadratic correlation function, and F2 may be a linear correlation function.
  • The correlation function F1 may be a function reflecting a correlation between the standard illuminants and Chw under the standard illuminant, and a substantial comparative estimating function to estimate the point of the illuminant in the image, for example. The correlation function F2 may be a modeling function considering Chdist in the standard illuminant locus function, and a function to compensate a performance of a comparative illuminant estimation of the correlation function F1, for example.
  • The order of functions F1 and F2 can be varied, and F1 and F2 may be set as a quadratic function and a linear function, respectively, as one example of optimizing its complexity. A concrete embodiment of F1 and F2 can be easily understood through the below Equations 4 through 6, for example.
    Σ(|DEV w −α*Ch w 2 −β*Ch w−γ|)≅0   Equation 4:
    F 1 =α*Ch w 2 +β*Ch w+65   Equation 5:
    F 2=θ*(F 1(Ch w ,DEV wCh dist)+ζ  Equation 6:
  • Here, in Equations 4 through 6, α, β, γ, θ, and ζ are a certain real number, and may be determined as a proper value based on experiment results. For example, α, β, and γ may preferably exist as in the relation in Equation 4, noting that alternative embodiments are equally available.
  • Referring again to FIG. 7, in identifying the initial illuminant, the illuminant estimation unit 730 may determine the point, in which a chromatic difference between the comparative illuminant, e.g., as determined in the above operation S950, and any one of the first average and the second average, e.g., as calculated in the above operation S940, as the initial illuminant. That is, the illuminant estimation unit 730 may calculate a chromatic difference (referred to as first chromatic difference) between the first average and the comparative illuminant, and a chromatic difference (referred to as second chromatic difference) between the second average and the comparative illuminant, and compare the first chromatic difference and the second chromatic difference. If the first chromatic difference is smaller than the second chromatic difference, the first average may be determined to be the initial illuminant, while if the second chromatic difference is smaller than the first chromatic difference, the second average may be determined to be the initial illuminant.
  • With reference to FIG. 10, distorted information may be reflected in the illuminant detected by the detection unit 120 in accordance with the deflection between devices providing the image or an amount of data sampled to detect the illuminant. For example, if the illuminant is detected according to the data contained in the illuminant detection region, the position of the illuminant detection region may be determined to be in the specified range in accordance with the EIT, but the amount of data or the chromaticity information to be input into the illuminant detection region may be varied depending upon the chromaticity variation of the device providing the image. In this instance, since the illustrated probability distribution 1020 in which the illuminant information exists in the illuminant detection region 1010 may not conform to the reference illuminant locus axis 1030, it becomes desirable to provide a new illuminant locus axis 1040 by adjusting the reference illuminant locus axis 1030.
  • In order to compensate this distortion phenomenon, the stabilization unit 130, for example, may stabilize the initial illuminant detected by the detection unit 120 based on the reference illuminant locus and the average chromatic value of data input into the illuminant detection region in the color gamut of the input image. FIG. 11 illustrates an example operation of the stabilization unit 130, noting that alternative operations and units for accomplishing the same are equally available.
  • In reference to FIG. 11, the stabilization unit 130 may calculate a weighted average point between the average chromatic value of data input into the illuminant detection region in the color gamut of the image and the chromatic value of the initial illuminant, in operation S1110. The weighted average may be expressed by the below Equation 7, for example.
    W avg =N·W i+(1−NCh avg   Equation 7:
  • Here, Wavg is a weighted value to be calculated, Wi is a chromatic value of the initial illuminant, e.g., as determined by the illuminant estimation unit 730, and Chavg is an average of chromatic values of the data contained in the illuminant detection region. Chavg is also used in Equation 2. In addition, N may be a weighted value of 0 or 1, for example.
  • As shown in FIG. 12A, a new illuminant locus 1230 which contains a weighted average point 1210 and is parallel with the reference illuminant locus 1220 may then be set, in operation S1120.
  • Further, as shown in FIG. 12B, a point 1250 in which the initial illuminant 1240 is projected on the new illuminant locus 1230 in a vertical direction may be determined, in operation S1130.
  • In embodiments of the present invention, the term “unit” indicating a respective component of the white balancing system 100, for example, may be constructed as a module, for example. Here, the term “module”, as used herein, means, but is not limited to, a software and/or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks. A module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors. Thus, a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The operation provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules.
  • In addition to the above described embodiments, embodiments of the present invention can also be implemented through computer readable code/instructions in/on a medium, e.g., a computer readable medium, to control at least one processing element to implement any above described embodiment. The medium can correspond to any medium/media permitting the storing and/or transmission of the computer readable code.
  • The computer readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as carrier waves, as well as through the Internet, for example. Thus, the medium may further be a signal, such as a resultant signal or bitstream, according to embodiments of the present invention. The media may also be a distributed network, so that the computer readable code is stored/transferred and executed in a distributed fashion. Still further, as only an example, the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
  • In accordance with the above description, one or more embodiments of the present invention include a white balancing method, medium, and system, where the color reproducing performance can be improved through more stabilized illuminant estimation.
  • Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (31)

1. A method with white balancing, comprising:
setting an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image captured; and
detecting an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
2. The method of claim 1, wherein the setting of the illuminant detection region comprises determining a candidate region which corresponds to the exposure integration time, from among a plurality of predefined candidate regions set in accordance with respective exposure integration times, as the illuminant detection region.
3. The method of claim 1, wherein the setting of the illuminant detection region comprises determining a given range, as the illuminant detection region, from a point corresponding to the exposure integration time in a specified locus on predefined chromaticity coordinates set based on a degree of illuminant distribution according to respective exposure integration times.
4. The method of claim 3, wherein the given range is variable depending upon the exposure integration time.
5. The method of claim 1, wherein the setting of the illuminant detection region further comprises determining the illuminant detection region by using at least one of a variance of the color gamut of the image and a center point.
6. The method of claim 5, wherein the determining of the illuminant detection region comprises adjusting the illuminant detection region based on whether the variance of the color gamut satisfies a specified threshold range.
7. The method of claim 1, wherein the detecting of the illuminant comprises:
dividing the data relative to the illuminant detection region into a first group and a second group in accordance with luminance distribution of the data relative to the illuminant detection region in the color gamut of the image; and
determining at least one of a first average luminance value of data relative to the first group and a second average luminance value of data relative to the second group as the illuminant.
8. The method of claim 7, wherein the dividing comprises dividing the data relative to the illuminant detection region into the first group and the second group based on a threshold luminance value derived from a weighted sum of an average luminance value of the data relative to the illuminant detection region and a median luminance value of the illuminant detection region.
9. The method of claim 7, wherein the determining of the at least one of the first average luminance value and the second average luminance value comprises determining any one of the first average luminance value and the second average luminance value as the illuminant, in which when a chromatic difference between the first average luminance value and a given comparative illuminant and a chromatic difference between the second average luminance value and the given comparative illuminant are compared, based on a respective lowest chromatic difference.
10. The method of claim 9, wherein the comparative illuminant is obtained from a correlation function using at least one among a weighted average value between an average chromatic value of the data relative to the illuminant detection region and a median chromatic value of the illuminant detection region, a difference value between the first average luminance value and the second average luminance value, and standard illuminant information a system which captured the image, as an input value.
11. The method of claim 10, wherein the correlation function is a linear correlation function using a resultant value of a sub-correlation function and the difference value as an input value, and the sub-correlation function is a quadratic correlation function using the weighted average value and the standard illuminant information as an input value.
12. The method of claim 1, further comprising compensating the determined illuminant.
13. The method of claim 12, wherein the compensating comprises modifying the determined illuminant by setting a point, in which the determined illuminant is perpendicularly reflected on a given illuminant locus, as the determined illuminant.
14. The method of claim 13, wherein the given illuminant locus is parallel to a trend line of a plurality of standard illuminants associated with a characteristic of a system which captured the image, and has a weighted average point of a chromatic value of the determined illuminant and an average chromatic value of the data relative to the illuminant detection region.
15. The method of claim 1, further comprising performing white balancing on the image based on the determined illuminant.
16. At least one medium comprising computer readable code to control at least one processing element to implement the method of claim 1.
17. A system, comprising:
a setting unit to set an illuminant detection region of an image based on an exposure integration time indicative of an amount of light collected for the image when the image was captured; and
a detection unit to detect an illuminant of the image by using data relative to the illuminant detection region in a color gamut of the image.
18. The system of claim 17, wherein the setting unit determines a candidate region which corresponds to the exposure integration time, from among a plurality of predefined candidate regions set in accordance with respective exposure integration times, as the illuminant detection region.
19. The system of claim 17 wherein the setting unit determines a given range, as the illuminant detection region, from a point corresponding to the exposure integration time in a specified locus on predefined chromaticity coordinates set based on a degree of illuminant distribution according to respective exposure integration times.
20. The system of claim 19, wherein the given range is variable depending upon the exposure integration time.
21. The system of claim 17, wherein the setting unit determines the illuminant detection region by using at least one of a variance of the color gamut of the image and a center point.
22. The system of claim 21, wherein the setting unit adjusts the illuminant detection region according to whether the variance of the color gamut satisfies a specified threshold range.
23. The system of claim 17, wherein the setting unit comprises:
a division unit to divide the data relative to the illuminant detection region into a first group and a second group in accordance with luminance distribution of the data relative to the illuminant detection region in the color gamut of the image; and
an illuminant determination unit to determine at least one of a first average luminance value of data relative to the first group and a second average luminance value of the data relative to the second group as the illuminant.
24. The system of claim 23, wherein the division unit divides the data relative to the illuminant detection region into the first group and the second group based on a threshold luminance value derived from a weighted sum of an average luminance value of the data relative to the illuminant detection region and a median luminance value of the illuminant detection region.
25. The system of claim 23, wherein the illuminant determination unit determines any one of the first average luminance value and the second average luminance value as the illuminant, in which when a chromatic difference between the first average luminance value and a given comparative illuminant and a chromatic difference between the second average luminance value and the given comparative illuminant are compared, based on a respective lowest chromatic difference.
26. The system of claim 25, wherein the comparative illuminant is obtained from a correlation function using at least one among a weighted average value between an average chromatic value of the data relative to the illuminant detection region and a median chromatic value of the illuminant detection region, a difference value between the first average luminance value and the second average luminance value, and standard illuminant information of a system which captured the image, as an input value.
27. The system of claim 26, wherein the correlation function is a linear correlation function using a resultant value of a sub-correlation function and the difference value as an input value, and the sub-correlation function is a quadratic correlation function using the weighted average value and the standard illuminant information as an input value.
28. The system of claim 17, further comprising a stabilization unit compensating the determined illuminant.
29. The system of claim 28, wherein the stabilization unit modifies the determined illuminant by setting a point, in which the determined illuminant is perpendicularly reflected on a given illuminant locus, as the determined illuminant.
30. The system of claim 29, wherein the given illuminant locus is parallel to a trend line of a plurality of standard illuminants associated with a characteristic of a system which captured the image, and has a weighted average point of a chromatic value of the determined illuminant and an average chromatic value of the data relative to the illuminant detection region.
31. The system of claim 17, further comprising a white balancing unit which performs white balancing on the image based on the determined illuminant.
US11/802,225 2006-05-26 2007-05-21 Automatic white balancing method, medium, and system Active 2031-05-20 US8624995B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020060047751A KR100809344B1 (en) 2006-05-26 2006-05-26 Method and apparatus for auto white balancing
KR10-2006-0047751 2006-05-26

Publications (2)

Publication Number Publication Date
US20070285530A1 true US20070285530A1 (en) 2007-12-13
US8624995B2 US8624995B2 (en) 2014-01-07

Family

ID=38821509

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/802,225 Active 2031-05-20 US8624995B2 (en) 2006-05-26 2007-05-21 Automatic white balancing method, medium, and system

Country Status (3)

Country Link
US (1) US8624995B2 (en)
JP (1) JP4472722B2 (en)
KR (1) KR100809344B1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080143845A1 (en) * 2006-12-14 2008-06-19 Takanori Miki Image capturing apparatus and white balance processing apparatus
US20090002519A1 (en) * 2007-06-29 2009-01-01 Tomokazu Nakamura Image processing method, apparatus and computer program product, and imaging apparatus, method and computer program product
US20110096190A1 (en) * 2009-10-27 2011-04-28 Nvidia Corporation Automatic white balancing for photography
US20120224084A1 (en) * 2007-12-03 2012-09-06 Zhaojian Li Image Sensor Apparatus And Method For Scene Illuminant Estimation
US8373718B2 (en) 2008-12-10 2013-02-12 Nvidia Corporation Method and system for color enhancement with color volume adjustment and variable shift along luminance axis
US8456549B2 (en) 2005-11-09 2013-06-04 Nvidia Corporation Using a graphics processing unit to correct video and audio data
US8471852B1 (en) 2003-05-30 2013-06-25 Nvidia Corporation Method and system for tessellation of subdivision surfaces
US8570634B2 (en) 2007-10-11 2013-10-29 Nvidia Corporation Image processing of an incoming light field using a spatial light modulator
US8571346B2 (en) 2005-10-26 2013-10-29 Nvidia Corporation Methods and devices for defective pixel detection
US8588542B1 (en) 2005-12-13 2013-11-19 Nvidia Corporation Configurable and compact pixel processing apparatus
US8594441B1 (en) 2006-09-12 2013-11-26 Nvidia Corporation Compressing image-based data using luminance
US8698908B2 (en) 2008-02-11 2014-04-15 Nvidia Corporation Efficient method for reducing noise and blur in a composite still image from a rolling shutter camera
US8712183B2 (en) 2009-04-16 2014-04-29 Nvidia Corporation System and method for performing image correction
US8723969B2 (en) 2007-03-20 2014-05-13 Nvidia Corporation Compensating for undesirable camera shakes during video capture
US8724895B2 (en) 2007-07-23 2014-05-13 Nvidia Corporation Techniques for reducing color artifacts in digital images
US8737832B1 (en) 2006-02-10 2014-05-27 Nvidia Corporation Flicker band automated detection system and method
US8780128B2 (en) 2007-12-17 2014-07-15 Nvidia Corporation Contiguously packed data
US8854709B1 (en) * 2013-05-08 2014-10-07 Omnivision Technologies, Inc. Automatic white balance based on dynamic mapping
US20150287366A1 (en) * 2012-11-30 2015-10-08 Nec Corporation Image display device and image display method
US9177368B2 (en) 2007-12-17 2015-11-03 Nvidia Corporation Image distortion correction
US9379156B2 (en) 2008-04-10 2016-06-28 Nvidia Corporation Per-channel image intensity correction
US9418400B2 (en) 2013-06-18 2016-08-16 Nvidia Corporation Method and system for rendering simulated depth-of-field visual effect
US10319276B2 (en) 2016-10-24 2019-06-11 Samsung Electronics Co., Ltd. Display apparatus and calibration method thereof

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101024067B1 (en) * 2008-12-30 2011-03-22 엠텍비젼 주식회사 Apparatus for auto white balancing, method for auto white balancing considering auto-exposure time and recorded medium for performing method for auto white balancing
KR101509992B1 (en) * 2013-11-28 2015-04-08 창원대학교 산학협력단 A method and apparatus for illuminant compensation based on highlight region selection

Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5319758A (en) * 1989-02-01 1994-06-07 Hitachi, Ltd. Method for managing multiple virtual storages divided into families
US5448502A (en) * 1991-06-20 1995-09-05 Matsushita Electric Industrial Co., Ltd. Devices for judging image on the basis of gray world assumption, discriminating color chart, deducing light source and regulating color
US5530474A (en) * 1991-09-05 1996-06-25 Canon Kabushiki Kaisha White balance correction device with correction signal limiting device
US5684359A (en) * 1994-06-06 1997-11-04 Matsushita Electric Industrial Co., Ltd. Discharge lamp and illumination instrument for general illumination
US6069632A (en) * 1997-07-03 2000-05-30 International Business Machines Corporation Passageway properties: customizable protocols for entry and exit of places
US20010037316A1 (en) * 2000-03-23 2001-11-01 Virtunality, Inc. Method and system for securing user identities and creating virtual users to enhance privacy on a communication network
US6359651B1 (en) * 1998-10-06 2002-03-19 Nikon Corporation Electronic camera using flash for exposure control
US6421047B1 (en) * 1996-09-09 2002-07-16 De Groot Marc Multi-user virtual reality system for simulating a three-dimensional environment
US20020129106A1 (en) * 2001-03-12 2002-09-12 Surgency, Inc. User-extensible system for manipulating information in a collaborative environment
US20020152268A1 (en) * 2001-03-19 2002-10-17 Microsoft Corporation System and method for communications management and data exchange
US6504952B1 (en) * 1998-03-17 2003-01-07 Fuji Photo Film Co. Ltd. Image processing method and apparatus
US20030014524A1 (en) * 2001-07-11 2003-01-16 Alexander Tormasov Balancing shared servers in virtual environments
US20030020826A1 (en) * 2001-06-25 2003-01-30 Nasser Kehtarnavaz Automatic white balancing via illuminant scoring autoexposure by neural network mapping
US6572662B2 (en) * 1998-05-15 2003-06-03 International Business Machines Corporation Dynamic customized web tours
US20030115132A1 (en) * 2000-07-17 2003-06-19 Iggland Benny Ingemar Method for virtual trading
US20030177195A1 (en) * 2000-08-08 2003-09-18 Kyungsook Han Multimedia communication method using virtual world interface in mobile personal computers
US6629112B1 (en) * 1998-12-31 2003-09-30 Nortel Networks Limited Resource management for CORBA-based applications
US6665434B1 (en) * 1999-03-11 2003-12-16 Fuji Photo Film Co., Ltd. Device, method, and recordium for correcting color imbalance of an image
US6677976B2 (en) * 2001-10-16 2004-01-13 Sprint Communications Company, LP Integration of video telephony with chat and instant messaging environments
US20040054740A1 (en) * 2002-09-17 2004-03-18 Daigle Brian K. Extending functionality of instant messaging (IM) systems
US20040068518A1 (en) * 2002-10-03 2004-04-08 International Business Machines Corporation Layered virtual identity system and method
US6788813B2 (en) * 2000-10-27 2004-09-07 Sony Corporation System and method for effectively performing a white balance operation
US6798407B1 (en) * 2000-11-28 2004-09-28 William J. Benman System and method for providing a functional virtual environment with real time extracted and transplanted images
US6912565B1 (en) * 1997-10-22 2005-06-28 British Telecommunications Public Limited Company Distributed virtual environment
US20050216558A1 (en) * 2004-03-12 2005-09-29 Prototerra, Inc. System and method for client side managed data prioritization and connections
US20060078182A1 (en) * 2004-01-07 2006-04-13 Gil Zwirn Methods and apparatus for analyzing ultrasound images
US20060123127A1 (en) * 2004-12-08 2006-06-08 Evil Twin Studios, Inc. System and method for organizing online communities and virtual dwellings within a virtual environment
US20060176379A1 (en) * 2005-02-09 2006-08-10 Fuji Photo Film Co., Ltd. White balance control method, white balance control apparatus and image-taking apparatus
US20060181535A1 (en) * 2003-07-22 2006-08-17 Antics Technologies Limited Apparatus for controlling a virtual environment
US7155680B2 (en) * 2000-12-27 2006-12-26 Fujitsu Limited Apparatus and method for providing virtual world customized for user
US20070025718A1 (en) * 2005-07-29 2007-02-01 Keiichi Mori Digital camera, image capture method, and image capture control program
US20070126733A1 (en) * 2005-12-02 2007-06-07 Electronics And Telecommunications Research Institute Apparatus and method for immediately creating and controlling virtual reality interactive human body model for user-centric interface
US20070130001A1 (en) * 2005-11-18 2007-06-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world profile data for making virtual world contacts
US7240067B2 (en) * 2000-02-08 2007-07-03 Sybase, Inc. System and methodology for extraction and aggregation of data from dynamic content
US20070203828A1 (en) * 2005-02-04 2007-08-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world incentives offered to virtual world participants
US7269632B2 (en) * 2001-06-05 2007-09-11 Xdyne, Inc. Networked computer system for communicating and operating in a virtual reality environment
US20070294171A1 (en) * 2006-06-06 2007-12-20 Eric Sprunk Method and apparatus for providing a virtual universe
US20080005237A1 (en) * 2006-06-28 2008-01-03 The Boeing Company. System and method of communications within a virtual environment
US20080059570A1 (en) * 2006-09-05 2008-03-06 Aol Llc Enabling an im user to navigate a virtual world
US7352895B2 (en) * 2003-09-30 2008-04-01 Sharp Laboratories Of America, Inc. Systems and methods for illuminant model estimation
US20090089684A1 (en) * 2007-10-01 2009-04-02 Boss Gregory J Systems, methods, and media for temporal teleport in a virtual world environment
US20090106347A1 (en) * 2007-10-17 2009-04-23 Citrix Systems, Inc. Methods and systems for providing access, from within a virtual world, to an external resource
US7728880B2 (en) * 2004-06-25 2010-06-01 Qualcomm Incorporated Automatic white balance method and apparatus
US7996818B1 (en) * 2006-12-29 2011-08-09 Amazon Technologies, Inc. Method for testing using client specified references

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2506584B2 (en) 1991-06-20 1996-06-12 松下電器産業株式会社 Image judgment device
JP3184573B2 (en) 1991-09-19 2001-07-09 キヤノン株式会社 Imaging device
JP2000102030A (en) 1998-09-25 2000-04-07 Ricoh Co Ltd White balance controller
JP4029206B2 (en) 2001-01-15 2008-01-09 株式会社ニコン Imaging device
US7158174B2 (en) 2002-04-04 2007-01-02 Eastman Kodak Company Method for automatic white balance of digital images
JP4352730B2 (en) 2003-03-12 2009-10-28 セイコーエプソン株式会社 Auto white balance processing apparatus and method, and image signal processing apparatus
JP2005033609A (en) 2003-07-08 2005-02-03 Fuji Film Microdevices Co Ltd Solid-state image-taking device and digital camera
JP2005109930A (en) 2003-09-30 2005-04-21 Fuji Photo Film Co Ltd Image processor, image processing program, recording medium and image processing method
JP2005223898A (en) * 2004-01-09 2005-08-18 Sony Corp Image processing method and imaging apparatus
JP2005236375A (en) 2004-02-17 2005-09-02 Seiko Epson Corp Digital camera, control method and control program thereof
JP2006033158A (en) 2004-07-13 2006-02-02 Canon Inc Imaging apparatus

Patent Citations (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5319758A (en) * 1989-02-01 1994-06-07 Hitachi, Ltd. Method for managing multiple virtual storages divided into families
US5448502A (en) * 1991-06-20 1995-09-05 Matsushita Electric Industrial Co., Ltd. Devices for judging image on the basis of gray world assumption, discriminating color chart, deducing light source and regulating color
US5530474A (en) * 1991-09-05 1996-06-25 Canon Kabushiki Kaisha White balance correction device with correction signal limiting device
US5684359A (en) * 1994-06-06 1997-11-04 Matsushita Electric Industrial Co., Ltd. Discharge lamp and illumination instrument for general illumination
US6421047B1 (en) * 1996-09-09 2002-07-16 De Groot Marc Multi-user virtual reality system for simulating a three-dimensional environment
US6069632A (en) * 1997-07-03 2000-05-30 International Business Machines Corporation Passageway properties: customizable protocols for entry and exit of places
US6912565B1 (en) * 1997-10-22 2005-06-28 British Telecommunications Public Limited Company Distributed virtual environment
US6504952B1 (en) * 1998-03-17 2003-01-07 Fuji Photo Film Co. Ltd. Image processing method and apparatus
US6572662B2 (en) * 1998-05-15 2003-06-03 International Business Machines Corporation Dynamic customized web tours
US6359651B1 (en) * 1998-10-06 2002-03-19 Nikon Corporation Electronic camera using flash for exposure control
US6629112B1 (en) * 1998-12-31 2003-09-30 Nortel Networks Limited Resource management for CORBA-based applications
US6665434B1 (en) * 1999-03-11 2003-12-16 Fuji Photo Film Co., Ltd. Device, method, and recordium for correcting color imbalance of an image
US7240067B2 (en) * 2000-02-08 2007-07-03 Sybase, Inc. System and methodology for extraction and aggregation of data from dynamic content
US20090024636A1 (en) * 2000-03-23 2009-01-22 Dekel Shiloh Method and system for securing user identities and creating virtual users to enhance privacy on a communication network
US7412422B2 (en) * 2000-03-23 2008-08-12 Dekel Shiloh Method and system for securing user identities and creating virtual users to enhance privacy on a communication network
US20010037316A1 (en) * 2000-03-23 2001-11-01 Virtunality, Inc. Method and system for securing user identities and creating virtual users to enhance privacy on a communication network
US20030115132A1 (en) * 2000-07-17 2003-06-19 Iggland Benny Ingemar Method for virtual trading
US20030177195A1 (en) * 2000-08-08 2003-09-18 Kyungsook Han Multimedia communication method using virtual world interface in mobile personal computers
US6788813B2 (en) * 2000-10-27 2004-09-07 Sony Corporation System and method for effectively performing a white balance operation
US6798407B1 (en) * 2000-11-28 2004-09-28 William J. Benman System and method for providing a functional virtual environment with real time extracted and transplanted images
US7155680B2 (en) * 2000-12-27 2006-12-26 Fujitsu Limited Apparatus and method for providing virtual world customized for user
US20020129106A1 (en) * 2001-03-12 2002-09-12 Surgency, Inc. User-extensible system for manipulating information in a collaborative environment
US20020152268A1 (en) * 2001-03-19 2002-10-17 Microsoft Corporation System and method for communications management and data exchange
US7269632B2 (en) * 2001-06-05 2007-09-11 Xdyne, Inc. Networked computer system for communicating and operating in a virtual reality environment
US20030020826A1 (en) * 2001-06-25 2003-01-30 Nasser Kehtarnavaz Automatic white balancing via illuminant scoring autoexposure by neural network mapping
US20030014524A1 (en) * 2001-07-11 2003-01-16 Alexander Tormasov Balancing shared servers in virtual environments
US6677976B2 (en) * 2001-10-16 2004-01-13 Sprint Communications Company, LP Integration of video telephony with chat and instant messaging environments
US20040054740A1 (en) * 2002-09-17 2004-03-18 Daigle Brian K. Extending functionality of instant messaging (IM) systems
US20040068518A1 (en) * 2002-10-03 2004-04-08 International Business Machines Corporation Layered virtual identity system and method
US20060181535A1 (en) * 2003-07-22 2006-08-17 Antics Technologies Limited Apparatus for controlling a virtual environment
US7352895B2 (en) * 2003-09-30 2008-04-01 Sharp Laboratories Of America, Inc. Systems and methods for illuminant model estimation
US20060078182A1 (en) * 2004-01-07 2006-04-13 Gil Zwirn Methods and apparatus for analyzing ultrasound images
US20050216558A1 (en) * 2004-03-12 2005-09-29 Prototerra, Inc. System and method for client side managed data prioritization and connections
US7728880B2 (en) * 2004-06-25 2010-06-01 Qualcomm Incorporated Automatic white balance method and apparatus
US20060123127A1 (en) * 2004-12-08 2006-06-08 Evil Twin Studios, Inc. System and method for organizing online communities and virtual dwellings within a virtual environment
US20070203828A1 (en) * 2005-02-04 2007-08-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world incentives offered to virtual world participants
US20060176379A1 (en) * 2005-02-09 2006-08-10 Fuji Photo Film Co., Ltd. White balance control method, white balance control apparatus and image-taking apparatus
US20070025718A1 (en) * 2005-07-29 2007-02-01 Keiichi Mori Digital camera, image capture method, and image capture control program
US20070130001A1 (en) * 2005-11-18 2007-06-07 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Real-world profile data for making virtual world contacts
US20070126733A1 (en) * 2005-12-02 2007-06-07 Electronics And Telecommunications Research Institute Apparatus and method for immediately creating and controlling virtual reality interactive human body model for user-centric interface
US20070294171A1 (en) * 2006-06-06 2007-12-20 Eric Sprunk Method and apparatus for providing a virtual universe
US20080005237A1 (en) * 2006-06-28 2008-01-03 The Boeing Company. System and method of communications within a virtual environment
US20080059570A1 (en) * 2006-09-05 2008-03-06 Aol Llc Enabling an im user to navigate a virtual world
US7996818B1 (en) * 2006-12-29 2011-08-09 Amazon Technologies, Inc. Method for testing using client specified references
US20090089684A1 (en) * 2007-10-01 2009-04-02 Boss Gregory J Systems, methods, and media for temporal teleport in a virtual world environment
US20090106347A1 (en) * 2007-10-17 2009-04-23 Citrix Systems, Inc. Methods and systems for providing access, from within a virtual world, to an external resource

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8471852B1 (en) 2003-05-30 2013-06-25 Nvidia Corporation Method and system for tessellation of subdivision surfaces
US8571346B2 (en) 2005-10-26 2013-10-29 Nvidia Corporation Methods and devices for defective pixel detection
US8456549B2 (en) 2005-11-09 2013-06-04 Nvidia Corporation Using a graphics processing unit to correct video and audio data
US8456547B2 (en) 2005-11-09 2013-06-04 Nvidia Corporation Using a graphics processing unit to correct video and audio data
US8456548B2 (en) 2005-11-09 2013-06-04 Nvidia Corporation Using a graphics processing unit to correct video and audio data
US8588542B1 (en) 2005-12-13 2013-11-19 Nvidia Corporation Configurable and compact pixel processing apparatus
US8768160B2 (en) 2006-02-10 2014-07-01 Nvidia Corporation Flicker band automated detection system and method
US8737832B1 (en) 2006-02-10 2014-05-27 Nvidia Corporation Flicker band automated detection system and method
US8594441B1 (en) 2006-09-12 2013-11-26 Nvidia Corporation Compressing image-based data using luminance
US7688358B2 (en) * 2006-12-14 2010-03-30 Eastman Kodak Company Image capturing apparatus and white balance processing apparatus
US20080143845A1 (en) * 2006-12-14 2008-06-19 Takanori Miki Image capturing apparatus and white balance processing apparatus
US8723969B2 (en) 2007-03-20 2014-05-13 Nvidia Corporation Compensating for undesirable camera shakes during video capture
US8106961B2 (en) * 2007-06-29 2012-01-31 Fujifilm Corporation Image processing method, apparatus and computer program product, and imaging apparatus, method and computer program product
US8462228B2 (en) 2007-06-29 2013-06-11 Fujifilm Corporation Image processing method, apparatus and computer program product, and imaging apparatus, method and computer program product
US20090002519A1 (en) * 2007-06-29 2009-01-01 Tomokazu Nakamura Image processing method, apparatus and computer program product, and imaging apparatus, method and computer program product
US8724895B2 (en) 2007-07-23 2014-05-13 Nvidia Corporation Techniques for reducing color artifacts in digital images
US8570634B2 (en) 2007-10-11 2013-10-29 Nvidia Corporation Image processing of an incoming light field using a spatial light modulator
US8620074B2 (en) * 2007-12-03 2013-12-31 Omnivision Technologies, Inc. Image sensor apparatus and method for scene illuminant estimation
US20120224084A1 (en) * 2007-12-03 2012-09-06 Zhaojian Li Image Sensor Apparatus And Method For Scene Illuminant Estimation
US8768055B2 (en) 2007-12-03 2014-07-01 Omnivision Technologies, Inc. Image sensor apparatus and method for scene illuminant estimation
US9177368B2 (en) 2007-12-17 2015-11-03 Nvidia Corporation Image distortion correction
US8780128B2 (en) 2007-12-17 2014-07-15 Nvidia Corporation Contiguously packed data
US8698908B2 (en) 2008-02-11 2014-04-15 Nvidia Corporation Efficient method for reducing noise and blur in a composite still image from a rolling shutter camera
US9379156B2 (en) 2008-04-10 2016-06-28 Nvidia Corporation Per-channel image intensity correction
US8373718B2 (en) 2008-12-10 2013-02-12 Nvidia Corporation Method and system for color enhancement with color volume adjustment and variable shift along luminance axis
US8712183B2 (en) 2009-04-16 2014-04-29 Nvidia Corporation System and method for performing image correction
US9414052B2 (en) 2009-04-16 2016-08-09 Nvidia Corporation Method of calibrating an image signal processor to overcome lens effects
US8749662B2 (en) 2009-04-16 2014-06-10 Nvidia Corporation System and method for lens shading image correction
US8698918B2 (en) * 2009-10-27 2014-04-15 Nvidia Corporation Automatic white balancing for photography
US20110096190A1 (en) * 2009-10-27 2011-04-28 Nvidia Corporation Automatic white balancing for photography
US20150287366A1 (en) * 2012-11-30 2015-10-08 Nec Corporation Image display device and image display method
US9548030B2 (en) * 2012-11-30 2017-01-17 Nec Corporation Image display device and image display method
US8854709B1 (en) * 2013-05-08 2014-10-07 Omnivision Technologies, Inc. Automatic white balance based on dynamic mapping
US9418400B2 (en) 2013-06-18 2016-08-16 Nvidia Corporation Method and system for rendering simulated depth-of-field visual effect
US10319276B2 (en) 2016-10-24 2019-06-11 Samsung Electronics Co., Ltd. Display apparatus and calibration method thereof
US10593249B2 (en) 2016-10-24 2020-03-17 Samsung Electrons Co., Ltd. Display apparatus and calibration method thereof

Also Published As

Publication number Publication date
US8624995B2 (en) 2014-01-07
KR20070113890A (en) 2007-11-29
JP4472722B2 (en) 2010-06-02
JP2007318747A (en) 2007-12-06
KR100809344B1 (en) 2008-03-05

Similar Documents

Publication Publication Date Title
US8624995B2 (en) Automatic white balancing method, medium, and system
US7362356B2 (en) White balance correction including indicative white color determination based on regions in a divided image
US8089525B2 (en) White balance control device and white balance control method
US7286703B2 (en) Image correction apparatus, method and program
US8184181B2 (en) Image capturing system and computer readable recording medium for recording image processing program
US7362895B2 (en) Image processing apparatus, image-taking system and image processing method
US20120106870A1 (en) Registration of Separations
US20080266417A1 (en) White balance adjusting device, imaging apparatus, and recording medium storing white balance adjusting program
US9036045B2 (en) Image processing apparatus and image processing method
US8890973B2 (en) Image processing apparatus, image processing method, and storage medium
US8154630B2 (en) Image processing apparatus, image processing method, and computer readable storage medium which stores image processing program
CN114827565A (en) Color correction matrix determining method, color correction device and storage medium
US8155440B2 (en) Image processing apparatus and image processing method
US8374426B2 (en) Apparatus and method for adjusting auto white balance using effective area
JP4038976B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium storing image processing program
US8115834B2 (en) Image processing device, image processing program and image processing method
JP4718113B2 (en) White balance adjustment method and image processing apparatus
JP4960597B2 (en) White balance correction apparatus and method, and imaging apparatus
KR102315200B1 (en) Image processing apparatus for auto white balance and processing method therefor
US20100177210A1 (en) Method for adjusting white balance
KR101227082B1 (en) Apparatus and method for color balancing of multi image stitch
KR20110047540A (en) Digital camera and controlling method thereof
JP2003259206A (en) Device and method for processing signal, recording medium and program
JP2012161046A (en) Image processing apparatus and control method of image processing apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, SUNG-SU;KANG, BYOUNG-HO;LEE, SEONG-DEOK;REEL/FRAME:019387/0054

Effective date: 20070516

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8