US6873729B2 - Method, apparatus and computer program product for processing image data - Google Patents

Method, apparatus and computer program product for processing image data Download PDF

Info

Publication number
US6873729B2
US6873729B2 US09/903,577 US90357701A US6873729B2 US 6873729 B2 US6873729 B2 US 6873729B2 US 90357701 A US90357701 A US 90357701A US 6873729 B2 US6873729 B2 US 6873729B2
Authority
US
United States
Prior art keywords
image data
condition
histogram
image
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US09/903,577
Other versions
US20020024609A1 (en
Inventor
Yuki Matsushima
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Assigned to RICOH COMPANY, LTD. reassignment RICOH COMPANY, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUSHIMA, YUKI
Publication of US20020024609A1 publication Critical patent/US20020024609A1/en
Priority to US11/064,233 priority Critical patent/US7006692B2/en
Application granted granted Critical
Publication of US6873729B2 publication Critical patent/US6873729B2/en
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/407Control or modification of tonal gradation or of extreme levels, e.g. background level
    • H04N1/4072Control or modification of tonal gradation or of extreme levels, e.g. background level dependent on the contents of the original
    • H04N1/4074Control or modification of tonal gradation or of extreme levels, e.g. background level dependent on the contents of the original using histograms

Definitions

  • the present invention relates to an image data processing system, and more particularly, to a method, apparatus and computer program product for processing image data acquired by an image capturing device, such as a digital camera, etc.
  • Japanese Patent Laid-Open Publication No. 4-168879 discloses an image forming apparatus in which the printing operation is performed based on a video signal.
  • video signals are sampled and an image is determined to be overexposed when a number of samples, which have a value equal to a specified threshold value TH or greater, is equal to a predetermined value NH or greater.
  • a tone conversion of the video signal is then performed using a suitable tone conversion curve. It is determined that the image is photographed with flash light when the number of samples, which have a value equal to a specified threshold value TL ( ⁇ TH) or lower, is equal to a predetermined value NL or greater.
  • the tone conversion of the video signal is performed using a suitable tone conversion curve.
  • Japanese Patent Laid-Open Publication No. 63-184473 discloses an image forming apparatus in which the luminance histogram of image data is used to switch the tone correction table based on the luminance condition of the image data.
  • Two histograms are generated i.e., one for a high luminance region and the other for a low luminance region, to reduce the memory capacity required for generating the histogram.
  • An image capturing device such as a digital camera, generally includes an automatic exposure control mechanism to obtain an optimum exposure.
  • three systems are employed as the automatic exposure control mechanism, Namely, an average, center-weighted, and spot metering systems. In the average metering system, the amount of light is measured by dividing the image screen into multiple regions.
  • the exposure is controlled based on a weighted average value of the amount of light of the divided regions.
  • the amount of light in the center region of the image screen is mainly measured.
  • the exposure is controlled by measuring the amount of light in a local spot of the image screen.
  • a proper exposure adjustment may not be easily made even in an image capturing device having the above-described automatic exposure control mechanism when an image is photographed under backlight or partly under backlight conditions.
  • the subject in a true backlight condition in which the sun is located just behind a subject, the subject is darkened (i.e., underexposed) in the average or the center-weighted metering systems because the luminance difference between the background and the subject is substantial.
  • the background part e.g., the sky is whitened (i.e., overexposed).
  • the exposure is controlled such that the subject is not underexposed in the true backlight condition.
  • the background e.g., the sky is tends to be overexposed.
  • the correct exposure for the subject is not always obtained under various photographing conditions because the position and the size of the photometry frame in the image screen are fixed.
  • a halation phenomenon in which light from the light source enters into the photographic lens also often occurs.
  • the halation tends to occur frequently when a landscape is photographed in the morning or in the evening when the sun is low in the sky.
  • the subject itself is correctly exposed, although a part of the image is bleached-out, because the light source is not located immediately behind the subject.
  • the present invention has been made in view of the above-mentioned and other problems and addresses the above-discussed and other problems.
  • the present invention advantageously provides a novel image processing apparatus, method and computer program product, wherein image data acquired with an image capturing device, such as a digital camera, is properly determined whether the image data is in a true backlight or a halation conditions, and an appropriate process is performed on the image data in both the true backlight condition and the halation condition to improve quality of the image.
  • image data acquired with an image capturing device such as a digital camera
  • an image processing apparatus includes an image input device configured to input image data, an image condition determining device configured to determine whether the input image data input by the image input device is in a true backlight condition or in a halation condition, and a processing device configured to perform a specific process on the input image data based on the condition of the input image data determined by the image condition determining device.
  • FIG. 1 is a block diagram illustrating an example of an image processing apparatus
  • FIG. 2 is a flowchart illustrating the overall operation of the image processing apparatus
  • FIG. 3 is a flowchart illustrating an example of a process step for evaluating the polarization degree of a luminance histogram
  • FIG. 4 illustrates a typical luminance histogram of image data in a true backlight condition
  • FIG. 5 illustrates a typical luminance histogram of image data in a halation condition
  • FIG. 6 is a diagram explaining the evaluation of the polarization degree of the luminance histogram
  • FIG. 7 is a diagram explaining the evaluation of the polarization degree of the luminance histogram
  • FIG. 8 is a diagram explaining the dynamic range correction performed on image data in the true backlight condition
  • FIG. 9 is a diagram explaining the dynamic range correction performed on image data in the halation condition.
  • FIG. 10 is a diagram showing examples of tone curves
  • FIG. 11 is a block diagram illustrating a construction of a printer into which the image processing apparatus according to the present invention is incorporated.
  • FIG. 12 is a block diagram illustrating the construction of an image capturing device into which the image processing apparatus according to the present invention is incorporated.
  • FIG. 1 is a block diagram illustrating an example of an image processing apparatus.
  • the image processing apparatus includes an image input section 101 , an image memory section 102 , an image condition determining section 103 , and a tone processing section 106 .
  • the image input section 101 inputs image data.
  • the image memory section 102 temporarily stores the input image data.
  • the image condition determining section 103 receives the input image data from the image memory section 102 and determines the condition of the image.
  • the tone processing section 106 receives the input image data from the image memory section 102 and performs a tone process on the image data based on the condition of the image data determined by the image condition determining section 103 .
  • the image condition determining section 103 includes a histogram generating section 104 and a polarization degree evaluation section 105 .
  • FIG. 2 is a flowchart illustrating an overall operation of an image processing apparatus 100 .
  • FIG. 3 is a flowchart illustrating an example of a process step performed by the polarization degree evaluating section 105 of the image condition determining section 103 .
  • the image input section 101 inputs image data and stores the input image data in the image memory section 102 at step 200 .
  • the input image data is assumed to be monochrome image data of 256 tones. More specifically, the image input section 101 inputs image data from, for example, (1) a digital camera, a personal computer, etc., via a universal serial bus (USB) cable, (2) a memory card or other recording media in which image data is stored, or (3) via a wire or radio network.
  • USB universal serial bus
  • the histogram generating section 104 of the image condition determining section 103 inputs the input image data from the image memory section 102 and generates a luminance histogram, which shows the brightness of image data, at step 201 . It is not necessarily required to use information of all of pixels of the input image data for generating the luminance histogram.
  • the luminance histogram can be generated by sampling the input image data in a set sampling interval and using a part of the discrete pixel information.
  • the polarization degree evaluating section 105 evaluates the degree of a polarization of the luminance histogram generated by the histogram generating section 106 to determine the condition of an image of the input image data at step 202 .
  • FIG. 4 illustrates a typical luminance histogram of image data in a true backlight condition.
  • FIG. 5 illustrates a typical luminance histogram of image data in a halation condition.
  • the luminance histogram is polarized between in the high luminance region and in the low luminance region both in the halation condition and in the true backlight condition.
  • the luminance histogram is perfectly polarized such that the low luminance region separates from the high luminance region.
  • the main information about a subject is included in the low luminance region.
  • the luminance histogram is not perfectly polarized in the halation condition such that the high luminance region and the low luminance region are completely separated.
  • the degree of the polarization is remarkably high in the true backlight condition while the polarization degree is relatively low in the halation condition.
  • the polarization degree evaluating section 105 evaluates the degree of polarization of the luminance histogram and determines that the image data is in the true backlight condition when the luminance histogram is perfectly polarized as shown in FIG. 4 .
  • the polarization degree evaluating section 105 determines that the image data is in the halation condition when the luminance histogram is not perfectly polarized. When such a polarization of the luminance histogram is not recognized, the polarization degree evaluating section 105 determines that the image data is in the non-backlight condition (i.e., in the orderly light condition). More specific steps for evaluating the degree of the polarization will be described below referring to FIGS. 3 through 7 .
  • the tone processing section 106 inputs input image data from the image memory section 102 and performs a tone process on the input image data suited for the condition of the image data determined by the image condition determining section 103 at step 203 .
  • the processed image data is then output.
  • An example of the tone process performed on image data in the true backlight condition and in the halation condition is described below referring to FIGS. 8 through 10 .
  • a conventional tone process is performed on image data in an orderly light condition.
  • the degree of polarization is evaluated by using the frequency and the gradient of the luminance histogram.
  • the frequency and the gradient at the luminance level “i” are described with “f (i)” and “h (i) ”, respectively.
  • Threshold values of the absolute rate of frequency and the increase rate of the gradient are set as C (0 ⁇ C ⁇ 1) and D (0 ⁇ D ⁇ 1), respectively.
  • the values of the absolute frequency threshold and the gradient increase threshold are set as described below.
  • the absolute frequency threshold value Th 1 C ⁇ N (2)
  • the gradient increase threshold value Th 2 f ( i ) ⁇ D (3)
  • the absolute frequency threshold value Th 1 is indicated by the dotted line in FIGS. 6 and 7 .
  • the condition 1 means that the h(i) is more than zero (i.e., h(i)>0). Namely the gradient h(i) is positive.
  • the level of the luminance “i” is decremented by 1 at step 217 .
  • the process returns to step 211 to determine whether or not the condition 1 is satisfied when the luminance level “i”, which has been decremented, is equal to the lowest level of the luminance histogram MIN or higher (i.e., No at step 218 ).
  • condition 2 means that h(i) is more than ⁇ Th 2 (i.e., h(i)> ⁇ Th 2 ) as well as f(i) is equal to or less than Th 1 (i.e., f(i) ⁇ Th 1 ).
  • the luminance level “i” is decremented at step 217 . The process then returns to step 211 .
  • condition 3 means that f(i) is more than Th 1 (i.e., f(i)>Th 1 ). Namely, the frequency exceeds the absolute frequency threshold value Th 1 .
  • the condition 3 is satisfied, the image is determined to be in the halation condition at step 215 . The process is then finished.
  • condition 4 means that h(i) is equal to or less than ⁇ Th 2 (i.e., h(i) ⁇ Th 2 ).
  • h(i) is equal to or less than ⁇ Th 2 (i.e., h(i) ⁇ Th 2 ).
  • the image is determined to be in the true backlight condition at step 216 . The process is then finished.
  • step 216 When the condition 4 is not satisfied at step 216 , the luminance level “i” is decremented at step 217 . The process then returns to step 211 .
  • the image is determined not to be in the true backlight condition or in the halation condition, i.e., in the orderly light condition at step 219 .
  • the process is then finished.
  • the condition 4 is satisfied at the point Y indicated in FIG. 6 (i.e., a level of luminance where the frequency remarkably changes), and the image is determined to be in the true backlight condition.
  • these values are just one of many examples. Since the evaluation of the quality of the image, which is printed or displayed after the image data is processed, differs according to the personal point of view of the observer, no absolute optimum value is available.
  • FIGS. 8 through 10 the tone process, which is performed by the tone processing section 106 on image data determined to be in the true backlight condition or in the halation condition, is now described referring to FIGS. 8 through 10 .
  • MAX and MIN represent the highest and the lowest luminance level of the luminance histogram.
  • the dynamic range correction is performed based on the image condition.
  • a tone curve correction is then performed on the image data, on which the dynamic range correction has been performed, based on the image condition.
  • the appropriate range to be corrected by the dynamic range correction is set for each image of in different type of conditions i.e., in the true backlight and the halation conditions.
  • the value of the parameter for determining the proper exposure is obtained using data on a region other than the high luminance white light region of the luminance histogram. The optimum tone curve is determined based on the value obtained.
  • the dynamic range correction is described below.
  • the luminance histogram is perfectly polarized such that the low luminance region separates from the high luminance region. Main information about the subject is included in the low luminance region. Therefore, information about the high luminance region is not required.
  • the maximum value of the luminance Max for the dynamic range correction is set such that the value satisfies the following condition instead of being set to the highest luminance level MAX of the luminance histogram (see FIG. 8 ). Y ⁇ Max ⁇ X (4)
  • the minimum value of the luminance Min for the dynamic range correction is set to the lowest luminance level MIN of the luminance histogram both in the true backlight condition and in the halation condition (see FIGS. 8 and 9 ). That is,
  • the value of input image data i.e., an input value (i.e., 0 ⁇ 255) is converted by the following equation using the Max and Min values which are set as described above.
  • the dynamic range correction suitable for the image data either in the true backlight condition or in the halation condition is performed.
  • the tone curve correction is described below. It is assumed that the tone curve correction is performed by selecting an optimum tone curve from the prearranged plural types of tone curves as shown in FIG. 10 , for example.
  • the white light part ought to be removed to judge the exposure for the image. It is necessary to calculate the tone curve correction parameter based on data of the low luminance region for the image either in the true backlight condition or in the halation condition. That is, whether or not an exposure is correct is determined using the data between Min and Y in FIGS. 8 and 9 .
  • the median value Mod is used as the determining parameter.
  • the tone curve 3 in FIG. 10 is selected.
  • the tone curve 2 is selected.
  • the tone curve 1 is selected. The tone curve correction is performed on image data, on which the dynamic range correction has been performed, using the tone curve selected as described above.
  • the parameter for determining the correct exposure is not limited to the median value.
  • the tone curve correction may be performed in such a manner in which the proper tone curve is generated, or the standard tone curve is modified based on the value of the parameter for determining the correct exposure instead of selecting one of prearranged tone curves.
  • the histogram generating section 104 calculates the luminance component from RGB (Red, Green, Blue) color image data and generates the luminance histogram based on the calculated luminance component, when the RGB color image data is input. More simply, the luminance histogram is generated using the G (green) component of the RGB color image data or using the component, which has a maximum luminance value out of the three components i.e., R (Red), G (Green) and B (Blue), as luminance information.
  • RGB Red, Green, Blue
  • the dynamic range correction is performed, for example, based on the above-described equation (7) with respect to the G component.
  • the dynamic range corrections on the R and B components are performed by multiplying values of the R and B components by a ratio of values of the G component between before and after the dynamic range correction is performed.
  • the tone curve correction is performed, for example, on the G component using the selected tone curve.
  • the tone curve corrections on the R and B components are performed by multiplying values of the R and B components by the ratio of values of the G component between before and after the dynamic range correction is performed.
  • the above-described image processing apparatus 100 can be accomplished using the hardware of a common computer which includes a processor, a memory, and so forth.
  • the contents of the process of the image processing apparatus 100 can be handled on a computer.
  • a program for handling the tasks is read into the computer from various types of recording media, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory element, and the like.
  • the program may be read into the computer via a network.
  • the read program is executed by a processor.
  • the above-described various types of recording media, in which the program is recorded, are also included in the present invention.
  • the image condition determining section 103 which perform the method for determining the condition of an image according to the present invention, is included in the present invention.
  • various types of recording media, in which the program for causing a computer to perform the process of the method is recorded, are included in the present invention.
  • the image processing apparatus can be incorporated into an image forming apparatus, such as a printer, an image display apparatus, and an image capturing device, such as a digital camera. Those apparatuses and devices which include this image processing apparatus are included in the present invention. An example of an image forming apparatus is described below.
  • FIG. 11 is a block diagram illustrating the construction of a printer into which an image processing apparatus according to the present invention is incorporated.
  • the printer includes an image processing block 300 , a print process/control section 306 , and an image forming engine 307 .
  • the image processing block 300 performs a similar process performed by the image processing apparatus 100 shown in FIG. 1 .
  • the image processing block 300 includes a USB interface 301 , a memory card reader 302 , an image memory section 303 , an image condition determining section 304 , and a tone processing section 305 .
  • the USB interface 301 and the image memory section 303 correspond to the image input section 101 and the image memory section 102 in FIG. 1 , respectively.
  • the image condition determining section 304 and the tone processing section 305 correspond to the image condition determining section 103 and the tone processing section 106 in FIG. 1 , respectively.
  • Image data to be printed is input from a device, such as a digital camera, a personal computer etc., connected with the USB interface 301 via a USB cable, or is input from a memory card set in the memory card reader 302 .
  • the image data is temporarily stored in the image memory section 303 .
  • the input image data is read into the image condition determining section 304 in which the above-described image condition determining process is performed.
  • the input image data is read into the tone processing section 305 in which the above-described tone process according to an image condition is performed.
  • the image data is then transmitted to the print process/control section 306 .
  • a conversion for example, a conversion from the RGB data to the CMYK (Cyan, Magenta, Yellow and Black) data
  • the half tone process for example, a dither process and an error diffusion process
  • a driving signal is supplied to the image forming engine 307 to form the image.
  • An image display apparatus into which the image processing apparatus according to the present invention is incorporated, is accomplished, for example, by replacing the image forming engine 307 and the print process/control section 306 in FIG. 11 with the display and the display control section, respectively.
  • FIG. 12 is a block diagram illustrating the construction of an image capturing device, such as a digital camera into which an image processing apparatus according to the present invention is incorporated.
  • the image capturing device includes an imaging optical system 401 , a CCD (charge-coupled device) 402 , and an analog signal processing section 403 .
  • the imaging optical system 401 includes an imaging lens and an aperture mechanism.
  • the CCD 402 converts an optical image formed by the imaging optical system 401 into an analog image signal.
  • the analog signal processing section 403 eliminates the noise in the analog image signal and adjusts the signal level of the analog image signal.
  • the analog signal processing section 403 also converts the analog image signal into a digital signal.
  • the image capturing device further includes an image processing section 404 , a buffer memory 405 , a monitor 406 , a USB i/F interface 407 , and a memory card i/F interface 408 .
  • the image processing section 404 includes a DSP (digital signal processor) for controlling a program.
  • the buffer memory 405 is used by the image processing section 404 .
  • the monitor 406 includes a liquid crystal display panel.
  • the USB i/F interface 407 is used for a connection with an external personal computer, etc.
  • the memory card i/F interface 408 interfaces with a memory card 409 for writing and reading operation.
  • the image capturing device also includes an operation unit 411 with which the user inputs various instructions, a control section 410 which controls the imaging optical system 401 , the CCD 402 , and the image processing section 404 .
  • the image processing section 404 performs processes, such as (1) to generate monitor display data (i.e., so called through image data) from digital image data input from the analog signal processing section 403 and to display the monitor display data on the monitor 406 , (2) to compress digital image data, which is input from the analog signal processing section 403 , based on the depression of the release button of the operation unit 411 , and write the compressed image data into the memory card 409 , (3) to read the compressed image data from the memory card 409 and decompress the image data to display it on the monitor 406 .
  • monitor display data i.e., so called through image data
  • compress digital image data which is input from the analog signal processing section 403 , based on the depression of the release button of the operation unit 411 , and write the compressed image data into the memory card 409
  • the image processing section 404 When the instruction to perform the backlight correction process is given by the operation unit 411 , the image processing section 404 reads the compressed image data from the memory card 403 and decompresses the read image data in the buffer memory 405 . The image processing section 404 then performs an image condition determining process on the decompressed image data similar to that performed by the image condition determining section 103 in FIG. 1 . The image processing section 404 further performs a tone process similar to that performed by the tone processing section 106 in FIG. 1 according to the result of the determination.
  • the image processing section 404 displays the processed image data on the monitor 406 and write the image data into the memory card 409 by compressing the image data when an instruction is given from the operation unit 411 . That is, the image processing section 404 acts as the image input section 101 , the image condition determining section 103 , and the tone processing section 106 in FIG. 1 , when the instruction to perform the backlight correction process is provided. In other words, processes performed by the image input section 101 , the image condition determining section 103 , and the tone processing section 106 are performed by the image processing section 404 by the program.
  • the buffer memory 405 corresponds to the image memory section 102 in FIG. 1 .

Abstract

An image processing apparatus, method and computer program product, wherein a luminance histogram of input image data is generated by the histogram generating section and a polarization degree of the luminance histogram is evaluated by the polarization degree evaluating section in order to determine whether the input image data is in a true backlight condition or in a halation condition. An optimum dynamic range correction and tone curve correction are performed on the input image data based on a result of the above-described determination.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an image data processing system, and more particularly, to a method, apparatus and computer program product for processing image data acquired by an image capturing device, such as a digital camera, etc.
2. Discussion of the Background
Japanese Patent Laid-Open Publication No. 4-168879 discloses an image forming apparatus in which the printing operation is performed based on a video signal. In the image forming apparatus, video signals are sampled and an image is determined to be overexposed when a number of samples, which have a value equal to a specified threshold value TH or greater, is equal to a predetermined value NH or greater. A tone conversion of the video signal is then performed using a suitable tone conversion curve. It is determined that the image is photographed with flash light when the number of samples, which have a value equal to a specified threshold value TL (<TH) or lower, is equal to a predetermined value NL or greater. The tone conversion of the video signal is performed using a suitable tone conversion curve.
Further, Japanese Patent Laid-Open Publication No. 63-184473 discloses an image forming apparatus in which the luminance histogram of image data is used to switch the tone correction table based on the luminance condition of the image data. Two histograms are generated i.e., one for a high luminance region and the other for a low luminance region, to reduce the memory capacity required for generating the histogram. An image capturing device, such as a digital camera, generally includes an automatic exposure control mechanism to obtain an optimum exposure. Generally, three systems are employed as the automatic exposure control mechanism, Namely, an average, center-weighted, and spot metering systems. In the average metering system, the amount of light is measured by dividing the image screen into multiple regions. The exposure is controlled based on a weighted average value of the amount of light of the divided regions. In the center-weighted metering system, the amount of light in the center region of the image screen is mainly measured. In the spot metering system, the exposure is controlled by measuring the amount of light in a local spot of the image screen.
A proper exposure adjustment may not be easily made even in an image capturing device having the above-described automatic exposure control mechanism when an image is photographed under backlight or partly under backlight conditions.
For example, in a true backlight condition in which the sun is located just behind a subject, the subject is darkened (i.e., underexposed) in the average or the center-weighted metering systems because the luminance difference between the background and the subject is substantial. The background part, e.g., the sky is whitened (i.e., overexposed). In the spot metering system, the exposure is controlled such that the subject is not underexposed in the true backlight condition. However, the background e.g., the sky is tends to be overexposed. Further, even in the spot metering system, the correct exposure for the subject is not always obtained under various photographing conditions because the position and the size of the photometry frame in the image screen are fixed.
A halation phenomenon in which light from the light source enters into the photographic lens also often occurs. The halation tends to occur frequently when a landscape is photographed in the morning or in the evening when the sun is low in the sky. In the halation condition, the subject itself is correctly exposed, although a part of the image is bleached-out, because the light source is not located immediately behind the subject.
SUMMARY OF THE INVENTION
The present invention has been made in view of the above-mentioned and other problems and addresses the above-discussed and other problems.
The present invention advantageously provides a novel image processing apparatus, method and computer program product, wherein image data acquired with an image capturing device, such as a digital camera, is properly determined whether the image data is in a true backlight or a halation conditions, and an appropriate process is performed on the image data in both the true backlight condition and the halation condition to improve quality of the image.
According to an example of the present invention, an image processing apparatus includes an image input device configured to input image data, an image condition determining device configured to determine whether the input image data input by the image input device is in a true backlight condition or in a halation condition, and a processing device configured to perform a specific process on the input image data based on the condition of the input image data determined by the image condition determining device.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the present invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
FIG. 1 is a block diagram illustrating an example of an image processing apparatus;
FIG. 2 is a flowchart illustrating the overall operation of the image processing apparatus;
FIG. 3 is a flowchart illustrating an example of a process step for evaluating the polarization degree of a luminance histogram;
FIG. 4 illustrates a typical luminance histogram of image data in a true backlight condition;
FIG. 5 illustrates a typical luminance histogram of image data in a halation condition;
FIG. 6 is a diagram explaining the evaluation of the polarization degree of the luminance histogram;
FIG. 7 is a diagram explaining the evaluation of the polarization degree of the luminance histogram;
FIG. 8 is a diagram explaining the dynamic range correction performed on image data in the true backlight condition;
FIG. 9 is a diagram explaining the dynamic range correction performed on image data in the halation condition;
FIG. 10 is a diagram showing examples of tone curves;
FIG. 11 is a block diagram illustrating a construction of a printer into which the image processing apparatus according to the present invention is incorporated; and
FIG. 12 is a block diagram illustrating the construction of an image capturing device into which the image processing apparatus according to the present invention is incorporated.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, FIG. 1 is a block diagram illustrating an example of an image processing apparatus. The image processing apparatus includes an image input section 101, an image memory section 102, an image condition determining section 103, and a tone processing section 106. The image input section 101 inputs image data. The image memory section 102 temporarily stores the input image data. The image condition determining section 103 receives the input image data from the image memory section 102 and determines the condition of the image. The tone processing section 106 receives the input image data from the image memory section 102 and performs a tone process on the image data based on the condition of the image data determined by the image condition determining section 103. The image condition determining section 103 includes a histogram generating section 104 and a polarization degree evaluation section 105.
FIG. 2 is a flowchart illustrating an overall operation of an image processing apparatus 100. FIG. 3 is a flowchart illustrating an example of a process step performed by the polarization degree evaluating section 105 of the image condition determining section 103.
The overall operation of the image processing apparatus 100 is now described below referring to FIG. 2. The image input section 101 inputs image data and stores the input image data in the image memory section 102 at step 200. Herein, the input image data is assumed to be monochrome image data of 256 tones. More specifically, the image input section 101 inputs image data from, for example, (1) a digital camera, a personal computer, etc., via a universal serial bus (USB) cable, (2) a memory card or other recording media in which image data is stored, or (3) via a wire or radio network.
When image data is input, the histogram generating section 104 of the image condition determining section 103 inputs the input image data from the image memory section 102 and generates a luminance histogram, which shows the brightness of image data, at step 201. It is not necessarily required to use information of all of pixels of the input image data for generating the luminance histogram. The luminance histogram can be generated by sampling the input image data in a set sampling interval and using a part of the discrete pixel information.
The polarization degree evaluating section 105 evaluates the degree of a polarization of the luminance histogram generated by the histogram generating section 106 to determine the condition of an image of the input image data at step 202. FIG. 4 illustrates a typical luminance histogram of image data in a true backlight condition. FIG. 5 illustrates a typical luminance histogram of image data in a halation condition.
As is observed in FIGS. 4 and 5, the luminance histogram is polarized between in the high luminance region and in the low luminance region both in the halation condition and in the true backlight condition. However, in the true backlight condition, the luminance histogram is perfectly polarized such that the low luminance region separates from the high luminance region. The main information about a subject is included in the low luminance region. As is shown in FIG. 5, contrarily to the true backlight condition, the luminance histogram is not perfectly polarized in the halation condition such that the high luminance region and the low luminance region are completely separated. As described above, the degree of the polarization is remarkably high in the true backlight condition while the polarization degree is relatively low in the halation condition.
The polarization degree evaluating section 105 evaluates the degree of polarization of the luminance histogram and determines that the image data is in the true backlight condition when the luminance histogram is perfectly polarized as shown in FIG. 4. The polarization degree evaluating section 105 determines that the image data is in the halation condition when the luminance histogram is not perfectly polarized. When such a polarization of the luminance histogram is not recognized, the polarization degree evaluating section 105 determines that the image data is in the non-backlight condition (i.e., in the orderly light condition). More specific steps for evaluating the degree of the polarization will be described below referring to FIGS. 3 through 7.
The tone processing section 106 inputs input image data from the image memory section 102 and performs a tone process on the input image data suited for the condition of the image data determined by the image condition determining section 103 at step 203. The processed image data is then output. An example of the tone process performed on image data in the true backlight condition and in the halation condition is described below referring to FIGS. 8 through 10. A conventional tone process is performed on image data in an orderly light condition.
Specific steps for evaluating the degree of polarization are described below referring to the flowchart in FIG. 3, and FIGS. 6 and 7. In the steps, the degree of polarization is evaluated by using the frequency and the gradient of the luminance histogram.
Herein, the number of pixels and the level of luminance of an image to be determined by the image condition determining section 103 are described with “N” and “i” (=0, 1, 2, . . . 255), respectively. Further, the frequency and the gradient at the luminance level “i” are described with “f (i)” and “h (i) ”, respectively. This gradient h(i) is calculated in the following equation.
h(i)=(f(i+δ)−f(i)/δ(i=0, 1, 2, . . . 255−δ, δ>0)  (1)
Threshold values of the absolute rate of frequency and the increase rate of the gradient are set as C (0<C<1) and D (0<D<1), respectively. The values of the absolute frequency threshold and the gradient increase threshold are set as described below.
The absolute frequency threshold value
Th 1=C×N  (2)
The gradient increase threshold value
Th 2=f(iD  (3)
The absolute frequency threshold value Th1 is indicated by the dotted line in FIGS. 6 and 7.
A luminance level “i”, which is lower than the highest level of the luminance histogram MAX by “δ”, is set at step 210.
At the luminance level “i”, whether or not the condition 1 is satisfied is determined at step 211. The condition 1 means that the h(i) is more than zero (i.e., h(i)>0). Namely the gradient h(i) is positive. When the condition 1 is satisfied, the level of the luminance “i” is decremented by 1 at step 217. The process returns to step 211 to determine whether or not the condition 1 is satisfied when the luminance level “i”, which has been decremented, is equal to the lowest level of the luminance histogram MIN or higher (i.e., No at step 218).
When the condition 1 is not satisfied at step 211, whether or not the condition 2 is satisfied is determined at step 212. The condition 2 means that h(i) is more than −Th2 (i.e., h(i)>−Th2) as well as f(i) is equal to or less than Th1(i.e., f(i)≦Th1). When the condition 2 is satisfied, the luminance level “i” is decremented at step 217. The process then returns to step 211.
When the condition 2 is not satisfied at step 212, whether or not the condition 3 is satisfied is determined at step 213. The condition 3 means that f(i) is more than Th1 (i.e., f(i)>Th1). Namely, the frequency exceeds the absolute frequency threshold value Th1. When the condition 3 is satisfied, the image is determined to be in the halation condition at step 215. The process is then finished.
When the condition 3 is not satisfied at step 213, whether or not the condition 4 is satisfied is determined at step 214. The condition 4 means that h(i) is equal to or less than −Th2 (i.e., h(i)≦−Th2). When the condition 4 is satisfied, the image is determined to be in the true backlight condition at step 216. The process is then finished.
When the condition 4 is not satisfied at step 216, the luminance level “i” is decremented at step 217. The process then returns to step 211.
When neither of the conditions 3 and 4 are satisfied even when processes are repeated to the lowest luminance level (i.e., Yes at step 218), the image is determined not to be in the true backlight condition or in the halation condition, i.e., in the orderly light condition at step 219. The process is then finished.
The above-described steps are described below referring to FIGS. 6 and 7. In areas indicated by {circle around (1)} in FIGS. 6 and 7, the process proceeds to the lower luminance level because the gradient is positive and the condition 1 is satisfied.
At steps 212, 213, and 214, the relationship between the gradient increase threshold value and the absolute frequency threshold value after the first polarizing point (i.e., a point indicated by “X” in FIGS. 6 and 7) is examined.
In areas indicated by {circle around (2)} in FIGS. 6 and 7 where the frequency is low and the change in the frequency hardly occurs, the condition 2 is satisfied. The process then proceeds to a lower luminance level.
At the point Y indicated in FIG. 7 where the frequency is higher than the absolute frequency threshold value Th1, The condition 3 is satisfied and the image is determined to be in the halation condition.
The condition 4 is satisfied at the point Y indicated in FIG. 6 (i.e., a level of luminance where the frequency remarkably changes), and the image is determined to be in the true backlight condition.
According to an experiment, the image condition is satisfactory determined when values of δ, C and D are selected as, δ=15, C=0.001, and D=0.15. However, these values are just one of many examples. Since the evaluation of the quality of the image, which is printed or displayed after the image data is processed, differs according to the personal point of view of the observer, no absolute optimum value is available.
Next, the tone process, which is performed by the tone processing section 106 on image data determined to be in the true backlight condition or in the halation condition, is now described referring to FIGS. 8 through 10. In FIGS. 8 and 9, MAX and MIN represent the highest and the lowest luminance level of the luminance histogram.
In the tone process, the dynamic range correction is performed based on the image condition. A tone curve correction is then performed on the image data, on which the dynamic range correction has been performed, based on the image condition. In the dynamic range correction process, the appropriate range to be corrected by the dynamic range correction is set for each image of in different type of conditions i.e., in the true backlight and the halation conditions. In the tone curve correction process, the value of the parameter for determining the proper exposure is obtained using data on a region other than the high luminance white light region of the luminance histogram. The optimum tone curve is determined based on the value obtained.
The dynamic range correction is described below. As described above, in the true backlight condition, the luminance histogram is perfectly polarized such that the low luminance region separates from the high luminance region. Main information about the subject is included in the low luminance region. Therefore, information about the high luminance region is not required. The maximum value of the luminance Max for the dynamic range correction is set such that the value satisfies the following condition instead of being set to the highest luminance level MAX of the luminance histogram (see FIG. 8).
Y≦Max≦X  (4)
Contrarily to the true backlight condition, the luminance histogram is not perfectly polarized in the halation condition such that the high luminance region and the low luminance region are completely separated. Therefore, the maximum value of the luminance MAX for the dynamic range correction is set to the highest luminance level MAX of the luminance histogram (see FIG. 9). That is,
Max=MAX  (5)
The minimum value of the luminance Min for the dynamic range correction is set to the lowest luminance level MIN of the luminance histogram both in the true backlight condition and in the halation condition (see FIGS. 8 and 9). That is,
 Min=MIN  (6)
The value of input image data, i.e., an input value (i.e., 0˜255) is converted by the following equation using the Max and Min values which are set as described above.
Output value=α×input value+β
α=255/(max−min)
β=−(255·min)/(max−min)  (7)
Thus, the dynamic range correction suitable for the image data either in the true backlight condition or in the halation condition is performed.
The tone curve correction is described below. It is assumed that the tone curve correction is performed by selecting an optimum tone curve from the prearranged plural types of tone curves as shown in FIG. 10, for example.
Since the image in the backlight condition includes high luminance white light as described above, the white light part ought to be removed to judge the exposure for the image. It is necessary to calculate the tone curve correction parameter based on data of the low luminance region for the image either in the true backlight condition or in the halation condition. That is, whether or not an exposure is correct is determined using the data between Min and Y in FIGS. 8 and 9. For example, the median value Mod is used as the determining parameter. However, since the dynamic range correction is performed based on the calculation (7), a median value Mod_after, which is calculated after the dynamic range correction is performed by the following equation, is used.
Mod_after=α×Mod+β  (8)
When Mod_after is equal to or less than 30 (i.e., Mod_after≦30), the tone curve 3 in FIG. 10 is selected. When Mod_after is more than 30 and is equal to or less than 60 (i.e., 30<Mod_after≦60), the tone curve 2 is selected. When Mod_after is more than 60 (i.e., 60<Mod_after) in which the tone curve correction is not so much required, the tone curve 1 is selected. The tone curve correction is performed on image data, on which the dynamic range correction has been performed, using the tone curve selected as described above.
The parameter for determining the correct exposure is not limited to the median value. The tone curve correction may be performed in such a manner in which the proper tone curve is generated, or the standard tone curve is modified based on the value of the parameter for determining the correct exposure instead of selecting one of prearranged tone curves.
Although the description is made assuming that monochrome image data is input, a similar process can be performed on color image data. For example, the histogram generating section 104 calculates the luminance component from RGB (Red, Green, Blue) color image data and generates the luminance histogram based on the calculated luminance component, when the RGB color image data is input. More simply, the luminance histogram is generated using the G (green) component of the RGB color image data or using the component, which has a maximum luminance value out of the three components i.e., R (Red), G (Green) and B (Blue), as luminance information.
When the dynamic range correction is performed in the tone processing section 106, the dynamic range correction is performed, for example, based on the above-described equation (7) with respect to the G component. The dynamic range corrections on the R and B components are performed by multiplying values of the R and B components by a ratio of values of the G component between before and after the dynamic range correction is performed. In the tone curve correction process, the tone curve correction is performed, for example, on the G component using the selected tone curve. The tone curve corrections on the R and B components are performed by multiplying values of the R and B components by the ratio of values of the G component between before and after the dynamic range correction is performed.
The above-described image processing apparatus 100 can be accomplished using the hardware of a common computer which includes a processor, a memory, and so forth. In other words, the contents of the process of the image processing apparatus 100 can be handled on a computer. A program for handling the tasks is read into the computer from various types of recording media, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory element, and the like. The program may be read into the computer via a network. The read program is executed by a processor.
The above-described various types of recording media, in which the program is recorded, are also included in the present invention. The image condition determining section 103, which perform the method for determining the condition of an image according to the present invention, is included in the present invention. Further, various types of recording media, in which the program for causing a computer to perform the process of the method is recorded, are included in the present invention.
The image processing apparatus according to the present invention can be incorporated into an image forming apparatus, such as a printer, an image display apparatus, and an image capturing device, such as a digital camera. Those apparatuses and devices which include this image processing apparatus are included in the present invention. An example of an image forming apparatus is described below.
FIG. 11 is a block diagram illustrating the construction of a printer into which an image processing apparatus according to the present invention is incorporated. The printer includes an image processing block 300, a print process/control section 306, and an image forming engine 307.
The image processing block 300 performs a similar process performed by the image processing apparatus 100 shown in FIG. 1. The image processing block 300 includes a USB interface 301, a memory card reader 302, an image memory section 303, an image condition determining section 304, and a tone processing section 305. The USB interface 301 and the image memory section 303 correspond to the image input section 101 and the image memory section 102 in FIG. 1, respectively. The image condition determining section 304 and the tone processing section 305 correspond to the image condition determining section 103 and the tone processing section 106 in FIG. 1, respectively.
Image data to be printed is input from a device, such as a digital camera, a personal computer etc., connected with the USB interface 301 via a USB cable, or is input from a memory card set in the memory card reader 302. The image data is temporarily stored in the image memory section 303. The input image data is read into the image condition determining section 304 in which the above-described image condition determining process is performed.
When the image condition determining process is finished, the input image data is read into the tone processing section 305 in which the above-described tone process according to an image condition is performed. The image data is then transmitted to the print process/control section 306. In print process/control section 306, a conversion (for example, a conversion from the RGB data to the CMYK (Cyan, Magenta, Yellow and Black) data) and the half tone process (for example, a dither process and an error diffusion process) are performed on the image data according to the imaging system and the characteristic of the image forming engine 307. A driving signal is supplied to the image forming engine 307 to form the image.
With this printer, quality of the image formed by the image forming engine 307 is improved even when image data in a true backlight condition or in a halation condition is input, because an appropriate dynamic range correction and tone curve correction are performed by the tone processing section 305 based on the condition of the image.
An image display apparatus, into which the image processing apparatus according to the present invention is incorporated, is accomplished, for example, by replacing the image forming engine 307 and the print process/control section 306 in FIG. 11 with the display and the display control section, respectively.
FIG. 12 is a block diagram illustrating the construction of an image capturing device, such as a digital camera into which an image processing apparatus according to the present invention is incorporated. The image capturing device includes an imaging optical system 401, a CCD (charge-coupled device) 402, and an analog signal processing section 403. The imaging optical system 401 includes an imaging lens and an aperture mechanism. The CCD 402 converts an optical image formed by the imaging optical system 401 into an analog image signal. The analog signal processing section 403 eliminates the noise in the analog image signal and adjusts the signal level of the analog image signal. The analog signal processing section 403 also converts the analog image signal into a digital signal. The image capturing device further includes an image processing section 404, a buffer memory 405, a monitor 406, a USB i/F interface 407, and a memory card i/F interface 408. The image processing section 404 includes a DSP (digital signal processor) for controlling a program. The buffer memory 405 is used by the image processing section 404. The monitor 406 includes a liquid crystal display panel. The USB i/F interface 407 is used for a connection with an external personal computer, etc. The memory card i/F interface 408 interfaces with a memory card 409 for writing and reading operation. The image capturing device also includes an operation unit 411 with which the user inputs various instructions, a control section 410 which controls the imaging optical system 401, the CCD 402, and the image processing section 404.
The image processing section 404 performs processes, such as (1) to generate monitor display data (i.e., so called through image data) from digital image data input from the analog signal processing section 403 and to display the monitor display data on the monitor 406, (2) to compress digital image data, which is input from the analog signal processing section 403, based on the depression of the release button of the operation unit 411, and write the compressed image data into the memory card 409, (3) to read the compressed image data from the memory card 409 and decompress the image data to display it on the monitor 406.
When the instruction to perform the backlight correction process is given by the operation unit 411, the image processing section 404 reads the compressed image data from the memory card 403 and decompresses the read image data in the buffer memory 405. The image processing section 404 then performs an image condition determining process on the decompressed image data similar to that performed by the image condition determining section 103 in FIG. 1. The image processing section 404 further performs a tone process similar to that performed by the tone processing section 106 in FIG. 1 according to the result of the determination.
Further, the image processing section 404 displays the processed image data on the monitor 406 and write the image data into the memory card 409 by compressing the image data when an instruction is given from the operation unit 411. That is, the image processing section 404 acts as the image input section 101, the image condition determining section 103, and the tone processing section 106 in FIG. 1, when the instruction to perform the backlight correction process is provided. In other words, processes performed by the image input section 101, the image condition determining section 103, and the tone processing section 106 are performed by the image processing section 404 by the program. In this case, the buffer memory 405 corresponds to the image memory section 102 in FIG. 1.
Numerous additional modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the present invention may be practiced otherwise than as specifically described herein.
This document claims priority and contains subject matter related to Japanese Patent Application No. 2000-214105, filed on Jul. 14, 2000, and the entire contents thereof are herein incorporated by reference.

Claims (16)

1. An image processing apparatus, comprising:
an image input device configured to input image data;
an image condition determining device configured to determine whether the input image data input by said image input device is in a true backlight condition or in a halation condition; and
a processing device configured to perform a specific process on the input image data based on the condition of the input image data determined by said image condition determining device,
wherein the image condition determining device is configured to generate a histogram showing a luminance of the image data based on the input image data and to determine whether the histogram is perfectly polarized, indicating the true backlight condition.
2. The image processing apparatus according to claim 1, wherein said image condition determining device is configured to determine whether the histogram is perfectly polarized by determining whether the histogram includes a high luminance region that is completely separated from a low luminance region.
3. The image processing apparatus according to claim 1, wherein said image condition determining device is configured to determine whether the histogram is perfectly polarized using frequency and gradient values of the histogram.
4. The image processing apparatus according to claim 1, wherein the specific process performed by said processing device includes a dynamic range correction and a tone curve correction to be performed on the input image data based on the determination made by said image condition determining device.
5. A method for processing image data, comprising:
inputting image data;
generating a histogram showing a luminance of image data based on the input image data;
determining whether the input image data is in a true backlight condition or in a halation condition by determining whether the histogram is perfectly polarized, indicating the true backlight condition; and
performing a specific process on the input image data based on the condition of the input image data determined in the determining step.
6. The method according to claim 5, wherein the determining step comprises:
determining whether the histogram includes a high luminance region that is completely separated from a low luminance region.
7. The method according to claim 5, wherein the determining step comprises determining whether the histogram is perfectly polarized using frequency and gradient values of the histogram.
8. The method according to claim 5, wherein the specific process performed on the input image data based on the determination made in the determining step includes a dynamic range correction and a tone curve correction.
9. An image processing apparatus, comprising:
an image input means for inputting image data;
an image condition determining means for determining whether the input image data input by said image input means is in a true backlight condition or in a halation condition; and
a processing means for performing a specific process on the input image data based on the condition of the input image data determined by said image condition determining means,
wherein the image condition determining means generates a histogram showing a luminance of the image data based on the input image data and determines whether the histogram is perfectly polarized, indicating the true backlight condition.
10. The image processing apparatus according to claim 9, wherein said image condition determining means determines whether the histogram is perfectly polarized by determining whether the histogram includes a high luminance region that is completely separated from a low luminance region.
11. The image processing apparatus according to claim 9, wherein said image condition determining means determines whether the histogram is perfectly polarized using frequency and gradient values of the histogram.
12. The image processing apparatus according to claim 9, wherein the specific process performed by said processing means includes a dynamic range correction and a tone curve correction to be performed on the input image data based on the determination made by said image condition determining means.
13. A computer program product embodied in a computer readable medium for processing image data, comprising:
a first computer code for determining whether input image data is in a true backlight condition or in a halation condition; and
a second computer code for performing a specific process on the input image data based on the condition of the input image data determined by the first computer code,
wherein the first computer code generates a histogram showing a luminance of the image data based on the input image data and determines whether the histogram is perfectly polarized, indicating the backlight condition.
14. The computer program product according to claim 13, wherein the first computer code comprises:
a third computer code for determining whether the histogram is perfectly polarized by determining whether the histogram includes a high luminance region that is completely separated from a low luminance region.
15. The computer program product according to claim 13, wherein the first computer code comprises:
a fourth computer code for determining whether the histogram is perfectly polarized using frequency and gradient values of the histogram.
16. The computer program product according to claim 13, wherein the second computer code performs a dynamic correction and a tone curve correction on the input image data based on the determination made by the first computer code.
US09/903,577 2000-07-14 2001-07-13 Method, apparatus and computer program product for processing image data Expired - Lifetime US6873729B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/064,233 US7006692B2 (en) 2000-07-14 2005-02-24 Method, apparatus and computer program product for processing image data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2000214105 2000-07-14
JP2000-214105 2000-07-14

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/064,233 Continuation US7006692B2 (en) 2000-07-14 2005-02-24 Method, apparatus and computer program product for processing image data

Publications (2)

Publication Number Publication Date
US20020024609A1 US20020024609A1 (en) 2002-02-28
US6873729B2 true US6873729B2 (en) 2005-03-29

Family

ID=18709747

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/903,577 Expired - Lifetime US6873729B2 (en) 2000-07-14 2001-07-13 Method, apparatus and computer program product for processing image data
US11/064,233 Expired - Fee Related US7006692B2 (en) 2000-07-14 2005-02-24 Method, apparatus and computer program product for processing image data

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/064,233 Expired - Fee Related US7006692B2 (en) 2000-07-14 2005-02-24 Method, apparatus and computer program product for processing image data

Country Status (1)

Country Link
US (2) US6873729B2 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020974A1 (en) * 2001-06-11 2003-01-30 Yuki Matsushima Image processing apparatus, image processing method and information recording medium
US20030099407A1 (en) * 2001-11-29 2003-05-29 Yuki Matsushima Image processing apparatus, image processing method, computer program and storage medium
US20030231347A1 (en) * 2002-02-26 2003-12-18 Toshie Imai Image processing apparatus, image processing method, image processing program, and medium recording the image processing program
US20050036033A1 (en) * 2003-05-16 2005-02-17 Toshie Imai Brightness correction for image
US20050140640A1 (en) * 2003-12-29 2005-06-30 Lg.Philips Lcd Co., Ltd. Liquid crystal display device and controlling method thereof
US20050140631A1 (en) * 2003-12-29 2005-06-30 Lg.Philips Lcd Co., Ltd. Method and apparatus for driving liquid crystal display device
US20060119713A1 (en) * 2002-09-10 2006-06-08 Tatsuya Deguchi Digital still camera and image correction method
US20060182338A1 (en) * 2003-12-12 2006-08-17 Fujitsu Limited Method and device for color balance correction, and computer product
US20070064250A1 (en) * 2005-09-16 2007-03-22 Yuki Matsushima Image forming apparatus and image forming method
US20080211923A1 (en) * 2007-01-12 2008-09-04 Canon Kabushiki Kaisha Imaging apparatus, image processing method, and computer program
US20090115907A1 (en) * 2007-10-31 2009-05-07 Masahiro Baba Image display apparatus and image display method
US20090295937A1 (en) * 2008-05-30 2009-12-03 Daisuke Sato Dynamic range compression apparatus, dynamic range compression method, computer-readable recording medium, integrated circuit, and imaging apparatus
US20100265359A1 (en) * 2002-07-12 2010-10-21 Seiko Epson Corporation Output image adjustment of image data
CN102693706A (en) * 2007-04-24 2012-09-26 瑞萨电子株式会社 Display device, and display driver
US8330970B2 (en) 2007-12-26 2012-12-11 Ricoh Company, Ltd. Image processing device, image processing method, and recording medium

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003337944A (en) * 2002-03-14 2003-11-28 Ricoh Co Ltd Image processor, host unit for image processing, image processing method, and computer-readable storage medium
JP4341295B2 (en) * 2003-05-16 2009-10-07 セイコーエプソン株式会社 Judging backlit human images
FI116327B (en) * 2003-09-24 2005-10-31 Nokia Corp Method and system for automatically adjusting color balance in a digital image processing chain, corresponding hardware and software means for implementing the method
US7986351B2 (en) 2005-01-27 2011-07-26 Qualcomm Incorporated Luma adaptation for digital image processing
JP4240023B2 (en) * 2005-08-31 2009-03-18 ソニー株式会社 Imaging apparatus, imaging method and imaging program, and image processing apparatus, image processing method and image processing program
US7715623B2 (en) * 2005-11-14 2010-05-11 Siemens Medical Solutions Usa, Inc. Diffusion distance for histogram comparison
JP2007178576A (en) * 2005-12-27 2007-07-12 Casio Comput Co Ltd Imaging apparatus and program therefor
US7813545B2 (en) * 2006-08-31 2010-10-12 Aptina Imaging Corporation Backlit subject detection in an image
US8207931B2 (en) * 2007-05-31 2012-06-26 Hong Kong Applied Science and Technology Research Institute Company Limited Method of displaying a low dynamic range image in a high dynamic range
CN101582991B (en) * 2008-05-13 2011-02-09 华为终端有限公司 Method and device for processing image
JP5424921B2 (en) * 2009-08-31 2014-02-26 キヤノン株式会社 Image processing apparatus and control method thereof
US9253375B2 (en) * 2013-04-02 2016-02-02 Google Inc. Camera obstruction detection
US11240439B2 (en) * 2018-12-20 2022-02-01 Canon Kabushiki Kaisha Electronic apparatus and image capture apparatus capable of detecting halation, method of controlling electronic apparatus, method of controlling image capture apparatus, and storage medium
US11818320B2 (en) 2020-04-17 2023-11-14 Ricoh Company, Ltd. Convert a dot area ratio of a target process color using a fluorescent color for higher brightness and saturation
JP2022040818A (en) 2020-08-31 2022-03-11 株式会社リコー Image processing device, image processing method, and program
JP2023080991A (en) 2021-11-30 2023-06-09 株式会社リコー Information processing apparatus, information processing system, information processing method, and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63184473A (en) 1987-01-27 1988-07-29 Canon Inc Gradation correction device
JPH04168879A (en) 1990-11-01 1992-06-17 Canon Inc Image forming device
US20020171852A1 (en) * 2001-04-20 2002-11-21 Xuemei Zhang System and method for digital image tone mapping using an adaptive sigmoidal function based on perceptual preference guidelines
US6577751B2 (en) * 1998-06-11 2003-06-10 Fuji Photo Film Co., Ltd. Image processing method capable of correcting red eye problem
US20030179398A1 (en) * 2002-03-20 2003-09-25 Hiroaki Takano Image forming method and image forming apparatus
US20040022434A1 (en) * 1998-06-24 2004-02-05 Canon Kabushiki Kaisha Image processing method and apparatus and storage medium
US6694051B1 (en) * 1998-06-24 2004-02-17 Canon Kabushiki Kaisha Image processing method, image processing apparatus and recording medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6888529B2 (en) * 2000-12-12 2005-05-03 Koninklijke Philips Electronics N.V. Control and drive circuit arrangement for illumination performance enhancement with LED light sources

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63184473A (en) 1987-01-27 1988-07-29 Canon Inc Gradation correction device
JPH04168879A (en) 1990-11-01 1992-06-17 Canon Inc Image forming device
US6577751B2 (en) * 1998-06-11 2003-06-10 Fuji Photo Film Co., Ltd. Image processing method capable of correcting red eye problem
US20040022434A1 (en) * 1998-06-24 2004-02-05 Canon Kabushiki Kaisha Image processing method and apparatus and storage medium
US6694051B1 (en) * 1998-06-24 2004-02-17 Canon Kabushiki Kaisha Image processing method, image processing apparatus and recording medium
US20020171852A1 (en) * 2001-04-20 2002-11-21 Xuemei Zhang System and method for digital image tone mapping using an adaptive sigmoidal function based on perceptual preference guidelines
US20030179398A1 (en) * 2002-03-20 2003-09-25 Hiroaki Takano Image forming method and image forming apparatus

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020974A1 (en) * 2001-06-11 2003-01-30 Yuki Matsushima Image processing apparatus, image processing method and information recording medium
US7525688B2 (en) 2001-06-11 2009-04-28 Ricoh Company, Ltd. Image processing apparatus, image processing method and information recording medium
US20080002216A1 (en) * 2001-06-11 2008-01-03 Yuki Matsushima Image processing apparatus, image processing method and information recording medium
US7251056B2 (en) 2001-06-11 2007-07-31 Ricoh Company, Ltd. Image processing apparatus, image processing method and information recording medium
US7167597B2 (en) 2001-11-29 2007-01-23 Ricoh Company, Ltd. Image processing apparatus, image processing method, computer program and storage medium
US20030099407A1 (en) * 2001-11-29 2003-05-29 Yuki Matsushima Image processing apparatus, image processing method, computer program and storage medium
US20030231347A1 (en) * 2002-02-26 2003-12-18 Toshie Imai Image processing apparatus, image processing method, image processing program, and medium recording the image processing program
US7200265B2 (en) * 2002-02-26 2007-04-03 Seiko Epson Corporation Image processing apparatus, image processing method, image processing program, and medium recording the image processing program
US20100265359A1 (en) * 2002-07-12 2010-10-21 Seiko Epson Corporation Output image adjustment of image data
US7580064B2 (en) * 2002-09-10 2009-08-25 Sony Corporation Digital still camera and image correction method
US8358355B2 (en) * 2002-09-10 2013-01-22 Sony Corporation Digital still camera and image correction method
US20060119713A1 (en) * 2002-09-10 2006-06-08 Tatsuya Deguchi Digital still camera and image correction method
US20090303345A1 (en) * 2002-09-10 2009-12-10 Tatsuya Deguchi Digital still camera and image correction method
US7486312B2 (en) * 2003-05-16 2009-02-03 Seiko Epson Corporation Brightness correction for image
US7932930B2 (en) 2003-05-16 2011-04-26 Seiko Epson Corporation Brightness correction for image
US20050036033A1 (en) * 2003-05-16 2005-02-17 Toshie Imai Brightness correction for image
US7664319B2 (en) * 2003-12-12 2010-02-16 Fujitsu Limited Method and device for color balance correction, and computer product
US20060182338A1 (en) * 2003-12-12 2006-08-17 Fujitsu Limited Method and device for color balance correction, and computer product
US7352352B2 (en) * 2003-12-29 2008-04-01 Lg.Philips Lcd Co., Ltd. Liquid crystal display device and controlling method thereof
US20050140631A1 (en) * 2003-12-29 2005-06-30 Lg.Philips Lcd Co., Ltd. Method and apparatus for driving liquid crystal display device
US8149195B2 (en) 2003-12-29 2012-04-03 Lg Display Co., Ltd. Method and apparatus for driving liquid crystal display device
US7782281B2 (en) * 2003-12-29 2010-08-24 Lg Display Co., Ltd. Method and apparatus for driving liquid crystal display device
US20050140640A1 (en) * 2003-12-29 2005-06-30 Lg.Philips Lcd Co., Ltd. Liquid crystal display device and controlling method thereof
US20100277518A1 (en) * 2003-12-29 2010-11-04 Eui Yeol Oh Method and apparatus for driving liquid crystal display device
US7510275B2 (en) 2005-09-16 2009-03-31 Ricoh Company, Ltd. Image forming apparatus and image forming method
US20070064250A1 (en) * 2005-09-16 2007-03-22 Yuki Matsushima Image forming apparatus and image forming method
US8094205B2 (en) 2007-01-12 2012-01-10 Canon Kabushiki Kaisha Imaging apparatus, image processing method and computer program for smoothing dynamic range of luminance of an image signal, color conversion process
US20080211923A1 (en) * 2007-01-12 2008-09-04 Canon Kabushiki Kaisha Imaging apparatus, image processing method, and computer program
US8957991B2 (en) 2007-01-12 2015-02-17 Canon Kabushiki Kaisha Imaging apparatus, image processing method and computer program for smoothing dynamic range of luminance of an image signal, color conversion process
CN102693706A (en) * 2007-04-24 2012-09-26 瑞萨电子株式会社 Display device, and display driver
US8134532B2 (en) * 2007-10-31 2012-03-13 Kabushiki Kaisha Toshiba Image display apparatus and image display method
US20090115907A1 (en) * 2007-10-31 2009-05-07 Masahiro Baba Image display apparatus and image display method
US8330970B2 (en) 2007-12-26 2012-12-11 Ricoh Company, Ltd. Image processing device, image processing method, and recording medium
US8169500B2 (en) * 2008-05-30 2012-05-01 Panasonic Corporation Dynamic range compression apparatus, dynamic range compression method, computer-readable recording medium, integrated circuit, and imaging apparatus
US20090295937A1 (en) * 2008-05-30 2009-12-03 Daisuke Sato Dynamic range compression apparatus, dynamic range compression method, computer-readable recording medium, integrated circuit, and imaging apparatus

Also Published As

Publication number Publication date
US20020024609A1 (en) 2002-02-28
US7006692B2 (en) 2006-02-28
US20050141763A1 (en) 2005-06-30

Similar Documents

Publication Publication Date Title
US7006692B2 (en) Method, apparatus and computer program product for processing image data
US7932930B2 (en) Brightness correction for image
US7598990B2 (en) Image signal processing system and electronic imaging device
US7057653B1 (en) Apparatus capable of image capturing
US6583820B1 (en) Controlling method and apparatus for an electronic camera
JPH0355078B2 (en)
US20040246350A1 (en) Image pickup apparatus capable of reducing noise in image signal and method for reducing noise in image signal
US7324702B2 (en) Image processing method, image processing apparatus, image recording apparatus, program, and recording medium
US6256414B1 (en) Digital photography apparatus with an image-processing unit
EP0826285B1 (en) Optimal tone scale mapping in electronic cameras
US20040105107A1 (en) Image sensing device and image processing method
JP2002092607A (en) Image processor, image state determining method and recording medium
US8102446B2 (en) Image capturing system and image processing method for applying grayscale conversion to a video signal, and computer-readable recording medium having recorded thereon an image processing program for applying grayscale conversion to a video signal
US20050052559A1 (en) Image processing apparatus
JP4099366B2 (en) Image processing apparatus, image reading apparatus, image forming apparatus, and image processing method
JP3995524B2 (en) Knee correction circuit and imaging apparatus
US8665351B2 (en) Image processing device, electronic camera, and storage medium storing image processing program
JP2000078437A (en) Video signal processor
JPH09102874A (en) Image processor
JP3460927B2 (en) Image conversion method and recording medium
JP2000261719A (en) Image processing method, its system and recording medium
JP2005142952A (en) Imaging unit
JP2005086772A (en) Image processing method, image processing apparatus, image forming device, image pickup device, and computer program
US8106977B2 (en) Image capturing system and image processing method for applying grayscale conversion to a video signal, and computer-readable recording medium having recorded thereon an image processing program for applying grayscale conversion to a video signal
JP2005303481A (en) Gray scale correcting device and method, electronic information apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: RICOH COMPANY, LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MATSUSHIMA, YUKI;REEL/FRAME:012288/0692

Effective date: 20010816

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12