US20050018923A1 - Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject - Google Patents

Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject Download PDF

Info

Publication number
US20050018923A1
US20050018923A1 US10/848,815 US84881504A US2005018923A1 US 20050018923 A1 US20050018923 A1 US 20050018923A1 US 84881504 A US84881504 A US 84881504A US 2005018923 A1 US2005018923 A1 US 2005018923A1
Authority
US
United States
Prior art keywords
image
skin
taken
subject
processing method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/848,815
Inventor
Giuseppe Messina
Sebastiano Battiato
Alfio Castorina
Laurent Plaza
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
STMicroelectronics SA
STMicroelectronics SRL
Original Assignee
STMicroelectronics SA
STMicroelectronics SRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by STMicroelectronics SA, STMicroelectronics SRL filed Critical STMicroelectronics SA
Publication of US20050018923A1 publication Critical patent/US20050018923A1/en
Assigned to STMICROELECTRONICS S.R.L., STMICROELECTRONICS SA reassignment STMICROELECTRONICS S.R.L. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STMICROELECTRONICS S.R.L.
Priority to US11/949,709 priority Critical patent/US7778483B2/en
Assigned to STMICROELECTRONICS S.R.L. reassignment STMICROELECTRONICS S.R.L. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BATTIATO, SEBASTIANO, CASTORINA, ALFIO, MESSINA, GIUSEPPE, PLAZA, LAURENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6027Correction or control of colour gradation or colour contrast

Definitions

  • the present invention relates to a digital image processing method.
  • the invention relates particularly, but not exclusively, to an image processing method of human subjects being photographed by portable image taking devices, particularly of backlit subjects and the following description is made with reference to this field of application for convenience of illustration only.
  • EP 01830803.1 filed in the name of STMicroelectronics, the assignee of the present application, which is hereby incorporated by reference in its entirety, describes an algorithm being similarly capable to identify visually important regions in a photographic image in order to replace them at intermediate tone levels.
  • This algorithm directly processes images of the Bayer Pattern type and simplifies the statistical measures used to detect regions in the image having a high information content, i.e., visually important regions.
  • the visual analysis is performed on an image having an intermediate luminosity produced by making a temporary correction only based on the average value of the channel G calculated on the whole plane.
  • the algorithms further perform exposure adjustment: once the visually interesting regions have been detected, the exposure adjustment is performed by using the average grey levels of these regions as reference values.
  • the photographed image is changed so to bring the average value of these regions to a target value T by changing all the pixels belonging to the Bayer Pattern.
  • This target value T should be a value ranging around 128 and it should take into consideration a possible correction range performed after the color reconstruction of the corrected Bayer Pattern. This means that, in certain cases, the target value T could be substantially lower than 128.
  • FIG. 1 a simulated response curve of a digital image taking device or camera is used, schematically shown in FIG. 1 .
  • a and C being the control parameters of the curve shape and the value q being expressed in base 2 logarithmic units (also known with the name “stops”. It is possible to evaluate these control parameters A and C by using the information comprised in the article by Mann et al. entitled “Comparametric Equations with Practical Applications in Quantigraphic Image Processing”, IEEE Transactions on Image Processing, Vol. 9, no. 8, 2000, which is hereby incorporated by reference in its entirety.
  • LUT look-up table
  • the distance or offset of the value 128 is 1.24 for f 1 and 0.62 for f 2 respectively (starting from a same input value equal to 72).
  • One embodiment of the present invention provides an image processing method having such features as to overcome the limits still affecting prior art techniques.
  • One embodiment of the present invention detects the features in the photograph of the skin of the subject being photographed in order to select and find convenient interesting regions on whose base an exposure adjustment/correction is applied.
  • One embodiment of the present invention is directed to a digital image processing method that includes: extracting chromatic information of an image taken by an image taking device and related to a human subject; detecting visually interesting regions in the taken image by recognizing areas corresponding to skin of the subject, wherein the recognized areas are the visually interesting regions; and correcting exposure of the taken image by normalizing a grey scale of the taken image based on the visually interesting regions.
  • FIG. 1 shows the trend of a simulated response curve of a known image taking device
  • FIGS. 2A and 2B show LUT transformations related to different curves like the one in FIG. 1 ;
  • FIG. 3 shows an image of a backlit subject taken by a known image taking device
  • FIG. 4 shows an illustrative diagram of a step of the image processing method according to one embodiment of the invention
  • FIGS. 5A-5C and 6 A- 6 C show following image processings for detecting important areas which are used in a step of the image processing method according to one embodiment of the invention
  • FIGS. 7A-7D schematically shows the method according to one embodiment of the invention by means of following image processings
  • FIG. 8A shows an image of a subject
  • FIG. 8B shows an image of the subject of FIG. 8A with areas highlighted corresponding to the skin of the subject
  • FIG. 8C shows a detection histogram of the image of FIG. 8A ;
  • FIGS. 9A-12B show processed images obtained by the method according to alternate embodiments of the invention.
  • An image processing method performs an exposure correction of a digital photographed image taken by an image taking device on the basis of a recognition algorithm of the skin of the photographed subject, thus improving the final photographic image quality, in a decisive way in the case of backlit subjects as in FIG. 3 .
  • the method comprises the following steps:
  • the method provides the extraction of the green channel G of the image taken when the images are in the Bayer format.
  • the chromatic information obtained during the first extraction step is thus used.
  • This adjustment can be performed in two ways:
  • the pixels belonging to the subject skin are placed at the intermediate level of the image grey scale and all the remaining image pixels are placed once again based on this average level.
  • the pixel chromatic values can be reconstructed according to the formulas:
  • R ′ 0.5 ⁇ ( Y ′ Y ⁇ ( R + Y ) + R - Y ) ( 5 )
  • G ′ 0.5 ⁇ ( Y ′ Y ⁇ ( G + Y ) + G - Y ) ( 6 )
  • B ′ 0.5 ⁇ ( Y ′ Y ⁇ ( B + Y ) + B - Y ) ( 7 )
  • R, G, B being the color values of the input pixels.
  • the colors of the human skin belong to a particular color category, different from the colors of most natural objects.
  • Zarti et al. entitled “Comparison of Five Color Models in Skin Pixel Classification”, Proc. Of Int. Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-Time Systems, IEEE Computer Society, Corfu, Greece, pages 58-63, 1999, which is hereby incorporated by reference in its entirety, it has been shown that the colors of the human skin are clustered and that the skin changes between each subject are substantially due to a different intensity and they can thus be reduced by using only the chrominance component of the subject image.
  • s) of a pixel block to belong to a human skin color class starting from the chrominance vector thereof ⁇ right arrow over (x) ⁇ is thus given by: p ⁇ ( x ⁇
  • the method comprises a step of recognizing a portion of the photographic image corresponding to the subject skin.
  • this recognition step is substantially based on a probabilistic function.
  • a probabilistic slicing is prepared to evaluate if this pixel must be classified as belonging to the subject skin. Based on this slicing, a new image is thus processed with a normalized grey scale, wherein the subject skin is highlighted as indicated in FIGS. 5A-6C on two different photographic images which depict human subjects.
  • the image pixels with higher grey values are considered as belonging to the skin of the photographed subject.
  • FIG. 6C highlights areas of the image of FIG. 6A corresponding to the skin of another photographed subject, according to the first embodiment of the recognition step.
  • the recognition step of the image areas corresponding to the skin of the photographed subject is substantially based on a single threshold area and it provides an RGB-format image processing in order to produce a chrominance slicing histogram starting from normalized channels r and g as described in the article by Soriano et al. entitled “Skin Color Modeling Under Varying Illumination Conditions Using the Skin Locus for Selecting Training Pixels”, Real-time Image Sequence Analysis (RISA2000, August 31-September 1, Finland), which is hereby incorporated by reference in its entirety.
  • the resulting bidimensional histogram shows the chrominance slicing in the image and the areas having the right human skin chrominance slicing are thus detected by applying a single threshold area.
  • the pixels of the processed image belonging to the threshold area are classified as belonging to the subject skin.
  • FIG. 5B shows the areas, detected by using this second embodiment of the recognition step, corresponding to the skin of the photographed subject from an image shown in FIG. 5A of a human subject.
  • FIG. 6B highlights areas of the image of FIG. 6A corresponding to the skin of another photographed subject, according to the second embodiment of the recognition step.
  • FIG. 7A -D schematically show the following processings of an image concerning a human subject, particularly backlit, after the different steps of the method according to alternate embodiments of the invention.
  • the recognition step of the areas corresponding to the photographed subject skin is performed with a probabilistic ( FIG. 7B ) or threshold ( FIG. 7C ) method.
  • An exposure correction step can thus be performed by using the areas detected as belonging to the skin of the photographed subject in order to normalize the grey levels obtaining a final processed image ( FIG. 7D ), the best image quality being immediately evident by comparing it with the starting image ( FIG. 7A ).
  • the regions being detected in the recognition step as belonging to the subject's skin are used as visually important images for the following exposure adjustment step of the photographic image.
  • the recognition step of the areas belonging to the photographed subject's skin processes a 8-bit image of the Bayer type, constructing a color image of sub-samples with size corresponding to a quarter of the initial data, obtained as previously described and schematically shown in FIG. 4 .
  • a recognition step of the areas belonging to the photographed subject's skin is performed using a chrominance slicing histogram according to the first probabilistic embodiment or the normalized channels r and g according to the second threshold embodiment.
  • the resulting bidimensional histogram shows the chrominance slicing of the processed image, therefore the areas corresponding to the photographed subject skin, as schematically shown in FIGS. 8 A-C, showing in series a taken image of the Bayer type ( FIG. 8A ), the image ( FIG. 8B ) processed to detect the areas corresponding to the photographed subject skin and a detection histogram ( FIG. 8C ) r-g of these areas.
  • the method finally comprises a reconstruction step of the color of the image taken according to the relations (5) to (7), already shown with reference to the prior art, R, G, B and R′, G′, B′ being the red, green and blue values of the images being respectively taken and processed.
  • the step sequence being described is suitable for a simple change allowing a correction to be performed directly on images in the Bayer Pattern format in favor of a further simplification from the calculation point of view.
  • the average value calculated for the regions concerned can be used to directly perform the Bayer Pattern correction, using for example the modes described in the above-mentioned European patent application no. 01830803.1.
  • I ′( x,y ) ⁇ ( ⁇ ⁇ 1 ( I ( x,y ))+ ⁇ ), (15) where ⁇ is the distance of the ideal exposure situation as expressed in relation (3).
  • FIGS. 9A-9B The image processing of a backlit subject being performed by using a CMOS-VGA sensor and an evaluation kit on the Windows® platform is shown in FIGS. 9A-9B , wherein in the panel V the areas detected as belonging to the photographed subject's skin have been indicated on a black background.
  • FIGS. 10A-12B show the results of a simulation of the method performed by the inventors starting from images taken by a common VGA sensor in the compressed jpeg format ( FIGS. 10A, 11A ) and by a 4.1 Mpixel CCD sensor of a traditional average-band DSC (Digital Still Camera) ( 12 A) and the images processed with the method 10 B, 11 B, 12 B respectively have been indicated, wherein the image qualitative improvement is completely evident.

Abstract

A digital image processing method includes extracting chromatic information of an image taken by an image taking device and related to a human subject; detecting visually interesting regions; and exposure correcting of the taken image by normalizing a grey scale of the taken image based on the visually interesting regions. Advantageously, the method includes recognizing areas corresponding to the skin of the subject, these areas being used as the visually interesting regions for the exposure correction step.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a digital image processing method. The invention relates particularly, but not exclusively, to an image processing method of human subjects being photographed by portable image taking devices, particularly of backlit subjects and the following description is made with reference to this field of application for convenience of illustration only.
  • 2. Description of the Related Art
  • As is well known, one of the main problems limiting photographic image quality involves the generation of sub-optimal photographs due to the wrong exposure to light of the photographed subject.
  • This problem is particularly suffered in portable devices such as mobile phones, wherein several factors concur in obtaining photographs that are wrongly exposed: the smallness of the available optical device, the lack of a flash device and the like. Moreover, the portable device nature and the traditional use of the photographs produced therefrom, particularly linked to the so-called multimedia messaging services or MMS, cause the acquisition of photographs of the type shown in FIG. 3.
  • Although it is impossible to provide a precise definition of a correct exposure, since the exposure depends on the photographed subject as well as on the personal taste of the person looking at the photograph, it is however possible to state that, for “normal” subjects (and thus not considering extreme cases, like a snow-covered landscape whose correct acquisition would involve an intentional photograph overexposure), a correct exposure is obtained when the main features of the photographic image are reproduced by using an intermediate grey level.
  • In the image processing field several techniques for improving the tone quality of photographic images are well known, such as histogram equalization, grey-level slicing, and histogram stretching.
  • Although advantageous under many aspects, these prior art techniques have several drawbacks mainly linked to the fact of being independent from the visual content of the photographed images.
  • The article entitled “Automated Global Enhancement of Digitized Photographs” by Bhukhanwale et al., published on the IEEE Transaction on Consumer Electronics, vol. 40, no. 1, 1994, which is hereby incorporated by reference in its entirety, describes instead an algorithm being capable to identify visually important regions in a photographic image, by adjusting the image exposure so that these regions occupy intermediate tone levels.
  • Moreover, the European patent application no. EP 01830803.1 filed in the name of STMicroelectronics, the assignee of the present application, which is hereby incorporated by reference in its entirety, describes an algorithm being similarly capable to identify visually important regions in a photographic image in order to replace them at intermediate tone levels. This algorithm directly processes images of the Bayer Pattern type and simplifies the statistical measures used to detect regions in the image having a high information content, i.e., visually important regions.
  • The algorithms provided in this document directly operate on the image in the Bayer Pattern format and they comprise the following steps:
      • extraction of the Bayer Pattern green plane or channel G: this plane provides a good approximation of the luminance Y.
        • visual analysis: once the channel G has been extracted, the visually interesting regions are identified on this channel. For this purpose, the green plane is split into N blocks having the same size and the following statistical values are calculated for each block:
        • focus: it characterizes the block sharpness and it is used for identifying the regions comprising high-frequency components, corresponding to details of the photographed image;
        • contrast: it is related to the image tone range—the higher the contrast, the higher the insulation of the so-called clusters of points in the block, i.e., the higher the block visual impact.
  • In order to obtain important visual features, independently from the lighting conditions of the photographed image, the visual analysis is performed on an image having an intermediate luminosity produced by making a temporary correction only based on the average value of the channel G calculated on the whole plane. The algorithms further perform exposure adjustment: once the visually interesting regions have been detected, the exposure adjustment is performed by using the average grey levels of these regions as reference values. In greater detail, the photographed image is changed so to bring the average value of these regions to a target value T by changing all the pixels belonging to the Bayer Pattern. This target value T should be a value ranging around 128 and it should take into consideration a possible correction range performed after the color reconstruction of the corrected Bayer Pattern. This means that, in certain cases, the target value T could be substantially lower than 128.
  • To this aim, a simulated response curve of a digital image taking device or camera is used, schematically shown in FIG. 1.
  • This curve gives an evaluation of how the light values picked up by the camera are turned into pixel values, i.e., it represents the function:
    ƒ(q)=I  (1)
    q being the light amount and I the final pixel value.
  • This simulated response function (1) of a camera can be expressed in a parametric way: f ( q ) = 255 ( 1 + - ( Aq ) ) C ( 2 )
  • A and C being the control parameters of the curve shape and the value q being expressed in base 2 logarithmic units (also known with the name “stops”. It is possible to evaluate these control parameters A and C by using the information comprised in the article by Mann et al. entitled “Comparametric Equations with Practical Applications in Quantigraphic Image Processing”, IEEE Transactions on Image Processing, Vol. 9, no. 8, 2000, which is hereby incorporated by reference in its entirety.
  • It is also possible to obtain experimentally the values of these parameters A and C or to set them in order to realize a particular final effect (for example, a more or less marked improvement of the contrast). In particular, FIG. 1 shows the trend of the simulated response curve expressed by the formula (2) with A=7 and C=0.13.
  • By using this simulated response curve f and an average grey level avg for the visually important regions, the distance Δ of an ideal exposure situation is expressed as:
    Δ=ƒ−1(128)−ƒ−1(avg)  (3)
    and the grey value I(x, y) of a pixel with position (x, y) is thus changed in:
    I′(x,y)=ƒ(ƒ−1(I(x,y))+Δ)  (4)
    It is worth noting that all the grey values of the pixels are corrected.
  • In particular, the above-mentioned changes are substantially a look-up table (LUT) transformation (i.e., they can be put in a table. in order to be then referred to) and FIGS. 2A and 2B show two different transformations (the curves LUT1 and LUT2) generated from a first simulated response curve f1 with values A=7 and C=0.13 and a second simulated response curve f2 with values A=0.85 and C=1.
  • It is worth noting that the distance or offset of the value 128 is 1.24 for f1 and 0.62 for f2 respectively (starting from a same input value equal to 72).
  • From the FIGS. 2A and 2B it is evident that the first curve LUT1 has a more linear trend, while the second curve LUT2 has a so-called range trend.
  • Although advantageous under several aspects, these prior art techniques are not very effective in the case of portable devices like mobile phones for which the photographic images are often backlit and they are mainly focused on human figures, when the user uses the image transmission for videophony, as shown in FIG. 3.
  • BRIEF SUMMARY OF THE INVENTION
  • One embodiment of the present invention provides an image processing method having such features as to overcome the limits still affecting prior art techniques.
  • One embodiment of the present invention detects the features in the photograph of the skin of the subject being photographed in order to select and find convenient interesting regions on whose base an exposure adjustment/correction is applied.
  • One embodiment of the present invention is directed to a digital image processing method that includes: extracting chromatic information of an image taken by an image taking device and related to a human subject; detecting visually interesting regions in the taken image by recognizing areas corresponding to skin of the subject, wherein the recognized areas are the visually interesting regions; and correcting exposure of the taken image by normalizing a grey scale of the taken image based on the visually interesting regions.
  • The features and advantages of the method according to the invention will be apparent from the following description of an embodiment thereof given by way of non-limiting example with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the drawings:
  • FIG. 1 shows the trend of a simulated response curve of a known image taking device;
  • FIGS. 2A and 2B show LUT transformations related to different curves like the one in FIG. 1;
  • FIG. 3 shows an image of a backlit subject taken by a known image taking device;
  • FIG. 4 shows an illustrative diagram of a step of the image processing method according to one embodiment of the invention;
  • FIGS. 5A-5C and 6A-6C show following image processings for detecting important areas which are used in a step of the image processing method according to one embodiment of the invention;
  • FIGS. 7A-7D schematically shows the method according to one embodiment of the invention by means of following image processings;
  • FIG. 8A shows an image of a subject; FIG. 8B shows an image of the subject of FIG. 8A with areas highlighted corresponding to the skin of the subject; and FIG. 8C shows a detection histogram of the image of FIG. 8A;
  • FIGS. 9A-12B show processed images obtained by the method according to alternate embodiments of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • An image processing method according to one embodiment of the invention performs an exposure correction of a digital photographed image taken by an image taking device on the basis of a recognition algorithm of the skin of the photographed subject, thus improving the final photographic image quality, in a decisive way in the case of backlit subjects as in FIG. 3.
  • In particular, the method comprises the following steps:
      • 1) a first extraction step of chromatic information from the photographic image;
      • 2) a second visual analysis step using a recognition method of the areas corresponding to the skin of the subject photographed in the photographic image; and
      • 3) a third exposure adjustment step of the obtained photographic image.
  • 1) Extraction Step
  • As has been seen with reference to known image processing methods, the method provides the extraction of the green channel G of the image taken when the images are in the Bayer format.
  • On the contrary, it provides the extraction of the luminance channel Y for images of the YcbCr type obtained from RGB images.
  • 2) Visual Analysis Step
  • This analysis can be performed on:
      • 1. images in the RGB format;
      • 2. images in the Bayer Pattern format generating, from an initial image, a conveniently sub-sampled RGB copy.
  • In particular, by using this skin recognition method, a plurality of visually interesting regions corresponding to the skin of the photographed subject are detected.
  • The chromatic information obtained during the first extraction step is thus used.
  • In particular, using Bayer data, it is possible to operate on three color planes and on sub-samples having a size corresponding to a quarter of the initial data, as schematically shown in FIG. 4, thus considerably reducing the calculation efforts of the method.
  • 3) Third Exposure Adjustment Step
  • This adjustment can be performed in two ways:
      • 1. correction of images in the RGB format;
      • 2. correction of images in the Bayer Pattern format before a following color interpolation algorithm.
  • In the case of the correction of images in the RGB format, once the visually important pixels have been detected as above-mentioned (i.e., the pixels belonging to the area corresponding to the skin of the photographed subject), a known exposure correction algorithm is used, wherein the average grey level of the known pixel clusters is considered as belonging to the skin of the photographed subject.
  • In other words, the pixels belonging to the subject skin are placed at the intermediate level of the image grey scale and all the remaining image pixels are placed once again based on this average level.
  • In particular, once the luminance value has been corrected from an original value Y to a revised value Y′ that reflects the average grey level of the know pixel clusters corresponding to the skin of the subject, according to the above-mentioned steps (2), (3) and thus using the information comprised in the article by Sakaue et al. entitled “Adaptive Gamma Processing of the Video Cameras for the Expansion of the Dynamic Range”, IEEE Transaction on Consumer Electronics, Vol. 41, n. 3, August 1995, which is hereby incorporated by reference in its entirety, starting from a curve of the type shown in FIG. 1, the pixel chromatic values can be reconstructed according to the formulas: R = 0.5 · ( Y Y · ( R + Y ) + R - Y ) ( 5 ) G = 0.5 · ( Y Y · ( G + Y ) + G - Y ) ( 6 ) B = 0.5 · ( Y Y · ( B + Y ) + B - Y ) ( 7 )
    R, G, B being the color values of the input pixels.
  • In the case of the correction of images in the Bayer format the formulas (5), (6) and (7) cannot be used and the output product will be obtained by simply applying the relation (4) to all the pixels of the pattern.
  • The recognition method of the areas corresponding to the skin of the subject photographed in the photographic image will be now described in greater detail.
  • Several recognition methods of the color of the skin of the photographed subject are known, substantially based on the application of a threshold to a color probability measure for the skin.
  • In fact, the colors of the human skin belong to a particular color category, different from the colors of most natural objects. In particular, in the article by Zarti et al. entitled “Comparison of Five Color Models in Skin Pixel Classification”, Proc. Of Int. Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-Time Systems, IEEE Computer Society, Corfu, Greece, pages 58-63, 1999, which is hereby incorporated by reference in its entirety, it has been shown that the colors of the human skin are clustered and that the skin changes between each subject are substantially due to a different intensity and they can thus be reduced by using only the chrominance component of the subject image.
  • Moreover, in the article by Yang et al. entitled “Skin-Color Modeling and Adaptation”, Technical Report CMU-CS-97-146, School of Computer Science, Carnegie Mellon University, 1997, which is hereby incorporated by reference in its entirety, it has been shown that the human skin color slicing can be represented by a bidimensional Gaussian function in the chrominance plane. The center of this slicing is determined by the mean vector {right arrow over (μ)} and the amplitude of the bell thereof is determined by the covariance matrix Σ, these two values being evaluated starting from a convenient group of test data.
  • The conditional probability p({right arrow over (x)}|s) of a pixel block to belong to a human skin color class starting from the chrominance vector thereof {right arrow over (x)} is thus given by: p ( x | s ) = 1 2 π Σ - 1 2 exp { - [ d ( x ) ] 2 2 } ( 8 )
    d({right arrow over (x)}) being the so-called Mahalonobis distance of the chrominance vector {right arrow over (x)} of the mean vector {right arrow over (μ)}, defined as:
    [d({right arrow over (x)})] 2=({right arrow over (x)}−{right arrow over (μ)})′Σ−1({right arrow over (x)}−{right arrow over (μ)})  (9)
    In other words, the value of the Mahalonobis distance d({right arrow over (x)}) of a pixel block with chrominance vector {right arrow over (x)} determines the probability of this block to belong to a predetermined human skin color class. The higher the Mahalonobis distance d({right arrow over (x)}) is, the lower the probability of the block belonging to this human skin color class.
  • Given the considerable amount of color types, of distance measures and of bidimensional slicings, a considerable variety of image processing algorithms can be considered. Moreover, the lighting conditions and the color models can change according to the image taking modes.
  • Advantageously, the method comprises a step of recognizing a portion of the photographic image corresponding to the subject skin.
  • In a first embodiment, this recognition step is substantially based on a probabilistic function.
  • In particular, for each pixel of an image taken in the YcrCb format a probabilistic slicing is prepared to evaluate if this pixel must be classified as belonging to the subject skin. Based on this slicing, a new image is thus processed with a normalized grey scale, wherein the subject skin is highlighted as indicated in FIGS. 5A-6C on two different photographic images which depict human subjects.
  • Based on this first embodiment of the recognition step of the photographed subject skin, the image pixels with higher grey values are considered as belonging to the skin of the photographed subject.
  • The areas being detected by using this first embodiment of the recognition step of the skin of the photographed subject on an image shown in FIG. 5A are shown in Figure and 5C. Similarly, FIG. 6C highlights areas of the image of FIG. 6A corresponding to the skin of another photographed subject, according to the first embodiment of the recognition step.
  • In a second embodiment, the recognition step of the image areas corresponding to the skin of the photographed subject is substantially based on a single threshold area and it provides an RGB-format image processing in order to produce a chrominance slicing histogram starting from normalized channels r and g as described in the article by Soriano et al. entitled “Skin Color Modeling Under Varying Illumination Conditions Using the Skin Locus for Selecting Training Pixels”, Real-time Image Sequence Analysis (RISA2000, August 31-September 1, Finland), which is hereby incorporated by reference in its entirety. In particular, the normalized channels r and g are defined as:
    r=R/(R+G+B)  (10)
    g=G/(R+G+B)  (11)
    The resulting bidimensional histogram shows the chrominance slicing in the image and the areas having the right human skin chrominance slicing are thus detected by applying a single threshold area. In particular, the pixels of the processed image belonging to the threshold area are classified as belonging to the subject skin.
  • Similarly, FIG. 5B shows the areas, detected by using this second embodiment of the recognition step, corresponding to the skin of the photographed subject from an image shown in FIG. 5A of a human subject. Also, FIG. 6B highlights areas of the image of FIG. 6A corresponding to the skin of another photographed subject, according to the second embodiment of the recognition step.
  • FIG. 7A-D schematically show the following processings of an image concerning a human subject, particularly backlit, after the different steps of the method according to alternate embodiments of the invention.
  • In particular, on a taken image (FIG. 7A) the recognition step of the areas corresponding to the photographed subject skin is performed with a probabilistic (FIG. 7B) or threshold (FIG. 7C) method.
  • An exposure correction step can thus be performed by using the areas detected as belonging to the skin of the photographed subject in order to normalize the grey levels obtaining a final processed image (FIG. 7D), the best image quality being immediately evident by comparing it with the starting image (FIG. 7A).
  • Advantageously, the regions being detected in the recognition step as belonging to the subject's skin are used as visually important images for the following exposure adjustment step of the photographic image.
  • In a preferred embodiment of the image processing method according to the invention, the recognition step of the areas belonging to the photographed subject's skin processes a 8-bit image of the Bayer type, constructing a color image of sub-samples with size corresponding to a quarter of the initial data, obtained as previously described and schematically shown in FIG. 4.
  • Starting from this color image of sub-samples, a recognition step of the areas belonging to the photographed subject's skin is performed using a chrominance slicing histogram according to the first probabilistic embodiment or the normalized channels r and g according to the second threshold embodiment.
  • However, in this second case, the normalized channels r and g are defined as:
    r=R/(R+G*+B)  (12)
    g=G*/(R+G*+B)  (13)
    being
    G*=(G1+G2)/2  (14)
    The resulting bidimensional histogram shows the chrominance slicing of the processed image, therefore the areas corresponding to the photographed subject skin, as schematically shown in FIGS. 8A-C, showing in series a taken image of the Bayer type (FIG. 8A), the image (FIG. 8B) processed to detect the areas corresponding to the photographed subject skin and a detection histogram (FIG. 8C) r-g of these areas.
  • The method finally comprises a reconstruction step of the color of the image taken according to the relations (5) to (7), already shown with reference to the prior art, R, G, B and R′, G′, B′ being the red, green and blue values of the images being respectively taken and processed.
  • The step sequence being described is suitable for a simple change allowing a correction to be performed directly on images in the Bayer Pattern format in favor of a further simplification from the calculation point of view. In fact, once the image for the skin detection according to the diagram of FIG. 4 has been constructed, the average value calculated for the regions concerned can be used to directly perform the Bayer Pattern correction, using for example the modes described in the above-mentioned European patent application no. 01830803.1.
  • It is however worth noting that the color reconstruction formulas described in the equations (5), (6), (7) cannot be used in this case and the output product of the corrected Bayer Pattern will be obtained by simply applying the relation (4) to all the model pixels.
  • In other words, the grey value I(x, y) of a pixel with position (x, y) is modified in:
    I′(x,y)=ƒ(ƒ−1(I(x,y))+Δ),  (15)
    where Δ is the distance of the ideal exposure situation as expressed in relation (3). The image processing of a backlit subject being performed by using a CMOS-VGA sensor and an evaluation kit on the Windows® platform is shown in FIGS. 9A-9B, wherein in the panel V the areas detected as belonging to the photographed subject's skin have been indicated on a black background.
  • Similarly, FIGS. 10A-12B show the results of a simulation of the method performed by the inventors starting from images taken by a common VGA sensor in the compressed jpeg format (FIGS. 10A, 11A) and by a 4.1 Mpixel CCD sensor of a traditional average-band DSC (Digital Still Camera) (12A) and the images processed with the method 10B, 11B, 12B respectively have been indicated, wherein the image qualitative improvement is completely evident.
  • All of the above U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheetare incorporated herein by reference, in their entirety.
  • From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the invention. Accordingly, the invention is not limited except as by the appended claims.

Claims (23)

1. A digital image processing method, comprising the steps of:
extracting chromatic information of an image taken by an image taking device and related to a human subject;
detecting visually interesting regions in the taken image by recognizing areas corresponding to skin of the subject, wherein the recognized areas are the visually interesting regions; and
correcting an exposure of said taken image by normalizing a grey scale of said taken image based on said visually interesting regions.
2. A digital image processing method according to claim 1, wherein said recognizing step comprises constructing a probabilistic slicing of said image taken in a YcrCb format to evaluate if pixels of said image must be classified as belonging to said areas corresponding to the skin of said subject.
3. A digital image processing method according to claim 2, wherein pixels with higher grey values are classified as belonging to said areas corresponding to the skin of said photographed subject.
4. A digital image processing method according to claim 1, wherein said recognizing step comprises applying a threshold area of said image taken in an RGB format to evaluate if pixels of said image must be classified as belonging to said areas corresponding to the skin of said subject.
5. A digital image processing method according to claim 4, wherein applying said threshold area comprises constructing a chrominance slicing histogram of said taken image.
6. A digital image processing method according to claim 5, wherein constructing said chrominance slicing histogram uses normalized channels r and g of the type:

r=R/(R+G+B)
g=G/(R+G+B)
R, G and B being red, green and blue values of each pixel of said taken RGB image.
7. A digital image processing method according to claim 5, wherein said recognizing step uses said chrominance slicing histogram to detect said areas corresponding to the skin of said subject formed by the pixels of said taken image belonging in said chrominance slicing histogram to said threshold area.
8. A digital image processing method according to claim 5, wherein said taken image is of a Bayer type, the method further comprising:
sub-sampling the image according to G*=(G1+G2)/2, wherein G1 and G2 are first and second green channels of the image, and said step of constructing said chrominance slicing histogram uses normalized channels r and g of the type:

r=R/(R+G*+B)
g=G*/(R+G*+B),
R, G and B being red, green and blue values of each pixel of said taken RGB image.
9. A digital image processing method according to claim 1, wherein said taken image is of a Bayer type.
10. A digital image processing method according to claim 9, wherein said exposure correction of said taken image uses:
a simulated response function of a type:
f ( q ) = 255 ( 1 + - ( Aq ) ) C
A and C being predetermined control parameters and q being a light quantity value expressed in base 2 logarithmic units; and
a grey average level (avg) calculated on said visually interesting regions, in order to calculate a distance Δ of an ideal exposure situation using:

Δ=ƒ−1(128)−ƒ−1(avg)
and in order to change a luminance value Y(x, y) of a pixel with position (x, y) in:

Y′(x,y)=ƒ(ƒ−1(Y(x,y))+Δ).
11. A digital image processing method according to claim 1, further comprising a final color reconstruction step.
12. A digital image processing method according to claim 11, said image being taken in an RGB format, wherein said final color reconstruction step comprises the relations:
R = 0.5 · ( Y Y · ( R + Y ) + R - Y ) G = 0.5 · ( Y Y · ( G + Y ) + G - Y ) B = 0.5 · ( Y Y · ( B + Y ) + B - Y )
R, G, B, and Y being respective red, green, blue, and luminance values of said taken image, Y′ being a desired luminance value, and R′, G′, and B′ being respective red, green, and blue values of the image after said final color reconstruction step.
13. A digital image processing method according to claim 11, said image being taken in a Bayer Pattern format, wherein said final color reconstruction step provides that a grey value I(x, y) of a pixel with position (x, y) is changed in:

I′(x,y)=ƒ(ƒ−1(I(x,y))+Δ),
where Δ is a distance of an ideal exposure situation.
14. A digital image processing method, comprising:
extracting chromatic information from an image of a human subject;
detecting, based on the extracted chromatic information, which areas of the image correspond to skin of the subject; and
normalizing grey scale values of the image based on the areas of the image that are detected as corresponding to the skin of the subject.
15. The method of claim 14, wherein the detecting step comprises constructing a probabilistic slicing of the image taken in a YcrCb format to evaluate if pixels of the image belong to the areas corresponding to the skin of the subject.
16. The method of claim 14, wherein the detecting step comprises applying a threshold area of the image taken in an RGB format to evaluate if pixels of the image belong to the areas corresponding to the skin of the subject.
17. The method of claim 16, wherein applying the threshold area comprises constructing a chrominance slicing histogram of the image and using the chrominance slicing histogram to detect the areas corresponding to the skin of the subject formed by the pixels of the image belonging in the chrominance slicing histogram to the threshold area.
18. The method of claim 16, wherein applying the threshold area comprises constructing a chrominance slicing histogram of the image using normalized channels r and g of the type:

r=R/(R+G+B)
g=G/(R+G+B)
R, G and B being red, green and blue values of each pixel of the image.
19. The method of claim 16, wherein applying the threshold area comprises constructing a chrominance slicing histogram of the image and the image is of a Bayer type, the method further comprising:
sub-sampling the image according to G*=(G1+G2)/2, wherein G1 and G2 are first and second green channels of the image, and said step of constructing said chrominance slicing histogram uses normalized channels r and g of the type:

r=R/(R+G*+B)
g=G*/(R+G*+B),
R, G and B being red, green and blue values of each pixel of said taken RGB image.
20. The method of claim 14, wherein the normalizing step performs exposure correction of the image that includes:
using a simulated response function of a type:
f ( q ) = 255 ( 1 + - ( Aq ) ) C
A and C being predetermined control parameters and q being a light quantity value expressed in base 2 logarithmic units; and
calculating a grey average level (avg) of the areas corresponding to the skin;
calculating a distance Δ of an ideal exposure situation using:

Δ=ƒ−1(128)−ƒ−1(avg); and
changing a luminance value Y(x, y) of a pixel with position (x, y) in:

Y′(x, y)=ƒ(ƒ−1(Y(x, y))+Δ).
21. The method of claim 14, further comprising a final color reconstruction step using the relations:
R = 0.5 · ( Y Y · ( R + Y ) + R - Y ) G = 0.5 · ( Y Y · ( G + Y ) + G - Y ) B = 0.5 · ( Y Y · ( B + Y ) + B - Y )
R, G, B, and Y being respective red, green, blue, and luminance values of the image, Y′ being a desired luminance value, and R′, G′, and B′ being respective red, green, and blue values of the image after the final color reconstruction step.
22. The method of claim 14, further comprising a final color reconstruction step that changes a grey value I(x, y) of a pixel with position (x, y) using:

I′(x,y)=ƒ(ƒ−1(I(x,y))+Δ),
where Δ is a distance of an ideal exposure situation.
23. A digital image processor, comprising:
means for extracting chromatic information from an image of a human subject;
means for detecting, based on the extracted chromatic information, which areas of the image correspond to skin of the subject; and
means for normalizing grey scale values of the image based on the areas of the image that are detected as corresponding to the skin of the subject.
US10/848,815 2003-05-19 2004-05-18 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject Abandoned US20050018923A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/949,709 US7778483B2 (en) 2003-05-19 2007-12-03 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP03291155A EP1482724B1 (en) 2003-05-19 2003-05-19 Image processing method for digital images with exposure correction by recognition of skin areas of the subject.
EP03291155.4 2003-05-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/949,709 Division US7778483B2 (en) 2003-05-19 2007-12-03 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject

Publications (1)

Publication Number Publication Date
US20050018923A1 true US20050018923A1 (en) 2005-01-27

Family

ID=33104202

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/848,815 Abandoned US20050018923A1 (en) 2003-05-19 2004-05-18 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject
US11/949,709 Active 2025-05-11 US7778483B2 (en) 2003-05-19 2007-12-03 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/949,709 Active 2025-05-11 US7778483B2 (en) 2003-05-19 2007-12-03 Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject

Country Status (4)

Country Link
US (2) US20050018923A1 (en)
EP (1) EP1482724B1 (en)
JP (1) JP2004357277A (en)
DE (1) DE60314851D1 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050270948A1 (en) * 2004-06-02 2005-12-08 Funai Electric Co., Ltd. DVD recorder and recording and reproducing device
US20060204034A1 (en) * 2003-06-26 2006-09-14 Eran Steinberg Modification of viewing parameters for digital images using face detection information
US20070110305A1 (en) * 2003-06-26 2007-05-17 Fotonation Vision Limited Digital Image Processing Using Face Detection and Skin Tone Information
US20080037838A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-Time Face Tracking in a Digital Image Acquisition Device
WO2008017343A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US20080043122A1 (en) * 2003-06-26 2008-02-21 Fotonation Vision Limited Perfecting the Effect of Flash within an Image Acquisition Devices Using Face Detection
US20080144946A1 (en) * 2006-12-19 2008-06-19 Stmicroelectronics S.R.L. Method of chromatic classification of pixels and method of adaptive enhancement of a color image
US20080143854A1 (en) * 2003-06-26 2008-06-19 Fotonation Vision Limited Perfecting the optics within a digital image acquisition device using face detection
US20080175481A1 (en) * 2007-01-18 2008-07-24 Stefan Petrescu Color Segmentation
US20080317378A1 (en) * 2006-02-14 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images
US20080317357A1 (en) * 2003-08-05 2008-12-25 Fotonation Ireland Limited Method of gathering visual meta data using a reference image
US20080317379A1 (en) * 2007-06-21 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images
US20080316328A1 (en) * 2005-12-27 2008-12-25 Fotonation Ireland Limited Foreground/background separation using reference images
US20090003652A1 (en) * 2006-08-11 2009-01-01 Fotonation Ireland Limited Real-time face tracking with reference images
US20090003708A1 (en) * 2003-06-26 2009-01-01 Fotonation Ireland Limited Modification of post-viewing parameters for digital images using image region or feature information
US20090052750A1 (en) * 2003-06-26 2009-02-26 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US20090080713A1 (en) * 2007-09-26 2009-03-26 Fotonation Vision Limited Face tracking in a camera processor
US20090141144A1 (en) * 2003-06-26 2009-06-04 Fotonation Vision Limited Digital Image Adjustable Compression and Resolution Using Face Detection Information
US20100026831A1 (en) * 2008-07-30 2010-02-04 Fotonation Ireland Limited Automatic face and skin beautification using face detection
US20100039525A1 (en) * 2003-06-26 2010-02-18 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US20100054533A1 (en) * 2003-06-26 2010-03-04 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US20100054549A1 (en) * 2003-06-26 2010-03-04 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US7844135B2 (en) 2003-06-26 2010-11-30 Tessera Technologies Ireland Limited Detecting orientation of digital images using face detection information
US20100321520A1 (en) * 2006-05-11 2010-12-23 Texas Instruments Incorporated Digital camera and method
US20110026780A1 (en) * 2006-08-11 2011-02-03 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US20110060836A1 (en) * 2005-06-17 2011-03-10 Tessera Technologies Ireland Limited Method for Establishing a Paired Connection Between Media Devices
US7912245B2 (en) 2003-06-26 2011-03-22 Tessera Technologies Ireland Limited Method of improving orientation and color balance of digital images using face detection information
US20110081052A1 (en) * 2009-10-02 2011-04-07 Fotonation Ireland Limited Face recognition performance using additional image features
US7953251B1 (en) 2004-10-28 2011-05-31 Tessera Technologies Ireland Limited Method and apparatus for detection and correction of flash-induced eye defects within digital images using preview or other reference images
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US20120105675A1 (en) * 2006-03-17 2012-05-03 Qualcomm Incorporated Systems, methods, and apparatus for exposure control
US20160148345A1 (en) * 2013-04-25 2016-05-26 Mediatek Inc. Methods of processing mosaicked images
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US20170214848A1 (en) * 2012-04-09 2017-07-27 Sony Corporation Image processing device and associated methodology for determining a main subject in an image

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4325552B2 (en) * 2004-12-24 2009-09-02 セイコーエプソン株式会社 Image processing apparatus, image processing method, and image processing program
US7986351B2 (en) 2005-01-27 2011-07-26 Qualcomm Incorporated Luma adaptation for digital image processing
US20080255840A1 (en) * 2007-04-16 2008-10-16 Microsoft Corporation Video Nametags
US8526632B2 (en) * 2007-06-28 2013-09-03 Microsoft Corporation Microphone array for a camera speakerphone
US8165416B2 (en) * 2007-06-29 2012-04-24 Microsoft Corporation Automatic gain and exposure control using region of interest detection
US8330787B2 (en) * 2007-06-29 2012-12-11 Microsoft Corporation Capture device movement compensation for speaker indexing
US8570359B2 (en) * 2008-08-04 2013-10-29 Microsoft Corporation Video region of interest features
US20100245610A1 (en) * 2009-03-31 2010-09-30 Electronics And Telecommunications Research Institute Method and apparatus for processing digital image
US20110044552A1 (en) * 2009-08-24 2011-02-24 Jonathan Yen System and method for enhancement of images in a selected region of interest of a captured image
US10169339B2 (en) 2011-10-31 2019-01-01 Elwha Llc Context-sensitive query enrichment
EP2602692A1 (en) * 2011-12-05 2013-06-12 Alcatel Lucent Method for recognizing gestures and gesture detector
US10552581B2 (en) 2011-12-30 2020-02-04 Elwha Llc Evidence-based healthcare information management protocols
US10475142B2 (en) 2011-12-30 2019-11-12 Elwha Llc Evidence-based healthcare information management protocols
US10559380B2 (en) 2011-12-30 2020-02-11 Elwha Llc Evidence-based healthcare information management protocols
US10340034B2 (en) 2011-12-30 2019-07-02 Elwha Llc Evidence-based healthcare information management protocols
US10402927B2 (en) 2011-12-30 2019-09-03 Elwha Llc Evidence-based healthcare information management protocols
US10679309B2 (en) 2011-12-30 2020-06-09 Elwha Llc Evidence-based healthcare information management protocols
US10528913B2 (en) 2011-12-30 2020-01-07 Elwha Llc Evidence-based healthcare information management protocols
US9659237B2 (en) 2012-10-05 2017-05-23 Micro Usa, Inc. Imaging through aerosol obscurants
DE102017000908A1 (en) * 2017-02-01 2018-09-13 Carl Zeiss Industrielle Messtechnik Gmbh Method for determining the exposure time for a 3D image
CN107491755B (en) * 2017-08-16 2021-04-27 京东方科技集团股份有限公司 Method and device for gesture recognition
US10951859B2 (en) 2018-05-30 2021-03-16 Microsoft Technology Licensing, Llc Videoconferencing device and method
EP3934751A1 (en) 2019-03-08 2022-01-12 Mevion Medical Systems, Inc. Collimator and energy degrader for a particle therapy system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5585860A (en) * 1994-04-15 1996-12-17 Matsushita Electric Industrial Co., Ltd. Reproduction circuit for skin color in video signals

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS52156624A (en) * 1976-06-22 1977-12-27 Fuji Photo Film Co Ltd Detection of skin color of color film
US5130935A (en) * 1986-03-31 1992-07-14 Canon Kabushiki Kaisha Color image processing apparatus for extracting image data having predetermined color information from among inputted image data and for correcting inputted image data in response to the extracted image data
US6249317B1 (en) * 1990-08-01 2001-06-19 Minolta Co., Ltd. Automatic exposure control apparatus
JP2844894B2 (en) 1990-10-15 1999-01-13 ミノルタ株式会社 Automatic exposure control device
JP2878855B2 (en) * 1991-02-21 1999-04-05 富士写真フイルム株式会社 Image processing device
US5781276A (en) * 1992-07-27 1998-07-14 Agfa-Gevaert Ag Printing of color film
US5715377A (en) * 1994-07-21 1998-02-03 Matsushita Electric Industrial Co. Ltd. Gray level correction apparatus
US5528339A (en) * 1994-08-26 1996-06-18 Eastman Kodak Company Color image reproduction of scenes with color enhancement and preferential tone mapping
JP3208324B2 (en) * 1996-05-31 2001-09-10 三洋電機株式会社 Digital still camera
US7057653B1 (en) * 1997-06-19 2006-06-06 Minolta Co., Ltd. Apparatus capable of image capturing
US6292574B1 (en) * 1997-08-29 2001-09-18 Eastman Kodak Company Computer program product for redeye detection
US6738510B2 (en) * 2000-02-22 2004-05-18 Olympus Optical Co., Ltd. Image processing apparatus
US20020063899A1 (en) * 2000-11-29 2002-05-30 Tinku Acharya Imaging device connected to processor-based system using high-bandwidth bus
US7184080B2 (en) * 2001-06-25 2007-02-27 Texas Instruments Incorporated Automatic white balancing via illuminant scoring
US6845181B2 (en) * 2001-07-12 2005-01-18 Eastman Kodak Company Method for processing a digital image to adjust brightness
US7050636B2 (en) * 2001-12-07 2006-05-23 Eastman Kodak Company Method and system for improving an image characteristic based on image content
EP1326209B1 (en) 2001-12-24 2007-09-26 STMicroelectronics S.r.l. Method for contrast enhancement in colour digital image
US6975759B2 (en) * 2002-06-25 2005-12-13 Koninklijke Philips Electronics N.V. Method and system for white balancing images using facial color as a reference signal
US7609908B2 (en) * 2003-04-30 2009-10-27 Eastman Kodak Company Method for adjusting the brightness of a digital image utilizing belief values
US7616233B2 (en) * 2003-06-26 2009-11-10 Fotonation Vision Limited Perfecting of digital image capture parameters within acquisition devices using face detection
US7376270B2 (en) * 2003-12-29 2008-05-20 Canon Kabushiki Kaisha Detecting human faces and detecting red eyes
JP4396387B2 (en) * 2004-05-13 2010-01-13 オムロン株式会社 Image correction device
US7542600B2 (en) * 2004-10-21 2009-06-02 Microsoft Corporation Video image quality
JP4803178B2 (en) * 2005-06-14 2011-10-26 株式会社ニコン Image processing apparatus, computer program product, and image processing method
US20080158396A1 (en) * 2006-08-07 2008-07-03 Transchip, Inc. Image Signal Processor For CMOS Image Sensors

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5585860A (en) * 1994-04-15 1996-12-17 Matsushita Electric Industrial Co., Ltd. Reproduction circuit for skin color in video signals

Cited By (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7844076B2 (en) 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US20090102949A1 (en) * 2003-06-26 2009-04-23 Fotonation Vision Limited Perfecting the Effect of Flash within an Image Acquisition Devices using Face Detection
US20070110305A1 (en) * 2003-06-26 2007-05-17 Fotonation Vision Limited Digital Image Processing Using Face Detection and Skin Tone Information
US20070160307A1 (en) * 2003-06-26 2007-07-12 Fotonation Vision Limited Modification of Viewing Parameters for Digital Images Using Face Detection Information
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US9129381B2 (en) 2003-06-26 2015-09-08 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US8055090B2 (en) 2003-06-26 2011-11-08 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8989453B2 (en) 2003-06-26 2015-03-24 Fotonation Limited Digital image processing using face detection information
US20080043122A1 (en) * 2003-06-26 2008-02-21 Fotonation Vision Limited Perfecting the Effect of Flash within an Image Acquisition Devices Using Face Detection
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US20080143854A1 (en) * 2003-06-26 2008-06-19 Fotonation Vision Limited Perfecting the optics within a digital image acquisition device using face detection
US8126208B2 (en) 2003-06-26 2012-02-28 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8005265B2 (en) 2003-06-26 2011-08-23 Tessera Technologies Ireland Limited Digital image processing using face detection information
US8675991B2 (en) 2003-06-26 2014-03-18 DigitalOptics Corporation Europe Limited Modification of post-viewing parameters for digital images using region or feature information
US20060204034A1 (en) * 2003-06-26 2006-09-14 Eran Steinberg Modification of viewing parameters for digital images using face detection information
US8498452B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US7860274B2 (en) 2003-06-26 2010-12-28 Fotonation Vision Limited Digital image processing using face detection information
US7912245B2 (en) 2003-06-26 2011-03-22 Tessera Technologies Ireland Limited Method of improving orientation and color balance of digital images using face detection information
US7853043B2 (en) 2003-06-26 2010-12-14 Tessera Technologies Ireland Limited Digital image processing using face detection information
US20100039525A1 (en) * 2003-06-26 2010-02-18 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US8326066B2 (en) 2003-06-26 2012-12-04 DigitalOptics Corporation Europe Limited Digital image adjustable compression and resolution using face detection information
US20090003708A1 (en) * 2003-06-26 2009-01-01 Fotonation Ireland Limited Modification of post-viewing parameters for digital images using image region or feature information
US20090052750A1 (en) * 2003-06-26 2009-02-26 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US20090052749A1 (en) * 2003-06-26 2009-02-26 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US7848549B2 (en) 2003-06-26 2010-12-07 Fotonation Vision Limited Digital image processing using face detection information
US7844135B2 (en) 2003-06-26 2010-11-30 Tessera Technologies Ireland Limited Detecting orientation of digital images using face detection information
US20090141144A1 (en) * 2003-06-26 2009-06-04 Fotonation Vision Limited Digital Image Adjustable Compression and Resolution Using Face Detection Information
US8224108B2 (en) 2003-06-26 2012-07-17 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US20100271499A1 (en) * 2003-06-26 2010-10-28 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US9053545B2 (en) 2003-06-26 2015-06-09 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US20110075894A1 (en) * 2003-06-26 2011-03-31 Tessera Technologies Ireland Limited Digital Image Processing Using Face Detection Information
US20100054533A1 (en) * 2003-06-26 2010-03-04 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US20100054549A1 (en) * 2003-06-26 2010-03-04 Fotonation Vision Limited Digital Image Processing Using Face Detection Information
US8131016B2 (en) 2003-06-26 2012-03-06 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US7684630B2 (en) 2003-06-26 2010-03-23 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US7693311B2 (en) 2003-06-26 2010-04-06 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US20100092039A1 (en) * 2003-06-26 2010-04-15 Eran Steinberg Digital Image Processing Using Face Detection Information
US7702136B2 (en) 2003-06-26 2010-04-20 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US20100165140A1 (en) * 2003-06-26 2010-07-01 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US7809162B2 (en) 2003-06-26 2010-10-05 Fotonation Vision Limited Digital image processing using face detection information
US8330831B2 (en) 2003-08-05 2012-12-11 DigitalOptics Corporation Europe Limited Method of gathering visual meta data using a reference image
US20080317357A1 (en) * 2003-08-05 2008-12-25 Fotonation Ireland Limited Method of gathering visual meta data using a reference image
US20050270948A1 (en) * 2004-06-02 2005-12-08 Funai Electric Co., Ltd. DVD recorder and recording and reproducing device
US20110221936A1 (en) * 2004-10-28 2011-09-15 Tessera Technologies Ireland Limited Method and Apparatus for Detection and Correction of Multiple Image Defects Within Digital Images Using Preview or Other Reference Images
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US7953251B1 (en) 2004-10-28 2011-05-31 Tessera Technologies Ireland Limited Method and apparatus for detection and correction of flash-induced eye defects within digital images using preview or other reference images
US8135184B2 (en) 2004-10-28 2012-03-13 DigitalOptics Corporation Europe Limited Method and apparatus for detection and correction of multiple image defects within digital images using preview or other reference images
US20110060836A1 (en) * 2005-06-17 2011-03-10 Tessera Technologies Ireland Limited Method for Establishing a Paired Connection Between Media Devices
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US20080316328A1 (en) * 2005-12-27 2008-12-25 Fotonation Ireland Limited Foreground/background separation using reference images
US20080317378A1 (en) * 2006-02-14 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images
US8682097B2 (en) 2006-02-14 2014-03-25 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US20120105675A1 (en) * 2006-03-17 2012-05-03 Qualcomm Incorporated Systems, methods, and apparatus for exposure control
US8824827B2 (en) * 2006-03-17 2014-09-02 Qualcomm Incorporated Systems, methods, and apparatus for exposure control
US20100321520A1 (en) * 2006-05-11 2010-12-23 Texas Instruments Incorporated Digital camera and method
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US7916897B2 (en) 2006-08-11 2011-03-29 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US7469055B2 (en) 2006-08-11 2008-12-23 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US20080037838A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-Time Face Tracking in a Digital Image Acquisition Device
US8050465B2 (en) 2006-08-11 2011-11-01 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US20110026780A1 (en) * 2006-08-11 2011-02-03 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
WO2008017343A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8055029B2 (en) 2006-08-11 2011-11-08 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US7864990B2 (en) 2006-08-11 2011-01-04 Tessera Technologies Ireland Limited Real-time face tracking in a digital image acquisition device
US20100060727A1 (en) * 2006-08-11 2010-03-11 Eran Steinberg Real-time face tracking with reference images
US20080037839A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-Time Face Tracking in a Digital Image Acquisition Device
US20080037840A1 (en) * 2006-08-11 2008-02-14 Fotonation Vision Limited Real-Time Face Tracking in a Digital Image Acquisition Device
US7403643B2 (en) 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8744145B2 (en) 2006-08-11 2014-06-03 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US20090208056A1 (en) * 2006-08-11 2009-08-20 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8270674B2 (en) 2006-08-11 2012-09-18 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US7460694B2 (en) 2006-08-11 2008-12-02 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US20090003652A1 (en) * 2006-08-11 2009-01-01 Fotonation Ireland Limited Real-time face tracking with reference images
US8666125B2 (en) 2006-08-11 2014-03-04 DigitalOptics Corporation European Limited Real-time face tracking in a digital image acquisition device
US8666124B2 (en) 2006-08-11 2014-03-04 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US7460695B2 (en) 2006-08-11 2008-12-02 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US8509498B2 (en) 2006-08-11 2013-08-13 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US8509496B2 (en) 2006-08-11 2013-08-13 DigitalOptics Corporation Europe Limited Real-time face tracking with reference images
US8385610B2 (en) 2006-08-11 2013-02-26 DigitalOptics Corporation Europe Limited Face tracking for controlling imaging parameters
US8422739B2 (en) 2006-08-11 2013-04-16 DigitalOptics Corporation Europe Limited Real-time face tracking in a digital image acquisition device
US20110129121A1 (en) * 2006-08-11 2011-06-02 Tessera Technologies Ireland Limited Real-time face tracking in a digital image acquisition device
US8374425B2 (en) * 2006-12-19 2013-02-12 Stmicroelectronics, S.R.L. Method of chromatic classification of pixels and method of adaptive enhancement of a color image
US20130136352A1 (en) * 2006-12-19 2013-05-30 Stmicroelectronics S.R.L. Method of chromatic classification of pixels and method of adaptive enhancement of a color image
US8811733B2 (en) * 2006-12-19 2014-08-19 Stmicroelectronics S.R.L. Method of chromatic classification of pixels and method of adaptive enhancement of a color image
US20080144946A1 (en) * 2006-12-19 2008-06-19 Stmicroelectronics S.R.L. Method of chromatic classification of pixels and method of adaptive enhancement of a color image
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US20080175481A1 (en) * 2007-01-18 2008-07-24 Stefan Petrescu Color Segmentation
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US10733472B2 (en) 2007-06-21 2020-08-04 Fotonation Limited Image capture device with contemporaneous image correction mechanism
US9767539B2 (en) 2007-06-21 2017-09-19 Fotonation Limited Image capture device with contemporaneous image correction mechanism
US20080317379A1 (en) * 2007-06-21 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images
US8213737B2 (en) 2007-06-21 2012-07-03 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US20090080713A1 (en) * 2007-09-26 2009-03-26 Fotonation Vision Limited Face tracking in a camera processor
US8155397B2 (en) 2007-09-26 2012-04-10 DigitalOptics Corporation Europe Limited Face tracking in a camera processor
US20100026831A1 (en) * 2008-07-30 2010-02-04 Fotonation Ireland Limited Automatic face and skin beautification using face detection
US9007480B2 (en) 2008-07-30 2015-04-14 Fotonation Limited Automatic face and skin beautification using face detection
US20100026832A1 (en) * 2008-07-30 2010-02-04 Mihai Ciuc Automatic face and skin beautification using face detection
US8345114B2 (en) 2008-07-30 2013-01-01 DigitalOptics Corporation Europe Limited Automatic face and skin beautification using face detection
US8384793B2 (en) 2008-07-30 2013-02-26 DigitalOptics Corporation Europe Limited Automatic face and skin beautification using face detection
US20110081052A1 (en) * 2009-10-02 2011-04-07 Fotonation Ireland Limited Face recognition performance using additional image features
US8379917B2 (en) 2009-10-02 2013-02-19 DigitalOptics Corporation Europe Limited Face recognition performance using additional image features
US20170214848A1 (en) * 2012-04-09 2017-07-27 Sony Corporation Image processing device and associated methodology for determining a main subject in an image
US10848662B2 (en) * 2012-04-09 2020-11-24 Sony Corporation Image processing device and associated methodology for determining a main subject in an image
US20160148345A1 (en) * 2013-04-25 2016-05-26 Mediatek Inc. Methods of processing mosaicked images
US9818172B2 (en) * 2013-04-25 2017-11-14 Mediatek Inc. Methods of processing mosaicked images

Also Published As

Publication number Publication date
EP1482724A1 (en) 2004-12-01
US20080089583A1 (en) 2008-04-17
JP2004357277A (en) 2004-12-16
US7778483B2 (en) 2010-08-17
DE60314851D1 (en) 2007-08-23
EP1482724B1 (en) 2007-07-11

Similar Documents

Publication Publication Date Title
US7778483B2 (en) Digital image processing method having an exposure correction based on recognition of areas corresponding to the skin of the photographed subject
Battiato et al. Exposure correction for imaging devices: an overview
US7542600B2 (en) Video image quality
US8811733B2 (en) Method of chromatic classification of pixels and method of adaptive enhancement of a color image
EP1800259B1 (en) Image segmentation method and system
US7639940B2 (en) Photography apparatus, photography method, and photography program using lighting condition parameters
US20110268359A1 (en) Foreground/Background Segmentation in Digital Images
US7386181B2 (en) Image display apparatus
US20040190789A1 (en) Automatic analysis and adjustment of digital images with exposure problems
US20060203311A1 (en) Automatic white balance method adaptive to digital color images
US7907786B2 (en) Red-eye detection and correction
WO2005114577A1 (en) Method for determining image quality
AU2015201623A1 (en) Choosing optimal images with preference distributions
Battiato et al. Automatic image enhancement by content dependent exposure correction
US20130004070A1 (en) Skin Color Detection And Adjustment In An Image
US7305124B2 (en) Method for adjusting image acquisition parameters to optimize object extraction
US7283667B2 (en) Image processing unit, image processing method, and image processing program
Messina et al. Image quality improvement by adaptive exposure correction techniques
JP5327766B2 (en) Memory color correction in digital images
Zini et al. Shallow camera pipeline for night photography rendering
Battiato et al. Automatic global image enhancement by skin dependent exposure correction
US7362910B2 (en) Color image characterization, enhancement and balancing process
Ferradans et al. A multi-modal approach to perceptual tone mapping

Legal Events

Date Code Title Description
AS Assignment

Owner name: STMICROELECTRONICS S.R.L., ITALY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STMICROELECTRONICS S.R.L.;REEL/FRAME:015924/0310

Effective date: 20041115

Owner name: STMICROELECTRONICS SA, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STMICROELECTRONICS S.R.L.;REEL/FRAME:015924/0310

Effective date: 20041115

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: STMICROELECTRONICS S.R.L., ITALY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MESSINA, GIUSEPPE;BATTIATO, SEBASTIANO;CASTORINA, ALFIO;AND OTHERS;REEL/FRAME:028710/0574

Effective date: 20040907