WO2001046753A1 - Scene recognition method and system using brightness and ranging mapping - Google Patents

Scene recognition method and system using brightness and ranging mapping Download PDF

Info

Publication number
WO2001046753A1
WO2001046753A1 PCT/US2000/034878 US0034878W WO0146753A1 WO 2001046753 A1 WO2001046753 A1 WO 2001046753A1 US 0034878 W US0034878 W US 0034878W WO 0146753 A1 WO0146753 A1 WO 0146753A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
brightness
range
sensing
regions
Prior art date
Application number
PCT/US2000/034878
Other languages
French (fr)
Inventor
George D. Whiteside
Original Assignee
Polaroid Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22629595&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2001046753(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Polaroid Corporation filed Critical Polaroid Corporation
Publication of WO2001046753A1 publication Critical patent/WO2001046753A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/097Digital circuits for control of both exposure time and aperture
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/16Control of exposure by setting shutters, diaphragms or filters, separately or conjointly in accordance with both the intensity of the flash source and the distance of the flash source from the object, e.g. in accordance with the "guide number" of the flash bulb and the focusing of the camera

Definitions

  • the present invention pertains to an automated exposure control unit and method for image recording devices and is, particularly, adapted for use in conjunction with cameras employing strobes for exposing photographic film of the self-developing type.
  • No. 4,423,936 describes an exposure control unit utilizing a photometer having a multi-sensor array that detects both subject range and ambient light intensity.
  • a comparison of range measurements identifies that scene area having the nearest object to the ambient light intensity measurements of subject and non-subject areas in order to classify scene lighting conditions and then select a corresponding program which controls the operation of the system to vary the ratio of ambient to artificial light contributions to exposure.
  • An object of the invention is to control the photographic exposure automatically such that the subject and non-subject areas of a scene are correctly exposed in an economical and efficient manner. It is another object of the invention to provide a digital capture system which can utilize the material already in the system.
  • the invention is directed to a method of controlling exposure to a photosensitive element by controlling illumination of the scene through a comparison of brightness and range mapping.
  • the method comprises: (a) sensing image data from the scene including scene brightness from a first set of a plurality of regions in the scene; (b) forming a brightness map of the scene in accordance with the brightness data corresponding to the first set of regions; (c) sensing range data from a second set of regions in the scene; (d) forming a range map to determine a subject in the scene; and, (e) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject brightness; and, (f) controlling the exposure by controlling artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally maintained.
  • provision is made
  • the invention is further directed to a system for controlling exposure by controlling the relationship between ambient and artificial illumination during exposure.
  • the system comprises a sensor for sensing image data including scene brightness from a first set of a plurality of regions in a scene; a sensor for sensing range data from a second set of regions in the scene; and, processing means for (i) defining a brightness map of the scene in accordance with the brightness data corresponding to the first set of regions, (ii) defining a range map of the scene in accordance to the second set of regions to determine a subject in the scene, (iii) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject range; and, (iv) controlling the exposure during an exposure interval by controlling artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally maintained.
  • control of the ambient exposure of the background is controlled by the ambient background readings and the exposure of the subject is controlled by a comparing the subject brightness to the scene brightness and by controlling the amount of artificial illumination directed at the subject by the source of artificial illumination to make up the difference in brightness values in order to give correct exposure for both the scene background and the subject. Accordingly, both the scene subject and scene background are well- exposed.
  • Fig. 1 shows a block diagram of a camera according to the present invention
  • Fig. 2 shows a block diagram of operation of the camera of Fig. 1 ;
  • Fig. 3 shows a scene of which a photograph is to be taken
  • Fig. 4 shows a plurality of regions used in range mapping
  • Fig. 5 shows the scene of Fig. 3 superimposed on the plurality of regions of Fig. 4;
  • Fig. 6 shows a ranging map
  • Fig. 7 A shows a macroscopic view of a plurality of regions used in brightness mapping
  • Fig. 7B shows an enlarged portion of the macroscopic view of Fig. 7A and shows discrete regions
  • Fig. 8 shows the scene in which the subject is illuminated more dimly than the background
  • Fig. 9 shows a scene in which the subject is illuminated more brightly than the background
  • Fig. 10 is an elevation view of another embodiment of an exposure control device with, however, portions removed for clarity in illustrating its construction and one mode of operation;
  • Fig. 11 is a view similar to Fig. 10, but illustrating the device in yet another mode of operation.
  • Fig. 12 is a cross-sectional view illustrating the system of Fig. 10. DETAILED DESCRIPTION
  • Fig. 1 shows a camera 1 according to the present invention.
  • the camera 1 includes a quenchable electronic strobe unit or other suitable strobe unit 103, and a photo responsive area sensor 105, such as the CCD.
  • a filter element 107 such as a filter sold under the model number CM 500, is disposed in front of the CCD 105 and is moved into one of two positions by motor 1 1 1 such that light directed toward the CCD 105 is intercepted by either a visible-passing filter 109V or an infrared-passing filter 1091R.
  • filter element 107 can be disposed in front of the strobe 103.
  • An auto-focus device 1 13, such as any auto-focus device known in the art moves a lens 115 into the appropriate focus position.
  • the components are connected to a microprocessor 117 or other suitable control device to control the operations of the camera.
  • the camera of Fig. 1 operates as shown in the flow chart of Fig. 2.
  • the operation starts in step 201.
  • An IR wink or strobe pulse is emitted in step 203 by the strobe 103 (or another suitable IR source which may be separately provided) to allow an auto-focus operation in step 205 by the autofocus device 113.
  • an auto-focus operation need not be performed, but rather a set lens can be used that if desired could be manually set.
  • the CCD is set to high resolution in step 207 so that a brightness map of the scene (subject and background) to be photographed can be formed in step 209.
  • the visible-passing filter 109V is located in front of the CCD 105 so that the brightness map is formed with visible light.
  • step 21 1 the CCD is set to low resolution, and another IR wink or strobe pulse is emitted in step 213 so that a ranging map of the scene can be formed in step 215.
  • the infrared-passing filter 109 IR is located in front of the
  • step 219 * set in step 219 to provide the appropriate level of backfill for the subject and the background. With this flash setting, the picture is taken in step 221, and the operation ends in step 223.
  • Fig. 3 shows a view of CCD 105 in low resolution divided into sixteen exemplary regions R1-R16, although, as noted above, the number of regions capable of being formed by the CCD typically r ranges from ten to 500 in low resolution.
  • Fig. 5 shows the same scene as in Fig. 3 except that it is divided into the sixteen regions corresponding to regions R1-R16 of Fig. 4 and thus shows how CCD 105 in low resolution divides scene 301 into the regions. In each of the regions R1-R16 of Fig. 5, the range is determined, and a near or far determination is made for each of the regions.
  • the ranging map 601 of Fig. 6 is obtained.
  • regions R1-R5, R8, R9, R12, R13, and R16 corresponding mostly to the background 305 of the scene 301 are determined to be far regions, while regions R6, R7, F10, Rl l, R14 and R15, corresponding mostly to the subject 303, are determined to be near regions.
  • regions R6, R7, F10, Rl l, R14 and R15 corresponding mostly to the subject 303.
  • each pixel discernible by the CCD can be a separate region; alternatively, a resolution intermediate between the low resolution and the maximum resolution of the CCD can be used. If the CCD is capable of VGA resolution (640 pixels across by 480 pixels down), the regions range from R(0,0) in the upper left corner to R(639,0) in the upper right corner to R(0,479) in the lower left corner to R(639,479) in the lower right corner, as indicated in Fig. 7A.
  • Fig. 7B is shown enlarged in Fig. 7B, which shows discrete regions ranging from
  • Fig. 8 shows a situation in which the subject is illuminated more dimly than the background
  • Fig. 9 shows the opposite situation.
  • the brightness levels in the brightness map obtained in step 209 are compared with ranging map 601. If the near regions are darker than the far regions, the situation in Fig. 8 is recognized, while if the opposite is true, the situation in Fig. 9 is recognized.
  • the flash is set accordingly to contribute to the exposure. For instance, in some circumstances if the subject is significantly frontlit, the flash need not be fired.
  • Fig's. 10-12 illustrate portions of an exposure control unit 400.
  • the exposure control unit 400 is similar that described in commonly assigned U.S. Patent Application, Serial No. 09/133,661 filed August 2, 1998. Hence, only those portions of the exposure control unit 400 which are necessary to understand the present invention will be described since a detailed description thereof is incorporated herein and made a part hereof.
  • the unit 400 includes an aperture/lens disc 420 that is rotatably mounted in the housing assembly 402.
  • the aperture/lens disc 420 is indexed under the control of a stepper motor 466 by means of a spur gear 430 that meshes with a gear 468.
  • a set of different sized apertures 422, 424 are selectively rotated into and out of coincidence with the CCD image sensor 442.
  • the apertures 422, 424 can be provided with lenses of a suitable variety such as close-up, wide angle, or telephoto.
  • a filter assembly 460 is rotationally mounted on a shaft 461 within the housing 402 before the lens assembly 410 and is biased by a torsion spring 470 to rest against a segment 412 of the housing assembly 402, whereby a visible light pass filter element 462 is positioned to be in overlying relationship to an aperture 424.
  • the image sensor 442 is enabled and the visible pass filter element 462 allows visible light from the scene to be passed to the sensor, whereby scene brightness measurements for each image sensing region of the sensor can be achieved.
  • These image sensing regions of the sensor correspond to scene portions that are to be sensed for establishing the scene brightness map.
  • the signals from the sensor are forwarded to a system controller (not shown).
  • the aperture/lens disc 420 is rotated in a counterclockwise direction, whereby a tab 423 on the disc drives the filter assembly 460 against the bias of the spring so that an infrared pass filter element 464 is placed in overlying relationship to the CCD image sensor 442, while the aperture 424 is now in overlying relationship to the image sensor.
  • the image sensor 442 can be operated in a low resolution mode for determining a range map distinguishing subject areas relative to the nonsubject areas.
  • the flash is operated to illuminate the scene and the resulting reflected scene brightness will pass through the IR pass filter 464 to the sensor 442; whereby range information for each sensor region can be determined consistent with the wink IR ranging technique described in commonly-assigned U.S. Patent No. 4,785,322.
  • the present invention envisions the use of differently configured image acquisition modules with a variety of techniques for presenting an IR filter over such a module, such as by moving an IR filter in front of the acquisition module by means of a stepper motor or solenoid.
  • a system controller (not shown) includes a logic circuit board includes a micro-controller circuit that which receives electrical signals from the various camera elements and, in turn, controls operation of the stepper motor and the CCD as well as strobe and camera shutter mechanism logic circuit board.
  • the logic circuit board includes a microprocessor that is operable for decoding signals from, for instance, the sensor for the scene brightness and range determining steps during different modes of operation.
  • the logic circuit includes a conventional, electronically erasable memory section which includes appropriate numbers of look-up tables, each of which employ combinations of the exposure parameters of subject range, flash mode selection, pre-exposure scene brightness information to define the desired memory cell address.
  • the present invention contemplates establishing the scene brightness and ranging maps, as well as comparing the maps to provide a relationship between the two that controls the strobe firing intervals for each and every combination of the scene brightness and ranging maps.
  • the logic circuit will control when the strobe will be fired and quenched during an exposure cycle so that a desired fill flash ratio between ambient and flash is maintained despite wide variances in scene lighting and subject ranges that exist.
  • the image sensing CCD can be operated in only one resolution, so that the ranging and brightness information can be taken simultaneously. In other words, a single set of data can be used to determine brightness and ranging.
  • the system and method described above do not have to be implemented in a camera, but find utility in any art in which correct illumination of subjects under a variety of lighting conditions are an issue.

Abstract

A camera (101) includes a control unit (117) for controlling exposure of an image and at least one sensor assembly (105) for sensing image data including scene brightness for a first set and a second regions. The camera (101) further includes a filter element (107) and an electronic storbe unit (103) for illuminating the scene.

Description

SCENE RECOGNITION METHOD AND SYSTEM USING BRIGHTNESS AND RANGING MAPPING
CROSS-RE ERENCE TO RELATED APPLICATION
The present application claims the benefit of copending U.S. provisional patent application, Serial No. 60/172,883 filed in the U.S. Patent and Trademark Office on December 20, 1999.
BACKGROUND OE THE INVENTION
The present invention pertains to an automated exposure control unit and method for image recording devices and is, particularly, adapted for use in conjunction with cameras employing strobes for exposing photographic film of the self-developing type.
Capturing a properly exposed scene on film is often difficult and a wide variety of techniques have been put forth in the prior art to achieve that end in an economical manner. Proper exposure is especially difficult when a photographer is confronted with a wide variety of scene lighting and subject conditions. Correct exposure is even more difficult when the exposure employs an electronic strobe for illuminating the subject and scene. The foregoing difficulties are compounded when exposing self-developing film in low cost mass-market cameras; wherein conventional techniques of improving film picture quality done during film processing cannot be utilized and sophisticated electronic systems do not make economic sense.
Despite the problems associated with obtaining correct exposures, there are several known successful techniques for dealing with the above issues by providing enhanced exposure for a variety of scene lighting and subject conditions. For instance, commonly-assigned, U.S. Patent No. 4,192,587 describes a proportional fill flash system for varying the time at which a source of artificial illumination is energized relative to an exposure interval by determining the range of a subject and to thereafter utilize the ranging information to vary the time in which the flash is fired so that it fires at an aperture corresponding to subject range. Commonly assigned U.S. Patent No. 4,255,030 describes a proportional fill flash system utilizing quenching of a quench strobe at an appropriate time without determining subject range based on the integration of both ambient and flash. Commonly assigned U.S. Patent No. 4,285,584 describes a proportional fill flash system utilizing a photometer having three zones in its overall field of view for distinguishing between background, central, and foreground portions of the scene. The output of the different segments of the scene are compared to increase or decrease the exposure depending on whether the scene is more heavily illuminated in the background or foreground. Commonly assigned U.S. Patent
No. 4,423,936 describes an exposure control unit utilizing a photometer having a multi-sensor array that detects both subject range and ambient light intensity. A comparison of range measurements identifies that scene area having the nearest object to the ambient light intensity measurements of subject and non-subject areas in order to classify scene lighting conditions and then select a corresponding program which controls the operation of the system to vary the ratio of ambient to artificial light contributions to exposure.
Despite the success of the above approaches in addressing the foregoing concerns particularly in regard to self-developing instant cameras, there is nevertheless a continuing desire to improve upon the efficiency and costs of obtaining proper exposure not only with cameras of the foregoing kind but also with digital cameras as well. In regard to digital cameras, it is highly desirable to effect highly quality printing utilizing low cost methodologies. For instance, in a camera with a digital capture mode the use of a multi-sensor array to minimize the problem adds to the overall cost to a system. Moreover, it is highly desirable to achieve the foregoing in a variety of scene lighting conditions wherein artificial illumination makes a contribution.
SUMMARY OF THE INVENTION
An object of the invention is to control the photographic exposure automatically such that the subject and non-subject areas of a scene are correctly exposed in an economical and efficient manner. It is another object of the invention to provide a digital capture system which can utilize the material already in the system.
To achieve this and other objects, the invention is directed to a method of controlling exposure to a photosensitive element by controlling illumination of the scene through a comparison of brightness and range mapping. The method comprises: (a) sensing image data from the scene including scene brightness from a first set of a plurality of regions in the scene; (b) forming a brightness map of the scene in accordance with the brightness data corresponding to the first set of regions; (c) sensing range data from a second set of regions in the scene; (d) forming a range map to determine a subject in the scene; and, (e) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject brightness; and, (f) controlling the exposure by controlling artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally maintained. In another embodiment, provision is made
The invention is further directed to a system for controlling exposure by controlling the relationship between ambient and artificial illumination during exposure. The system comprises a sensor for sensing image data including scene brightness from a first set of a plurality of regions in a scene; a sensor for sensing range data from a second set of regions in the scene; and, processing means for (i) defining a brightness map of the scene in accordance with the brightness data corresponding to the first set of regions, (ii) defining a range map of the scene in accordance to the second set of regions to determine a subject in the scene, (iii) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject range; and, (iv) controlling the exposure during an exposure interval by controlling artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally maintained.
As a result of the foregoing system and method, when the ambient brightness readings of both the scene background and scene subject are measured, control of the ambient exposure of the background is controlled by the ambient background readings and the exposure of the subject is controlled by a comparing the subject brightness to the scene brightness and by controlling the amount of artificial illumination directed at the subject by the source of artificial illumination to make up the difference in brightness values in order to give correct exposure for both the scene background and the subject. Accordingly, both the scene subject and scene background are well- exposed.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be set forth in detail with reference to the drawings, in which:
Fig. 1 shows a block diagram of a camera according to the present invention;
Fig. 2 shows a block diagram of operation of the camera of Fig. 1 ;
Fig. 3 shows a scene of which a photograph is to be taken; Fig. 4 shows a plurality of regions used in range mapping;
Fig. 5 shows the scene of Fig. 3 superimposed on the plurality of regions of Fig. 4;
Fig. 6 shows a ranging map;
Fig. 7 A shows a macroscopic view of a plurality of regions used in brightness mapping;
Fig. 7B shows an enlarged portion of the macroscopic view of Fig. 7A and shows discrete regions;
Fig. 8 shows the scene in which the subject is illuminated more dimly than the background;
Fig. 9 shows a scene in which the subject is illuminated more brightly than the background;
Fig. 10 is an elevation view of another embodiment of an exposure control device with, however, portions removed for clarity in illustrating its construction and one mode of operation;
Fig. 11 is a view similar to Fig. 10, but illustrating the device in yet another mode of operation; and,
Fig. 12 is a cross-sectional view illustrating the system of Fig. 10. DETAILED DESCRIPTION
Fig. 1 shows a camera 1 according to the present invention. The camera 1 includes a quenchable electronic strobe unit or other suitable strobe unit 103, and a photo responsive area sensor 105, such as the CCD. A filter element 107, such as a filter sold under the model number CM 500, is disposed in front of the CCD 105 and is moved into one of two positions by motor 1 1 1 such that light directed toward the CCD 105 is intercepted by either a visible-passing filter 109V or an infrared-passing filter 1091R. Alternatively, filter element 107 can be disposed in front of the strobe 103. An auto-focus device 1 13, such as any auto-focus device known in the art moves a lens 115 into the appropriate focus position. The components are connected to a microprocessor 117 or other suitable control device to control the operations of the camera.
The camera of Fig. 1 operates as shown in the flow chart of Fig. 2. The operation starts in step 201. An IR wink or strobe pulse is emitted in step 203 by the strobe 103 (or another suitable IR source which may be separately provided) to allow an auto-focus operation in step 205 by the autofocus device 113. It will be appreciated that an auto-focus operation need not be performed, but rather a set lens can be used that if desired could be manually set. The CCD is set to high resolution in step 207 so that a brightness map of the scene (subject and background) to be photographed can be formed in step 209. In the exemplary embodiment, the visible-passing filter 109V is located in front of the CCD 105 so that the brightness map is formed with visible light. In step 21 1, the CCD is set to low resolution, and another IR wink or strobe pulse is emitted in step 213 so that a ranging map of the scene can be formed in step 215. In the exemplary embodiment, the infrared-passing filter 109 IR is located in front of the
CCD 105. The brightness and ranging maps are compared in step 217, and the flash is
* set in step 219 to provide the appropriate level of backfill for the subject and the background. With this flash setting, the picture is taken in step 221, and the operation ends in step 223.
The operation of forming the ranging map in step 215 will be explained with reference to Fig's. 3-6. Fig. 3 shows a view of CCD 105 in low resolution divided into sixteen exemplary regions R1-R16, although, as noted above, the number of regions capable of being formed by the CCD typically r ranges from ten to 500 in low resolution. Fig. 5 shows the same scene as in Fig. 3 except that it is divided into the sixteen regions corresponding to regions R1-R16 of Fig. 4 and thus shows how CCD 105 in low resolution divides scene 301 into the regions. In each of the regions R1-R16 of Fig. 5, the range is determined, and a near or far determination is made for each of the regions. As a result, the ranging map 601 of Fig. 6 is obtained. In this ranging map 601, regions R1-R5, R8, R9, R12, R13, and R16 corresponding mostly to the background 305 of the scene 301, are determined to be far regions, while regions R6, R7, F10, Rl l, R14 and R15, corresponding mostly to the subject 303, are determined to be near regions. Of course, it is not necessary to use a binary near/far distinction; instead, varying distance ranges could be used.
While the exemplary embodiment forms the ranging map with the CCD in low resolution, it forms the brightness map with the CCD in high resolution, which will be explained with reference to Fig's. 7A and 7B. In high resolution, each pixel discernible by the CCD can be a separate region; alternatively, a resolution intermediate between the low resolution and the maximum resolution of the CCD can be used. If the CCD is capable of VGA resolution (640 pixels across by 480 pixels down), the regions range from R(0,0) in the upper left corner to R(639,0) in the upper right corner to R(0,479) in the lower left corner to R(639,479) in the lower right corner, as indicated in Fig. 7A. The portion of the CCD framed in dotted lines in
Fig. 7B is shown enlarged in Fig. 7B, which shows discrete regions ranging from
R(631,200) to R(631,211) to R(639,200) to R(639,211).
The operation of comparison in step 217 will now be explained with reference to Fig's. 6, 8 and 9. As explained above, Fig. 8 shows a situation in which the subject is illuminated more dimly than the background, while Fig. 9 shows the opposite situation. The brightness levels in the brightness map obtained in step 209 are compared with ranging map 601. If the near regions are darker than the far regions, the situation in Fig. 8 is recognized, while if the opposite is true, the situation in Fig. 9 is recognized. The flash is set accordingly to contribute to the exposure. For instance, in some circumstances if the subject is significantly frontlit, the flash need not be fired.
Fig's. 10-12 illustrate portions of an exposure control unit 400. The exposure control unit 400 is similar that described in commonly assigned U.S. Patent Application, Serial No. 09/133,661 filed August 2, 1998. Hence, only those portions of the exposure control unit 400 which are necessary to understand the present invention will be described since a detailed description thereof is incorporated herein and made a part hereof. The unit 400 includes an aperture/lens disc 420 that is rotatably mounted in the housing assembly 402. The aperture/lens disc 420 is indexed under the control of a stepper motor 466 by means of a spur gear 430 that meshes with a gear 468. A set of different sized apertures 422, 424 are selectively rotated into and out of coincidence with the CCD image sensor 442. Although not shown, the apertures 422, 424 can be provided with lenses of a suitable variety such as close-up, wide angle, or telephoto.
A filter assembly 460 is rotationally mounted on a shaft 461 within the housing 402 before the lens assembly 410 and is biased by a torsion spring 470 to rest against a segment 412 of the housing assembly 402, whereby a visible light pass filter element 462 is positioned to be in overlying relationship to an aperture 424. During a preexposure to obtain the scene brightness mapping, the image sensor 442 is enabled and the visible pass filter element 462 allows visible light from the scene to be passed to the sensor, whereby scene brightness measurements for each image sensing region of the sensor can be achieved. These image sensing regions of the sensor, of course, correspond to scene portions that are to be sensed for establishing the scene brightness map. The signals from the sensor are forwarded to a system controller (not shown).
To effect a range determining function while still in the pre-exposure mode, the aperture/lens disc 420 is rotated in a counterclockwise direction, whereby a tab 423 on the disc drives the filter assembly 460 against the bias of the spring so that an infrared pass filter element 464 is placed in overlying relationship to the CCD image sensor 442, while the aperture 424 is now in overlying relationship to the image sensor. In this step, the image sensor 442 can be operated in a low resolution mode for determining a range map distinguishing subject areas relative to the nonsubject areas.
The flash is operated to illuminate the scene and the resulting reflected scene brightness will pass through the IR pass filter 464 to the sensor 442; whereby range information for each sensor region can be determined consistent with the wink IR ranging technique described in commonly-assigned U.S. Patent No. 4,785,322. Also, the present invention envisions the use of differently configured image acquisition modules with a variety of techniques for presenting an IR filter over such a module, such as by moving an IR filter in front of the acquisition module by means of a stepper motor or solenoid.
A system controller (not shown) includes a logic circuit board includes a micro-controller circuit that which receives electrical signals from the various camera elements and, in turn, controls operation of the stepper motor and the CCD as well as strobe and camera shutter mechanism logic circuit board. The logic circuit board includes a microprocessor that is operable for decoding signals from, for instance, the sensor for the scene brightness and range determining steps during different modes of operation. The logic circuit includes a conventional, electronically erasable memory section which includes appropriate numbers of look-up tables, each of which employ combinations of the exposure parameters of subject range, flash mode selection, pre-exposure scene brightness information to define the desired memory cell address. The present invention contemplates establishing the scene brightness and ranging maps, as well as comparing the maps to provide a relationship between the two that controls the strobe firing intervals for each and every combination of the scene brightness and ranging maps. The logic circuit will control when the strobe will be fired and quenched during an exposure cycle so that a desired fill flash ratio between ambient and flash is maintained despite wide variances in scene lighting and subject ranges that exist.
Although illustrative embodiments of the invention have been set forth, those skilled in the art will recognize that other embodiments can be realized within the scope of the invention. For example, the image sensing CCD can be operated in only one resolution, so that the ranging and brightness information can be taken simultaneously. In other words, a single set of data can be used to determine brightness and ranging. Also, the system and method described above do not have to be implemented in a camera, but find utility in any art in which correct illumination of subjects under a variety of lighting conditions are an issue.

Claims

WHAT IS CLAIMED IS:
1. A method of controlling exposure of a scene image comprising the steps of: (a) sensing a scene for image data including scene brightness data from at least a first set of a plurality of regions of the scene including a subject region; (b) deriving values representative of a brightness map of the scene in accordance with scene brightness data values corresponding to each of a first set of regions; (c) sensing the scene for image data including range data from at least a second set of regions in the scene; (d) deriving values representative of a range map in accordance with each of the second the range to determine a subject in the scene; and, (e) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject brightness; and, (f) controlling the exposure by controlling artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally obtained based on the relationship between scene brightness and the subject brightness.
2. The method of claim 1 wherein said step of controlling the artificial illumination controls firing intervals of a strobe for each and every combination of the scene brightness and ranging maps.
3. The method of claim 1 wherein one of the brightness sensing or range sensing is performed in a first sensor resolution mode and the other of the range sensing or brightness sensing is performed in a second sensor resolution mode.
4. The method of claim 1 wherein the brightness and range sensing are performed in one resolution mode.
5. The method of claim 1 wherein said range and brightness sensing steps are performed generally simultaneously.
6. A system of controlling exposure of a scene image, the system comprises: at least one sensor assembly for sensing image data including scene brightness from a first set of a plurality of regions in a scene; said sensor assembly is operable for sensing range data from a second set of genrally independent regions in the scene; a source of artificial illumination; and, processing means for (i) defining a brightness map of the scene in accordance with the brightness data corresponding to each of the regions in the first set of regions, (ii) defining a range map of the scene in accordance to the second set of regions to determine a subject in the scene, (iii) comparing the range map with the scene brightness map for determining a relationship between scene brightness and the subject range; and, (iv) controlling the exposure by controlling a strobe artificial illumination upon the scene, whereby a relationship of ambient and artificial illumination is generally obtained based on the relationship between scene brightness and the subject brightness.
7. The system of claim 6 wherein said sensor assembly includes a sensor for sensing the scene brightness and for sensing the range.
8. The system of claim 7 wherein said sensor senses one of the scene brightness and the other of the scene ranging in a first resolution mode and said sensor senses the other of the subject range and scene brightness in a second resolution mode.
9. The system of claim 7 wherein said sensor is operable for sensing the scene brightness and ranging in a single resolution mode.
10. The system of claim 6 wherein said sensor assembly includes an infrared pass filter assembly that is operable in one condition to allow ambient and artificial illumination from a scene to impinge on said sensor, and in another condition allows infrared from the scene to impinge on said sensor.
11. The system of claim 6 wherein said processing means is operable for controlling the artificial illumination by a controlling a strobe for each and every combination of the scene brightness and ranging maps.
PCT/US2000/034878 1999-12-20 2000-12-20 Scene recognition method and system using brightness and ranging mapping WO2001046753A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17288399P 1999-12-20 1999-12-20
US60/172,883 1999-12-20

Publications (1)

Publication Number Publication Date
WO2001046753A1 true WO2001046753A1 (en) 2001-06-28

Family

ID=22629595

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/034878 WO2001046753A1 (en) 1999-12-20 2000-12-20 Scene recognition method and system using brightness and ranging mapping

Country Status (2)

Country Link
US (1) US6516147B2 (en)
WO (1) WO2001046753A1 (en)

Families Citing this family (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7738015B2 (en) 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US7042505B1 (en) 1997-10-09 2006-05-09 Fotonation Ireland Ltd. Red-eye filter method and apparatus
US7630006B2 (en) 1997-10-09 2009-12-08 Fotonation Ireland Limited Detecting red eye filter and apparatus using meta-data
US8155397B2 (en) 2007-09-26 2012-04-10 DigitalOptics Corporation Europe Limited Face tracking in a camera processor
US9129381B2 (en) 2003-06-26 2015-09-08 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7269292B2 (en) 2003-06-26 2007-09-11 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US8363951B2 (en) 2007-03-05 2013-01-29 DigitalOptics Corporation Europe Limited Face recognition training method and apparatus
US7317815B2 (en) * 2003-06-26 2008-01-08 Fotonation Vision Limited Digital image processing composition using face detection information
US8494286B2 (en) 2008-02-05 2013-07-23 DigitalOptics Corporation Europe Limited Face detection in mid-shot digital images
US8330831B2 (en) 2003-08-05 2012-12-11 DigitalOptics Corporation Europe Limited Method of gathering visual meta data using a reference image
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8498452B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8254674B2 (en) 2004-10-28 2012-08-28 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US7844076B2 (en) 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US8553949B2 (en) 2004-01-22 2013-10-08 DigitalOptics Corporation Europe Limited Classification and organization of consumer digital images using workflow, and face detection and recognition
US8682097B2 (en) 2006-02-14 2014-03-25 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US7536036B2 (en) 2004-10-28 2009-05-19 Fotonation Vision Limited Method and apparatus for red-eye detection in an acquired digital image
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US7574016B2 (en) 2003-06-26 2009-08-11 Fotonation Vision Limited Digital image processing using face detection information
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7565030B2 (en) 2003-06-26 2009-07-21 Fotonation Vision Limited Detecting orientation of digital images using face detection information
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US7792970B2 (en) 2005-06-17 2010-09-07 Fotonation Vision Limited Method for establishing a paired connection between media devices
US8989453B2 (en) 2003-06-26 2015-03-24 Fotonation Limited Digital image processing using face detection information
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7471846B2 (en) 2003-06-26 2008-12-30 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US7440593B1 (en) 2003-06-26 2008-10-21 Fotonation Vision Limited Method of improving orientation and color balance of digital images using face detection information
US7792335B2 (en) 2006-02-24 2010-09-07 Fotonation Vision Limited Method and apparatus for selective disqualification of digital images
US7315630B2 (en) 2003-06-26 2008-01-01 Fotonation Vision Limited Perfecting of digital image rendering parameters within rendering devices using face detection
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US7616233B2 (en) * 2003-06-26 2009-11-10 Fotonation Vision Limited Perfecting of digital image capture parameters within acquisition devices using face detection
US7680342B2 (en) 2004-08-16 2010-03-16 Fotonation Vision Limited Indoor/outdoor classification in digital images
US7620218B2 (en) 2006-08-11 2009-11-17 Fotonation Ireland Limited Real-time face tracking with reference images
US7587068B1 (en) 2004-01-22 2009-09-08 Fotonation Vision Limited Classification database for consumer digital images
US8036458B2 (en) 2007-11-08 2011-10-11 DigitalOptics Corporation Europe Limited Detecting redeye defects in digital images
US7362368B2 (en) * 2003-06-26 2008-04-22 Fotonation Vision Limited Perfecting the optics within a digital image acquisition device using face detection
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US7978260B2 (en) * 2003-09-15 2011-07-12 Senshin Capital, Llc Electronic camera and method with fill flash function
US8326084B1 (en) * 2003-11-05 2012-12-04 Cognex Technology And Investment Corporation System and method of auto-exposure control for image acquisition hardware using three dimensional information
US6859618B1 (en) * 2003-11-15 2005-02-22 Hewlett-Packard Development Company, L.P. Exposure compensation method and system employing meter matrix and flash
US7555148B1 (en) 2004-01-22 2009-06-30 Fotonation Vision Limited Classification system for consumer digital images using workflow, face detection, normalization, and face recognition
US7551755B1 (en) 2004-01-22 2009-06-23 Fotonation Vision Limited Classification and organization of consumer digital images using workflow, and face detection and recognition
US7558408B1 (en) 2004-01-22 2009-07-07 Fotonation Vision Limited Classification system for consumer digital images using workflow and user interface modules, and face detection and recognition
US7564994B1 (en) 2004-01-22 2009-07-21 Fotonation Vision Limited Classification system for consumer digital images using automatic workflow and face detection and recognition
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US7715597B2 (en) 2004-12-29 2010-05-11 Fotonation Ireland Limited Method and component for image recognition
US8503800B2 (en) 2007-03-05 2013-08-06 DigitalOptics Corporation Europe Limited Illumination detection using classifier chains
US7315631B1 (en) 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US7388690B2 (en) * 2005-05-27 2008-06-17 Khageshwar Thakur Method for calibrating an imaging apparatus configured for scanning a document
US7599577B2 (en) 2005-11-18 2009-10-06 Fotonation Vision Limited Method and apparatus of correcting hybrid flash artifacts in digital images
WO2007095553A2 (en) 2006-02-14 2007-08-23 Fotonation Vision Limited Automatic detection and correction of non-red eye flash defects
US7804983B2 (en) 2006-02-24 2010-09-28 Fotonation Vision Limited Digital image acquisition control and correction method and apparatus
DE602007012246D1 (en) 2006-06-12 2011-03-10 Tessera Tech Ireland Ltd PROGRESS IN EXTENDING THE AAM TECHNIQUES FROM GRAY CALENDAR TO COLOR PICTURES
EP2050043A2 (en) 2006-08-02 2009-04-22 Fotonation Vision Limited Face recognition with combined pca-based datasets
US7403643B2 (en) 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US7916897B2 (en) 2006-08-11 2011-03-29 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
JP5049356B2 (en) 2007-02-28 2012-10-17 デジタルオプティックス・コーポレイション・ヨーロッパ・リミテッド Separation of directional lighting variability in statistical face modeling based on texture space decomposition
EP2123008A4 (en) 2007-03-05 2011-03-16 Tessera Tech Ireland Ltd Face categorization and annotation of a mobile phone contact list
WO2008107002A1 (en) 2007-03-05 2008-09-12 Fotonation Vision Limited Face searching and detection in a digital image acquisition device
JP2010520567A (en) 2007-03-05 2010-06-10 フォトネーション ビジョン リミテッド Red-eye false detection filtering using face position and orientation
US7916971B2 (en) 2007-05-24 2011-03-29 Tessera Technologies Ireland Limited Image processing method and apparatus
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8750578B2 (en) 2008-01-29 2014-06-10 DigitalOptics Corporation Europe Limited Detecting facial expressions in digital images
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US7855737B2 (en) 2008-03-26 2010-12-21 Fotonation Ireland Limited Method of making a digital camera image of a scene including the camera user
CN103475837B (en) 2008-05-19 2017-06-23 日立麦克赛尔株式会社 Record reproducing device and method
CN102027505A (en) 2008-07-30 2011-04-20 泰塞拉技术爱尔兰公司 Automatic face and skin beautification using face detection
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
WO2010063463A2 (en) 2008-12-05 2010-06-10 Fotonation Ireland Limited Face recognition using face tracker classifier data
US8379917B2 (en) 2009-10-02 2013-02-19 DigitalOptics Corporation Europe Limited Face recognition performance using additional image features
US20110216157A1 (en) 2010-03-05 2011-09-08 Tessera Technologies Ireland Limited Object Detection and Rendering for Wide Field of View (WFOV) Image Acquisition Systems
JP2012088685A (en) * 2010-09-22 2012-05-10 Panasonic Corp Camera device
US8836777B2 (en) 2011-02-25 2014-09-16 DigitalOptics Corporation Europe Limited Automatic detection of vertical gaze using an embedded imaging device
US8896703B2 (en) 2011-03-31 2014-11-25 Fotonation Limited Superresolution enhancment of peripheral regions in nonlinear lens geometries
US8860816B2 (en) 2011-03-31 2014-10-14 Fotonation Limited Scene enhancements in off-center peripheral regions for nonlinear lens geometries
US9218667B2 (en) * 2013-11-25 2015-12-22 International Business Machines Corporation Spherical lighting device with backlighting coronal ring

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5420635A (en) * 1991-08-30 1995-05-30 Fuji Photo Film Co., Ltd. Video camera, imaging method using video camera, method of operating video camera, image processing apparatus and method, and solid-state electronic imaging device
US5606392A (en) * 1996-06-28 1997-02-25 Eastman Kodak Company Camera using calibrated aperture settings for exposure control

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4304476A (en) * 1979-10-01 1981-12-08 Eastman Kodak Company Partial infrared filter exposure compensation apparatus
US6167200A (en) * 1998-08-03 2000-12-26 Minolta Co., Ltd. Exposure operation mechanism of camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5420635A (en) * 1991-08-30 1995-05-30 Fuji Photo Film Co., Ltd. Video camera, imaging method using video camera, method of operating video camera, image processing apparatus and method, and solid-state electronic imaging device
US5606392A (en) * 1996-06-28 1997-02-25 Eastman Kodak Company Camera using calibrated aperture settings for exposure control

Also Published As

Publication number Publication date
US6516147B2 (en) 2003-02-04
US20010031142A1 (en) 2001-10-18

Similar Documents

Publication Publication Date Title
US6516147B2 (en) Scene recognition method and system using brightness and ranging mapping
US5128711A (en) Apparatus for recording position information of principal image and method of detecting principal image
US6816676B2 (en) Adaptive control of LCD display utilizing imaging sensor measurements
US7345702B2 (en) Image sensing apparatus, control method for illumination device, flash photographing method, and computer program product
JPH0427530B2 (en)
US5289227A (en) Method of automatically controlling taking exposure and focusing in a camera and a method of controlling printing exposure
US20040008274A1 (en) Imaging device and illuminating device
US5023656A (en) Photographic printing method
JP2934712B2 (en) Camera backlight detection device
US5305051A (en) Camera having a variable photographing aperture and control method thereof
JPH05110912A (en) Camera
JP3927655B2 (en) camera
EP0654699A1 (en) Control system for camera with autofocus
US6314243B1 (en) Electronic flash light-emission controlling method and apparatus and camera
US6760545B1 (en) Method and camera for image capture
JPH05100288A (en) Camera with electric view finder
JPH08136971A (en) Photographing device and method for controlling exposure
JP2949766B2 (en) Camera exposure control device
JPH02251941A (en) Electronic camera
JP4169524B2 (en) camera
JP3887144B2 (en) Light source selection device for electronic still camera
KR100338439B1 (en) Method and apparatus for automatic compensation for camera backlight
GB2227850A (en) Automatic exposure camera
JP2604913B2 (en) Camera with built-in strobe
US4486087A (en) Method of and apparatus for altering sensitivity of photometer to different scene portions

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP