WO2002005549A1 - Camera having a through the lens pixel illuminator - Google Patents

Camera having a through the lens pixel illuminator Download PDF

Info

Publication number
WO2002005549A1
WO2002005549A1 PCT/IL2000/000404 IL0000404W WO0205549A1 WO 2002005549 A1 WO2002005549 A1 WO 2002005549A1 IL 0000404 W IL0000404 W IL 0000404W WO 0205549 A1 WO0205549 A1 WO 0205549A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
scene
photosurface
region
camera according
Prior art date
Application number
PCT/IL2000/000404
Other languages
French (fr)
Inventor
Ori J. Braun
Giora Yahav
Original Assignee
3Dv Systems, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Dv Systems, Ltd. filed Critical 3Dv Systems, Ltd.
Priority to US10/332,643 priority Critical patent/US7355648B1/en
Priority to JP2002509283A priority patent/JP2004503188A/en
Priority to EP00942347A priority patent/EP1302066A1/en
Priority to PCT/IL2001/000627 priority patent/WO2002004247A1/en
Priority to US10/332,646 priority patent/US6993255B2/en
Priority to AU2001269414A priority patent/AU2001269414A1/en
Priority to IL15385701A priority patent/IL153857A0/en
Publication of WO2002005549A1 publication Critical patent/WO2002005549A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/10Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source
    • F21S41/14Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source characterised by the type of light source
    • F21S41/141Light emitting diodes [LED]
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/10Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source
    • F21S41/14Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by the light source characterised by the type of light source
    • F21S41/141Light emitting diodes [LED]
    • F21S41/147Light emitting diodes [LED] the main emission direction of the LED being angled to the optical axis of the illuminating device
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • F21S41/60Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution
    • F21S41/65Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on light sources
    • F21S41/663Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps characterised by a variable light distribution by acting on light sources by switching light sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/30Systems for automatic generation of focusing signals using parallactic triangle with a base line
    • G02B7/32Systems for automatic generation of focusing signals using parallactic triangle with a base line using active means, e.g. light emitter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4812Constructional features, e.g. arrangements of optical elements common to transmitter and receiver transmitted and received beams following a coaxial path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements

Definitions

  • the present invention relates to imaging a scene with light and in particular to methods and apparatus for illuminating the imaged scene.
  • Satisfactory and effective imaging of a scene with light requires proper illumination of the scene. In many instances however, achieving proper illumination is a relatively complicated and arduous task. Often it is desirable and/or necessary to illuminate different regions of a scene with different intensity and/or color light. Sometimes a dynamic range of a camera limits contrast in an image of a scene and brightest regions and dimmest regions of the scene cannot be properly imaged at the same time. Sometimes lighting and color of illumination must be adjusted to provide desired effects in an image.
  • lighting usually has to be adjusted to increase illumination of certain regions of the product while decreasing illumination of other regions of the product.
  • color of light used to illuminate the product may have to be adjusted and substantial effort invested in arranging direction and intensity of indirect lighting of the product.
  • lighting In imaging items for visual inspection, lighting generally has to be adjusted to assure that inspected features of the items are appropriately visible to enable proper inspection. Often, to distinguish features of an inspected item, color of light illuminating a first region of the item is optionally adjusted to be different from color of light illuminating a second region of the item.
  • Distance to a region is determined from timing of the gates with respect to emission times of the light pulses and a total amount of light from the region that is received and imaged during the gates.
  • differences in reflectivity of different regions of the scene and/or differences in distances of different region in the scene from the camera result in substantial differences in intensity of reflected light reaching the camera.
  • intensity of light received from a region in a scene is a function of a solid angle subtended by the region at the camera. Since the solid angle is inversely proportional to the square of the distance from the camera, small differences in distances from the camera of regions in the scene can generate large differences in intensity of light reaching the camera from the regions. The range of these intensity differences can sometimes approach or exceed the dynamic range of the camera. As a result, accuracy of determination of distances to regions in the scene can vary substantially. In some instances, as a result of intensity differences it may not be possible to determine distances to all regions of the scene at a same time.
  • An aspect of some embodiments of the present invention relates to providing a camera, comprising an improved illumination system for illuminating a scene imaged with the camera.
  • the illumination system illuminates regions of the scene that are imaged on contiguous regions of the photosurface independently of each other.
  • regions of a scene that are independently illuminated in accordance with an embodiment of the present invention are referred to as "illumination zones”.
  • the size of an illumination zone is such that the illumination zone is imaged on a single pixel of the photosurface.
  • the size of an illumination zone is such that the illumination zone is imaged on a group of pixels on the photosurface.
  • all illumination zones of a scene are imaged regions of the photosurface having a same size and shape.
  • different illumination zones are imaged on different size and/or shape regions of the photosurface.
  • An aspect of some embodiments of the present invention relates to adjusting intensity of light that illuminates each illumination zone of a scene independently of other illumination zones in the scene.
  • An aspect of some embodiments of the present invention relates to adjusting color of light that illuminates each illumination zone of a scene independently of other illumination zones in the scene.
  • a camera in accordance with an embodiment of the present invention comprises a photosurface on which light from a scene imaged with the camera is focused and an illumination system having a spatially modulated light source boresighted with the photosurface.
  • the light source hereinafter referred to as a "pixelated illuminator”
  • the size, location and orientation of the pixelated illuminator and optics used to boresight the pixelated illuminator with the photosurface are such that a virtual image of the pixelated illuminator is substantially coincident with the photosurface.
  • a virtual image of each luxel of the pixelated illuminator is therefore located in a different corresponding region of the photosurface and light from virtual images of any two luxels does not appear to pass through or emanate from a same region of the photosurface.
  • the pixelated illuminator is boresighted with the photosurface using methods and devices described in US Patent Application, number 09/250,322, the disclosure of which is incorporated herein by reference.
  • the region of the photosurface on which a virtual image of a luxel is located images a region of the scene that is an illumination zone of the scene corresponding to the luxel.
  • Light from the luxel illuminates only the illumination zone that is imaged on the region of the photosurface in which the virtual image of the luxel is located.
  • the camera comprises a controller that controls intensity of light provided by a luxel so as to control intensity of illumination of the illumination zone corresponding to the luxel.
  • the controller controls the luxel to adjust hue and/or saturation of light illuminating the illumination zone.
  • the controller receives signals generated by a pixel or pixels of the photosurface on which an illumination zone of a scene is imaged responsive to light incident on the pixel or pixels from the illumination zone.
  • the controller controls light provided by the luxel that illuminates the illumination zone, responsive to the received signals.
  • the controller receives instructions from a user that define a desired hue, and/or saturation and/or intensity for light reaching the pixel or pixels of the photosurface from the illumination zone.
  • the controller controls light provided by the luxel responsive to the instructions so that light that reaches the pixel or pixels has the desired characteristics defined by the user.
  • a person uses a camera in accordance with an embodiment of the present invention to image a metal object that has highly reflecting areas.
  • the person might prefer to reduce light reaching the camera from the highly reflecting areas to reduce glare and improve perceived quality of the image of the object.
  • the person can accomplish this by instructing the controller to reduce, by a desired amount, intensity of illumination of illumination zones corresponding to pixels in the image that register light intensity greater than a maximum light intensity defined by the person.
  • the person can accomplish this by indicating on a display of a preview of the image, a region or regions for which the user wants to reduce brightness.
  • the user may indicate an amount by which the user wishes to reduce illumination of the region or regions on an appropriate control panel that is displayed with the preview.
  • the controller reduces illumination of illumination zones corresponding to the indicated region or regions, a new preview may be presented to the user, which if the user desires he or she can adjust again.
  • hue and saturation of light illuminating a scene being imaged using the camera are adjusted similarly.
  • the pixelated illuminator illuminates illumination zones with R, G or B light
  • the user can adjust color of the image by adjusting intensity, using for example methods similar to those described above, of R, G and/or B light illuminating illumination zones of the scene.
  • the virtual image of a luxel on the photosurface is substantially coincident with a single pixel on the photosurface.
  • the luxel therefore illuminates a region, i.e. an illumination zone, of a scene imaged on the photosurface that is imaged on the single pixel on the photosurface.
  • the virtual image of a luxel is substantiality coincident with a group of pixels on the photosurface and the luxel illuminates an illumination zone of a scene that is imaged on the group of pixels.
  • luxels are not all the same size and/or shape and two different luxels may illuminate different size and/or shape illumination zones in a scene.
  • a camera comprises an illumination system having a pixelated illuminator that is not boresighted with a photosurface of the camera. Instead, the illumination system comprises its own optics that focuses light from the pixelated illuminator on a scene being imaged by the camera.
  • the camera may comprise a controller that controls the illumination system optics so that the light from the pixelated illuminator is focused to the scene being imaged by the camera.
  • Light providing systems and devices suitable for use as a pixelated illuminator for practice of the present invention are available in the market.
  • certain types of light projectors used for projection display of images are suitable for use in practice of the present invention.
  • These light projectors comprise luxels that are independently controllable to provide light at desired intensities and colors.
  • the luxels can be made as small as pixels in a typical CCD or CMOS camera.
  • US Patent 4,680,579 to E. O. Granville describes a light projector that uses a digital mirror device (DMD) in which micro mirrors function as luxels and reflect light of desired intensity and color provided by a suitable light source.
  • DMD digital mirror device
  • US Patent 5,844,588 to Anderson describes a DMD device used in an illumination system for a xerographic printer.
  • Texas Instruments markets DMD light projectors and image projection systems based on DMDs.
  • JVC markets a D-ILA (Direct, Direct Drive or Digital, Image Light Amplifier) light projector and image projection systems in which liquid crystal (LC) pixels are controlled to project light of desired intensity and color.
  • D-ILA Direct, Direct Drive or Digital, Image Light Amplifier
  • An aspect of some embodiments of the invention is concerned with the acquisition of high gray level resolution images utilizing a camera having a lower gray level resolution.
  • a first image of a scene is acquired utilizing uniform lighting.
  • the lighting which can be controlled to controllably and differently illuminate the various regions of the field of view is adjusted to give a relatively flat (contrast-wise) image.
  • the illumination may have up to N' illuminations spaced at intervals such that for each gray level there exists a combination of N and N' such that N*N -C, where C is a constant that is the same for all of the pixels.
  • the illumination is matched to intensity measured on the initial image acquisition, and a second image is acquired, the intensity variations are less than ⁇ 0.5 of the intensity distance between the initial gray levels.
  • the acquisition range is set to cover this range of gray levels with substantially the entire scan of acquisition levels. If M acquisition levels and P illumination levels are available, each pixel (i, j) is characterized by two numbers my and pjj. The gray levels of a uniformly illuminated image can thus be derived to a resolution given by MxP gray levels.
  • the illumination is controlled only over regions of greater than one pixel.
  • the second image will have a variability that is larger than the variability when the illumination of the individual pixels are separately illuminated.
  • the field of view is divided into Q regions. An uniformly lit image is acquired. The acquired intensity over the regions is averaged and the illumination is adjusted such that the acquired intensity (for a subsequent image) over all of the regions (based on the averages) is uniform to within the quantization accuracies of the illumination and the acquisition.
  • each pixel is characterized by two numbers, one for the illumination and one for the acquisition.
  • the number of levels for illumination and acquisition need not be the same. Thus, for example, if there are half as many illumination levels as there are acquisition levels, the illumination levels are matched to every second acquisition level, such that the total range of illuminations for the second acquisition is ⁇ 1 original gray level. The second acquisition is set to cover this range with the entire range of acquisition values.
  • a very high resolution gray level (uniformly illuminated) image can be constructed utilizing these embodiments by generating an image in which each pixel has a brightness derived from the illumination level and the acquired brightness level.
  • the first P bits of the image gray level would be given by the illumination values and the next M bits of the image gray level would be given by the acquisition levels of the second acquired image.
  • two matrices are provided, one of which contains information on the illumination and the other information on the second acquisition.
  • an A/D that allows for adjusting the zero level for the acquisition (so that the acquisition starts at the gray level of the pixel with the lowest illumination) and the A D gain, so that the highest acquisition gray level is matched with the brightest portion of the image.
  • An iris may be provided to at least partially adjust the gain.
  • the illumination elements illuminate pixels or groups of pixels without overlap.
  • illumination of a pixel for individual pixel illumination
  • illumination of a region for regional pixel illumination
  • the reconstruction optionally takes into account the effects of illumination of adjacent pixels when determining the illumination to be used and in reconstructing the image.
  • the overlap is adjusted such that, for uniform brightness of the sources, each pixel is uniformly illuminated. This can be achieved, for example, by providing for Gaussian roll off of the illumination, with the half illumination point at the edge of the portion being illuminated.
  • a camera comprising: a photosurface comprising light sensitive pixels that generate signals responsive to light incident thereon and optics that focus an image of a scene onto the photosurface; and an illumination system for illuminating the scene, the illumination system comprising: an illuminator having a plurality of substantially contiguous, independently controllable light providing regions; optics that focuses an image of the illuminator on the scene so that light from the illuminator illuminates substantially all of and substantially only the field of view of the camera; and a controller that controls light provided by each light providing region.
  • the illuminator is boresighted with the photosurface so that a virtual image of the illuminator is located at the photosurface and optics that image the scene on the photosurface image the illuminator on the scene.
  • a virtual image of the illuminator is substantially coincident with the photosurface.
  • light from each of the light providing regions that is reflected from the scene is imaged on a different group of contiguous pixels in the photosurface.
  • light from each of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a region of the photosurface having a size substantially equal to the size of a pixel in the photosurface.
  • the region on which the light is imaged is located substantially within the area of a pixel on the photosurface.
  • the regions of the photosurface on which light from at least two adjacent light providing regions of the illuminator is imaged overlap.
  • light from a plurality of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a same region of the photosurface having a size substantially equal to the size of a pixel in the photosurface.
  • the region on which the light from the plurality of light providing regions is imaged is located substantially within the area of a pixel on the photosurface.
  • each light providing region provides white light.
  • each light providing region is controllable to provide light in at least two different wavelength bands of light.
  • each light providing region is controllable to provide R, G and B light.
  • the plurality of light providing regions comprises three light providing regions.
  • each light providing region provides a different one of R, G or B light.
  • the controller controls intensity of R, G or B light provided by each light providing region so as to control hue, saturation or intensity of light illuminating a region of the scene which is illuminated by light from the light providing region independently of hue saturation and intensity of light illuminating other regions of the scene.
  • the controller controls light provided by at least one light providing region to control an intensity distribution of R, G or B light reflected by the scene that is incident on pixels of the photosurface. In some embodiments of the present invention the controller controls intensity of light provided by each light providing region so as to control intensity of light illuminating a region of the scene illuminated by light from the light providing region independently of intensity of light illuminating other regions of the scene. In some embodiments of the present invention the controller controls light provided by at least one light providing region to control an intensity distribution of light reflected by the scene that is incident on pixels of the photosurface.
  • the controller receives signals generated by each pixel responsive to light incident on the pixel and controls light provided by at least one of the light providing regions responsive to the signals to adjust illumination of the scene.
  • the controller controls light provided by at least one light providing region to decrease intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is greater than a predetermined maximum intensity.
  • the controller controls light provided by at least one light providing region to increase intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is less than a predetermined minimum intensity.
  • the camera comprises a user operated input terminal to transmit instructions to the controller and wherein the controller controls light from at least one of the light providing regions responsive to the instructions to adjust illumination of the scene.
  • the input terminal comprises a display screen.
  • the controller generates a preview of an image of the scene being imaged by the camera on the screen, responsive to signals that it receives from the pixels.
  • the screen is a touch sensitive screen on which a border can be delineated surrounding an object or region in the preview of the scene to select the object or region by touching the screen.
  • the camera comprises a pointer controllable to select an object or region in the preview.
  • the controller controls light provided by light providing regions responsive to the selected region or object.
  • the controller displays a control panel having control icons on the screen and wherein manipulation of the control icons transmits instructions to the controller.
  • the camera comprises a shutter that the controller opens and closes to gate the photosurface on and off.
  • pixels in the photosurface are controllable to be gated on and off and wherein the controller gates the photosurface on and off by gating the pixels.
  • the controller may simultaneously turn on and turn off light providing regions so as to radiate at least one pulse of light having a pulse width that illuminates the scene.
  • the controller gates the photosurface on and off at times responsive to a time at which the at least one light pulse is radiated.
  • the controller gates the photosurface off when the illumination system is providing light to illuminate the scene.
  • the at least one light pulse comprises a train of light pulses.
  • the controller gates the photosurface on for a first gate period after a first time lapse following each radiated light pulse of a first plurality of radiated light pulses in the train of light pulses; and gates the photosurface on for a second gate period after a second time lapse following each radiated light pulse of a second plurality of radiated light pulses.
  • the mid points of first and second gate periods are delayed with respect to the radiated light pulses that they respectively follow by the same amount of time.
  • the duration of the first gate is greater than or equal to three times the pulse width.
  • the duration of the second gate period is substantially equal to the pulse width of the radiated light pulses.
  • the controller determines intensity of light reaching the camera from a region of the scene from an amount of light incident on a pixel that images the region. Optimally, the controller controls light provided by the light providing regions so as to minimize a difference between a predetermined light intensity and intensity of light reaching each pixel in the photosurface from the scene.
  • the controller may determine a distance to region of the scene imaged on a pixel of the photosurface responsive to amounts of light incident on the pixel during the first and second gates.
  • each light providing region are light sources that emit light.
  • Each light providing region comprises a laser.
  • each light providing region comprises a light emitting diode.
  • the camera comprises a light source and wherein the light providing regions provide light by reflecting and modulating light from the light source that is incident on the light providing regions.
  • each light providing region comprises a micromirror controllable to reflect light in a direction towards the optics that focuses an image of the illuminator on the scene.
  • each light providing region may comprises a liquid crystal having a controllable transmittance and a mirror and wherein light from the light source enters the liquid crystal and is reflected back out the liquid crystal cells by the mirror in a direction towards the optics that focuses an image of the illuminator on the scene.
  • a method for illuminating a scene being imaged by a camera comprising a photosurface having pixels and optics for imaging the scene on the photosurface, the method comprising: illuminating the scene with an illuminator comprising a plurality of light providers so that each region of the scene that is imaged on a region of the photosurface comprising at least one pixel but less than all the pixels of the photosurface is illuminated with light provided by at least one light provider that provides light substantially only for the region; and controlling light provided the at least one light provider for a region of the scene to adjust illumination of the scene.
  • the at least one pixel may a single pixel.
  • the method comprises boresighting the illuminator so that a virtual image of the illuminator is located substantially at the photosurface.
  • the virtual image of the illuminator is substantially coincident with the photosurface.
  • a virtual image of each light provider is located within a different group of contiguous pixels on the photosurface.
  • a virtual image of each light provider is located substantially within a corresponding single pixel.
  • a virtual image of a plurality of light providers is located substantially within a corresponding single pixel.
  • the light providers are independently controllable and wherein controlling each of the light providing regions comprises controlling each of the light providing regions independently of the other light providing regions.
  • controlling light comprises controlling the light to control contrast in an image of the scene provided by the camera. In some embodiments of the present invention controlling contrast comprises controlling contrast in only a localized region of the image.
  • controlling light comprises controlling the light to adjust an intensity distribution of light from the scene that is incident on pixels in the photosurface.
  • adjusting an intensity distribution comprises controlling a parameter of the intensity distribution.
  • controlling light comprises controlling the light to perform a predetermined image processing procedure.
  • the image processing procedure comprises unsharp masking.
  • the image processing procedure comprises sharpening.
  • the image processing procedure comprises smoothing.
  • the image processing procedure comprises histogramic equalization.
  • a method of imaging comprising: illuminating at least a portion of a scene with substantially uniform lighting; determining a spatially varying illumination that reduces the variability of brightness values of the at least portion of the scene; illuminating the at least portion with said varying illumination; acquiring a second image of the at least portion, said second image having brightness values for areas thereof; and determining the brightness values of the scene under uniform illumination from the brightness values and the varying values of the illumination.
  • illuminating comprises illuminating utilizing the method of any of claims 50-57.
  • acquiring the second image comprises acquiring the images using a brightness acquisition range limited to the range of brightness values in the image.
  • Fig. 1 schematically shows a camera comprising an illumination system having a pixelated illuminator boresighted with a photosurface of the camera, in accordance with an embodiment of the present invention
  • Fig. 2 schematically shows another camera comprising an illumination system having a light projector and a "boresighted" pixelated illuminator, in accordance with an embodiment of the present invention
  • Fig. 3 schematically shows a camera comprising an illumination system in which a pixelated illuminator of the illumination system is not boresighted with a photosurface of the camera, in accordance with an embodiment of the present invention
  • Fig. 4 is a flow chart of a method for providing a high gray level resolution image, in accordance with an exemplary embodiment of the invention.
  • FIG. 1 schematically shows a camera 20 comprising an illumination system 21, in accordance with an embodiment of the present invention.
  • Camera 20 comprises a photosurface 22, such as a CCD or CMOS light sensitive surface, having pixels 24 and optics represented by a lens 26 for focusing an image of a scene on photosurface 22.
  • a single lens 26 represents the optics
  • the optics may comprise any suitable optical system, which may have a plurality of lenses and optical elements, as known in the art, for focusing images on photosurface 22.
  • Illumination system 21 comprises a pixelated illuminator 30 having a plurality of light providing luxels 32.
  • Pixelated illuminator 30 is boresighted with photosurface 22 using an appropriate beam splitter 34 and optical elements (not shown) as might be required, so that optimally, a virtual image of pixelated illuminator 30 is substantially coincident with photosurface 22. It is to be noted that because lens 26 is used both to focus an image of a scene being imaged with camera 20 on pixelated surface 22 and to focus light from pixelated illuminator 30 on the scene, that the virtual image of pixelated illuminator 30 remains automatically registered to photosurface 22 during zooming operations of the camera.
  • pixelated illuminator 30 In general the size of pixelated illuminator 30 does not have to be the same as the size of photosurface 22. However, in order for a virtual image of pixelated illuminator 30 to be substantially coincident with photosurface 22, pixelated illuminator 30 or a real or virtual image of the pixelated illuminator that is boresighted with the photosurface must have a same size and shape as the photosurface. In Fig. 1 and figures that follow, for convenience of exposition, pixelated illuminator 30 is shown, by way of example, as having a same size as photosurface 22.
  • Pixelated illuminator 30 is therefore shown boresighted with photosurface 22 without an optical system that provides an appropriate real or virtual image of the pixelated illuminator that has substantially a same size as photosurface 22.
  • Pixelated illuminator 30 may be boresighted with photosurface 22 using methods similar to those described in US Patent Application 09/250,322 referenced above.
  • luxels 32, of pixelated illuminator 30 are assumed to have a size and shape so that a virtual image of each luxel 32 is substantially coincident with a single pixel 24 of photosurface 22. There is therefore (since a virtual image of pixelated illuminator
  • Mesh squares 40 are homologous with pixels 24 such that light that is collected by lens 26 that appears to come from any particular mesh square 40 is focused by the lens onto a single pixel 24 of photosurface 22 that corresponds to the particular mesh square.
  • light provided by each luxel 32 is reflected by beam splitter 34 and illuminates substantially only, and all of, a single mesh square 40.
  • the mesh square illuminated by the luxel 32 corresponds to the pixel 24 with which the virtual image of the luxel is coincident.
  • Light from luxel 32 illuminates substantially only, and substantially all of, a single mesh square 40 as a result of the assumption, by way of example, that a virtual image of each luxel 32 is substantially coincident with a single pixel 24.
  • size and shape of luxels 32 and position of pixelated illuminator 30 are such that light from adjacent luxels 32 overlap adjacent mesh squares 40.
  • Illumination zones of the scene being imaged are those regions of the scene that can be illuminated by light from a single luxel 32. Boundaries of illumination zones of the scene are generally either coincident with or close to projections ofboundari.es of mesh squares 40 on the scene.
  • Dashed lines 41 in Fig. 1 represent selected light rays between corresponding points at peripheries of photosurface 22, pixelated illuminator 30 and field of view 38. Arrowheads on dashed lines represent direction of travel of light. For lines 41 that have arrows pointing in both directions along the lines, arrowheads pointing towards field of view 38 indicate direction of light from pixelated illuminator 30 that illuminates the field of view. Arrowheads pointing away from field of view 38 indicate light reflected from field of view 38 that is focused on photosurface 20.
  • the illumination zones of letter 51 would not of course be squares but would be defined by generally curvilinear boundaries determined by the shape of the surface of letter 51 (and the shape of luxels 32). If letter 51 were displaced away from object plane 36 in a direction perpendicular to object plane 36, adjacent illumination zones would overlap along their mutual borders, with the amount of overlap varying with the displacement. For example if the displacement of letter 51 were rectilinear and did not comprise a rotation, the overlap would increase linearly with magnitude of the displacement.
  • the widths of the vertical "back” and horizontal “arms” of letter 51 are shown as having a width equal to a side of a mesh square 40 for convenience of presentation.
  • the back and arms of letter 51 can of course be many mesh squares 40 wide and/or the widths can be equal to a fractional number of mesh squares or it can be offset from edges of the mesh squares.
  • Luxels 32 in pixelated illuminator 30 that illuminate illumination zones of letter 51 are shaded.
  • the shaded luxels 32 form an illumination pattern "E" 52, inverted top to bottom and reversed left to right, which corresponds to letter 51 in field of view 38.
  • shaded pixels 24 in photosurface 22 indicate pixels 24 on which letter 51 in field of view 38 is imaged by camera 20, and form an image letter "E", 53 on the photosurface.
  • Letter 53 in photosurface 22 is a reflection, in beam splitter 34, of letter 52 in pixelated illuminator 30.
  • each shaded luxel 32 in letter 52 illuminates a single corresponding illumination zone 40 of letter 51 in field of view of 38.
  • the illumination zones are referred to by the numeral 40 which identifies mesh squares 40.
  • light provided by the luxel 32 at the tip of the short middle arm of letter 52 on pixelated illuminator 30 illuminates only that illumination zone 40 located at the tip of the short middle arm in letter 51.
  • luxels 32 are controllable to provide desired patterns of illumination for a scene that is imaged with camera 20.
  • luxels 32 are "color" luxels, for which light intensity as well as a spectrum of light provided by the pixels can be controlled to provide different color light.
  • luxels 32 are RGB luxels controllable to provide different combinations of R, G and B light so as to control hue, saturation and intensity of light illuminating illumination zones of the scene.
  • luxels 32 are "gray level” luxels that provide light having a fixed spectrum and for which only intensity of light provided by the pixels is varied.
  • light provided by luxels 32 is not limited to the visible spectrum but may, for example, comprise infrared or ultraviolet light.
  • Luxels 32 may provide and control light by emission, reflection or transmission, or a combination of these processes.
  • luxels 32 may comprise one or a combination of light sources, such as for example semiconductor lasers, that provide light by emission, micro- mirrors that provide light by reflection and/or liquid crystal cells that provide light by transmission.
  • the luxels may provide a multi-level controlled illumination.
  • Illumination system 21 may comprise a controller 60 for controlling luxels 32 and a shutter 62 for shuttering photosurface 22.
  • Controller 60 shutters light provided by pixelated illuminator 30 so as to provide at least one light pulse to illuminate a scene being imaged with camera 20.
  • controller 60 shutters light from pixelated illuminator 30 by turning on and turning off luxels of luxels 32.
  • camera 20 comprises a shutter, which is controlled by controller 60, to transmit and not transmit light in order to shutter light from pixelated illuminator 30, and provide at least one light pulse that illuminates a scene imaged by the camera.
  • the shutter may, for example, be located between pixelated illuminator 30 and beam splitter 34. If pixelated illuminator 30 is an illuminator that provides light to illuminate a scene by reflecting light from a light source, such as the illuminator shown in Fig. 2, the shutter may be located between the light source and the illuminator.
  • controller 60 controls shutter 62 so that during a time that luxels of luxels 32 are "on” to provide a pulse of light, shutter 62 does not transmit light.
  • shutter 62 does not transmit light.
  • the intensity of the light may be varied in one or more of the following ways:
  • the number of pulses emitted from each of the luxels may be varied.
  • 256 pulsed of light should be available for each frame (image) and the light level for each luxel adjusted by varying the number of pulses.
  • a spatial attenuator having a controllable number of attenuation levels is used to adjust the illumination level.
  • the length of light pulses is adjusted.
  • the intensity of the light pulses is adjusted, as for example, by adjusting the current to laser luxel sources.
  • shutter 62 is gated open for a limited accurately determined “gate” period of time, during which shutter 62 transmits light, following an accurately determined delay from the time that each light pulse of the at least one light is radiated.
  • Light pulse width, repetition rate, gate width and relative timing of gates and light pulses are determined so that light from the light pulse that reaches photosurface 22 is reflected only by objects in a limited desired range of distances from camera 20.
  • the range of distances defines a scene having a desired depth of field and distance (i.e. range) from the camera.
  • range parameters are determined and the determined range parameters define a range and depth of field for a scene
  • distance to a scene is determined and range parameters are determined responsive to the distance.
  • the distance may be determined, for example, using devices and methods for determining range by focus or determining range by any of various time of flight techniques, techniques used in gated 3D cameras or geometrical techniques. Techniques for determining distances to regions in a scene are described in PCT applications PCT/IL98/00476 and PCT/IL98/00611 the disclosures of which are incorporated herein by reference.
  • camera 20 is a 3D camera and illumination parameters are determined to provide a 3D map of an imaged scene.
  • the parameters may be determined using methods and described in PCT Publications WO 97/01111, WO 97/01112, WO 97/01113.
  • photosurface 22 is a "self shuttering" photosurface for which shuttering of the photosurface is accomplished by controlling circuits comprised in the photosurface.
  • PCT publication WO 00/19705 the disclosure of which is incorporated herein by reference, describes self-shuttering photosurfaces in which each pixel in the photosurface comprises a circuit controllable to gate the pixel on or off.
  • pixels 24 in photosurface 22 may have different types of spectral sensitivities.
  • photosurface 22 is a gray level "colorblind" photosurface. All pixels 24 in the photosurface are gray level pixels that are sensitive to substantially only total intensity of light, in a same limited spectrum, incident on the pixels.
  • photosurface 22 is a "color photosurface" 22, that provides a color image of a scene imaged by camera 20 and pixels 24 are color sensitive pixels, such as RGB pixels.
  • pixels 24 are sensitive to infrared or ultraviolet light.
  • pixelated illuminator 30 is an RGB color illuminator and that photosurface 22 is an RGB color photosurface.
  • the surface of letter 51 that is imaged by camera 20 is white and that the letter is set against a black background.
  • color of an image of letter 51, i.e. letter 53 formed on photosurface 22, provided by camera 20 can be changed.
  • luxels 32 in illumination pattern 52 illuminate letter 51 with one of R, G or B light
  • image 53 on photosurface 22 will be respectively red green or blue.
  • camera 20 to provide a "rainbow" colored image of letter 51 by controlling different luxels on pixelated illuminator 30 to provide light of different hues.
  • Light from luxels 32 can also be controlled to compensate for variations in reflectivity between different surface regions of letter 51. For example, assume that surface regions of the middle arm of letter 51 are faded and are characterized by relatively low reflectivity compared to other surface regions of letter 51. If intensity of illumination of the surface of letter 51 is substantially uniform, the middle short arm in an image of letter 51 provided by camera 20 will appear faded. To compensate, in accordance with an embodiment of the present invention, for the fading, intensity of light provided by luxels 32 in the middle arm of illumination pattern 52 (i.e. "E" 52) is increased relative to intensity of light provided by the other luxels 32 in illumination pattern 52.
  • E intensity of light provided by luxels 32 in the middle arm of illumination pattern 52
  • Extreme lighting effects can also be provided by an illumination system in accordance with an embodiment of the present invention. For example, if luxels 32 in the middle and lower arms of illumination pattern 52 are turned off so that they do not provide any light, only the back and bottom arm of letter 51 will be illuminated and an image of letter
  • controller 60 can control light provided by pixelated illuminator 30 to "tint" her face so that it appears more vital. If the earrings appear too bright, controller 60 can control luxels 32 in pixelated illuminator 30 to decrease intensity of illumination of the earrings.
  • an illumination system in accordance with an embodiment of the present invention, can control illumination of a scene being imaged by a camera so as to control both color and contrast of the image.
  • camera 20 comprises a visual display screen 64 that a person using the camera 20 employs to transmit information to and receive information from controller 60.
  • controller 60 may, for example, display a preview of the scene on screen 64.
  • controller 60 also displays a control panel 66 having appropriate control buttons 68, such as toggle and slide buttons, and or a keyboard (not shown) for communicating instructions to controller 60.
  • the user can indicate a region of the scene for which the user wishes to adjust illumination by selecting the region in the preview.
  • Various methods known in the art may be used for selecting the region. For example, the user can touch the region with a finger or draw a contour around the region using a stylus.
  • camera 20 comprises a mouse that can be used to draw a suitable border around a region to be selected or to point to an object for which object a region is automatically delineated using image processing methods well known in the art. Once a region is selected, the user indicates desired changes of illumination for the region by pressing or sliding appropriate control buttons shown on screen 66.
  • Fig. 1 screen 64 shows a preview letter E, 70 of letter 51 as it will be imaged by camera 20 under illumination conditions that obtain at the time that preview 70 is shown.
  • the bottom arm of the letter is faded, which is indicated by lighter shading of the bottom arm of "preview E" 70 in comparison to shading of other regions of the preview E.
  • the user selects the bottom arm of preview letter 70.
  • the user then instructs controller 60 to increase illumination of the region of letter 51 corresponding to the selected region of preview 70 by moving an appropriate slider displayed in control panel 66 by a desired amount.
  • Controller 60 controls luxels 32 corresponding to the selected region responsive to the amount by which the user moved the slider and displays an adjusted preview of letter 51 on the output screen. If the user determines, responsive to the adjusted preview, that illumination of letter 51 requires further adjustment, the user again selects a region of the letter in the preview to be adjusted and indicates a desired adjustment of the illumination using displayed control panel 66. If the user determines that further adjustment is not required, the user instructs camera 20 to image letter 51.
  • controller 60 receives signals from pixels 24 in photosurface 22 responsive to light that is incident on the pixels. In some embodiments of the present invention, controller 60 uses signals from pixels 24 to calibrate light output of luxels 32 that correspond to the pixels. The user controls camera 20 to image a suitable surface having uniform reflectivity that fills a field of view of camera 20. Controller 60 uses the signals to determine correspondence between control settings of luxels 32 and light output of the luxels.
  • such measurement can also be used to determine overlap between luxels. For example, where individual pixels are selectively illuminated, the pixels are sequentially illuminated and the intensity on illumination acquired by the camera at adjacent pixels is determined. Similarly, when areas of the field of view are illuminated together, both edge fall-off and overlap can be determined in the same way. These measurements can be sued to correct certain of the results in some embodiments of the invention.
  • controller 60 controls light provided by luxels 32 that correspond to the pixels, responsive to the signals. For example, assume that the user instructs controller 60 to increase brightness of a region of a scene being imaged by a desired amount. Controller 60 increases output intensity of the luxels until brightness of the region, as determined from signals from pixels 24, indicate that brightness of the region of the scene is of the desired magnitude.
  • controller 60 automatically adjusts illumination of a scene imaged by camera 20 responsive to a particular algorithm so as to provide a desired contrast of an imaged scene. For example, in accordance with an embodiment of the present invention, controller 60 may automatically reduce illumination of illumination zones in the scene corresponding to pixels 24 that generate signals indicating that intensity of light incident on the pixels is greater than a predetermined intensity. Similarly controller 60 can be programmed to increase illumination of illumination zones for which corresponding pixels 24 indicate that intensity of incident light is low.
  • controller 60 uses signals from pixels 24 to perform relatively more complicated processing of an image of a scene being imaged by camera 20. For example controller 60 may control illumination of the scene to provide smoothing of the image or sharpemng of the image. In some embodiments of the present invention controller 60 uses signals from pixels 24 to determine parameters, such as for example mean brightness and/or brightness range, of a brightness histogram of the imaged scene. Controller 60 then adjusts illumination of the scene so that the parameters in the image of the scene assume desired values. In some embodiments of the present invention a "histogram adjustment", such as a histogramic equalization of the image or a region of the image, is performed by controller 60 for a single histogram of total integrated light intensity reaching pixels 24.
  • a histogram adjustment such as a histogramic equalization of the image or a region of the image
  • controller 60 performs brightness histogram adjustments for each color light provided by pixelated illuminator 30. For example if pixelated illuminator provides RGB light, histogram adjustments are made for each of R, G and B light.
  • camera 20 in accordance with an embodiment of the present invention, provides accurate determination of reflectance of regions in a scene being imaged by the camera and as a result high resolution processing of a scene being imaged by the camera.
  • An aspect of some embodiments of the invention is concerned with the acquisition of high gray level resolution images utilizing a camera having a lower gray level resolution.
  • An exemplary method for acquiring such an image is shown in Fig. 4.
  • a first image with uniform lighting is first acquired (200).
  • the brightness values in the image are used to determine illumination values for the pixels (202) such that the lighting differently illuminate the various regions of the field of view is adjusted to give a relatively flat (contrast-wise) image.
  • These illumination values are stored (204). If the acquisition has a resolution of N gray levels (from zero to some maximum) the illumination may have up to N' illuminations spaced at intervals such that for each gray level there exists a combination of N and N' such that N*N-C, where C is a constant that is the same for all of the pixels. Under this flattening illumination, a second image is acquired (208).
  • the intensity variations in this image are less than ⁇ 0.5 of the intensity distance between the initial gray levels.
  • the acquisition range is set (206) to cover this range of gray levels with substantially the entire range of acquisition levels.
  • the acquisition values are preferably stored (210). If M acquisition levels and P illumination levels are available, each pixel (i, j) is characterized by two numbers mjj and pjj. The gray levels of a uniformly illuminated image can thus (ideally) be derived to a resolution given by MxP gray levels.
  • the illumination is controlled only over regions of greater than one pixel.
  • the second image will have a variability that is larger than the variability when the illumination of the individual pixels are separately illuminated.
  • the field of view is divided into Q regions. An uniformly lit image is acquired. The acquired intensity over the regions is averaged and the illumination is adjusted such that the acquired intensity (for a subsequent image) over all of the regions (based on the averages) is uniform to within the quantization accuracies of the illumination and the acquisition.
  • each pixel is characterized by two numbers, one for the illumination and one for the acquisition.
  • the number of levels for illumination and acquisition need not be the same. Thus, for example, if there are half as many illumination levels as there are acquisition levels, the illumination levels are matched to every second acquisition level, such that the total range of illuminations for the second acquisition is ⁇ 1 original gray level. The second acquisition is set to cover this range with the entire range of acquisition values.
  • a very high resolution gray level (uniformly illuminated) image can be constructed (212) by generating an image in which each pixel has a brightness derived from the illumination level and the acquired brightness level.
  • the first P bits of the image gray level would be given by the illumination values and the next M bits of the image gray level would be given by the acquisition levels of the second acquired image.
  • two matrices are provided, for example in a memory in controller 60, one of which contains information on the illumination and the other info ⁇ nation on the second acquisition.
  • an A/D that allows for adjusting the zero level for the acquisition (so that the acquisition starts at the gray level of the pixel with the lowest illumination) and the A/D gain, so that the highest acquisition gray level is matched with the brightest portion of the image.
  • An iris may be provided to at least partially adjust the gain.
  • the illumination elements illuminate pixels or groups of pixels without overlap.
  • illumination of a pixel for individual pixel illumination
  • illumination of a region for regional pixel illumination
  • the reconstruction optionally takes into account the effects of illumination of adjacent pixels when determining the illumination to be used and in reconstructing the image.
  • the overlap is adjusted such that, for uniform brightness of the sources, each pixel is uniformly illuminated. This can be achieved, for example, by providing for Gaussian roll off of the illumination, with the half illumination point at the edge of the portion being illuminated.
  • the method may be used for acquiring only a portion of a scene or for acquiring a scene in piecewise portions.
  • Fig. 2 schematically shows a camera 71 similar to camera 20 comprising an illumination system 72, in accordance with an embodiment of the present invention.
  • Illumination system 72 comprises a pixelated illuminator 74 having luxels 76, a beam splitter 78 and a light source 80 that provides light that is incident on beam splitter 78.
  • light source 80 is schematically shown, by way of example, as an incandescent lamp, light source 80 may comprise any suitable light source, such as for example a laser, an array of lasers, flash lamp, arc source and an integrating sphere.
  • Light source 80 may also comprise optical elements required to coUimate and/or direct light from the light source so that the light is appropriately incident on beam splitter 78.
  • Beam splitter 78 reflects a portion of the light so that it is incident on pixelated illuminator 74.
  • dashed lines 41 and associated arrowheads indicate direction of travel of light along selected light rays.
  • each luxel 76 in pixelated illuminator 74 comprises a liquid crystal (LC) cell (not shown) having a front surface that faces beam splitter 78 and its own reflecting electrode (not shown) at a back surface of the cell.
  • LC liquid crystal
  • Light from light source 80 that enters a luxel 76 enters the LC cell of the luxel and is reflected back out the luxel by the electrode.
  • Voltage applied to the luxel's electrode controls transmittance of the luxel's LC cell and therefore intensity of light that exits the luxel.
  • Light that exits the luxel 76 is incident on beam splitter 78, which transmits some of the light to beam splitter 34.
  • Beam splitter 34 in turn reflects a portion of the light to illuminate an illumination zone of a scene being imaged with camera 71 that corresponds to the luxel.
  • Controller 60 controls voltages applied to the pixel electrodes of luxels 76 and controls thereby light from the luxels that illuminates a scene being imaged with camera 71.
  • light source 80 provides white light and each pixel electrode reflects substantially only R, G or B light.
  • a luxel 76 of pixelated illuminator 74 therefore provides either R, G or B light.
  • R, G and B luxels 76 may be grouped into groups of three luxels, each group comprising an R, G and B luxel and each such group corresponding to a single (color) pixel 24 in photosurface 22. (Assuming that each pixel 24 comprises an R, G and B photosensor, preferably, a configuration of the photosensors in the pixel is homologous with a configuration of the R, G and B luxels in the group of luxels corresponding to the pixel.
  • virtual images of R, G and B luxels are slightly defocused so that each luxel 76 in a group of luxels illuminates the appropriate photosensor in the corresponding pixel 24.
  • Controller 60 controls intensity, hue and saturation of illumination of the region by controlling intensity of light from each of the R, G and B luxels 76 that illuminate the illumination zones.
  • light source 80 provides R, G and B light, using for example a color wheel, and pixel electrodes of luxels 76 are "white" reflectors characterized by a substantially same reflectivity for R, G and B light.
  • Each luxel 76 optionally corresponds to a single pixel 24 and a region of a scene imaged on the pixel 24 corresponds to the illumination zone of the scene illuminated by the luxel 76.
  • Controller 60 may control light source 80 to illuminate pixelated illuminator sequentially with R, G and B light. Controller 60 controls transmittance of the LC cells in luxels 76 in synchrony with R, G and B light from light source 80 to control hue, saturation and intensity of illumination of illumination zones in the imaged scene.
  • pixelated illuminator is a DMD device and each luxel 76 comprises a micromirror (not shown). Controller 60 controls the micromirrors, using methods known in the art, to control intensity of light from light source 80 that is provided by each pixel 76.
  • Fig. 3 schematically shows a camera 100 having a photosurface 22 and comprising an illumination system 102 having a pixelated illuminator 74 comprising luxels 76, a light source 80 and a beam splitter 78, in accordance with an embodiment of the present invention.
  • Light from a scene imaged by camera 100 is focused by optics represented by a lens 26 onto photosurface 22.
  • a field of view 38 of camera 100 in an object plane 36 of the camera is located at an appropriate position in the scene.
  • Pixelated illuminator 74 is not boresighted with photosurface 22 and illumination system 102 comprises optics, represented by a lens 104, that focuses light from luxels 74 onto a scene imaged by camera 102.
  • dashed lines 41 and associated arrowheads indicate direction of travel of light along selected light rays.
  • light from light source 80 is directed to pixelated illuminator 74 by beam splitter 78.
  • Each of luxels 76 reflects a controlled amount of light incident on the luxel from light source 80 towards optics 104.
  • Optics 104 focuses the light from the luxel to an illumination zone in field of view 38.
  • Optimally optics 104 forms an image of pixelated illuminator 74 at object plane 36 which is substantially coincident with field of view 38.
  • illumination system 104 comprises a controller 60 that controls illumination system optics 104. Controller 60 adjusts optics 104 so that as camera 100 images scenes at different distances from the camera and distance of field of view 38 from the camera changes, light from pixelated illuminator 74 is appropriately focused to the field of view.
  • illumination system 102 is similar to and functions similarly to illumination systems 21 and 72. Variations of illumination system 102, in accordance with embodiments of the present invention, can be used to provide gray level and/or color illumination of a scene imaged by camera 100 and/or perform any of the illumination adjustments described with respect to illumination systems 21 and 72 and variations thereof. Similarly, whereas illumination system 102 is shown comprising light source 80 and a "reflecting type" pixelated illuminator, other types of non-boresighted illumination systems, in accordance with an embodiment of the present invention, are possible and can be advantageous.
  • a non-boresighted illumination system in accordance with an embodiment of the present invention, may be without a light source 80 and comprise a pixelated illuminator that emits light instead of one that reflects light.
  • Any of the boresighted illumination systems and variations thereof described above can, in accordance with an embodiment of the present invention, be configured as a non-boresighted illumination system.
  • illumination system 102 is shown, by way of example comprising a visual display 64 for communicating with a user.
  • controller 60 controls pixel illuminator 74 to illuminate an imaged scene with a particular spatial and/or temporal "calibration" pattern of light. Controller 60 correlates response of pixels 24 in photosurface 22 with the calibration light pattern to determine alignment of pixelated illuminator 74 relative to photosurface 22 and determine which pixels 24 in photosurface 22 image each of the illumination zones of a scene illuminated by the illumination system. By way of example, controller 60 may turn on and turn off a pattern of luxels 76 in pixelated illuminator 74 so that the pixels in the pattern "flicker" and illuminate illumination zones of a scene imaged by camera 100 with flickering light.
  • Controller 60 determines which pixels 24 in photosurface 22 generate signals indicating that light incident on the pixels is flickering in cadence with the titrning on and turning off of luxels 76 in the pattern. Controller 60 associates "flickering" photosurface pixels 24 with their corresponding luxels 76 and illumination zones in the scene. By determining which flickering luxels 76 correspond to which "flickering" photosurface pixels 24, controller 60 determines alignment of pixelated illuminator 76 relative to photosurface 22 and generates a "map" of which pixels in photosurface 22 image each of the illumination zones of a scene illuminated by the illumination system. Controller 60 uses the map to determine which luxels 76 to control to provide desired lighting of the scene.
  • each of the verbs, "comprise” “include” and “have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb.

Abstract

A camera comprising: a photosurface comprising light sensitive pixels that generate signals responsive to light incident thereon and optics that focus an image of a scene onto the photosurface; and an illumination system for illuminating the scene, the illumination system comprising: an illuminator having a plurality of substantially contiguous, independently controllable light providing regions; optics that focuses an image of the illuminator on the scene so that light from the illuminator illuminates substantially all of and substantially only the field of view of the camera; and a controller that controls light provided by each light providing region.

Description

CAMERA HAVING A THROUGH THE LENS PIXEL ILLUMINATOR
The present invention relates to imaging a scene with light and in particular to methods and apparatus for illuminating the imaged scene. BACKGROUND OF THE INVENTION
Satisfactory and effective imaging of a scene with light requires proper illumination of the scene. In many instances however, achieving proper illumination is a relatively complicated and arduous task. Often it is desirable and/or necessary to illuminate different regions of a scene with different intensity and/or color light. Sometimes a dynamic range of a camera limits contrast in an image of a scene and brightest regions and dimmest regions of the scene cannot be properly imaged at the same time. Sometimes lighting and color of illumination must be adjusted to provide desired effects in an image.
For example, in imaging a product for a catalogue, lighting usually has to be adjusted to increase illumination of certain regions of the product while decreasing illumination of other regions of the product. To enhance images of the product so that they are appropriately flattering, color of light used to illuminate the product may have to be adjusted and substantial effort invested in arranging direction and intensity of indirect lighting of the product.
In imaging items for visual inspection, lighting generally has to be adjusted to assure that inspected features of the items are appropriately visible to enable proper inspection. Often, to distinguish features of an inspected item, color of light illuminating a first region of the item is optionally adjusted to be different from color of light illuminating a second region of the item.
In some applications it may be desirable to adjust illumination of a scene so that light reaching a camera from all regions of the scene has substantially a same intensity or that intensities from different regions in the scene vary within a relatively narrow intensity range. For example, PCT Publications WO97/01111, WO97/01112, and WO97/01113 the disclosures of which are incorporated herein by reference describe gated 3D cameras that measures distances to regions in a scene. In a described 3D camera, a train of light pulses is radiated by a light source to illuminate the scene. Following each light pulse the camera is gated open to receive light reflected by regions in the scene from the light pulse and image the received light on a photosurface comprised in the camera. Distance to a region is determined from timing of the gates with respect to emission times of the light pulses and a total amount of light from the region that is received and imaged during the gates. In some instances, differences in reflectivity of different regions of the scene and/or differences in distances of different region in the scene from the camera result in substantial differences in intensity of reflected light reaching the camera. For example, intensity of light received from a region in a scene is a function of a solid angle subtended by the region at the camera. Since the solid angle is inversely proportional to the square of the distance from the camera, small differences in distances from the camera of regions in the scene can generate large differences in intensity of light reaching the camera from the regions. The range of these intensity differences can sometimes approach or exceed the dynamic range of the camera. As a result, accuracy of determination of distances to regions in the scene can vary substantially. In some instances, as a result of intensity differences it may not be possible to determine distances to all regions of the scene at a same time.
SUMMARY OF THE INVENTION An aspect of some embodiments of the present invention relates to providing a camera, comprising an improved illumination system for illuminating a scene imaged with the camera. The illumination system illuminates regions of the scene that are imaged on contiguous regions of the photosurface independently of each other. Hereinafter, regions of a scene that are independently illuminated in accordance with an embodiment of the present invention are referred to as "illumination zones".
According to an aspect of some embodiments of the present invention the size of an illumination zone is such that the illumination zone is imaged on a single pixel of the photosurface.
According to an aspect of some embodiments of the present invention the size of an illumination zone is such that the illumination zone is imaged on a group of pixels on the photosurface. In some embodiments of the present invention all illumination zones of a scene are imaged regions of the photosurface having a same size and shape. In some embodiments of the present invention different illumination zones are imaged on different size and/or shape regions of the photosurface.
An aspect of some embodiments of the present invention relates to adjusting intensity of light that illuminates each illumination zone of a scene independently of other illumination zones in the scene. An aspect of some embodiments of the present invention relates to adjusting color of light that illuminates each illumination zone of a scene independently of other illumination zones in the scene.
A camera in accordance with an embodiment of the present invention comprises a photosurface on which light from a scene imaged with the camera is focused and an illumination system having a spatially modulated light source boresighted with the photosurface. The light source, hereinafter referred to as a "pixelated illuminator", comprises a plurality of independently controllable light providing regions, hereinafter referred to as
"luxels". The size, location and orientation of the pixelated illuminator and optics used to boresight the pixelated illuminator with the photosurface are such that a virtual image of the pixelated illuminator is substantially coincident with the photosurface. A virtual image of each luxel of the pixelated illuminator is therefore located in a different corresponding region of the photosurface and light from virtual images of any two luxels does not appear to pass through or emanate from a same region of the photosurface. In an embodiment of the invention, the pixelated illuminator is boresighted with the photosurface using methods and devices described in US Patent Application, number 09/250,322, the disclosure of which is incorporated herein by reference.
The region of the photosurface on which a virtual image of a luxel is located images a region of the scene that is an illumination zone of the scene corresponding to the luxel. Light from the luxel illuminates only the illumination zone that is imaged on the region of the photosurface in which the virtual image of the luxel is located.
In some embodiments of the present invention, the camera comprises a controller that controls intensity of light provided by a luxel so as to control intensity of illumination of the illumination zone corresponding to the luxel. In some embodiments of the present invention the controller controls the luxel to adjust hue and/or saturation of light illuminating the illumination zone.
According to an aspect of some embodiments of the present invention, the controller receives signals generated by a pixel or pixels of the photosurface on which an illumination zone of a scene is imaged responsive to light incident on the pixel or pixels from the illumination zone. The controller controls light provided by the luxel that illuminates the illumination zone, responsive to the received signals. According to an aspect of some embodiments of the present invention the controller receives instructions from a user that define a desired hue, and/or saturation and/or intensity for light reaching the pixel or pixels of the photosurface from the illumination zone. The controller controls light provided by the luxel responsive to the instructions so that light that reaches the pixel or pixels has the desired characteristics defined by the user.
For example, assume a person uses a camera in accordance with an embodiment of the present invention to image a metal object that has highly reflecting areas. The person might prefer to reduce light reaching the camera from the highly reflecting areas to reduce glare and improve perceived quality of the image of the object. In accordance with an embodiment of the present invention, the person can accomplish this by instructing the controller to reduce, by a desired amount, intensity of illumination of illumination zones corresponding to pixels in the image that register light intensity greater than a maximum light intensity defined by the person. In some embodiments of the present invention the person can accomplish this by indicating on a display of a preview of the image, a region or regions for which the user wants to reduce brightness. After indicating the region or regions, in some embodiments, the user may indicate an amount by which the user wishes to reduce illumination of the region or regions on an appropriate control panel that is displayed with the preview. After the controller reduces illumination of illumination zones corresponding to the indicated region or regions, a new preview may be presented to the user, which if the user desires he or she can adjust again.
In some embodiments of the present invention hue and saturation of light illuminating a scene being imaged using the camera are adjusted similarly. For example, assuming the pixelated illuminator illuminates illumination zones with R, G or B light, the user can adjust color of the image by adjusting intensity, using for example methods similar to those described above, of R, G and/or B light illuminating illumination zones of the scene.
In some embodiments of the present invention, the virtual image of a luxel on the photosurface is substantially coincident with a single pixel on the photosurface. The luxel therefore illuminates a region, i.e. an illumination zone, of a scene imaged on the photosurface that is imaged on the single pixel on the photosurface. In some embodiments of the present invention, the virtual image of a luxel is substantiality coincident with a group of pixels on the photosurface and the luxel illuminates an illumination zone of a scene that is imaged on the group of pixels. In some embodiments of the present invention luxels are not all the same size and/or shape and two different luxels may illuminate different size and/or shape illumination zones in a scene. In some embodiments of the present invention virtual images of luxels on the photosurface are smaller than a pixel on the photosurface and a plurality of virtual images of luxels are located within a single pixel of the photosurface. In some embodiments of the present invention, a camera comprises an illumination system having a pixelated illuminator that is not boresighted with a photosurface of the camera. Instead, the illumination system comprises its own optics that focuses light from the pixelated illuminator on a scene being imaged by the camera. The camera may comprise a controller that controls the illumination system optics so that the light from the pixelated illuminator is focused to the scene being imaged by the camera.
Light providing systems and devices suitable for use as a pixelated illuminator for practice of the present invention are available in the market. In particular, certain types of light projectors used for projection display of images are suitable for use in practice of the present invention. These light projectors comprise luxels that are independently controllable to provide light at desired intensities and colors. The luxels can be made as small as pixels in a typical CCD or CMOS camera.
US Patent 4,680,579 to E. O. Granville, the disclosure of which is incorporated herein by reference, describes a light projector that uses a digital mirror device (DMD) in which micro mirrors function as luxels and reflect light of desired intensity and color provided by a suitable light source. US Patent 5,844,588 to Anderson, the disclosure of which is incorporated herein by reference, describes a DMD device used in an illumination system for a xerographic printer. Texas Instruments markets DMD light projectors and image projection systems based on DMDs. JVC markets a D-ILA (Direct, Direct Drive or Digital, Image Light Amplifier) light projector and image projection systems in which liquid crystal (LC) pixels are controlled to project light of desired intensity and color. Various projection light sources are described in a technical article entitled "Display Technologies" available during April 2000 from URL sight "www.extron.com", the disclosure of which is incorporated herein by reference. An aspect of some embodiments of the invention is concerned with the acquisition of high gray level resolution images utilizing a camera having a lower gray level resolution.
In an exemplary embodiment of the invention, a first image of a scene is acquired utilizing uniform lighting. The lighting, which can be controlled to controllably and differently illuminate the various regions of the field of view is adjusted to give a relatively flat (contrast-wise) image. If the acquisition has a resolution of N gray levels (from zero to some maximum) the illumination may have up to N' illuminations spaced at intervals such that for each gray level there exists a combination of N and N' such that N*N -C, where C is a constant that is the same for all of the pixels. Thus, if the illumination is matched to intensity measured on the initial image acquisition, and a second image is acquired, the intensity variations are less than ±0.5 of the intensity distance between the initial gray levels. However, in an exemplary embodiment of the invention, the acquisition range is set to cover this range of gray levels with substantially the entire scan of acquisition levels. If M acquisition levels and P illumination levels are available, each pixel (i, j) is characterized by two numbers my and pjj. The gray levels of a uniformly illuminated image can thus be derived to a resolution given by MxP gray levels.
In some embodiments of the invention, the illumination is controlled only over regions of greater than one pixel. In this case, the second image will have a variability that is larger than the variability when the illumination of the individual pixels are separately illuminated. In one exemplary embodiment, the field of view is divided into Q regions. An uniformly lit image is acquired. The acquired intensity over the regions is averaged and the illumination is adjusted such that the acquired intensity (for a subsequent image) over all of the regions (based on the averages) is uniform to within the quantization accuracies of the illumination and the acquisition. Again, each pixel is characterized by two numbers, one for the illumination and one for the acquisition.
The number of levels for illumination and acquisition need not be the same. Thus, for example, if there are half as many illumination levels as there are acquisition levels, the illumination levels are matched to every second acquisition level, such that the total range of illuminations for the second acquisition is ±1 original gray level. The second acquisition is set to cover this range with the entire range of acquisition values.
A very high resolution gray level (uniformly illuminated) image can be constructed utilizing these embodiments by generating an image in which each pixel has a brightness derived from the illumination level and the acquired brightness level. Thus, the first P bits of the image gray level would be given by the illumination values and the next M bits of the image gray level would be given by the acquisition levels of the second acquired image. Conveniently, two matrices are provided, one of which contains information on the illumination and the other information on the second acquisition.
While a fixed A/D for the acquisition may be used, in some embodiments of the invention an A/D that allows for adjusting the zero level for the acquisition (so that the acquisition starts at the gray level of the pixel with the lowest illumination) and the A D gain, so that the highest acquisition gray level is matched with the brightest portion of the image. An iris may be provided to at least partially adjust the gain.
The above discussion describes a system in which the illumination elements illuminate pixels or groups of pixels without overlap. In general, illumination of a pixel (for individual pixel illumination) or illumination of a region (for regional pixel illumination) will overlap at least partly into one or more neighboring pixels. In such cases, the reconstruction optionally takes into account the effects of illumination of adjacent pixels when determining the illumination to be used and in reconstructing the image. Optimally, for regional illumination, the overlap is adjusted such that, for uniform brightness of the sources, each pixel is uniformly illuminated. This can be achieved, for example, by providing for Gaussian roll off of the illumination, with the half illumination point at the edge of the portion being illuminated.
There is therefore provided in accordance with an embodiment of the present invention a camera comprising: a photosurface comprising light sensitive pixels that generate signals responsive to light incident thereon and optics that focus an image of a scene onto the photosurface; and an illumination system for illuminating the scene, the illumination system comprising: an illuminator having a plurality of substantially contiguous, independently controllable light providing regions; optics that focuses an image of the illuminator on the scene so that light from the illuminator illuminates substantially all of and substantially only the field of view of the camera; and a controller that controls light provided by each light providing region.
Optionally, the illuminator is boresighted with the photosurface so that a virtual image of the illuminator is located at the photosurface and optics that image the scene on the photosurface image the illuminator on the scene. Optimally, a virtual image of the illuminator is substantially coincident with the photosurface.
In some embodiments of the present invention light from each of the light providing regions that is reflected from the scene is imaged on a different group of contiguous pixels in the photosurface.
In some embodiments of the present invention light from each of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a region of the photosurface having a size substantially equal to the size of a pixel in the photosurface. In an embodiment of the invention, the region on which the light is imaged is located substantially within the area of a pixel on the photosurface. some embodiments of the present invention the regions of the photosurface on which light from at least two adjacent light providing regions of the illuminator is imaged overlap.
In some embodiments of the present invention there is substantially little overlap of regions of the photosurface on which light from adjacent light providing regions of the illuminator is imaged.
In some embodiments of the present invention light from a plurality of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a same region of the photosurface having a size substantially equal to the size of a pixel in the photosurface. Optimally, the region on which the light from the plurality of light providing regions is imaged is located substantially within the area of a pixel on the photosurface.
In some embodiments of the present invention all light providing regions provide light characterized by a substantially same spectrum. Optionally, each light providing region provides white light. In some embodiments of the present invention each light providing region is controllable to provide light in at least two different wavelength bands of light. In an exemplary embodiment, each light providing region is controllable to provide R, G and B light.
In some embodiments of the present invention the plurality of light providing regions comprises three light providing regions. In an exemplary embodiment, each light providing region provides a different one of R, G or B light.
In some embodiments of the present invention the controller controls intensity of R, G or B light provided by each light providing region so as to control hue, saturation or intensity of light illuminating a region of the scene which is illuminated by light from the light providing region independently of hue saturation and intensity of light illuminating other regions of the scene.
In some embodiments of the present invention the controller controls light provided by at least one light providing region to control an intensity distribution of R, G or B light reflected by the scene that is incident on pixels of the photosurface. In some embodiments of the present invention the controller controls intensity of light provided by each light providing region so as to control intensity of light illuminating a region of the scene illuminated by light from the light providing region independently of intensity of light illuminating other regions of the scene. In some embodiments of the present invention the controller controls light provided by at least one light providing region to control an intensity distribution of light reflected by the scene that is incident on pixels of the photosurface.
In some embodiments of the present invention the controller receives signals generated by each pixel responsive to light incident on the pixel and controls light provided by at least one of the light providing regions responsive to the signals to adjust illumination of the scene.
Optionally, the controller controls light provided by at least one light providing region to decrease intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is greater than a predetermined maximum intensity. Alternatively or additionally, the controller controls light provided by at least one light providing region to increase intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is less than a predetermined minimum intensity.
In some embodiments of the present invention the camera comprises a user operated input terminal to transmit instructions to the controller and wherein the controller controls light from at least one of the light providing regions responsive to the instructions to adjust illumination of the scene. Optionally, the input terminal comprises a display screen.
Optionally, the controller generates a preview of an image of the scene being imaged by the camera on the screen, responsive to signals that it receives from the pixels. Optionally, the screen is a touch sensitive screen on which a border can be delineated surrounding an object or region in the preview of the scene to select the object or region by touching the screen.
Alternatively or additionally the camera comprises a pointer controllable to select an object or region in the preview. In some embodiments of the present invention the controller controls light provided by light providing regions responsive to the selected region or object. In some embodiments of the present invention the controller displays a control panel having control icons on the screen and wherein manipulation of the control icons transmits instructions to the controller.
In some embodiments of the present invention the camera comprises a shutter that the controller opens and closes to gate the photosurface on and off. In some embodiments of the present invention pixels in the photosurface are controllable to be gated on and off and wherein the controller gates the photosurface on and off by gating the pixels.
Alternatively or additionally the controller may simultaneously turn on and turn off light providing regions so as to radiate at least one pulse of light having a pulse width that illuminates the scene. Optionally, the controller gates the photosurface on and off at times responsive to a time at which the at least one light pulse is radiated. Optionally, the controller gates the photosurface off when the illumination system is providing light to illuminate the scene. Alternatively or additionally the at least one light pulse comprises a train of light pulses. Optionally, the controller gates the photosurface on for a first gate period after a first time lapse following each radiated light pulse of a first plurality of radiated light pulses in the train of light pulses; and gates the photosurface on for a second gate period after a second time lapse following each radiated light pulse of a second plurality of radiated light pulses. Optimally, the mid points of first and second gate periods are delayed with respect to the radiated light pulses that they respectively follow by the same amount of time. Optionally, the duration of the first gate is greater than or equal to three times the pulse width. Optionally, the duration of the second gate period is substantially equal to the pulse width of the radiated light pulses. Generally, the controller determines intensity of light reaching the camera from a region of the scene from an amount of light incident on a pixel that images the region. Optimally, the controller controls light provided by the light providing regions so as to minimize a difference between a predetermined light intensity and intensity of light reaching each pixel in the photosurface from the scene. The controller may determine a distance to region of the scene imaged on a pixel of the photosurface responsive to amounts of light incident on the pixel during the first and second gates.
In some embodiments of the present invention the light providing regions are light sources that emit light. Each light providing region comprises a laser. Alternatively, each light providing region comprises a light emitting diode.
In some embodiments of the present invention the camera comprises a light source and wherein the light providing regions provide light by reflecting and modulating light from the light source that is incident on the light providing regions. In an embodiment of the invention, each light providing region comprises a micromirror controllable to reflect light in a direction towards the optics that focuses an image of the illuminator on the scene. Alternatively each light providing region may comprises a liquid crystal having a controllable transmittance and a mirror and wherein light from the light source enters the liquid crystal and is reflected back out the liquid crystal cells by the mirror in a direction towards the optics that focuses an image of the illuminator on the scene. There is further provided, in accordance with an embodiment of the present invention a method for illuminating a scene being imaged by a camera, the camera comprising a photosurface having pixels and optics for imaging the scene on the photosurface, the method comprising: illuminating the scene with an illuminator comprising a plurality of light providers so that each region of the scene that is imaged on a region of the photosurface comprising at least one pixel but less than all the pixels of the photosurface is illuminated with light provided by at least one light provider that provides light substantially only for the region; and controlling light provided the at least one light provider for a region of the scene to adjust illumination of the scene. The at least one pixel may a single pixel. In some embodiments of the present invention the method comprises boresighting the illuminator so that a virtual image of the illuminator is located substantially at the photosurface.
Optimally, the virtual image of the illuminator is substantially coincident with the photosurface. In some embodiments of the present invention a virtual image of each light provider is located within a different group of contiguous pixels on the photosurface. In some embodiments of the present invention a virtual image of each light provider is located substantially within a corresponding single pixel. In some embodiments of the present invention a virtual image of a plurality of light providers is located substantially within a corresponding single pixel. In some embodiments of the present invention the light providers are independently controllable and wherein controlling each of the light providing regions comprises controlling each of the light providing regions independently of the other light providing regions.
In some embodiments of the present invention controlling light comprises controlling the light to control contrast in an image of the scene provided by the camera. In some embodiments of the present invention controlling contrast comprises controlling contrast in only a localized region of the image.
In some embodiments of the present invention controlling light comprises controlling the light to adjust an intensity distribution of light from the scene that is incident on pixels in the photosurface. Optionally, adjusting an intensity distribution comprises controlling a parameter of the intensity distribution.
In some embodiments of the present invention controlling light comprises controlling the light to perform a predetermined image processing procedure. In some embodiments of the present invention the image processing procedure comprises unsharp masking. In some embodiments of the present invention the image processing procedure comprises sharpening. In some embodiments of the present invention the image processing procedure comprises smoothing. In some embodiments of the present invention the image processing procedure comprises histogramic equalization.
There is further provided, in accordance with an embodiment of the invention, a method of imaging, comprising: illuminating at least a portion of a scene with substantially uniform lighting; determining a spatially varying illumination that reduces the variability of brightness values of the at least portion of the scene; illuminating the at least portion with said varying illumination; acquiring a second image of the at least portion, said second image having brightness values for areas thereof; and determining the brightness values of the scene under uniform illumination from the brightness values and the varying values of the illumination.
Optionally, illuminating comprises illuminating utilizing the method of any of claims 50-57.
In an embodiment of the invention, acquiring the second image comprises acquiring the images using a brightness acquisition range limited to the range of brightness values in the image.
BRIEF DESCRIPTION OF FIGURES The following description describes examples of embodiments of the present invention and should be read with reference to figures attached hereto. In the figures, identical structures, elements or parts that appear in more than one figure are generally labeled with the same numeral in all the figures in which they appear. Dimensions of components and features shown in the figures are chosen for convenience and clarity of presentation and are not necessarily shown to scale. The figures are listed below.
Fig. 1 schematically shows a camera comprising an illumination system having a pixelated illuminator boresighted with a photosurface of the camera, in accordance with an embodiment of the present invention;
Fig. 2 schematically shows another camera comprising an illumination system having a light projector and a "boresighted" pixelated illuminator, in accordance with an embodiment of the present invention; Fig. 3 schematically shows a camera comprising an illumination system in which a pixelated illuminator of the illumination system is not boresighted with a photosurface of the camera, in accordance with an embodiment of the present invention; and
Fig. 4 is a flow chart of a method for providing a high gray level resolution image, in accordance with an exemplary embodiment of the invention.
DETAILED DESCRIPTION OF EMBODIMENTS Fig. 1 schematically shows a camera 20 comprising an illumination system 21, in accordance with an embodiment of the present invention.
Camera 20 comprises a photosurface 22, such as a CCD or CMOS light sensitive surface, having pixels 24 and optics represented by a lens 26 for focusing an image of a scene on photosurface 22. Whereas, in Fig. 1 (and in Figs. 2 and 3 discussed below) a single lens 26 represents the optics, the optics may comprise any suitable optical system, which may have a plurality of lenses and optical elements, as known in the art, for focusing images on photosurface 22. Illumination system 21 comprises a pixelated illuminator 30 having a plurality of light providing luxels 32. Pixelated illuminator 30 is boresighted with photosurface 22 using an appropriate beam splitter 34 and optical elements (not shown) as might be required, so that optimally, a virtual image of pixelated illuminator 30 is substantially coincident with photosurface 22. It is to be noted that because lens 26 is used both to focus an image of a scene being imaged with camera 20 on pixelated surface 22 and to focus light from pixelated illuminator 30 on the scene, that the virtual image of pixelated illuminator 30 remains automatically registered to photosurface 22 during zooming operations of the camera.
In general the size of pixelated illuminator 30 does not have to be the same as the size of photosurface 22. However, in order for a virtual image of pixelated illuminator 30 to be substantially coincident with photosurface 22, pixelated illuminator 30 or a real or virtual image of the pixelated illuminator that is boresighted with the photosurface must have a same size and shape as the photosurface. In Fig. 1 and figures that follow, for convenience of exposition, pixelated illuminator 30 is shown, by way of example, as having a same size as photosurface 22. Pixelated illuminator 30 is therefore shown boresighted with photosurface 22 without an optical system that provides an appropriate real or virtual image of the pixelated illuminator that has substantially a same size as photosurface 22. Pixelated illuminator 30 may be boresighted with photosurface 22 using methods similar to those described in US Patent Application 09/250,322 referenced above. By way of example, luxels 32, of pixelated illuminator 30, are assumed to have a size and shape so that a virtual image of each luxel 32 is substantially coincident with a single pixel 24 of photosurface 22. There is therefore (since a virtual image of pixelated illuminator
30 is substantially coincident with photosurface 22) a luxel 32 for each pixel 24 of photosurface 22.
A field of view of camera 20, at an object plane 36 of the camera located in a scene being imaged by the camera, is represented by a rectilinear mesh 38 comprising mesh squares 40. Mesh squares 40 are homologous with pixels 24 such that light that is collected by lens 26 that appears to come from any particular mesh square 40 is focused by the lens onto a single pixel 24 of photosurface 22 that corresponds to the particular mesh square. In exemplary embodiments of the invention, light provided by each luxel 32 is reflected by beam splitter 34 and illuminates substantially only, and all of, a single mesh square 40. The mesh square illuminated by the luxel 32 corresponds to the pixel 24 with which the virtual image of the luxel is coincident. Light from luxel 32 illuminates substantially only, and substantially all of, a single mesh square 40 as a result of the assumption, by way of example, that a virtual image of each luxel 32 is substantially coincident with a single pixel 24. In some embodiments of the present invention, size and shape of luxels 32 and position of pixelated illuminator 30 are such that light from adjacent luxels 32 overlap adjacent mesh squares 40. Illumination zones of the scene being imaged are those regions of the scene that can be illuminated by light from a single luxel 32. Boundaries of illumination zones of the scene are generally either coincident with or close to projections ofboundari.es of mesh squares 40 on the scene.
Dashed lines 41 in Fig. 1 represent selected light rays between corresponding points at peripheries of photosurface 22, pixelated illuminator 30 and field of view 38. Arrowheads on dashed lines represent direction of travel of light. For lines 41 that have arrows pointing in both directions along the lines, arrowheads pointing towards field of view 38 indicate direction of light from pixelated illuminator 30 that illuminates the field of view. Arrowheads pointing away from field of view 38 indicate light reflected from field of view 38 that is focused on photosurface 20.
To illustrate the relationship between luxels 32, pixels 24 in photosurface 22, mesh squares 40 and illumination zones of a scene being imaged by camera 20, assume that a letter "E" 51 represents the scene. Assume that letter 51 is coplanar with object plane 36 and located within field of view 38. (Letter 51 has a proper orientation as seen from camera 20 and appears reversed because, from the perspective of the reader, it is seen from the back.) In Fig. 1, because letter 51 is coplanar with object plane 36, and virtual images of luxels 32 are coincident with pixels 24, illumination zones of letter 51 are coincident with mesh squares 40 and adjacent illumination zones of letter 51 are contiguous and do not overlap. If letter 51 were not planar and parallel to object plane 36, the illumination zones of letter 51 would not of course be squares but would be defined by generally curvilinear boundaries determined by the shape of the surface of letter 51 (and the shape of luxels 32). If letter 51 were displaced away from object plane 36 in a direction perpendicular to object plane 36, adjacent illumination zones would overlap along their mutual borders, with the amount of overlap varying with the displacement. For example if the displacement of letter 51 were rectilinear and did not comprise a rotation, the overlap would increase linearly with magnitude of the displacement.
In Fig. 1, the widths of the vertical "back" and horizontal "arms" of letter 51 are shown as having a width equal to a side of a mesh square 40 for convenience of presentation. The back and arms of letter 51 can of course be many mesh squares 40 wide and/or the widths can be equal to a fractional number of mesh squares or it can be offset from edges of the mesh squares. Luxels 32 in pixelated illuminator 30 that illuminate illumination zones of letter 51 are shaded. The shaded luxels 32 form an illumination pattern "E" 52, inverted top to bottom and reversed left to right, which corresponds to letter 51 in field of view 38. Similarly, shaded pixels 24 in photosurface 22 indicate pixels 24 on which letter 51 in field of view 38 is imaged by camera 20, and form an image letter "E", 53 on the photosurface. Letter 53 in photosurface 22 is a reflection, in beam splitter 34, of letter 52 in pixelated illuminator 30.
Light from each shaded luxel 32 in letter 52 illuminates a single corresponding illumination zone 40 of letter 51 in field of view of 38. (For convenience, since illumination zones of letter 51 are coincident with mesh squares 40 the illumination zones are referred to by the numeral 40 which identifies mesh squares 40.) For example, light provided by the luxel 32 at the tip of the short middle arm of letter 52 on pixelated illuminator 30 illuminates only that illumination zone 40 located at the tip of the short middle arm in letter 51.
In accordance with some embodiments of the present invention, luxels 32 are controllable to provide desired patterns of illumination for a scene that is imaged with camera 20. In some embodiments of the present invention, luxels 32 are "color" luxels, for which light intensity as well as a spectrum of light provided by the pixels can be controlled to provide different color light. For example, in some embodiments of the present invention, luxels 32 are RGB luxels controllable to provide different combinations of R, G and B light so as to control hue, saturation and intensity of light illuminating illumination zones of the scene. In some embodiments of the present invention, luxels 32 are "gray level" luxels that provide light having a fixed spectrum and for which only intensity of light provided by the pixels is varied. In some embodiments of the present invention light provided by luxels 32 is not limited to the visible spectrum but may, for example, comprise infrared or ultraviolet light.
Luxels 32 may provide and control light by emission, reflection or transmission, or a combination of these processes. For example, luxels 32 may comprise one or a combination of light sources, such as for example semiconductor lasers, that provide light by emission, micro- mirrors that provide light by reflection and/or liquid crystal cells that provide light by transmission. In some embodiments of the invention, the luxels may provide a multi-level controlled illumination.
Illumination system 21 may comprise a controller 60 for controlling luxels 32 and a shutter 62 for shuttering photosurface 22. Controller 60 shutters light provided by pixelated illuminator 30 so as to provide at least one light pulse to illuminate a scene being imaged with camera 20. In some embodiments of the present invention controller 60 shutters light from pixelated illuminator 30 by turning on and turning off luxels of luxels 32. In some embodiments of the present invention, camera 20 comprises a shutter, which is controlled by controller 60, to transmit and not transmit light in order to shutter light from pixelated illuminator 30, and provide at least one light pulse that illuminates a scene imaged by the camera. The shutter may, for example, be located between pixelated illuminator 30 and beam splitter 34. If pixelated illuminator 30 is an illuminator that provides light to illuminate a scene by reflecting light from a light source, such as the illuminator shown in Fig. 2, the shutter may be located between the light source and the illuminator.
Optionally, controller 60 controls shutter 62 so that during a time that luxels of luxels 32 are "on" to provide a pulse of light, shutter 62 does not transmit light. This reduces exposure of photosurface 22 to light from pixelated illuminator 30 that may reach photosurface 22 without being reflected from a scene being imaged by camera 20, as a result for example of reflections or "haloing" of light from the illuminator by internal structures of camera 20. The intensity of the light may be varied in one or more of the following ways:
(1) The number of pulses emitted from each of the luxels may be varied. Thus, for example, if 256 levels of illumination are desired, 256 pulsed of light should be available for each frame (image) and the light level for each luxel adjusted by varying the number of pulses.
(2) A spatial attenuator, having a controllable number of attenuation levels is used to adjust the illumination level. (3) The length of light pulses is adjusted.
(4) The intensity of the light pulses is adjusted, as for example, by adjusting the current to laser luxel sources.
Other methods of varying intensity may be used.
In some embodiments of the present invention, shutter 62 is gated open for a limited accurately determined "gate" period of time, during which shutter 62 transmits light, following an accurately determined delay from the time that each light pulse of the at least one light is radiated. Light pulse width, repetition rate, gate width and relative timing of gates and light pulses, hereinafter collectively referred to as "illumination range parameters", are determined so that light from the light pulse that reaches photosurface 22 is reflected only by objects in a limited desired range of distances from camera 20. The range of distances defines a scene having a desired depth of field and distance (i.e. range) from the camera.
Various methods known in the art may be used in the practice of the present invention to determine illumination range parameters appropriate to desired range and depth of field for a scene to be imaged by camera 20. Relationships between range parameters and range and depth of field of a scene are described in PCT Publications WO97/01111, WO97/01112, and
WO97/01113 referenced above.
Whereas in some embodiments of the present invention, range parameters are determined and the determined range parameters define a range and depth of field for a scene, In some embodiments of the present invention, distance to a scene is determined and range parameters are determined responsive to the distance. The distance may be determined, for example, using devices and methods for determining range by focus or determining range by any of various time of flight techniques, techniques used in gated 3D cameras or geometrical techniques. Techniques for determining distances to regions in a scene are described in PCT applications PCT/IL98/00476 and PCT/IL98/00611 the disclosures of which are incorporated herein by reference. Gated 3D cameras and associated techniques are described in PCT Publications WO 97/01111, WO 97/01112, WO 97/01113 and US patent application 09/250,322. Cameras using substantially "geometrical" methods for measuring distances to regions in a scene are described in PCT application PCT/IL97/00370, the disclosure of which is incorporated herein by reference.
In some embodiments of the present invention, camera 20 is a 3D camera and illumination parameters are determined to provide a 3D map of an imaged scene. The parameters may be determined using methods and described in PCT Publications WO 97/01111, WO 97/01112, WO 97/01113.
It should be noted that whereas in camera 20 photosurface 22 is shuttered by shutter 62, in some embodiments of the present invention, photosurface 22 is a "self shuttering" photosurface for which shuttering of the photosurface is accomplished by controlling circuits comprised in the photosurface. PCT publication WO 00/19705, the disclosure of which is incorporated herein by reference, describes self-shuttering photosurfaces in which each pixel in the photosurface comprises a circuit controllable to gate the pixel on or off.
Just as different luxels 32 may, in accordance with embodiments of the present invention, provide light characterized by different spectral distributions, pixels 24 in photosurface 22 may have different types of spectral sensitivities. For example, in some embodiments of the present invention photosurface 22 is a gray level "colorblind" photosurface. All pixels 24 in the photosurface are gray level pixels that are sensitive to substantially only total intensity of light, in a same limited spectrum, incident on the pixels. In some embodiments of the present invention photosurface 22 is a "color photosurface" 22, that provides a color image of a scene imaged by camera 20 and pixels 24 are color sensitive pixels, such as RGB pixels. In some embodiments of the present invention, pixels 24 are sensitive to infrared or ultraviolet light.
To illustrate some lighting effects that can be provided by an illumination system in accordance with an embodiment of the present invention, assume, by way of example, that pixelated illuminator 30 is an RGB color illuminator and that photosurface 22 is an RGB color photosurface. Assume further that the surface of letter 51 that is imaged by camera 20 is white and that the letter is set against a black background. By changing color of light provided by luxels 32 of pixelated illuminator 30, color of an image of letter 51, i.e. letter 53 formed on photosurface 22, provided by camera 20 can be changed. For example, if luxels 32 in illumination pattern 52 illuminate letter 51 with one of R, G or B light, image 53 on photosurface 22 will be respectively red green or blue. It is of course also possible, in accordance with an embodiment of the present invention, for camera 20 to provide a "rainbow" colored image of letter 51 by controlling different luxels on pixelated illuminator 30 to provide light of different hues.
Light from luxels 32 can also be controlled to compensate for variations in reflectivity between different surface regions of letter 51. For example, assume that surface regions of the middle arm of letter 51 are faded and are characterized by relatively low reflectivity compared to other surface regions of letter 51. If intensity of illumination of the surface of letter 51 is substantially uniform, the middle short arm in an image of letter 51 provided by camera 20 will appear faded. To compensate, in accordance with an embodiment of the present invention, for the fading, intensity of light provided by luxels 32 in the middle arm of illumination pattern 52 (i.e. "E" 52) is increased relative to intensity of light provided by the other luxels 32 in illumination pattern 52.
"Extreme" lighting effects can also be provided by an illumination system in accordance with an embodiment of the present invention. For example, if luxels 32 in the middle and lower arms of illumination pattern 52 are turned off so that they do not provide any light, only the back and bottom arm of letter 51 will be illuminated and an image of letter
51 will appear to be an "L".
The above example of a scene comprising only an "E" is of course a very simplified scene, which is provided to facilitate clarity of exposition. Substantially more complicated and intricate scenes can of course be imaged with camera 20 and illumination system 21 can be used to provide substantially more subtle and complicated lighting effects than the lighting effects described above.
For example, assume that a portrait of a young woman wearing large highly reflective gold earrings is being imaged with camera 20 and that the woman's face is surrounded by an arrangement of brightly colored flowers. If the woman's face is relatively pale, controller 60 can control light provided by pixelated illuminator 30 to "tint" her face so that it appears more vital. If the earrings appear too bright, controller 60 can control luxels 32 in pixelated illuminator 30 to decrease intensity of illumination of the earrings.
It is therefore seen that an illumination system, in accordance with an embodiment of the present invention, can control illumination of a scene being imaged by a camera so as to control both color and contrast of the image.
Optionally, camera 20 comprises a visual display screen 64 that a person using the camera 20 employs to transmit information to and receive information from controller 60. When a scene is being imaged by camera 20, controller 60 may, for example, display a preview of the scene on screen 64. In some embodiments of the present mvention, controller 60 also displays a control panel 66 having appropriate control buttons 68, such as toggle and slide buttons, and or a keyboard (not shown) for communicating instructions to controller 60. In some embodiments of the invention, the user can indicate a region of the scene for which the user wishes to adjust illumination by selecting the region in the preview. Various methods known in the art may be used for selecting the region. For example, the user can touch the region with a finger or draw a contour around the region using a stylus. In some embodiments of the present invention camera 20 comprises a mouse that can be used to draw a suitable border around a region to be selected or to point to an object for which object a region is automatically delineated using image processing methods well known in the art. Once a region is selected, the user indicates desired changes of illumination for the region by pressing or sliding appropriate control buttons shown on screen 66.
In Fig. 1 screen 64 shows a preview letter E, 70 of letter 51 as it will be imaged by camera 20 under illumination conditions that obtain at the time that preview 70 is shown. By way of example, in preview 70, the bottom arm of the letter is faded, which is indicated by lighter shading of the bottom arm of "preview E" 70 in comparison to shading of other regions of the preview E. To improve an image of letter 51 provided by camera 20, the user selects the bottom arm of preview letter 70. In some embodiments of the present invention, the user then instructs controller 60 to increase illumination of the region of letter 51 corresponding to the selected region of preview 70 by moving an appropriate slider displayed in control panel 66 by a desired amount. Controller 60 controls luxels 32 corresponding to the selected region responsive to the amount by which the user moved the slider and displays an adjusted preview of letter 51 on the output screen. If the user determines, responsive to the adjusted preview, that illumination of letter 51 requires further adjustment, the user again selects a region of the letter in the preview to be adjusted and indicates a desired adjustment of the illumination using displayed control panel 66. If the user determines that further adjustment is not required, the user instructs camera 20 to image letter 51.
In some embodiments of the present invention, controller 60 receives signals from pixels 24 in photosurface 22 responsive to light that is incident on the pixels. In some embodiments of the present invention, controller 60 uses signals from pixels 24 to calibrate light output of luxels 32 that correspond to the pixels. The user controls camera 20 to image a suitable surface having uniform reflectivity that fills a field of view of camera 20. Controller 60 uses the signals to determine correspondence between control settings of luxels 32 and light output of the luxels.
In some embodiments of the invention, such measurement can also be used to determine overlap between luxels. For example, where individual pixels are selectively illuminated, the pixels are sequentially illuminated and the intensity on illumination acquired by the camera at adjacent pixels is determined. Similarly, when areas of the field of view are illuminated together, both edge fall-off and overlap can be determined in the same way. These measurements can be sued to correct certain of the results in some embodiments of the invention. In some embodiments of the present invention controller 60 controls light provided by luxels 32 that correspond to the pixels, responsive to the signals. For example, assume that the user instructs controller 60 to increase brightness of a region of a scene being imaged by a desired amount. Controller 60 increases output intensity of the luxels until brightness of the region, as determined from signals from pixels 24, indicate that brightness of the region of the scene is of the desired magnitude.
In some embodiments of the present invention, controller 60 automatically adjusts illumination of a scene imaged by camera 20 responsive to a particular algorithm so as to provide a desired contrast of an imaged scene. For example, in accordance with an embodiment of the present invention, controller 60 may automatically reduce illumination of illumination zones in the scene corresponding to pixels 24 that generate signals indicating that intensity of light incident on the pixels is greater than a predetermined intensity. Similarly controller 60 can be programmed to increase illumination of illumination zones for which corresponding pixels 24 indicate that intensity of incident light is low.
In some embodiments of the present invention, controller 60 uses signals from pixels 24 to perform relatively more complicated processing of an image of a scene being imaged by camera 20. For example controller 60 may control illumination of the scene to provide smoothing of the image or sharpemng of the image. In some embodiments of the present invention controller 60 uses signals from pixels 24 to determine parameters, such as for example mean brightness and/or brightness range, of a brightness histogram of the imaged scene. Controller 60 then adjusts illumination of the scene so that the parameters in the image of the scene assume desired values. In some embodiments of the present invention a "histogram adjustment", such as a histogramic equalization of the image or a region of the image, is performed by controller 60 for a single histogram of total integrated light intensity reaching pixels 24. In some embodiments of the present invention, controller 60 performs brightness histogram adjustments for each color light provided by pixelated illuminator 30. For example if pixelated illuminator provides RGB light, histogram adjustments are made for each of R, G and B light. It is to be noted that camera 20, in accordance with an embodiment of the present invention, provides accurate determination of reflectance of regions in a scene being imaged by the camera and as a result high resolution processing of a scene being imaged by the camera.
An aspect of some embodiments of the invention is concerned with the acquisition of high gray level resolution images utilizing a camera having a lower gray level resolution. An exemplary method for acquiring such an image is shown in Fig. 4.
In accordance with the embodiment outlined in Fig. 4, a first image with uniform lighting is first acquired (200). The brightness values in the image are used to determine illumination values for the pixels (202) such that the lighting differently illuminate the various regions of the field of view is adjusted to give a relatively flat (contrast-wise) image. These illumination values are stored (204). If the acquisition has a resolution of N gray levels (from zero to some maximum) the illumination may have up to N' illuminations spaced at intervals such that for each gray level there exists a combination of N and N' such that N*N-C, where C is a constant that is the same for all of the pixels. Under this flattening illumination, a second image is acquired (208). Ideally, the intensity variations in this image are less than ±0.5 of the intensity distance between the initial gray levels. In an exemplary embodiment of the invention, the acquisition range is set (206) to cover this range of gray levels with substantially the entire range of acquisition levels. The acquisition values are preferably stored (210). If M acquisition levels and P illumination levels are available, each pixel (i, j) is characterized by two numbers mjj and pjj. The gray levels of a uniformly illuminated image can thus (ideally) be derived to a resolution given by MxP gray levels.
In some embodiments of the invention, the illumination is controlled only over regions of greater than one pixel. In this case, the second image will have a variability that is larger than the variability when the illumination of the individual pixels are separately illuminated. In one exemplary embodiment, the field of view is divided into Q regions. An uniformly lit image is acquired. The acquired intensity over the regions is averaged and the illumination is adjusted such that the acquired intensity (for a subsequent image) over all of the regions (based on the averages) is uniform to within the quantization accuracies of the illumination and the acquisition. Again, each pixel is characterized by two numbers, one for the illumination and one for the acquisition.
The number of levels for illumination and acquisition need not be the same. Thus, for example, if there are half as many illumination levels as there are acquisition levels, the illumination levels are matched to every second acquisition level, such that the total range of illuminations for the second acquisition is ±1 original gray level. The second acquisition is set to cover this range with the entire range of acquisition values.
A very high resolution gray level (uniformly illuminated) image can be constructed (212) by generating an image in which each pixel has a brightness derived from the illumination level and the acquired brightness level. Thus, in a simple embodiment, the first P bits of the image gray level would be given by the illumination values and the next M bits of the image gray level would be given by the acquisition levels of the second acquired image. Conveniently, two matrices are provided, for example in a memory in controller 60, one of which contains information on the illumination and the other infoπnation on the second acquisition.
While a fixed A/D for the acquisition may be used, in some embodiments of the invention an A/D that allows for adjusting the zero level for the acquisition (so that the acquisition starts at the gray level of the pixel with the lowest illumination) and the A/D gain, so that the highest acquisition gray level is matched with the brightest portion of the image. An iris may be provided to at least partially adjust the gain. These methodologies, while they give higher gray level resolution result in higher
The above discussion describes a system in which the illumination elements illuminate pixels or groups of pixels without overlap. In general, illumination of a pixel (for individual pixel illumination) or illumination of a region (for regional pixel illumination) will overlap at least partly into one or more neighboring pixels. In such cases, the reconstruction optionally takes into account the effects of illumination of adjacent pixels when determining the illumination to be used and in reconstructing the image. Optimally, for regional illumination, the overlap is adjusted such that, for uniform brightness of the sources, each pixel is uniformly illuminated. This can be achieved, for example, by providing for Gaussian roll off of the illumination, with the half illumination point at the edge of the portion being illuminated. It should be understood that the method may be used for acquiring only a portion of a scene or for acquiring a scene in piecewise portions.
Fig. 2 schematically shows a camera 71 similar to camera 20 comprising an illumination system 72, in accordance with an embodiment of the present invention. Illumination system 72 comprises a pixelated illuminator 74 having luxels 76, a beam splitter 78 and a light source 80 that provides light that is incident on beam splitter 78. Whereas light source 80 is schematically shown, by way of example, as an incandescent lamp, light source 80 may comprise any suitable light source, such as for example a laser, an array of lasers, flash lamp, arc source and an integrating sphere.. Light source 80 may also comprise optical elements required to coUimate and/or direct light from the light source so that the light is appropriately incident on beam splitter 78. Beam splitter 78 reflects a portion of the light so that it is incident on pixelated illuminator 74. In Fig. 2 as in Fig. 1, dashed lines 41 and associated arrowheads indicate direction of travel of light along selected light rays.
In some embodiments of the present invention, each luxel 76 in pixelated illuminator 74 comprises a liquid crystal (LC) cell (not shown) having a front surface that faces beam splitter 78 and its own reflecting electrode (not shown) at a back surface of the cell. Light from light source 80 that enters a luxel 76 enters the LC cell of the luxel and is reflected back out the luxel by the electrode. Voltage applied to the luxel's electrode controls transmittance of the luxel's LC cell and therefore intensity of light that exits the luxel. Light that exits the luxel 76 is incident on beam splitter 78, which transmits some of the light to beam splitter 34. Beam splitter 34 in turn reflects a portion of the light to illuminate an illumination zone of a scene being imaged with camera 71 that corresponds to the luxel. Controller 60 controls voltages applied to the pixel electrodes of luxels 76 and controls thereby light from the luxels that illuminates a scene being imaged with camera 71. In some embodiments of the present invention, light source 80 provides white light and each pixel electrode reflects substantially only R, G or B light. A luxel 76 of pixelated illuminator 74 therefore provides either R, G or B light. R, G and B luxels 76 may be grouped into groups of three luxels, each group comprising an R, G and B luxel and each such group corresponding to a single (color) pixel 24 in photosurface 22. (Assuming that each pixel 24 comprises an R, G and B photosensor, preferably, a configuration of the photosensors in the pixel is homologous with a configuration of the R, G and B luxels in the group of luxels corresponding to the pixel. In some embodiments of the present invention virtual images of R, G and B luxels are slightly defocused so that each luxel 76 in a group of luxels illuminates the appropriate photosensor in the corresponding pixel 24.) Three illumination zones, an R, G and B illumination zone, therefore correspond to each region of a scene imaged by a pixel 24. Controller 60 controls intensity, hue and saturation of illumination of the region by controlling intensity of light from each of the R, G and B luxels 76 that illuminate the illumination zones.
In some embodiments of the present invention, light source 80 provides R, G and B light, using for example a color wheel, and pixel electrodes of luxels 76 are "white" reflectors characterized by a substantially same reflectivity for R, G and B light. Each luxel 76 optionally corresponds to a single pixel 24 and a region of a scene imaged on the pixel 24 corresponds to the illumination zone of the scene illuminated by the luxel 76.
Controller 60 may control light source 80 to illuminate pixelated illuminator sequentially with R, G and B light. Controller 60 controls transmittance of the LC cells in luxels 76 in synchrony with R, G and B light from light source 80 to control hue, saturation and intensity of illumination of illumination zones in the imaged scene. In some embodiments of the present invention pixelated illuminator is a DMD device and each luxel 76 comprises a micromirror (not shown). Controller 60 controls the micromirrors, using methods known in the art, to control intensity of light from light source 80 that is provided by each pixel 76.
Fig. 3 schematically shows a camera 100 having a photosurface 22 and comprising an illumination system 102 having a pixelated illuminator 74 comprising luxels 76, a light source 80 and a beam splitter 78, in accordance with an embodiment of the present invention. Light from a scene imaged by camera 100 is focused by optics represented by a lens 26 onto photosurface 22. A field of view 38 of camera 100 in an object plane 36 of the camera is located at an appropriate position in the scene. Pixelated illuminator 74 is not boresighted with photosurface 22 and illumination system 102 comprises optics, represented by a lens 104, that focuses light from luxels 74 onto a scene imaged by camera 102. In Fig. 3, as in Figs. 1 and 2, dashed lines 41 and associated arrowheads indicate direction of travel of light along selected light rays.
As in illumination system 72 shown in Fig. 2, light from light source 80 is directed to pixelated illuminator 74 by beam splitter 78. Each of luxels 76 reflects a controlled amount of light incident on the luxel from light source 80 towards optics 104. Optics 104 focuses the light from the luxel to an illumination zone in field of view 38. Optimally optics 104 forms an image of pixelated illuminator 74 at object plane 36 which is substantially coincident with field of view 38.
In some embodiments of the present invention, illumination system 104 comprises a controller 60 that controls illumination system optics 104. Controller 60 adjusts optics 104 so that as camera 100 images scenes at different distances from the camera and distance of field of view 38 from the camera changes, light from pixelated illuminator 74 is appropriately focused to the field of view.
Except for pixelated illuminator 74 not being boresighted with photosurface 22 and features of illumination system 102 resulting therefrom, illumination system 102 is similar to and functions similarly to illumination systems 21 and 72. Variations of illumination system 102, in accordance with embodiments of the present invention, can be used to provide gray level and/or color illumination of a scene imaged by camera 100 and/or perform any of the illumination adjustments described with respect to illumination systems 21 and 72 and variations thereof. Similarly, whereas illumination system 102 is shown comprising light source 80 and a "reflecting type" pixelated illuminator, other types of non-boresighted illumination systems, in accordance with an embodiment of the present invention, are possible and can be advantageous. For example, a non-boresighted illumination system, in accordance with an embodiment of the present invention, may be without a light source 80 and comprise a pixelated illuminator that emits light instead of one that reflects light. Any of the boresighted illumination systems and variations thereof described above can, in accordance with an embodiment of the present invention, be configured as a non-boresighted illumination system. As in illumination systems 21 and 72, illumination system 102 is shown, by way of example comprising a visual display 64 for communicating with a user.
In some embodiments of the present invention, controller 60 controls pixel illuminator 74 to illuminate an imaged scene with a particular spatial and/or temporal "calibration" pattern of light. Controller 60 correlates response of pixels 24 in photosurface 22 with the calibration light pattern to determine alignment of pixelated illuminator 74 relative to photosurface 22 and determine which pixels 24 in photosurface 22 image each of the illumination zones of a scene illuminated by the illumination system. By way of example, controller 60 may turn on and turn off a pattern of luxels 76 in pixelated illuminator 74 so that the pixels in the pattern "flicker" and illuminate illumination zones of a scene imaged by camera 100 with flickering light. Controller 60 determines which pixels 24 in photosurface 22 generate signals indicating that light incident on the pixels is flickering in cadence with the titrning on and turning off of luxels 76 in the pattern. Controller 60 associates "flickering" photosurface pixels 24 with their corresponding luxels 76 and illumination zones in the scene. By determining which flickering luxels 76 correspond to which "flickering" photosurface pixels 24, controller 60 determines alignment of pixelated illuminator 76 relative to photosurface 22 and generates a "map" of which pixels in photosurface 22 image each of the illumination zones of a scene illuminated by the illumination system. Controller 60 uses the map to determine which luxels 76 to control to provide desired lighting of the scene.
It is to be noted that whereas calibration of an illumination system and "mapping" of luxels and photosurface pixels is described for non-boresighted illumination system 102 calibration and mapping, in accordance with an embodiment of the present invention, may be similarly performed by boresighted illumination systems.
In the description and claims of the present application, each of the verbs, "comprise" "include" and "have", and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb.
The present invention has been described using detailed descriptions of embodiments thereof that are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments of the invention. Some embodiments of the present invention utilize only some of the features or possible combinations of the features. Variations of embodiments of the present invention that are described and embodiments of the present invention comprising different combinations of features noted in the described embodiments will occur to persons of the art. The scope of the invention is limited only by the following claims.

Claims

1. A camera comprising: a photosurface comprising light sensitive pixels that generate signals responsive to light incident thereon and optics that focus an image of a scene onto the photosurface; and an illumination system for illuminating the scene, the illumination system comprising: an illuminator having a plurality of substantially contiguous, independently controllable light providing regions; optics that focuses an image of the illuminator on the scene so that light from the illuminator illuminates substantially all of and substantially only the field of view of the camera; and a controller that controls light provided by each light providing region.
2. A camera according to claim 1 wherein the illuminator is boresighted with the photosurface so that a virtual image of the illuminator is located at the photosurface and optics that image the scene on the photosurface image the illuminator on the scene.
3. A camera according to claim 2 wherein a virtual image of the illuminator is substantially coincident with the photosurface.
4. A camera according to any of claims 1-3 wherein light from each of the light providing regions that is reflected from the scene is imaged on a different group of contiguous pixels in the photosurface.
5. A camera according to any of claims 1-3 wherein light from each of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a region of the photosurface having a size substantially equal to the size of a pixel in the photosurface.
6. A camera according to claim 5 wherein the region on which the light is imaged is located substantially within the area of a pixel on the photosurface.
7. A camera according to any of claims 4-6 wherein the regions of the photosurface on which light from at least two adjacent light providing regions of the illuminator is imaged overlap.
8. A camera according to any of claims 4-6 wherein there is substantially little overlap of regions of the photosurface on which light from adjacent light providing regions of the illuminator is imaged.
9. A camera according to any of claims 1-3 wherein light from a plurality of the light providing regions that is reflected from the scene and imaged by the camera is imaged on a same region of the photosurface having a size substantially equal to the size of a pixel in the photosurface.
10. A camera according to claim 9 wherein the region on which the light from the plurality of light providing regions is imaged is located substantially within the area of a pixel on the photosurface.
11. A camera according to any of claims 1-9 wherein all light providing regions provide light characterized by a substantially same spectrum.
12. A camera according to any of claims 11 wherein each light providing region provides white light.
13. A camera according to any of claims 1-9 wherein each light providing region is controllable to provide light in at least two different wavelength bands of light.
14. A camera according to claim 13 wherein each light providing region is controllable to provide R, G and B light.
15. A camera according to claim 9 or claim 10 wherein the plurality of light providing regions comprises three light providing regions.
16. A camera according to claim 15 wherein each light providing region provides a different one of R, G or B light.
17. A camera according to claim 14 or claim 16 wherein the controller controls intensity of R, G or B light provided by each light providing region so as to control hue, saturation or intensity of light illuminating a region of the scene which is illuminated by light from the light providing region independently of hue saturation and intensity of light illuminating other regions of the scene.
18. A camera according to claims 14 or claim 16 wherein the controller controls light provided by at least one light providing region to control an intensity distribution of R, G or B light reflected by the scene that is incident on pixels of the photosurface.
19. A camera according to any of the preceding claims wherein the controller controls intensity of light provided by each light providing region so as to control intensity of light illuminating a region of the scene illuminated by light from the light providing region independently of intensity of light illuminating other regions of the scene.
20. A camera according to any of the preceding claims wherein the controller controls light provided by at least one light providing region to control an intensity distribution of light reflected by the scene that is incident on pixels of the photosurface.
21. A camera according to any of the preceding claims wherein the controller receives signals generated by each pixel responsive to light incident on the pixel and controls light provided by at least one of the light providing regions responsive to the signals to adjust illumination of the scene.
22. A camera according to claim 21 wherein the controller controls light provided by at least one light providing region to decrease intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is greater than a predetermined maximum intensity.
23. A camera according to claim 21 or claim 22 wherein the controller controls light provided by at least one light providing region to increase intensity of light reaching the camera from a region of the scene for which intensity of light incident on a pixel that images the region is less than a predetermined minimum intensity.
24. A camera according to any of claims 21-23 wherein the camera comprises a user operated input terminal to transmit instructions to the controller and wherein the controller controls light from at least one of the light providing regions responsive to the instructions to adjust illumination of the scene.
25. A camera according to claim 24 wherein the input terminal comprises a display screen.
26. A camera according to claim 25 wherein the controller generates a preview of an image of the scene being imaged by the camera on the screen, responsive to signals that it receives from the pixels.
27. A camera according to claim 26 wherein the screen is a touch sensitive screen on which a border can be delineated surrounding an object or region in the preview of the scene to select the object or region by touching the screen.
28. A camera according to claim 26 or claim 27 and comprising a pointer controllable to select an object or region in the preview.
29. A camera according to claim 27 or claim 28 wherein the controller controls light provided by light providing regions responsive to the selected region or object.
30. A camera according to any of claims 25 - 29 wherein the controller displays a control panel having control icons on the screen and wherein manipulation of the control icons transmits instructions to the controller.
31. A camera according to any of the preceding claims and comprising a shutter that the controller opens and closes to gate the photosurface on and off.
32 A camera according to any of the preceding claims wherein pixels in the photosurface are controllable to be gated on and off and wherein the controller gates the photosurface on and off by gating the pixels.
33. A camera according to claim 31 or claim 32 wherein the controller simultaneously turns on and turns off light providing regions so as to radiate at least one pulse of light having a pulse width that illuminates the scene.
34. A camera according to claim 33 wherein the controller gates the photosurface on and off at times responsive to a time at which the at least one light pulse is radiated.
35. A camera according to claim 34 wherein the controller gates the photosurface off when the illumination system is providing light to illuminate the scene.
36. A camera according to claim 34 or claim 35 wherein the at least one light pulse comprises a train of light pulses.
37. A camera according to claim 36 wherein the controller gates the photosurface on for a first gate period after a first time lapse following each radiated light pulse of a first plurality of radiated light pulses in the train of light pulses; and gates the photosurface on for a second gate period after a second time lapse following each radiated light pulse of a second plurality of radiated light pulses.
38. A camera according to claim 37 wherein the mid points of first and second gate periods are delayed with respect to the radiated light pulses that they respectively follow by the same amount of time.
39. A camera according to claim 38 wherein the duration of the first gate is greater than or equal to three times the pulse width.
40. A camera according to claim 39 wherein the duration of the second gate period is substantially equal to the pulse width of the radiated light pulses.
41. A camera according to claim 40 wherein the controller determines intensity of light reaching the camera from a region of the scene from an amount of light incident on a pixel that images the region.
42. A camera according to claim 41 wherein the controller controls light provided by the light providing regions so as to minimize a difference between a predetermined light intensity and intensity of light reaching each pixel in the photosurface from the scene.
43. A camera according to claim 42 wherein the controller determines a distance to region of the scene imaged on a pixel of the photosurface responsive to amounts of light incident on the pixel during the first and second gates.
44. A camera according to any of the preceding claims wherein the light providing regions are light sources that emit light.
45. A camera according to claim 44 wherein each light providing region comprises a laser.
46. A camera according to claim 44 wherein each light providing region comprises a light emitting diode.
47. A camera according to any of claims 1-43 and comprising a light source and wherein the light providing regions provide light by reflecting and modulating light from the light source that is incident on the light providing regions.
48. A camera according to claim 47 wherein each light providing region comprises a micromirror controllable to reflect light in a direction towards the optics that focuses an image of the illuminator on the scene.
49. A camera according to claim 47 wherein each light providing region comprises a liquid crystal having a controllable transmittance and a mirror and wherein light from the light source enters the liquid crystal and is reflected back out the liquid crystal cells by the mirror in a direction towards the optics that focuses an image of the illuminator on the scene.
50. A method for illuminating a scene being imaged by a camera, the camera comprising a photosurface having pixels and optics for imaging the scene on the photosurface, the method comprising: illuminating the scene with an illuminator comprising a plurality of light providers so that each region of the scene that is imaged on a region of the photosurface comprising at least one pixel but less than all the pixels of the photosurface is illuminated with light provided by at least one light provider that provides light substantially only for the region; and controlling light provided the at least one light provider for a region of the scene to adjust illumination of the scene.
51. A method according to claim 50 wherein the at least one pixel is a single pixel.
52. A method according to claim 50 and comprising boresighting the illuminator so that a virtual image of the illuminator is located substantially at the photosurface.
53. A method according to claim 52 wherein the virtual image of the illuminator is substantially coincident with the photosurface.
54. A method according to claim 53 wherein a virtual image of each light provider is located within a different group of contiguous pixels on the photosurface.
55. A method according to claim 53 wherein a virtual image of each light provider is located substantially within a corresponding single pixel.
56. A method according to claim 53 wherein a virtual image of a plurality of light providers is located substantially within a corresponding single pixel.
57. A method according to any of claims 50-56 wherein the light providers are independently controllable and wherein controlling each of the light providing regions comprises controlling each of the light providing regions independently of the other light providing regions.
58. A method according to any of claims 50-57 wherein controlling light comprises controlling the light to control contrast in an image of the scene provided by the camera.
59. A method according to claim 58 wherein controlling contrast comprises controlling contrast in only a localized region of the image.
60. A method according to any of claims 50-59 wherein controlling light comprises controlling the light to adjust an intensity distribution of light from the scene that is incident on pixels in the photosurface.
61. A method according to claim 60 wherein adjusting an intensity distribution comprises controlling a parameter of the intensity distribution.
62. A method according to any of claims 50-61 wherein controlling light comprises controlling the light to perform a predetermined image processing procedure.
63. A method according to claim 62 wherein the image processing procedure comprises unsharp masking.
64. A method according to claim 62 wherein the image processing procedure comprises sharpening.
65. A method according to claim 62 wherein the image processing procedure comprises smoothing.
66. A method according to claim 62 wherein the image processing procedure comprises histogramic equalization.
67. A method of imaging, comprising: illuminating at least a portion of a scene with substantially uniform lighting; determining a spatially varying illumination that reduces the variability of brightness values of the at least portion of the scene; illuminating the at least portion with said varying illumination; acquiring a second image of the at least portion, said second image having brightness values for areas thereof; and determining the brightness values of the scene under uniform illumination from the brightness values and the varying values of the illumination.
68. A method according to claim 67 wherein illuminating comprises illuminating utilizing the method of any of claims 50-57.
69. A method according to claim 67 or claim 68 wherein acquiring the second image comprises acquiring the images using a brightness acquisition range limited to the range of brightness values in tlie image.
70. A camera according to any of claims 1-49 utilizing the method of any of claims 50-69.
PCT/IL2000/000404 1999-02-16 2000-07-09 Camera having a through the lens pixel illuminator WO2002005549A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US10/332,643 US7355648B1 (en) 1999-02-16 2000-07-09 Camera having a through the lens pixel illuminator
JP2002509283A JP2004503188A (en) 2000-07-09 2000-07-09 Camera with through-the-lens illuminator
EP00942347A EP1302066A1 (en) 2000-07-09 2000-07-09 Camera having a through the lens pixel illuminator
PCT/IL2001/000627 WO2002004247A1 (en) 2000-07-09 2001-07-09 Method and apparatus for providing adaptive illumination
US10/332,646 US6993255B2 (en) 1999-02-16 2001-07-09 Method and apparatus for providing adaptive illumination
AU2001269414A AU2001269414A1 (en) 2000-07-09 2001-07-09 Method and apparatus for providing adaptive illumination
IL15385701A IL153857A0 (en) 2000-07-09 2001-07-09 Method and apparatus for providing adaptive illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/250,322 US6445884B1 (en) 1995-06-22 1999-02-16 Camera with through-the-lens lighting

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/250,322 Continuation-In-Part US6445884B1 (en) 1995-06-22 1999-02-16 Camera with through-the-lens lighting

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10332646 Continuation-In-Part 2001-07-09

Publications (1)

Publication Number Publication Date
WO2002005549A1 true WO2002005549A1 (en) 2002-01-17

Family

ID=22947259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2000/000404 WO2002005549A1 (en) 1999-02-16 2000-07-09 Camera having a through the lens pixel illuminator

Country Status (2)

Country Link
US (4) US6445884B1 (en)
WO (1) WO2002005549A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005073776A1 (en) * 2004-01-28 2005-08-11 Leica Microsystems Cms Gmbh Microscope system and method for shading correction of lenses present in the microscope system
WO2006085834A1 (en) * 2005-01-28 2006-08-17 Microvision, Inc. Method and apparatus for illuminating a field-of-view and capturing an image
EP1793588A1 (en) * 2004-09-21 2007-06-06 Nikon Corporation Mobile information device
US7428997B2 (en) 2003-07-29 2008-09-30 Microvision, Inc. Method and apparatus for illuminating a field-of-view and capturing an image
EP1978725A3 (en) * 2007-04-03 2009-06-03 Delphi Technologies, Inc. Synchronous imaging using segmented illumination
JP2011519430A (en) * 2008-03-18 2011-07-07 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Calibration camera for spectral depth
EP2461574A1 (en) * 2010-12-03 2012-06-06 Research In Motion Limited Dynamic lighting control in hybrid camera-projector device
US8285133B2 (en) 2010-12-03 2012-10-09 Research In Motion Limited Dynamic lighting control in hybrid camera-projector device

Families Citing this family (118)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6445884B1 (en) * 1995-06-22 2002-09-03 3Dv Systems, Ltd. Camera with through-the-lens lighting
US20040114035A1 (en) * 1998-03-24 2004-06-17 Timothy White Focusing panel illumination method and apparatus
EP1515541A3 (en) * 1998-09-28 2008-07-23 3DV Systems Ltd. Distance measurement with a camera
JP3346337B2 (en) * 1999-05-07 2002-11-18 ミノルタ株式会社 Digital camera
AU5646299A (en) * 1999-09-08 2001-04-10 3Dv Systems Ltd. 3d imaging system
US6856355B1 (en) * 1999-11-30 2005-02-15 Eastman Kodak Company Method and apparatus for a color scannerless range image system
EP1160725A3 (en) * 2000-05-09 2002-04-03 DaimlerChrysler AG Method and apparatus for image acquisition in particular for three-dimensional detection of objects or scenes
US6963437B2 (en) * 2000-10-03 2005-11-08 Gentex Corporation Devices incorporating electrochromic elements and optical sensors
JP3907397B2 (en) * 2000-10-11 2007-04-18 富士通株式会社 Video surveillance device
AU2002305780A1 (en) 2001-05-29 2002-12-09 Transchip, Inc. Patent application cmos imager for cellular applications and methods of using such
US7738013B2 (en) * 2001-05-29 2010-06-15 Samsung Electronics Co., Ltd. Systems and methods for power conservation in a CMOS imager
US7160258B2 (en) 2001-06-26 2007-01-09 Entrack, Inc. Capsule and method for treating or diagnosing the intestinal tract
US20030147002A1 (en) * 2002-02-06 2003-08-07 Eastman Kodak Company Method and apparatus for a color sequential scannerless range imaging system
JP3960336B2 (en) * 2002-07-19 2007-08-15 セイコーエプソン株式会社 Image quality adjustment
US7068446B2 (en) * 2003-05-05 2006-06-27 Illumitech Inc. Compact non-imaging light collector
US20050046739A1 (en) * 2003-08-29 2005-03-03 Voss James S. System and method using light emitting diodes with an image capture device
JP2005208333A (en) * 2004-01-22 2005-08-04 Sharp Corp Flash device, camera equipped with flash device, and semiconductor laser device and manufacturing method therefor
US9464767B2 (en) * 2004-03-18 2016-10-11 Cliplight Holdings, Ltd. LED work light
EP1771749B1 (en) * 2004-07-30 2011-08-24 Panasonic Electric Works Co., Ltd. Image processing device
US7227611B2 (en) * 2004-08-23 2007-06-05 The Boeing Company Adaptive and interactive scene illumination
US7684007B2 (en) * 2004-08-23 2010-03-23 The Boeing Company Adaptive and interactive scene illumination
JP4688130B2 (en) * 2004-10-20 2011-05-25 株式会社リコー Optical system, color information display method, light deflection apparatus, and image projection display apparatus
JP4496964B2 (en) * 2005-01-14 2010-07-07 株式会社デンソー Tunnel detection device for vehicle and light control device for vehicle
US7134707B2 (en) * 2005-02-10 2006-11-14 Motorola, Inc. Selective light attenuation system
JP2006235254A (en) * 2005-02-25 2006-09-07 Fuji Photo Film Co Ltd Imaging apparatus
KR20060114890A (en) * 2005-05-03 2006-11-08 주식회사 팬택앤큐리텔 Wireless telecommunication terminal and method for controlling flash lightness automatically
JP4115467B2 (en) * 2005-06-01 2008-07-09 富士フイルム株式会社 Imaging device
DE102005032848A1 (en) * 2005-07-14 2007-01-25 Robert Bosch Gmbh Method and device for driver assistance
US20070030348A1 (en) * 2005-08-04 2007-02-08 Sony Ericsson Mobile Communications Ab Wireless communication device with range finding functions
US7592615B2 (en) * 2005-10-11 2009-09-22 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Optical receiver with a modulated photo-detector
US20070086762A1 (en) * 2005-10-13 2007-04-19 3M Innovative Properties Company Front end for 3D imaging camera
IL181030A (en) * 2006-01-29 2012-04-30 Rafael Advanced Defense Sys Time-space multiplexed ladar
US20090279316A1 (en) * 2006-04-21 2009-11-12 Koninklijke Philips Electronics N.V. Lamp unit for an adaptive front lighting system for a vehicle
US20080151052A1 (en) * 2006-11-01 2008-06-26 Videolarm, Inc. Infrared illuminator with variable beam angle
US20080198372A1 (en) * 2007-02-21 2008-08-21 Spatial Photonics, Inc. Vehicle headlight with image display
US7611266B2 (en) * 2007-03-27 2009-11-03 Visteon Global Technologies, Inc. Single path road geometry predictive adaptive front lighting algorithm using vehicle positioning and map data
US7568822B2 (en) * 2007-03-28 2009-08-04 Visteon Global Technologies, Inc. Predictive adaptive front lighting algorithm for branching road geometry
DE102007038899B4 (en) * 2007-08-13 2021-10-07 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Method and device for controlling the contrast range of the image recording light falling on an image recording medium
JP4458141B2 (en) * 2007-09-27 2010-04-28 株式会社デンソー Light control device
DE102007053138A1 (en) * 2007-11-08 2009-01-29 Adc Automotive Distance Control Systems Gmbh Relevant surrounding object e.g. pedestrian, displaying device for vehicle, has controller connected with sensor system for surrounding detection over uni-directional data exchange device
WO2009079498A2 (en) * 2007-12-17 2009-06-25 Omnivision Technologies, Inc. Reflowable camera module with integrated flash
US8149210B2 (en) * 2007-12-31 2012-04-03 Microsoft International Holdings B.V. Pointing device and method
DE102009015824B4 (en) 2008-04-02 2022-03-24 Denso Corporation Glare-free zone mapping product and system using it to determine if a person is being blinded
DE102008025947A1 (en) 2008-05-30 2009-12-03 Hella Kgaa Hueck & Co. Method and device for controlling the light output of a headlight of a vehicle
US8187097B1 (en) 2008-06-04 2012-05-29 Zhang Evan Y W Measurement and segment of participant's motion in game play
JP5647118B2 (en) * 2008-07-29 2014-12-24 マイクロソフト インターナショナル ホールディングス ビイ.ヴイ. Imaging system
DE102008052064B4 (en) * 2008-10-17 2010-09-09 Diehl Bgt Defence Gmbh & Co. Kg Device for taking pictures of an object scene
TW201025249A (en) * 2008-12-16 2010-07-01 Chunghwa Picture Tubes Ltd Depth-fused 3D display, driving method thereof and driving circuit thereof
JP2010145735A (en) * 2008-12-18 2010-07-01 Mitsubishi Heavy Ind Ltd Imaging apparatus and imaging method
US8681321B2 (en) 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8935055B2 (en) * 2009-01-23 2015-01-13 Robert Bosch Gmbh Method and apparatus for vehicle with adaptive lighting system
US8033697B2 (en) * 2009-02-18 2011-10-11 National Kaohsiung First University Of Science And Technology Automotive headlight system and adaptive automotive headlight system with instant control and compensation
US8531590B2 (en) * 2009-07-22 2013-09-10 University Of Southern California Camera with precise visual indicator to subject when within camera view
US8803967B2 (en) * 2009-07-31 2014-08-12 Mesa Imaging Ag Time of flight camera with rectangular field of illumination
CA2773696A1 (en) * 2009-09-11 2011-03-17 Koninklijke Philips Electronics N.V. Illumination system for enhancing the appearance of an object and method thereof
US9179106B2 (en) * 2009-12-28 2015-11-03 Canon Kabushiki Kaisha Measurement system, image correction method, and computer program
US8459811B2 (en) * 2009-12-30 2013-06-11 Nokia Corporation Method and apparatus for illumination
US8917395B2 (en) * 2010-04-19 2014-12-23 Florida Atlantic University MEMS microdisplay optical imaging and sensor systems for underwater scattering environments
US9019503B2 (en) * 2010-04-19 2015-04-28 The United States Of America, As Represented By The Secretary Of The Navy MEMS microdisplay optical imaging and sensor systems for underwater and other scattering environments
DE102010028949A1 (en) * 2010-05-12 2011-11-17 Osram Gesellschaft mit beschränkter Haftung headlight module
CN102065215B (en) * 2010-06-13 2013-05-29 深圳市捷高电子科技有限公司 Infrared night vision system and control method thereof
TWI540312B (en) * 2010-06-15 2016-07-01 原相科技股份有限公司 Time of flight system capable of increasing measurement accuracy, saving power and/or increasing motion detection rate and method thereof
CN102298149B (en) * 2010-06-25 2016-04-27 原相科技股份有限公司 The time-difference distance measuring system of raising degree of accuracy, moveable detection efficiency, power saving and method
US20130105670A1 (en) * 2010-07-02 2013-05-02 Marko Borosak Pulsed laser signal disrupting device incorporating led illuminator
DE102010032761A1 (en) * 2010-07-29 2012-02-02 E:Cue Control Gmbh Method for controlling controller for lighting system, involves detecting position or state of motion of person by using depth sensor camera
KR101220063B1 (en) * 2010-11-19 2013-01-08 주식회사 에스엘라이팅 Intelligent head lamp assembly of vehicle
KR101798063B1 (en) 2010-12-14 2017-11-15 삼성전자주식회사 Illumination optical system and 3D image acquisition apparatus including the same
US9823339B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Plural anode time-of-flight sensor
US8947527B1 (en) 2011-04-01 2015-02-03 Valdis Postovalov Zoom illumination system
DE102012002922A1 (en) * 2012-02-14 2013-08-14 Audi Ag Time-of-flight camera for a motor vehicle, motor vehicle and method for operating a time-of-flight camera
US9462255B1 (en) * 2012-04-18 2016-10-04 Amazon Technologies, Inc. Projection and camera system for augmented reality environment
JP6011040B2 (en) * 2012-06-07 2016-10-19 オムロン株式会社 Photoelectric sensor
JP6108150B2 (en) * 2012-07-10 2017-04-05 東芝ライテック株式会社 Lighting control system
US20140055252A1 (en) * 2012-08-24 2014-02-27 Ford Motor Company Vehicle with safety projector
US9161019B2 (en) * 2012-09-10 2015-10-13 Aemass, Inc. Multi-dimensional data capture of an environment using plural devices
US9338849B2 (en) * 2013-02-04 2016-05-10 Infineon Technologies Austria Ag Spatial intensity distribution controlled flash
US9255699B2 (en) * 2013-02-08 2016-02-09 Rite-Hite Holding Corporation Motion sensing dock lighting systems
US8761594B1 (en) * 2013-02-28 2014-06-24 Apple Inc. Spatially dynamic illumination for camera systems
FR3006421B1 (en) * 2013-05-30 2017-08-11 Valeo Vision LIGHTING MODULE FOR MOTOR VEHICLE PROJECTOR, PROJECTOR EQUIPPED WITH SUCH MODULES, AND PROJECTOR ASSEMBLY
AU2014202744B2 (en) 2014-05-20 2016-10-20 Canon Kabushiki Kaisha System and method for re-configuring a lighting arrangement
DE102013226624A1 (en) * 2013-12-19 2015-06-25 Osram Gmbh lighting device
JP6331383B2 (en) * 2013-12-26 2018-05-30 セイコーエプソン株式会社 Image display device and method for controlling image display device
US20150285458A1 (en) * 2014-04-08 2015-10-08 Ford Global Technologies, Llc Vehicle headlamp system
US10066799B2 (en) 2014-06-26 2018-09-04 Texas Instruments Incorporated Pixelated projection for automotive headlamp
GB2530564A (en) * 2014-09-26 2016-03-30 Ibm Danger zone warning system
US9433065B2 (en) * 2014-11-05 2016-08-30 Stmicroelectronics (Research & Development) Limited Lighting system including time of flight ranging system
US9918073B2 (en) * 2014-12-22 2018-03-13 Google Llc Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with movable illuminated region of interest
DE102014019344A1 (en) * 2014-12-22 2016-06-23 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Motor vehicle headlight, vehicle headlight system, motor vehicle and method for operating a motor vehicle
US9674415B2 (en) * 2014-12-22 2017-06-06 Google Inc. Time-of-flight camera system with scanning illuminator
US9896022B1 (en) 2015-04-20 2018-02-20 Ambarella, Inc. Automatic beam-shaping using an on-car camera system
DE102015008729A1 (en) * 2015-07-07 2017-01-12 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Motor vehicle headlight assembly, vehicle headlight system, motor vehicle and method for operating a motor vehicle
US9663025B2 (en) * 2015-09-18 2017-05-30 Clearpath Robotics, Inc. Lighting control system and method for autonomous vehicles
DE102016104381A1 (en) * 2016-03-10 2017-09-14 Osram Opto Semiconductors Gmbh Optoelectronic lighting device, method for illuminating a scene, camera and mobile terminal
US9809152B2 (en) * 2016-03-18 2017-11-07 Ford Global Technologies, Llc Smart light assembly and smart lighting system for a motor vehicle
WO2018035484A1 (en) * 2016-08-18 2018-02-22 Apple Inc. System and method for interactive scene projection
US20180186278A1 (en) * 2016-08-30 2018-07-05 Faraday&Future Inc. Smart beam lights for driving and environment assistance
US10502830B2 (en) * 2016-10-13 2019-12-10 Waymo Llc Limitation of noise on light detectors using an aperture
US10917626B2 (en) 2016-11-23 2021-02-09 Microsoft Technology Licensing, Llc Active illumination 3D imaging system
US10554881B2 (en) 2016-12-06 2020-02-04 Microsoft Technology Licensing, Llc Passive and active stereo vision 3D sensors with variable focal length lenses
US10469758B2 (en) 2016-12-06 2019-11-05 Microsoft Technology Licensing, Llc Structured light 3D sensors with variable focal length lenses and illuminators
CN110402399B (en) * 2017-01-03 2023-07-18 应诺维思科技有限公司 Lidar system and method for detecting and classifying objects
DE102017103660B4 (en) 2017-02-22 2021-11-11 OSRAM Opto Semiconductors Gesellschaft mit beschränkter Haftung METHOD OF OPERATING A LIGHT SOURCE FOR A CAMERA, LIGHT SOURCE, CAMERA
DE102017103884A1 (en) 2017-02-24 2018-08-30 Osram Opto Semiconductors Gmbh Lighting device, electronic device with a lighting device and use of a lighting device
US10412286B2 (en) * 2017-03-31 2019-09-10 Westboro Photonics Inc. Multicamera imaging system and method for measuring illumination
FR3065818B1 (en) * 2017-04-28 2019-04-26 Valeo Vision LUMINOUS MODULE FOR A CONFIGURED MOTOR VEHICLE FOR PROJECTING A LIGHT BEAM FORMING A PIXELIZED IMAGE
US10901073B2 (en) 2017-07-11 2021-01-26 Microsoft Technology Licensing, Llc Illumination for zoned time-of-flight imaging
US10430958B2 (en) 2017-07-11 2019-10-01 Microsoft Technology Licensing, Llc Active illumination 3D zonal imaging system
DE102017212872A1 (en) * 2017-07-26 2019-01-31 Robert Bosch Gmbh Method and device for adapting a vehicle lighting for a vehicle
US10904449B2 (en) * 2017-07-31 2021-01-26 Disney Enterprises, Inc. Intrinsic color camera
US10302295B2 (en) 2017-10-18 2019-05-28 Rite-Hite Holding Corporation Illuminating fans for loading docks
US11514789B2 (en) * 2018-01-05 2022-11-29 Arriver Software Llc Illumination-based object tracking within a vehicle
DE102018119312A1 (en) * 2018-08-08 2020-02-13 Osram Opto Semiconductors Gmbh lighting device
PL3671034T3 (en) * 2018-12-21 2022-09-12 a1 Mobile Light Technology GmbH Adjustable lamp
JP7021647B2 (en) * 2019-02-14 2022-02-17 株式会社デンソー Optical range measuring device
CA201546S (en) 2019-03-08 2022-01-28 Rite Hite Holding Corp Mounting device for fan and light
USD933283S1 (en) 2019-08-28 2021-10-12 Rite-Hite Holding Corporation Fan and light mounting system
JP2020153798A (en) * 2019-03-19 2020-09-24 株式会社リコー Optical device, distance measuring optical unit, distance measuring device, and distance measuring system
US20200344405A1 (en) * 2019-04-25 2020-10-29 Canon Kabushiki Kaisha Image pickup apparatus of measuring distance from subject to image pickup surface of image pickup device and method for controlling the same

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4501961A (en) * 1982-09-01 1985-02-26 Honeywell Inc. Vision illumination system for range finder
US4591918A (en) * 1983-04-18 1986-05-27 Omron Tateisi Electronics Co. Image sensor system
EP0436267A1 (en) * 1988-03-16 1991-07-10 Digivision, Inc. Apparatus and method for enhancement of image viewing by modulated illumination of a transparency
DE4026956A1 (en) * 1990-08-25 1992-03-05 Daimler Benz Ag Video camera for work intensity image - has beam splitter between objective lens and image converter, and phototransmitter elements
US5218485A (en) * 1991-04-05 1993-06-08 X-Ray Scanner Corporation Expanded dynamic range image digitizing apparatus
DE4420637A1 (en) * 1994-06-14 1995-12-21 Bertram Dr Rapp Optical imaging apparatus for photographic or video camera
WO1998026583A1 (en) * 1996-12-09 1998-06-18 Zeman Herbert D Contrast enhancing illuminator
US5828485A (en) * 1996-02-07 1998-10-27 Light & Sound Design Ltd. Programmable light beam shape altering device using programmable micromirrors
WO2000019705A1 (en) * 1998-09-28 2000-04-06 3Dv Systems, Ltd. Distance measurement with a camera

Family Cites Families (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3634725A (en) 1967-09-28 1972-01-11 Polaroid Corp Modulated electronic flash control
US3571493A (en) 1967-10-20 1971-03-16 Texas Instruments Inc Intensity modulated laser imagery display
US3629796A (en) 1968-12-11 1971-12-21 Atlantic Richfield Co Seismic holography
US3734625A (en) 1969-09-12 1973-05-22 Honeywell Inc Readout system for a magneto-optic memory
US3834816A (en) 1970-12-10 1974-09-10 Hughes Aircraft Co Colinear heterodyne frequency modulator
DE2453077B2 (en) 1974-11-08 1976-09-02 Precitronic Gesellschaft für Feinmechanik und Electronic mbH, 2000 Hamburg RECEIVING TRANSMITTER DEVICE FOR THE TRANSMISSION OF INFORMATION USING CONCENTRATED, MODULATED LIGHT BEAMS
US4185191A (en) 1978-06-05 1980-01-22 Honeywell Inc. Range determination system
JPS5596475A (en) 1979-01-19 1980-07-22 Nissan Motor Co Ltd Obstacle detector for vehicle
DE3021448A1 (en) 1980-06-06 1981-12-24 Siemens AG, 1000 Berlin und 8000 München Spatial deviation detection of surfaces from smooth planes - using optical grid and video image signal analysis
US4769700A (en) 1981-11-20 1988-09-06 Diffracto Ltd. Robot tractors
JPS57193183A (en) 1981-05-25 1982-11-27 Mitsubishi Electric Corp Image pickup device
US4410804A (en) 1981-07-13 1983-10-18 Honeywell Inc. Two dimensional image panel with range measurement capability
US4408263A (en) 1981-12-14 1983-10-04 Wonder Corporation Of America Disposable flashlight
US4680579A (en) 1983-09-08 1987-07-14 Texas Instruments Incorporated Optical system for projection display using spatial light modulator device
US4734735A (en) 1985-08-23 1988-03-29 Konishiroku Photo Industry Co., Ltd. Image apparatus having a color separation function
US4687326A (en) 1985-11-12 1987-08-18 General Electric Company Integrated range and luminance camera
US5255087A (en) 1986-11-29 1993-10-19 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus using the same
US4971413A (en) 1987-05-13 1990-11-20 Nikon Corporation Laser beam depicting apparatus
US5081530A (en) 1987-06-26 1992-01-14 Antonio Medina Three dimensional camera and range finder
US4734733A (en) 1987-09-21 1988-03-29 Polaroid Corporation Camera with two position strobe
JPH01100492A (en) 1987-10-14 1989-04-18 Matsushita Electric Ind Co Ltd Laser vision sensor
CH674675A5 (en) 1987-10-23 1990-06-29 Kern & Co Ag
US4959726A (en) 1988-03-10 1990-09-25 Fuji Photo Film Co., Ltd. Automatic focusing adjusting device
US4780732A (en) 1988-03-21 1988-10-25 Xerox Corporation Dual interaction TIR modulator
JPH01244934A (en) 1988-03-28 1989-09-29 Nissan Motor Co Ltd Headlamp for vehicle
EP0424409A4 (en) 1988-06-20 1992-01-15 Kemal Ajay Range finding device
US5013917A (en) 1988-07-07 1991-05-07 Kaman Aerospace Corporation Imaging lidar system using non-visible light
US4991953A (en) 1989-02-09 1991-02-12 Eye Research Institute Of Retina Foundation Scanning laser vitreous camera
US5009502A (en) 1989-04-20 1991-04-23 Hughes Aircraft Company System of holographic optical elements for testing laser range finders
US4935616A (en) 1989-08-14 1990-06-19 The United States Of America As Represented By The Department Of Energy Range imaging laser radar
EP0413333A3 (en) 1989-08-18 1991-07-24 Hitachi, Ltd. A superconductized semiconductor device
JP2976242B2 (en) 1989-09-23 1999-11-10 ヴィエルエスアイ ヴィジョン リミテッド Integrated circuit, camera using the integrated circuit, and method for detecting incident light incident on an image sensor manufactured using the integrated circuit technology
JPH03258200A (en) 1990-03-08 1991-11-18 Fujitsu Ten Ltd Acoustic reproducing device
US5343391A (en) 1990-04-10 1994-08-30 Mushabac David R Device for obtaining three dimensional contour data and for operating on a patient and related method
US5056914A (en) 1990-07-12 1991-10-15 Ball Corporation Charge integration range detector
US5090803A (en) 1990-09-21 1992-02-25 Lockheed Missiles & Space Company, Inc. Optical coordinate transfer assembly
US5198877A (en) 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
EP0449337A2 (en) 1990-10-24 1991-10-02 Kaman Aerospace Corporation Range finding array camera
US5200793A (en) 1990-10-24 1993-04-06 Kaman Aerospace Corporation Range finding array camera
US5253033A (en) 1990-12-03 1993-10-12 Raytheon Company Laser radar system with phased-array beam steerer
US5164823A (en) 1990-12-21 1992-11-17 Kaman Aerospace Corporation Imaging lidar system employing multipulse single and multiple gating for single and stacked frames
US5283671A (en) * 1991-02-20 1994-02-01 Stewart John R Method and apparatus for converting RGB digital data to optimized CMYK digital data
US5157451A (en) 1991-04-01 1992-10-20 John Taboada Laser imaging and ranging system using two cameras
DE69204886T2 (en) 1991-04-23 1996-04-04 Nec Corp Measuring device for moving bodies.
JP3217386B2 (en) 1991-04-24 2001-10-09 オリンパス光学工業株式会社 Diagnostic system
US5257085A (en) 1991-04-24 1993-10-26 Kaman Aerospace Corporation Spectrally dispersive imaging lidar system
US5216259A (en) 1991-05-10 1993-06-01 Robotic Vision System, Inc. Apparatus and method for improved determination of the spatial location of object surface points
US5200931A (en) 1991-06-18 1993-04-06 Alliant Techsystems Inc. Volumetric and terrain imaging sonar
US5448330A (en) 1991-06-20 1995-09-05 Nikon Corporation Divided radiation type flashlight system
US5243553A (en) 1991-07-02 1993-09-07 Loral Vought Systems Corporation Gate array pulse capture device
US5110203A (en) 1991-08-28 1992-05-05 The United States Of America As Represented By The Secretary Of The Navy Three dimensional range imaging system
US5265327A (en) 1991-09-13 1993-11-30 Faris Sadeg M Microchannel plate technology
US5220164A (en) 1992-02-05 1993-06-15 General Atomics Integrated imaging and ranging lidar receiver with ranging information pickoff circuit
JPH05289123A (en) 1992-04-14 1993-11-05 Ricoh Co Ltd Surface optical modulator
US5408263A (en) 1992-06-16 1995-04-18 Olympus Optical Co., Ltd. Electronic endoscope apparatus
US5434612A (en) 1992-09-25 1995-07-18 The United States Of America As Represented By The Secretary Of The Army Duo-frame normalization technique
US5334848A (en) 1993-04-09 1994-08-02 Trw Inc. Spacecraft docking sensor system
JPH07110381A (en) 1993-10-07 1995-04-25 Wacom Co Ltd Distance camera device
US5587832A (en) 1993-10-20 1996-12-24 Biophysica Technologies, Inc. Spatially light modulated confocal microscope and method
DE4341409C2 (en) 1993-12-04 2002-07-11 Bosch Gmbh Robert Device for regulating the headlight range of motor vehicle headlights
US5844588A (en) 1995-01-11 1998-12-01 Texas Instruments Incorporated DMD modulated continuous wave light source for xerographic printer
IL114278A (en) 1995-06-22 2010-06-16 Microsoft Internat Holdings B Camera and method
US6445884B1 (en) 1995-06-22 2002-09-03 3Dv Systems, Ltd. Camera with through-the-lens lighting
CN100524015C (en) 1995-06-22 2009-08-05 3Dv系统有限公司 Method and apparatus for generating range subject distance image
DE19530008B4 (en) 1995-08-16 2005-02-03 Automotive Lighting Reutlingen Gmbh Illumination device for vehicles with a reflective deflection device
EP2416198B1 (en) 1998-05-25 2013-05-01 Panasonic Corporation Range finder device and camera
US6023365A (en) 1998-07-16 2000-02-08 Siros Technologies, Inc. DMD illumination coupler
US20020112005A1 (en) * 1998-08-25 2002-08-15 Charles Namias Video e-mail kiosk
US7202898B1 (en) 1998-12-16 2007-04-10 3Dv Systems Ltd. Self gating photosurface
TW493054B (en) 1999-06-25 2002-07-01 Koninkl Philips Electronics Nv Vehicle headlamp and a vehicle
US6941323B1 (en) * 1999-08-09 2005-09-06 Almen Laboratories, Inc. System and method for image comparison and retrieval by enhancing, defining, and parameterizing objects in images
AU5646299A (en) 1999-09-08 2001-04-10 3Dv Systems Ltd. 3d imaging system
US6379022B1 (en) * 2000-04-25 2002-04-30 Hewlett-Packard Company Auxiliary illuminating device having adjustable color temperature
EP1302066A1 (en) 2000-07-09 2003-04-16 3DV Systems Ltd. Camera having a through the lens pixel illuminator
US20020015103A1 (en) 2000-07-25 2002-02-07 Zhimin Shi System and method of capturing and processing digital images with depth channel
AU2001218821A1 (en) 2000-12-14 2002-06-24 3Dv Systems Ltd. 3d camera

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4501961A (en) * 1982-09-01 1985-02-26 Honeywell Inc. Vision illumination system for range finder
US4591918A (en) * 1983-04-18 1986-05-27 Omron Tateisi Electronics Co. Image sensor system
EP0436267A1 (en) * 1988-03-16 1991-07-10 Digivision, Inc. Apparatus and method for enhancement of image viewing by modulated illumination of a transparency
DE4026956A1 (en) * 1990-08-25 1992-03-05 Daimler Benz Ag Video camera for work intensity image - has beam splitter between objective lens and image converter, and phototransmitter elements
US5218485A (en) * 1991-04-05 1993-06-08 X-Ray Scanner Corporation Expanded dynamic range image digitizing apparatus
DE4420637A1 (en) * 1994-06-14 1995-12-21 Bertram Dr Rapp Optical imaging apparatus for photographic or video camera
US5828485A (en) * 1996-02-07 1998-10-27 Light & Sound Design Ltd. Programmable light beam shape altering device using programmable micromirrors
WO1998026583A1 (en) * 1996-12-09 1998-06-18 Zeman Herbert D Contrast enhancing illuminator
WO2000019705A1 (en) * 1998-09-28 2000-04-06 3Dv Systems, Ltd. Distance measurement with a camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAWAKITA M ET AL: "AXI-VISION CAMERA: A THREE-DIMENSION CAMERA", PROCEEDINGS OF THE SPIE, 2000, XP000987367 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7428997B2 (en) 2003-07-29 2008-09-30 Microvision, Inc. Method and apparatus for illuminating a field-of-view and capturing an image
DE102004004115B4 (en) * 2004-01-28 2016-08-18 Leica Microsystems Cms Gmbh Microscope system and method for shading correction of the existing optics in the microscope system
WO2005073776A1 (en) * 2004-01-28 2005-08-11 Leica Microsystems Cms Gmbh Microscope system and method for shading correction of lenses present in the microscope system
JP2007519962A (en) * 2004-01-28 2007-07-19 ライカ マイクロシステムス ツェーエムエス ゲーエムベーハー Microscope system and method for shading correction of lenses present in a microscope system
US8147066B2 (en) 2004-09-21 2012-04-03 Nikon Corporation Portable information device having a projector and an imaging device
EP1793588A4 (en) * 2004-09-21 2011-04-27 Nikon Corp Mobile information device
EP1793588A1 (en) * 2004-09-21 2007-06-06 Nikon Corporation Mobile information device
WO2006085834A1 (en) * 2005-01-28 2006-08-17 Microvision, Inc. Method and apparatus for illuminating a field-of-view and capturing an image
EP1978725A3 (en) * 2007-04-03 2009-06-03 Delphi Technologies, Inc. Synchronous imaging using segmented illumination
US7745771B2 (en) 2007-04-03 2010-06-29 Delphi Technologies, Inc. Synchronous imaging using segmented illumination
JP2011519430A (en) * 2008-03-18 2011-07-07 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Calibration camera for spectral depth
EP2461574A1 (en) * 2010-12-03 2012-06-06 Research In Motion Limited Dynamic lighting control in hybrid camera-projector device
US8285133B2 (en) 2010-12-03 2012-10-09 Research In Motion Limited Dynamic lighting control in hybrid camera-projector device

Also Published As

Publication number Publication date
US6654556B2 (en) 2003-11-25
US6445884B1 (en) 2002-09-03
US7355648B1 (en) 2008-04-08
US20040114921A1 (en) 2004-06-17
US20010055482A1 (en) 2001-12-27
US6993255B2 (en) 2006-01-31

Similar Documents

Publication Publication Date Title
US7355648B1 (en) Camera having a through the lens pixel illuminator
EP1302066A1 (en) Camera having a through the lens pixel illuminator
US8736710B2 (en) Automatic exposure control for flash photography
US7661828B2 (en) Adjusting light intensity
US8004766B2 (en) Illuminating device, illuminating method, image signal processing device, image signal processing method, and image projecting apparatus
US7224384B1 (en) 3D imaging system
US7605828B2 (en) Method and system for reducing gray scale discontinuities in contrast enhancing screens affected by ambient light
US6556706B1 (en) Three-dimensional surface profile imaging method and apparatus using single spectral light condition
US10359277B2 (en) Imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
KR20160007361A (en) Image capturing method using projecting light source and image capturing device using the method
JP2004503188A5 (en)
EP4053499A1 (en) Three-dimensional-measurement device
JP2004212385A (en) Photographic device, photographing method and control method for the photographic device
TWI801637B (en) Infrared pre-flash for camera
CN110702031B (en) Three-dimensional scanning device and method suitable for dark surface
US20080151194A1 (en) Method and System for Illumination Adjustment
JP7401013B2 (en) Information processing device, control device, information processing method and program
KR20200028406A (en) Method and apparatus for optically measuring the surface of a measurement object
JPS62130473A (en) Drawing electrical signal generator
US20220034651A1 (en) Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor
US20110298844A1 (en) Power Calibration of Multiple Light Sources in a Display Screen
JP7005175B2 (en) Distance measuring device, distance measuring method and imaging device
GB2482562A (en) Light control device
TW201621451A (en) Projection apparatus and method for projecting an image pixel by pixel
CN110710201B (en) Operating method and control unit for a laser projection unit and laser projection unit

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): BR CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2000942347

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000942347

Country of ref document: EP

ENP Entry into the national phase

Country of ref document: RU

Kind code of ref document: A

Format of ref document f/p: F

WWW Wipo information: withdrawn in national office

Ref document number: 2000942347

Country of ref document: EP