WO2002049366A1 - 3d camera - Google Patents

3d camera Download PDF

Info

Publication number
WO2002049366A1
WO2002049366A1 PCT/IL2000/000838 IL0000838W WO0249366A1 WO 2002049366 A1 WO2002049366 A1 WO 2002049366A1 IL 0000838 W IL0000838 W IL 0000838W WO 0249366 A1 WO0249366 A1 WO 0249366A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
photosurface
light
pixel
scene
Prior art date
Application number
PCT/IL2000/000838
Other languages
French (fr)
Inventor
Yacov Malinovich
Original Assignee
3Dv Systems, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Dv Systems, Ltd. filed Critical 3Dv Systems, Ltd.
Priority to AU2001218821A priority Critical patent/AU2001218821A1/en
Priority to PCT/IL2000/000838 priority patent/WO2002049366A1/en
Priority to PCT/IL2001/001159 priority patent/WO2002049367A2/en
Priority to AU2002222487A priority patent/AU2002222487A1/en
Publication of WO2002049366A1 publication Critical patent/WO2002049366A1/en

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • H01L27/14627Microlenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the invention relates to photosurfaces used for imaging a scene and in particular to light collection in photosurfaces that are used to provide both an image of a scene and a depth map of the scene.
  • Gated 3D cameras that provide distance measurements to regions of a scene that they image are well known in the art.
  • Gated 3D cameras comprise a photosurface, such as a CCD or CMOS photosurface and a gating means for gating the photosurface on and off, such as an electro-optical shutter or a gated image intensifier.
  • a photosurface such as a CCD or CMOS photosurface
  • a gating means for gating the photosurface on and off such as an electro-optical shutter or a gated image intensifier.
  • the scene is generally illuminated with a train of light pulses radiated from an appropriate light source.
  • the radiated light pulses are infrared (IR) light pulses.
  • the photosurface For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the photosurface is gated on for a period of time, hereinafter referred to as a "gate".
  • a gate For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the photosurface is gated on for a period of time, hereinafter referred to as a "gate".
  • Light from the light pulse that is reflected from a region in the scene is imaged on the photosurface if it reaches the camera during the gate. Since the time elapsed between radiating a light pulse and the gate that follows it is known, the time it took imaged light to travel from the light source to the reflecting region in the scene and back to the camera can be determined. The time elapsed is used to determine the distance to the reflecting region.
  • 3D-picture cameras provide a picture of a scene that they image as well as a depth map of the scene.
  • the picture is a black and white picture, while in others the picture is a color picture.
  • PCT application PCT/IL99/00490 describes various configurations of 3D-picture cameras.
  • the described cameras comprise different photosurfaces for different imaging functions of the camera.
  • some of the described cameras comprise an IR sensitive photosurface for registering IR light used to provide a depth map of a scene and separate R, G, and B photosurfaces for providing a color image of the scene.
  • An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface in which some pixels have sizes and/or shapes that are different from other pixels in the photosurface.
  • An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface in which some pixels have photosensitive regions that have sizes different from photosensitive regions of other pixels in the photosurface.
  • Distance pixels generally require processing circuitry that is larger and more complex than circuitry comprised in picture pixels in a photosurface. Furthermore, because of the relative complexity of processing circuitry in a distance pixel, a distance pixel is usually more sensitive to crosstalk between a region of the pixel in which its circuitry is located and a photosensitive region of the pixel. Examples of processing circuitry comprised in distance pixels in a photosurface of a 3D camera are described in PCT publication WO 00/19705 referenced above. In addition, there is generally less light available for imaging a scene to provide a depth map of the scene than there is available for imaging the scene to provide a picture of the scene.
  • a 3D- picture camera comprises a 3D-picture photosurface having distance pixels that are substantially larger than picture pixels.
  • the larger size of distance pixels compared to picture pixels provides more space in the distance pixels for processing circuitry.
  • the distance pixels also have photosensitive regions that are larger than photosensitive regions of picture pixels.
  • the larger size photosensitive and circuit regions of the distance pixels tend to reduce cross-talk between circuit regions of the pixels and photosensitive regions of the pixels.
  • the larger photosensitive regions of the distance pixels also enhances their photosensitivity.
  • pixels in the photosurface have different shapes.
  • An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface, which is compensated for differences in size and/or shape of pixels comprised in the photosurface and for differences in size of their respective photosensitive regions.
  • Algorithms for processing imaging data acquired with the photosurface may thereby be simplified and currently available image processing algorithms may be applied to processing the data.
  • a 3D-picture camera in accordance with some embodiments of the present invention comprises a 3D-picture photosurface having microlenses coupled to pixels in the photosurface that compensate for differences in sizes and/or shapes of the pixels and their respective photosensitive regions.
  • all the microlenses in the photosurface have a same size and shape and are distributed in a symmetric pattern over the photosurface.
  • the shape and size of a microlens refers to the shape and size of the aperture of the microlens.
  • the photosurface thus collects light from a symmetric, uniform grid of equal size and shape regions of a scene imaged with the photosurface. Therefore, processing imaging data acquired with the photosurface to provide depth images and pictures of scenes imaged with the photosurface is simplified and currently available image processing algorithms may be applied to processing the data.
  • microlenses are often used in prior photosurfaces for increasing light gathering efficiency of pixels in the photosurfaces.
  • microlenses are generally the same size as the pixels to which they are coupled and are not used to compensate for differences in sizes of the pixels.
  • distance pixels in a 3D-picture photosurface are IR pixels that register IR light and IR light is used to image a scene to determine distances to the scene
  • a picture of the scene is a color picture
  • picture pixels of the 3D-picture photosurface are RGB pixels.
  • distance pixels are IR pixels and picture pixels are RGB pixels.
  • aspects of the present invention are not limited to IR-RGB photosurfaces and are applicable to 3D-picture photosurfaces having other spectral sensitivities.
  • a 3D-picture photosurface comprising IR and RGB pixels, in accordance with an embodiment of the present, is referred to as an IR- RGB photosurface.
  • microlenses are used to adjust relative photosensitivity of pixels in a photosurface.
  • IR pixels in the photosurface are coupled to microlenses that are larger than microlenses to which RGB pixels are coupled.
  • the R pixels are coupled to microlenses that are larger than microlenses coupled to the G and B pixels.
  • a 3D-picture camera that comprises a 3D-picture photosurface, for which distance pixels and picture pixels image a scene with light having wavelengths in different wavelength bands
  • exposure of the photosurface to light used by distance pixels can degrade quality of a picture of the scene provided by the picture pixels.
  • exposure of the photosurface to light used by the picture pixels can degrade accuracy of a depth map of the scene provided by the camera.
  • LR light registered by the RGB pixels can degrade quality of the picture provided by the camera.
  • visible light registered by the IR pixels can adversely affect accuracy of the depth map provided by the camera.
  • An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising an IR-RGB photosurface that provides a color picture and a depth map of a scene that are substantially uncontaminated by exposure of the photosurface to IR and visible light respectively.
  • the photosurface in the camera comprises a "blanket filter" that protects substantially the entire area of the photosurface.
  • the blanket filter transmits visible RGB light but transmits IR light substantially only in a narrow band of wavelengths centered on a wavelength of IR light used with the camera to provide depth maps of a scene.
  • a "local" black filter that is substantially opaque to visible light, but at least relatively transparent to IR light in at least the bandpass of the blanket filter protects each IR pixel.
  • the blanket filter reduces sensitivity of the RGB pixels to IR light and the black filters reduce sensitivity of the IR pixels to visible light.
  • IR contamination of responses of the RGB pixels can be accurately estimated from responses to IR light by IR pixels. Estimates of the IR contamination are useable, in accordance with embodiments of the present invention, to correct responses of the RGB pixels for contamination by IR light.
  • the camera provides color pictures of a scene having reduced sensitivity to IR light and depth maps of the scene having reduced sensitivity to visible light.
  • filter configurations in accordance with an embodiment of the present invention, described for uncoupling spectral sensitivities of IR, R, G and B pixels are also applicable, with obvious modifications, to photosurfaces comprising pixels having other spectral sensitivities.
  • photosurfaces tiled with different size and/or shape pixels have been described for 3D-picture photosurfaces used in 3D-picture cameras, some aspects of the invention are not limited to such photosurfaces, nor to photosurfaces having different size and/or shape pixels.
  • Some methods and apparatus, in accordance with embodiments of the present invention are applicable quite generally to photosurfaces, irrespective of the applications for which the photosurfaces are used and spectral sensitivities of their pixels.
  • a photosurface for imaging a scene comprising: a plurality of pixels, each having a photosensitive region, wherein at least two of the pixels have different size photosensitive regions; and a plurality of microlenses, each of which collects light and directs the collected light to the photosensitive region of a different one of the pixels, wherein pixels having different size photosensitive regions have same size and shape microlenses.
  • a photosurface for imaging a scene comprising: a plurality of pixels each having a photosensitive region; and a different microlens for each pixel that collects light and directs the collected light to the photosensitive region of the pixel, wherein at least two of the microlenses have different size apertures.
  • the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures have a same size.
  • the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures have different sizes.
  • at least one of the microlenses has an aperture that shadows at least three pixels of the plurality of pixels.
  • a photosurface for imaging a scene comprising: a plurality of pixels having photosensitive regions; and a microlens having an aperture that covers at least a portion of three pixels of the plurality of pixels, which microlens collects light and directs the collected light to the photosensitive region of one of the three pixels.
  • portions of two pixels that are covered by the microlens do not include photosensitive regions of the two pixels.
  • the plurality of pixels comprises R, G and B pixels. In some embodiments of the present invention the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths.
  • the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths, wherein each LR pixel is adjacent to an R, G and B pixel and wherein the IR pixel and the three adjacent pixels form a square.
  • the IR pixel is larger than any of the adjacent R, G or B pixels.
  • IR pixels comprise circuitry for determining distances to regions of a scene imaged with the photosurface.
  • a photosurface for providing a picture of a scene and a depth map of the scene comprising; a first plurality of first pixels that generate signals usable to provide a picture of the scene responsive to light from the scene that is incident on the pixels; and a second plurality of second pixels each of which comprises circuitry controllable to generate a signal usable to determine a distance to the scene responsive to light from the scene imaged on the pixel and wherein the second pixels are larger than the first pixels.
  • the first plurality of pixels comprises RGB pixels.
  • the second pixels are optionally IR pixels sensitive to light in a band of IR wavelengths.
  • the second pixels are IR pixels sensitive to light in a band of TR. wavelengths, wherein each IR pixel is adjacent to an R, G and B pixel and the IR pixel and the three adjacent pixels form a square.
  • the second pixels are IR pixels sensitive to light in a band of IR wavelengths and each IR pixel is adjacent to an R, G and B pixel and the IR pixel and the three adjacent pixels form a square.
  • the photosurface comprises a filter for each IR pixel that is substantially opaque to visible light.
  • the photosurface comprises a filter that shields all pixels in the photosurface, which filter is substantially transparent to visible light and is substantially opaque to IR light, except for IR light in a portion of the bandpass of the IR pixels.
  • a 3D camera comprising a photosurface in accordance with an embodiment of the present invention and a lens that receives light and focus it on the photosurface.
  • a photosurface for imaging a scene comprising: a first pluralities of pixels sensitive to light in first and second bands of wavelengths; a second pluralities of pixels sensitive to light in the first and second bands of wavelengths; a filter substantially transparent to light in the first band of wavelengths that shields all the pixels in the first and second pluralities of pixels and transmits light only in a portion of the second band of wavelengths; and a filter for each pixel in the second plurality of pixels that is substantially opaque to light in the first band of wavelengths.
  • Fig. 1 schematically shows an IR-RGB photosurface tiled with pixels having different sizes and shapes, in accordance with an embodiment of the present invention
  • FIG. 2 schematically shows the IR-RGB photosurface shown in Fig. 1 with the addition of microlenses, in accordance with an embodiment of the present invention
  • Fig. 3 schematically shows an IR-RGB photosurface similar to that shown in Fig. 2 but comprising a different configuration of microlenses, in accordance with an embodiment of the present invention
  • Fig. 4 schematically shows an IR-RGB photosurface in which different size microlenses are used to adjust spectral sensitivity of the photosurface, in accordance with an embodiment of the present invention
  • Fig. 5 schematically shows a cross section view of an IR-RGB photosurface having filters that are used to decouple spectral sensitivities of the pixels, in accordance with an embodiment of the present invention
  • Fig. 6 schematically shows a 3D-picture camera, comprising a photosurface, in accordance with an embodiment of the present invention.
  • Fig. 1 schematically shows a portion of an IR-RBG photosurface 20 used in a 3D- picture camera (not shown) having a tiling configuration of IR pixels 21, R pixels 22, G pixels 23 and B pixels 24, in accordance with an embodiment of the present invention.
  • pixels 21-24 are also labeled with their respective spectral sensitivities.
  • Each pixel 21-24 has a shaded area 26 and an unshaded area 28.
  • Unshaded areas 28 represent photosensitive regions of pixels 21-24 and shaded areas 26 represent regions of the pixels used for circuitry such as capacitors, amplifiers, switches etc.
  • IR pixels 21 have a shape and size that is different from the shapes and sizes of RGB pixels 22, 23 and 24.
  • G pixels 23 have a shape and size that is different from the shapes and sizes of R and B pixels 22 and 24.
  • Photosensitive regions 28 of pixels 21-24 with different color sensitivity have substantially same sizes and shapes.
  • IR pixels 21 have photosensitive regions 28 substantially larger than photosensitive regions 28 of RGB pixels 22-24 and in addition have substantially more processing circuitry than the RGB pixels. IR pixels 21 therefore are substantially larger than RGB pixels 22-24.
  • processing circuitry of the IR pixels are similar to the processing circuitry described in PCT Publication WO 00/19705 referenced above.
  • RGB pixels 22-24 IR pixels 21 image different size regions of a scene imaged with photosurface 20 and have different photosensitivities.
  • Algorithms for processing imaging data acquired with photosurface 20 are therefor relatively complex.
  • imaging data acquired using photosurface 20 generally requires, inter alia, normalization of intensities of light registered by pixels 21-24 to different sizes of their respective photosensitive regions 28.
  • many common algorithms used to generate an image of a scene from light intensities registered by pixels in a photosurface used to image the scene assume that the pixels image same size regions of the scene and have substantially same photosensitivities. Because of the different sizes and sensitivities of IR pixels 21 and RGB pixels 22-24 these algorithms may not readily be useable to .process light intensities registered by pixels 21-24 in photosurface 20.
  • Fig. 2 schematically shows a portion of an IR-RBG photosurface 30 having a tiling pattern of pixels 21-24 identical to the tiling pattern of pixels 21-24 in photosurface 20 shown in Fig. 1 but comprising in addition, an array of circular microlenses 32.
  • Each microlens 32 is coupled to a different one of pixels 21-24 and collects light and directs the collected light onto a photosensitive region 28 of the pixel to which it is coupled.
  • All microlenses 32 have, by way of example, a same radius.
  • Each microlens 32 coupled to an R, G or B pixel 22-24 overlies portions of circuit regions 26 of at least three adjacent pixels and collects light that would be incident on those portions of the adjacent pixels in the absence of the microlens.
  • each microlens 32 coupled to an R, G or B pixel 22-24 overlays and collects light that would be incident on a portion of circuit region 26 of an IR pixel 21 adjacent to the R, G or B pixel.
  • each pixel 21-24 acquires light from a same size and shape region of a scene imaged using photosurface 30 despite differences in their sizes and sizes of their respective photosensitive regions.
  • the sensitivities of pixels 21-24 are substantially the same.
  • microlenses 32 also increase the effective area of photosurface 20 that is used to collect light and increase the photosensitivity of each pixel 21-24 in the photosurface.
  • Photosurface 30 thus collects light from a highly symmetric and uniform grid of surface regions in a scene imaged with the photosurface. Data acquired with photosurface 30 is therefore substantially less complex to process than data acquired with photosurface 20 shown in Fig. 1 and may be processed using available image processing algorithms.
  • microlenses 32 are centered over photosensitive regions 28, in some embodiments of the present invention a microlens 32 may be positioned so that its optic axis is not centered on the photosensitive region of the pixel to which it is coupled. In such instances light collected by the microlens may be directed to the photosensitive region using an optical wedge.
  • Fig. 3 schematically shows a photosurface 40 comprising, by way of example square microlenses 42 having filleted corners 43, in accordance with an embodiment of the present invention. Except for the shape and size microlenses 42, photosurface 40 is similar to photosurface 30 shown in Fig. 2. As a result of microlenses 42, as in the case of photosurface 30, photosurface 40 collects light from a highly symmetric and uniform grid of surface regions in a scene imaged with the photosurface.
  • sensitivity of photosurfaces manufactured at the fab is enhanced by forming the photosurfaces with microlenses coupled to IR and R pixels in the photosurfaces that are larger than microlenses coupled to G or B pixels in the photosurfaces.
  • FIG. 4 schematically shows an IR-RGB photosurface 60 comprising pixels 21-24 having a same tiling pattern as pixels 21-24 in photosurfaces shown in Figs. 1-3, in which different size microlenses are used to adjust relative sensitivities of the pixels, in accordance with an embodiment of the present invention.
  • G pixels 23 and B pixels 24 are each coupled to a circular microlens 122 having a same radius (non-circular microlenses can also be used, e.g. rectangular microlenses).
  • Each R pixel 22 on the other hand, is coupled to a microlens 124 that is substantially larger than microlenses 122 and each IR pixel 21 is coupled to a microlens 126 larger than microlens 124
  • photosurface 60 has enhanced sensitivity to IR and R light in comparison to G or B light.
  • Fig. 5 shows a schematic cross section of a portion of an IR-RGB photosurface 50 that may be used with an IR light source (not shown), which illuminates scenes imaged by the photosurface. TR. distance images of a scene provided by photosurface 50 are substantially uncontaminated by exposure of the photosurface to visible light.
  • RGB picture images of the scene are substantially uncontaminated by exposure of the photosurface to ambient IR light and IR light provided by the IR light source.
  • IR, R, G and B pixels are shown having a same size by way of example, and a photosurface, in accordance with an embodiment of the present invention, similar to photosurface 50 may have IR, R, G and B pixels having different sizes.
  • Photosurface 50 comprises R, G, B and IR pixels 51, 52, 53 and 54 respectively.
  • Each R pixel 51 comprises a photosensitive pixel 61 and an R filter 71.
  • each G pixel 52 comprises a photosensitive pixel 62 and a G filter 72 and each B pixel 53 comprises a photosensitive pixel 63 and a G filter 73.
  • Each IR pixel 54 comprises a light sensitive pixel 64 shielded by a "black filter" 74 that is substantially opaque to visible light.
  • a "blanket" IR filter 80 covers all pixels 51-54 in photosurface 50.
  • IR blanket filter 80 may optionally be formed on a glass cover plate 82 that protects pixels 51-54 in photosurface 50.
  • IR blanket filter 80 is substantially transparent to visible light but transmits IR light substantially only in a narrow band of wavelengths centered on a wavelength of IR light radiated by the light source.
  • Blanket IR filter 80 reduces sensitivity of RGB pixels 51, 52 and 53 to IR light. However blanket filter 80 does allow some IR light incident on photosurface 50 to reach RGB pixels 51-53. Amounts of IR light incident on RGB pixels 51-53 can be estimated from signals generated by IR pixels 54 responsive to IR light incident on photosurface 50.
  • photosurface 50 is used with a processor (not shown) that receives signals from RGB and IR pixels 51-54 responsive to light that they receive.
  • the processor corrects signals from RGB pixels 51-53 for contamination thereof resulting from IR light incident on the pixels, responsive to signals generated by IR pixels 74.
  • Fig. 6 schematically shows a 3D-picture camera 90, in accordance with an embodiment of the present invention, being used to provide a picture of elephants 92 and distances to surface regions of the elephants. Only those elements and components of 3D- picture camera 90 germane to the discussion are shown in Fig. 6.
  • 3D-picture camera 90 comprises an IR-RGB photosurface 94 similar to photosurface
  • Photosurface 94 is, optionally, tiled with IR and RGB pixels 21-24 in a tiling pattern similar to the tiling patterns shown in Figs 2 and 3 and comprises circular microlenses 32 that compensate the photosurface for differences in size the IR and RGB pixels.
  • IR pixels 21-24 are shielded by black filters (not shown) similar to black filters 74 shown in Fig. 5 and photosurface 94 comprises a narrow band blanket IR filter (not shown) similar to blanket filter 80 also shown in Fig. 5.
  • IR pixels 21 are used to provide a depth map of elephants 92 and RGB pixels 22-24 are used to provide a picture of the elephants.
  • IR pixels 21 are gated pixels and each IR pixel 21 comprises circuitry for gating the pixel on and off similar, optionally, to circuitry described in PCT publication WO 00/19705.
  • an IR light source 96 illuminates elephants 92 with a train of light pulses 98.
  • a controller 100 controls circuitry in IR pixels 21 to gate the pixels on and off following each pulse of light 98, preferably using methods and gating sequences similar to those described in WO 00/19705.
  • Intensities of pulses of IR light 102 reflected from the train of light pulses 98 by elephants 92, which are registered by IR pixels 98 are used to determine distances to the elephants.
  • Intensities of light registered by IR pixels 21 are optionally processed to determine distances to elephants 92 using methods described in PCT Publication WO 00/19705 and US Patents 6,057,909, 6,091,905 and 6,100,517 referenced above.
  • each of the verbs, "comprise” is optionally processed to determine distances to elephants 92 using methods described in PCT Publication WO 00/19705 and US Patents 6,057,909, 6,091,905 and 6,100,517 referenced above.

Abstract

A photosurface for imaging a scene comprising: a plurality of pixels, each having a photosensitive region, wherein at least two of the pixels have different size photosensitive regions; and a plurality of microlenses, each of which collects light and directs the collected light to the photosensitive region of a different one of the pixels, wherein pixels having different size photosensitive regions have same size and shape microlenses.

Description

3D CAMERA
FIELD OF THE INVENTION
The invention relates to photosurfaces used for imaging a scene and in particular to light collection in photosurfaces that are used to provide both an image of a scene and a depth map of the scene.
BACKGROUND OF THE INVENTION 3D cameras that provide distance measurements to regions of a scene that they image are well known in the art. Gated 3D cameras comprise a photosurface, such as a CCD or CMOS photosurface and a gating means for gating the photosurface on and off, such as an electro-optical shutter or a gated image intensifier. To image a scene and determine distances from the camera to regions in the scene, the scene is generally illuminated with a train of light pulses radiated from an appropriate light source. Generally, the radiated light pulses are infrared (IR) light pulses. For each radiated light pulse in the train, following an accurately determined delay from the time that the light pulse is radiated, the photosurface is gated on for a period of time, hereinafter referred to as a "gate". Light from the light pulse that is reflected from a region in the scene is imaged on the photosurface if it reaches the camera during the gate. Since the time elapsed between radiating a light pulse and the gate that follows it is known, the time it took imaged light to travel from the light source to the reflecting region in the scene and back to the camera can be determined. The time elapsed is used to determine the distance to the reflecting region. In some of these 3D cameras, only the timing between light pulses and gates is used to determine the distance from the 3D camera to a point in the scene imaged on a pixel of the photosurface of the 3D camera. In others, the amount of light registered by the pixel during the time that the camera is gated open is also used to detennine the distance. Gated 3D cameras and examples of their uses are found in US Patents 6,057,909,
6,091,905 and 6,100,517, which are incorporated herein by reference.
Some 3D cameras, hereinafter referred to as "3D-picture cameras", provide a picture of a scene that they image as well as a depth map of the scene. In some 3D-picture cameras the picture is a black and white picture, while in others the picture is a color picture. PCT application PCT/IL99/00490, the disclosure of which is incorporated herein by reference, describes various configurations of 3D-picture cameras. The described cameras comprise different photosurfaces for different imaging functions of the camera. For example, some of the described cameras comprise an IR sensitive photosurface for registering IR light used to provide a depth map of a scene and separate R, G, and B photosurfaces for providing a color image of the scene.
There are substantial advantages to be achieved in simplicity, robustness and accuracy of registration of a depth map of a scene to a picture of the scene, by combining image- sensing functions of a 3D-picture camera in a single photosurface. PCT Publication WO 00/19705, the disclosure of which is incorporated herein by reference, describes 3D- picture cameras comprising a single photosurface that provides both a depth map of a scene being imaged with the camera and a picture of the scene. The photosurface comprises "distance" pixels that are used for determining distances to regions of a scene imaged with the camera and "picture" pixels that are used to provide a picture of the imaged scene. Hereinafter, a photosurface, comprising both distance pixels and picture pixels is referred to as a "3D-picture photosurface".
SUMMARY OF THE INVENTION An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface in which some pixels have sizes and/or shapes that are different from other pixels in the photosurface.
An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface in which some pixels have photosensitive regions that have sizes different from photosensitive regions of other pixels in the photosurface.
Distance pixels generally require processing circuitry that is larger and more complex than circuitry comprised in picture pixels in a photosurface. Furthermore, because of the relative complexity of processing circuitry in a distance pixel, a distance pixel is usually more sensitive to crosstalk between a region of the pixel in which its circuitry is located and a photosensitive region of the pixel. Examples of processing circuitry comprised in distance pixels in a photosurface of a 3D camera are described in PCT publication WO 00/19705 referenced above. In addition, there is generally less light available for imaging a scene to provide a depth map of the scene than there is available for imaging the scene to provide a picture of the scene. Therefore, in accordance with some embodiments of the present invention, a 3D- picture camera comprises a 3D-picture photosurface having distance pixels that are substantially larger than picture pixels. The larger size of distance pixels compared to picture pixels provides more space in the distance pixels for processing circuitry. In some embodiments of the present invention the distance pixels also have photosensitive regions that are larger than photosensitive regions of picture pixels. The larger size photosensitive and circuit regions of the distance pixels tend to reduce cross-talk between circuit regions of the pixels and photosensitive regions of the pixels. The larger photosensitive regions of the distance pixels also enhances their photosensitivity.
According to an aspect of some embodiments of the present invention, in order to tile a 3D-picture photosurface having different size pixels, pixels in the photosurface have different shapes.
As a result of the different sizes of pixels in a 3D-picture photosurface, in accordance with some embodiments of the present invention, conventional tiling patterns for pixels in a photosurface often cannot provide an appropriate pattern for allocating space for both distance and picture pixels in the 3D-picture photosurface.
However, "unconventional''' tiling patterns that might provide a suitable partition of a 3D-picture photosurface for accommodating both distance and picture pixels in the photosurface may increase complexity of algorithms used for processing imaging data acquired with the photosurface. In particular, conventional image processing algorithms used to generate images from light intensities registered by pixels in a photosurface generally assume that pixels acquire light from a highly symmetric grid of equal size and shape regions imaged with the photosurface. These algorithms may not be useable to process images acquired with pixels tiled in an unconventional tiling pattern.
An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising a 3D-picture photosurface, which is compensated for differences in size and/or shape of pixels comprised in the photosurface and for differences in size of their respective photosensitive regions. Algorithms for processing imaging data acquired with the photosurface may thereby be simplified and currently available image processing algorithms may be applied to processing the data.
A 3D-picture camera, in accordance with some embodiments of the present invention comprises a 3D-picture photosurface having microlenses coupled to pixels in the photosurface that compensate for differences in sizes and/or shapes of the pixels and their respective photosensitive regions. In accordance with some embodiments of the present invention, all the microlenses in the photosurface have a same size and shape and are distributed in a symmetric pattern over the photosurface. (As used herein, the shape and size of a microlens refers to the shape and size of the aperture of the microlens.) As a result, in spite of differences in sizes and shapes of photosensitive regions of pixels in the photosurface, each pixel in the photosurface images a same size and shape region of a scene imaged with the photosurface. The photosurface thus collects light from a symmetric, uniform grid of equal size and shape regions of a scene imaged with the photosurface. Therefore, processing imaging data acquired with the photosurface to provide depth images and pictures of scenes imaged with the photosurface is simplified and currently available image processing algorithms may be applied to processing the data.
It is noted that microlenses are often used in prior photosurfaces for increasing light gathering efficiency of pixels in the photosurfaces. However, in prior art, microlenses are generally the same size as the pixels to which they are coupled and are not used to compensate for differences in sizes of the pixels.
In some embodiments of the present invention, distance pixels in a 3D-picture photosurface are IR pixels that register IR light and IR light is used to image a scene to determine distances to the scene, h some embodiments of the present invention a picture of the scene is a color picture and picture pixels of the 3D-picture photosurface are RGB pixels. Hereinafter, for convenience of presentation it is assumed that distance pixels are IR pixels and picture pixels are RGB pixels. However, it should be noted that aspects of the present invention are not limited to IR-RGB photosurfaces and are applicable to 3D-picture photosurfaces having other spectral sensitivities. A 3D-picture photosurface comprising IR and RGB pixels, in accordance with an embodiment of the present, is referred to as an IR- RGB photosurface.
In some embodiments of the present invention, microlenses, are used to adjust relative photosensitivity of pixels in a photosurface. For example, in an IR-RGB photosurface for which it is desired to enhance TR sensitivity relative to RGB sensitivity, IR pixels in the photosurface are coupled to microlenses that are larger than microlenses to which RGB pixels are coupled. If, in addition, it is desired to enhance R sensitivity of the photosurface relative to G and B sensitivity, the R pixels are coupled to microlenses that are larger than microlenses coupled to the G and B pixels.
In a 3D-picture camera that comprises a 3D-picture photosurface, for which distance pixels and picture pixels image a scene with light having wavelengths in different wavelength bands, exposure of the photosurface to light used by distance pixels can degrade quality of a picture of the scene provided by the picture pixels. Similarly, exposure of the photosurface to light used by the picture pixels can degrade accuracy of a depth map of the scene provided by the camera. For example, in a 3D-picture camera comprising a 3D-picture photosurface, in which distance pixels image a scene with IR light and picture pixels are RGB color pixels, LR light registered by the RGB pixels can degrade quality of the picture provided by the camera. Similarly, visible light registered by the IR pixels can adversely affect accuracy of the depth map provided by the camera.
An aspect of some embodiments of the present invention relates to providing a 3D- picture camera comprising an IR-RGB photosurface that provides a color picture and a depth map of a scene that are substantially uncontaminated by exposure of the photosurface to IR and visible light respectively. In accordance with an embodiment of the present invention, the photosurface in the camera comprises a "blanket filter" that protects substantially the entire area of the photosurface. The blanket filter transmits visible RGB light but transmits IR light substantially only in a narrow band of wavelengths centered on a wavelength of IR light used with the camera to provide depth maps of a scene. In addition, a "local" black filter that is substantially opaque to visible light, but at least relatively transparent to IR light in at least the bandpass of the blanket filter protects each IR pixel. The blanket filter reduces sensitivity of the RGB pixels to IR light and the black filters reduce sensitivity of the IR pixels to visible light. In addition, since the IR pixels are protected from visible light and the IR and RGB pixels are protected by a same IR filter (i.e. the blanket filter), IR contamination of responses of the RGB pixels can be accurately estimated from responses to IR light by IR pixels. Estimates of the IR contamination are useable, in accordance with embodiments of the present invention, to correct responses of the RGB pixels for contamination by IR light.
As a result of the blanket and black filters, in accordance with an embodiment of the present invention, the camera provides color pictures of a scene having reduced sensitivity to IR light and depth maps of the scene having reduced sensitivity to visible light.
It should be noted that filter configurations, in accordance with an embodiment of the present invention, described for uncoupling spectral sensitivities of IR, R, G and B pixels are also applicable, with obvious modifications, to photosurfaces comprising pixels having other spectral sensitivities. It should also be noted that whereas photosurfaces tiled with different size and/or shape pixels have been described for 3D-picture photosurfaces used in 3D-picture cameras, some aspects of the invention are not limited to such photosurfaces, nor to photosurfaces having different size and/or shape pixels. Some methods and apparatus, in accordance with embodiments of the present invention, are applicable quite generally to photosurfaces, irrespective of the applications for which the photosurfaces are used and spectral sensitivities of their pixels. For example, in a photosurface for which all the pixels have a same size and shape, relative sensitivities of the pixels can be adjusted, in accordance with an embodiment of the present invention, by coupling pixels in the photosurface to different size microlenses. There is therefore provided in accordance with an embodiment of the present invention a photosurface for imaging a scene comprising: a plurality of pixels, each having a photosensitive region, wherein at least two of the pixels have different size photosensitive regions; and a plurality of microlenses, each of which collects light and directs the collected light to the photosensitive region of a different one of the pixels, wherein pixels having different size photosensitive regions have same size and shape microlenses.
There is further provided in accordance with an embodiment of the present invention a photosurface for imaging a scene comprising: a plurality of pixels each having a photosensitive region; and a different microlens for each pixel that collects light and directs the collected light to the photosensitive region of the pixel, wherein at least two of the microlenses have different size apertures. Optionally the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures, have a same size. Alternatively, the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures, have different sizes. In some embodiments of the present invention, at least one of the microlenses has an aperture that shadows at least three pixels of the plurality of pixels.
There is also provided in accordance with an embodiment of the present invention, a photosurface for imaging a scene comprising: a plurality of pixels having photosensitive regions; and a microlens having an aperture that covers at least a portion of three pixels of the plurality of pixels, which microlens collects light and directs the collected light to the photosensitive region of one of the three pixels. Optionally portions of two pixels that are covered by the microlens do not include photosensitive regions of the two pixels.
In some embodiments of the present invention the plurality of pixels comprises R, G and B pixels. In some embodiments of the present invention the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths.
In some embodiments of the present invention the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths, wherein each LR pixel is adjacent to an R, G and B pixel and wherein the IR pixel and the three adjacent pixels form a square. Optionally, the IR pixel is larger than any of the adjacent R, G or B pixels.
In some embodiments of the present invention IR pixels comprise circuitry for determining distances to regions of a scene imaged with the photosurface. There is further provided in accordance with an embodiment of the present invention a photosurface for providing a picture of a scene and a depth map of the scene comprising; a first plurality of first pixels that generate signals usable to provide a picture of the scene responsive to light from the scene that is incident on the pixels; and a second plurality of second pixels each of which comprises circuitry controllable to generate a signal usable to determine a distance to the scene responsive to light from the scene imaged on the pixel and wherein the second pixels are larger than the first pixels.
Optionally, the first plurality of pixels comprises RGB pixels. Additionally or alternatively, the second pixels are optionally IR pixels sensitive to light in a band of IR wavelengths. Optionally, the second pixels are IR pixels sensitive to light in a band of TR. wavelengths, wherein each IR pixel is adjacent to an R, G and B pixel and the IR pixel and the three adjacent pixels form a square.
In some embodiments of the present invention, the second pixels are IR pixels sensitive to light in a band of IR wavelengths and each IR pixel is adjacent to an R, G and B pixel and the IR pixel and the three adjacent pixels form a square.
In some embodiments of the present invention, the photosurface comprises a filter for each IR pixel that is substantially opaque to visible light. In some embodiments of the present invention, the photosurface comprises a filter that shields all pixels in the photosurface, which filter is substantially transparent to visible light and is substantially opaque to IR light, except for IR light in a portion of the bandpass of the IR pixels.
There is further provided in accordance with an embodiment of the present invention, a 3D camera comprising a photosurface in accordance with an embodiment of the present invention and a lens that receives light and focus it on the photosurface.
There is further provided in accordance with an embodiment of the present invention, a photosurface for imaging a scene comprising: a first pluralities of pixels sensitive to light in first and second bands of wavelengths; a second pluralities of pixels sensitive to light in the first and second bands of wavelengths; a filter substantially transparent to light in the first band of wavelengths that shields all the pixels in the first and second pluralities of pixels and transmits light only in a portion of the second band of wavelengths; and a filter for each pixel in the second plurality of pixels that is substantially opaque to light in the first band of wavelengths.
BRIEF DESCRIPTION OF FIGURES Non-limiting examples of embodiments of the present invention are described below with reference to figures attached hereto. In the figures, identical structures, elements or parts that appear in more than one figure are generally labeled with the same numeral in all the figures in which they appear. Dimensions of components and features shown in the figures are chosen for convenience and clarity of presentation and are not necessarily shown to scale. The figures are listed below.
Fig. 1 schematically shows an IR-RGB photosurface tiled with pixels having different sizes and shapes, in accordance with an embodiment of the present invention;
Fig. 2 schematically shows the IR-RGB photosurface shown in Fig. 1 with the addition of microlenses, in accordance with an embodiment of the present invention; Fig. 3 schematically shows an IR-RGB photosurface similar to that shown in Fig. 2 but comprising a different configuration of microlenses, in accordance with an embodiment of the present invention;
Fig. 4 schematically shows an IR-RGB photosurface in which different size microlenses are used to adjust spectral sensitivity of the photosurface, in accordance with an embodiment of the present invention;
Fig. 5 schematically shows a cross section view of an IR-RGB photosurface having filters that are used to decouple spectral sensitivities of the pixels, in accordance with an embodiment of the present invention; and
Fig. 6 schematically shows a 3D-picture camera, comprising a photosurface, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS Fig. 1 schematically shows a portion of an IR-RBG photosurface 20 used in a 3D- picture camera (not shown) having a tiling configuration of IR pixels 21, R pixels 22, G pixels 23 and B pixels 24, in accordance with an embodiment of the present invention. For convenience pixels 21-24 are also labeled with their respective spectral sensitivities. Each pixel 21-24 has a shaded area 26 and an unshaded area 28. Unshaded areas 28 represent photosensitive regions of pixels 21-24 and shaded areas 26 represent regions of the pixels used for circuitry such as capacitors, amplifiers, switches etc. By way of example, IR pixels 21 have a shape and size that is different from the shapes and sizes of RGB pixels 22, 23 and 24. G pixels 23 have a shape and size that is different from the shapes and sizes of R and B pixels 22 and 24. Photosensitive regions 28 of pixels 21-24 with different color sensitivity have substantially same sizes and shapes. However, IR pixels 21 have photosensitive regions 28 substantially larger than photosensitive regions 28 of RGB pixels 22-24 and in addition have substantially more processing circuitry than the RGB pixels. IR pixels 21 therefore are substantially larger than RGB pixels 22-24. in some embodiments of the present invention, processing circuitry of the IR pixels are similar to the processing circuitry described in PCT Publication WO 00/19705 referenced above. As a result of the different sizes of the photosensitive regions of IR pixels 21 and
RGB pixels 22-24, IR pixels 21 image different size regions of a scene imaged with photosurface 20 and have different photosensitivities. Algorithms for processing imaging data acquired with photosurface 20 are therefor relatively complex. For example, imaging data acquired using photosurface 20 generally requires, inter alia, normalization of intensities of light registered by pixels 21-24 to different sizes of their respective photosensitive regions 28. Furthermore many common algorithms used to generate an image of a scene from light intensities registered by pixels in a photosurface used to image the scene, assume that the pixels image same size regions of the scene and have substantially same photosensitivities. Because of the different sizes and sensitivities of IR pixels 21 and RGB pixels 22-24 these algorithms may not readily be useable to .process light intensities registered by pixels 21-24 in photosurface 20.
Fig. 2 schematically shows a portion of an IR-RBG photosurface 30 having a tiling pattern of pixels 21-24 identical to the tiling pattern of pixels 21-24 in photosurface 20 shown in Fig. 1 but comprising in addition, an array of circular microlenses 32. Each microlens 32 is coupled to a different one of pixels 21-24 and collects light and directs the collected light onto a photosensitive region 28 of the pixel to which it is coupled.
All microlenses 32 have, by way of example, a same radius. Each microlens 32 coupled to an R, G or B pixel 22-24 overlies portions of circuit regions 26 of at least three adjacent pixels and collects light that would be incident on those portions of the adjacent pixels in the absence of the microlens. In particular, each microlens 32 coupled to an R, G or B pixel 22-24 overlays and collects light that would be incident on a portion of circuit region 26 of an IR pixel 21 adjacent to the R, G or B pixel. As a result of microlenses 32, each pixel 21-24 acquires light from a same size and shape region of a scene imaged using photosurface 30 despite differences in their sizes and sizes of their respective photosensitive regions. Furthermore, to within differences resulting from differences in spectral sensitivity of the material from which photosensitive regions 28 of the pixels are fabricated, e.g. the material may be more sensitive to R light than B light, the sensitivities of pixels 21-24 are substantially the same. As in prior art, microlenses 32 also increase the effective area of photosurface 20 that is used to collect light and increase the photosensitivity of each pixel 21-24 in the photosurface. Photosurface 30 thus collects light from a highly symmetric and uniform grid of surface regions in a scene imaged with the photosurface. Data acquired with photosurface 30 is therefore substantially less complex to process than data acquired with photosurface 20 shown in Fig. 1 and may be processed using available image processing algorithms.
It should be noted that whereas in Fig. 2 microlenses 32 are centered over photosensitive regions 28, in some embodiments of the present invention a microlens 32 may be positioned so that its optic axis is not centered on the photosensitive region of the pixel to which it is coupled. In such instances light collected by the microlens may be directed to the photosensitive region using an optical wedge.
Fig. 3 schematically shows a photosurface 40 comprising, by way of example square microlenses 42 having filleted corners 43, in accordance with an embodiment of the present invention. Except for the shape and size microlenses 42, photosurface 40 is similar to photosurface 30 shown in Fig. 2. As a result of microlenses 42, as in the case of photosurface 30, photosurface 40 collects light from a highly symmetric and uniform grid of surface regions in a scene imaged with the photosurface.
In some situations it is advantageous to tailor spectral sensitivity of photosurfaces fabricated using a same lithographic mask set to different applications. For example, assume that a fab is configured to fabricate IR-RGB photosurfaces similar to photosurface 20 shown in Fig. 1, in which RGB pixels have same size photosensitive regions. Assume that there is a need to inspect structural projects for rust and that it is advantageous to have IR-RGB photosurfaces that have enhanced IR and enhanced R sensitivities. In accordance with an embodiment of the present invention, sensitivity of photosurfaces manufactured at the fab is enhanced by forming the photosurfaces with microlenses coupled to IR and R pixels in the photosurfaces that are larger than microlenses coupled to G or B pixels in the photosurfaces. Fig. 4 schematically shows an IR-RGB photosurface 60 comprising pixels 21-24 having a same tiling pattern as pixels 21-24 in photosurfaces shown in Figs. 1-3, in which different size microlenses are used to adjust relative sensitivities of the pixels, in accordance with an embodiment of the present invention. In photosurface 60 G pixels 23 and B pixels 24 are each coupled to a circular microlens 122 having a same radius (non-circular microlenses can also be used, e.g. rectangular microlenses). Each R pixel 22 on the other hand, is coupled to a microlens 124 that is substantially larger than microlenses 122 and each IR pixel 21 is coupled to a microlens 126 larger than microlens 124 As a result, photosurface 60 has enhanced sensitivity to IR and R light in comparison to G or B light. Fig. 5 shows a schematic cross section of a portion of an IR-RGB photosurface 50 that may be used with an IR light source (not shown), which illuminates scenes imaged by the photosurface. TR. distance images of a scene provided by photosurface 50 are substantially uncontaminated by exposure of the photosurface to visible light. RGB picture images of the scene are substantially uncontaminated by exposure of the photosurface to ambient IR light and IR light provided by the IR light source. In Fig. 5, IR, R, G and B pixels are shown having a same size by way of example, and a photosurface, in accordance with an embodiment of the present invention, similar to photosurface 50 may have IR, R, G and B pixels having different sizes.
Photosurface 50 comprises R, G, B and IR pixels 51, 52, 53 and 54 respectively. Each R pixel 51 comprises a photosensitive pixel 61 and an R filter 71. Similarly, each G pixel 52 comprises a photosensitive pixel 62 and a G filter 72 and each B pixel 53 comprises a photosensitive pixel 63 and a G filter 73. Each IR pixel 54 comprises a light sensitive pixel 64 shielded by a "black filter" 74 that is substantially opaque to visible light.
A "blanket" IR filter 80 covers all pixels 51-54 in photosurface 50. IR blanket filter 80 may optionally be formed on a glass cover plate 82 that protects pixels 51-54 in photosurface 50. IR blanket filter 80 is substantially transparent to visible light but transmits IR light substantially only in a narrow band of wavelengths centered on a wavelength of IR light radiated by the light source.
As a result of black filters 74, signals generated by LR pixels 54 are substantially independent of visible light incident on photosurface 50. Blanket IR filter 80 reduces sensitivity of RGB pixels 51, 52 and 53 to IR light. However blanket filter 80 does allow some IR light incident on photosurface 50 to reach RGB pixels 51-53. Amounts of IR light incident on RGB pixels 51-53 can be estimated from signals generated by IR pixels 54 responsive to IR light incident on photosurface 50. Preferably, photosurface 50 is used with a processor (not shown) that receives signals from RGB and IR pixels 51-54 responsive to light that they receive. Preferably, the processor corrects signals from RGB pixels 51-53 for contamination thereof resulting from IR light incident on the pixels, responsive to signals generated by IR pixels 74.
Fig. 6 schematically shows a 3D-picture camera 90, in accordance with an embodiment of the present invention, being used to provide a picture of elephants 92 and distances to surface regions of the elephants. Only those elements and components of 3D- picture camera 90 germane to the discussion are shown in Fig. 6. 3D-picture camera 90 comprises an IR-RGB photosurface 94 similar to photosurface
30 shown in Fig. 20. Photosurface 94 is, optionally, tiled with IR and RGB pixels 21-24 in a tiling pattern similar to the tiling patterns shown in Figs 2 and 3 and comprises circular microlenses 32 that compensate the photosurface for differences in size the IR and RGB pixels. Optionally, IR pixels 21-24 are shielded by black filters (not shown) similar to black filters 74 shown in Fig. 5 and photosurface 94 comprises a narrow band blanket IR filter (not shown) similar to blanket filter 80 also shown in Fig. 5. IR pixels 21 are used to provide a depth map of elephants 92 and RGB pixels 22-24 are used to provide a picture of the elephants.
In some embodiments of the present invention, IR pixels 21 are gated pixels and each IR pixel 21 comprises circuitry for gating the pixel on and off similar, optionally, to circuitry described in PCT publication WO 00/19705. To determine distances to surface regions of elephants 92 an IR light source 96 illuminates elephants 92 with a train of light pulses 98. A controller 100 controls circuitry in IR pixels 21 to gate the pixels on and off following each pulse of light 98, preferably using methods and gating sequences similar to those described in WO 00/19705. Intensities of pulses of IR light 102 reflected from the train of light pulses 98 by elephants 92, which are registered by IR pixels 98 are used to determine distances to the elephants. Intensities of light registered by IR pixels 21 are optionally processed to determine distances to elephants 92 using methods described in PCT Publication WO 00/19705 and US Patents 6,057,909, 6,091,905 and 6,100,517 referenced above. In the description and claims of the present application, each of the verbs, "comprise"
"include" and "have", and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb. The present invention has been described using detailed descriptions of embodiments thereof that are provided by way of example and are not intended to limit the scope of the invention. The described embodiments comprise different features, not all of which are required in all embodiments of the invention. Some embodiments of the present invention utilize only some of the features or possible combinations of the features. Variations of embodiments of the present invention that are described and embodiments of the present invention comprising different combinations of features noted in the described embodiments will occur to persons of the art. The scope of the invention is limited only by the following claims.

Claims

1. A photosurface for imaging a scene comprising: a plurality of pixels, each having a photosensitive region, wherein at least two of the pixels have different size photosensitive regions; and a plurality of microlenses, each of which collects light and directs the collected light to the photosensitive region of a different one of the pixels, wherein pixels having different size photosensitive regions have microlenses with a same size and shape aperture.
2. A photosurface for imaging a scene comprising: a plurality of pixels each having a photosensitive region; and a different microlens for each pixel that collects light and directs the collected light to the photosensitive region of the pixel, wherein at least two of the microlenses have different size apertures.
3. A photosurface according to claim 2 wherein the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures, have a same size.
4. A photosurface according to claim 2 wherein the photosensitive regions of the pixels, which are coupled to microlenses having different size apertures, have different sizes.
5. A photosurface according to any of claims 1-4 wherein at least one of the microlenses has an aperture that shadows at least three pixels of the plurality of pixels.
6. A photosurface for imaging a scene comprising: a plurality of pixels having photosensitive regions; and a microlens having an aperture that covers at least a portion of three pixels of the plurality of pixels, which microlens collects light and directs the collected light to the photosensitive region of one of the three pixels.
7. A photosurface according to claim 6 wherein portions of two pixels that are covered by the microlens do not include photosensitive regions of the two pixels.
8. A photosurface according to any of the preceding claims wherein the plurality of pixels comprises R, G and B pixels.
9. A photosurface according to any of the preceding claims wherein the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths.
10. A photosurface according to claim 8 wherein the plurality of pixels comprises IR pixels sensitive to light in a band of IR wavelengths, wherein each IR pixel is adjacent to R, G and B pixels and wherein the IR pixel and the three adjacent pixels form a square.
11. A photosurface according to claim 10 wherein the IR pixel is larger than any of the adjacent R, G or B pixels.
12. A photosurface according to any of claims 9-11 wherein the IR pixels comprise circuitry for determining distances to regions of a scene imaged with the photosurface.
13. A photosurface for providing a picture of a scene and a depth map of the scene comprising; a first plurality of first pixels that generate signals usable to provide a picture of the scene responsive to light from the scene that is incident on the pixels; and a second plurality of second pixels each of which comprises circuitry controllable to generate a signal usable to determine a distance to the scene responsive to light from the scene imaged on the pixel and wherein the second pixels are larger than the first pixels.
14. A photosurface according to claim 13 wherein the first plurality of pixels comprises RGB pixels.
15. A photosurface according to claim 13 or claim 14 wherein the second pixels are IR pixels sensitive to light in a band of IR wavelengths.
16. A photosurface according to claim 14 wherein the second pixels are DR. pixels sensitive to light in a band of IR wavelengths and each IR pixel is adjacent to an R, G and B pixel and the IR pixel and the three adjacent pixels form a square.
17. A photosurface according to any of claims 9-12, 15 or claim 16 and comprising a filter for each IR pixel that is substantially opaque to visible light.
18. A photosurface according to any of claims 9-12 or claims 15-17 and comprising a filter that shields all pixels in the photosurface, which filter is substantially transparent to visible light and is substantially opaque to IR light except for IR light in a portion of the bandpass of the IR pixels.
19. A 3D camera comprising a photosurface according to any of claims 1-18 and a lens that receives light and focus it on the photosurface .
20. A photosurface for imaging a scene comprising: a first pluralities of pixels sensitive to light in first and second bands of wavelengths; a second pluralities of pixels sensitive to light in the first and second bands of wavelengths; a filter substantially transparent to light in the first band of wavelengths that shields all the pixels in the first and second pluralities of pixels and transmits light only in a portion of the second band of wavelengths; and a filter for each pixel in the second plurality of pixels that is substantially opaque to light in the first band of wavelengths.
21. A method of directing light to four pixels comprised in a photosurface, wherein each pixel has a photosensitive region and is adjacent to the other three pixels, the method comprising: coupling a different microlens having an aperture to each of three of the pixels, which microlens collects light and directs the collected light to the photosensitive region of the pixel; and coupling a microlens having an aperture to the fourth pixel, which microlens collects light and directs the collected light to the photosensitive region of the fourth pixel and wherein the aperture of the microlens coupled to the fourth pixel covers portions of at least two of the pixels adjacent to the fourth pixel.
22. A method according to claim 21 wherein the aperture of the microlens coupled to the fourth pixel covers portions of all three adjacent pixels.
23. A method according to claim 21 or claim 22 wherein the portions of the adjacent pixels covered by the microlens do not include photosensitive regions of the pixels.
24. A method for changing the relative sensitivities of first and second pixels in a photosurface, which first pixel comprises a first photosensitive region having a first area and which second pixel comprises a second photosensitive region having a second area, the method comprising : coupling a first microlens having a first aperture to the first pixel; and coupling a second microlens having a second aperture to the second pixel, wherein the apertures of the first and second microlenses are different.
PCT/IL2000/000838 2000-12-14 2000-12-14 3d camera WO2002049366A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2001218821A AU2001218821A1 (en) 2000-12-14 2000-12-14 3d camera
PCT/IL2000/000838 WO2002049366A1 (en) 2000-12-14 2000-12-14 3d camera
PCT/IL2001/001159 WO2002049367A2 (en) 2000-12-14 2001-12-13 Improved photosurface for a 3d camera
AU2002222487A AU2002222487A1 (en) 2000-12-14 2001-12-13 Improved photosurface for a 3d camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2000/000838 WO2002049366A1 (en) 2000-12-14 2000-12-14 3d camera

Publications (1)

Publication Number Publication Date
WO2002049366A1 true WO2002049366A1 (en) 2002-06-20

Family

ID=11043012

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2000/000838 WO2002049366A1 (en) 2000-12-14 2000-12-14 3d camera
PCT/IL2001/001159 WO2002049367A2 (en) 2000-12-14 2001-12-13 Improved photosurface for a 3d camera

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/IL2001/001159 WO2002049367A2 (en) 2000-12-14 2001-12-13 Improved photosurface for a 3d camera

Country Status (2)

Country Link
AU (1) AU2001218821A1 (en)
WO (2) WO2002049366A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6993255B2 (en) 1999-02-16 2006-01-31 3Dv Systems, Ltd. Method and apparatus for providing adaptive illumination
WO2006130517A1 (en) * 2005-06-01 2006-12-07 Eastman Kodak Company Asymmetrical microlenses on pixel arrays
US8780257B2 (en) 2011-04-28 2014-07-15 Commissariat à l'énergie atmoque et aux énergies alternatives Imager device for evaluating distances of elements in an image
JP2015206634A (en) * 2014-04-18 2015-11-19 浜松ホトニクス株式会社 distance image sensor
JP2016529491A (en) * 2013-12-24 2016-09-23 ソフトキネティク センサーズ エヌブイ Time-of-flight camera system

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1757087A4 (en) * 2004-04-16 2009-08-19 James A Aman Automatic event videoing, tracking and content generation system
US8319846B2 (en) 2007-01-11 2012-11-27 Raytheon Company Video camera system using multiple image sensors
KR101467509B1 (en) 2008-07-25 2014-12-01 삼성전자주식회사 Image sensor and operating method for image sensor
JP5512675B2 (en) * 2008-08-03 2014-06-04 マイクロソフト インターナショナル ホールディングス ビイ.ヴイ. Rolling camera system
KR101643376B1 (en) * 2010-04-02 2016-07-28 삼성전자주식회사 Remote touch panel using light sensor and remote touch screen apparatus having the same
US20120154535A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Capturing gated and ungated light in the same frame on the same photosurface
KR101951318B1 (en) 2012-08-27 2019-04-25 삼성전자주식회사 3D image acquisition apparatus and method of obtaining color and depth images simultaneously
US9985063B2 (en) * 2014-04-22 2018-05-29 Optiz, Inc. Imaging device with photo detectors and color filters arranged by color transmission characteristics and absorption coefficients
US10002893B2 (en) 2014-05-19 2018-06-19 Samsung Electronics Co., Ltd. Image sensor including hybrid pixel structure
US9369681B1 (en) * 2014-11-25 2016-06-14 Omnivision Technologies, Inc. RGBC color filter array patterns to minimize color aliasing
US10942274B2 (en) 2018-04-11 2021-03-09 Microsoft Technology Licensing, Llc Time of flight and picture camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5592223A (en) * 1992-11-11 1997-01-07 Sony Corporation Charge-coupled device having on-chip lens
JPH09116127A (en) * 1995-10-24 1997-05-02 Sony Corp Solid-state image sensor
US6137100A (en) * 1998-06-08 2000-10-24 Photobit Corporation CMOS image sensor with different pixel sizes for different colors

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5453611A (en) * 1993-01-01 1995-09-26 Canon Kabushiki Kaisha Solid-state image pickup device with a plurality of photoelectric conversion elements on a common semiconductor chip
JP4398562B2 (en) * 2000-03-07 2010-01-13 Hoya株式会社 Focus adjustment mechanism of 3D image detector
US6456793B1 (en) * 2000-08-03 2002-09-24 Eastman Kodak Company Method and apparatus for a color scannerless range imaging system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5592223A (en) * 1992-11-11 1997-01-07 Sony Corporation Charge-coupled device having on-chip lens
JPH09116127A (en) * 1995-10-24 1997-05-02 Sony Corp Solid-state image sensor
US6137100A (en) * 1998-06-08 2000-10-24 Photobit Corporation CMOS image sensor with different pixel sizes for different colors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1997, no. 09 30 September 1997 (1997-09-30) *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6993255B2 (en) 1999-02-16 2006-01-31 3Dv Systems, Ltd. Method and apparatus for providing adaptive illumination
US7355648B1 (en) 1999-02-16 2008-04-08 3Dv Systems Ltd. Camera having a through the lens pixel illuminator
WO2006130517A1 (en) * 2005-06-01 2006-12-07 Eastman Kodak Company Asymmetrical microlenses on pixel arrays
US7456380B2 (en) 2005-06-01 2008-11-25 Eastman Kodak Company Asymmetrical microlenses on pixel arrays
US8780257B2 (en) 2011-04-28 2014-07-15 Commissariat à l'énergie atmoque et aux énergies alternatives Imager device for evaluating distances of elements in an image
JP2016529491A (en) * 2013-12-24 2016-09-23 ソフトキネティク センサーズ エヌブイ Time-of-flight camera system
JP2015206634A (en) * 2014-04-18 2015-11-19 浜松ホトニクス株式会社 distance image sensor
US10436908B2 (en) 2014-04-18 2019-10-08 Hamamatsu Photonics K.K. Range image sensor

Also Published As

Publication number Publication date
WO2002049367A3 (en) 2003-03-06
WO2002049367A2 (en) 2002-06-20
AU2001218821A1 (en) 2002-06-24

Similar Documents

Publication Publication Date Title
WO2002049366A1 (en) 3d camera
TWI605297B (en) Image sensor and imaging system with symmetric multi-pixel phase-difference detectors, and associated detecting methods
US9681057B2 (en) Exposure timing manipulation in a multi-lens camera
EP1214609B1 (en) 3d imaging system
CN101682692B (en) Compound eye camera module
US7119842B2 (en) Image capturing device including a spectrally-selectively transmissive diaphragm
US20080165257A1 (en) Configurable pixel array system and method
EP1178333A2 (en) Method and apparatus for a color scannerless range imaging system
KR20010072091A (en) Color imaging system with infrared correction
JP2011176715A (en) Back-illuminated image sensor and imaging apparatus
CN102203655A (en) Image capturing apparatus
US6885400B1 (en) CCD imaging device and method for high speed profiling
JP2013157442A (en) Image pickup element and focal point detection device
CN102484723A (en) Solid-state image capturing element, image capturing device and signal processing method
JP2009164654A (en) Compound eye camera module
JP2003092392A (en) Image pickup unit
US20050151863A1 (en) Arrangement in a measuring system
US20040051806A1 (en) Integrated-circuit technology photosensitive sensor
US6535249B1 (en) Digital camera optical system with field lens
CN107221544B (en) Multispectral image pickup device and image pickup method thereof
CN212628099U (en) Image pickup apparatus
JP2661037B2 (en) Optical device for focus detection
JPH0277001A (en) Prism for video camera
WO2021210060A1 (en) Solid-state imaging element, imaging device, endoscope device, and operation microscope system
JPH0617394Y2 (en) Endoscope using image sensor

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP