WO1999034592A1 - Wide dynamic range optical sensor - Google Patents

Wide dynamic range optical sensor Download PDF

Info

Publication number
WO1999034592A1
WO1999034592A1 PCT/US1998/027821 US9827821W WO9934592A1 WO 1999034592 A1 WO1999034592 A1 WO 1999034592A1 US 9827821 W US9827821 W US 9827821W WO 9934592 A1 WO9934592 A1 WO 9934592A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
signal
group
pixel cell
optical sensor
Prior art date
Application number
PCT/US1998/027821
Other languages
French (fr)
Inventor
Joseph S. Stam
Jon H. Bechtel
Eric R. Fossum
Sabrina E. Kemeny
Original Assignee
Gentex Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gentex Corporation filed Critical Gentex Corporation
Priority to AU19495/99A priority Critical patent/AU1949599A/en
Priority to CA002315145A priority patent/CA2315145A1/en
Priority to DE69816126T priority patent/DE69816126T2/en
Priority to JP2000527085A priority patent/JP2002500476A/en
Priority to EP98964332A priority patent/EP1044561B1/en
Publication of WO1999034592A1 publication Critical patent/WO1999034592A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/587Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components

Definitions

  • the present invention relates in general to optical sensors and in particular to CMOS photogate active pixel sensor array optical sensors with wide dynamic range.
  • Optical sensors find a variety of uses, including satellite attitude sensors, video cameras and security systems. Many of these applications, such as a vehicle viewing system, are required to operate over an extensive range of conditions. For example, a wide intra-scene brightness range allows for viewing dimly lit night scenes even in the presence of glare from headlamps. A wide inter-scene brightness range allows for viewing scenes illuminated by bright sunlight as well as moonlight. Still further, frame rates must allow for displayed scenes to appear real-time.
  • CCDs Charge-coupled devices
  • a CCD optical sensor operates as an analog shift register, passing charge developed in proportion to light incident on a pixel across adjacent pixels until the charge reaches an end pixel where it is processed.
  • cost, read-out rate limitations, requirements for high and multiple voltage levels, and support electronics integration incompatibilities have prohibited large-scale adaptation of CCDs into certain applications such as vehicle viewing systems.
  • active pixel sensors APSs
  • APS devices utilize at least one active element within each pixel to accomplish amplification, pixel selection, charge storage or a similar benefit.
  • APS devices have many of the benefits of CCDs including high sensitivity, high signal fidelity and large array formats.
  • One form of APS utilizes a photodiode p-n junction and a source-follower buffer in each pixel.
  • photodiode devices typically suffer from high kTC, 1/f and fixed pattern noise, thereby limiting dynamic range.
  • alternative APS design uses a metal-on- silicon (MOS) photogate to accumulate charge proportional to light incident during an integration period.
  • the charge can be shifted to a sensing region for readout.
  • the sensing region can also be reset, allowing a reference output indicative of noise levels.
  • the reference can be subtracted from the integrated light value to implement correlated double sampling.
  • MOS metal-on- silicon
  • a photogate device presents several benefits.
  • a first benefit is that photogates have a very low noise level compared to other devices such as photodiodes .
  • a second benefit is that the photogate APS is compatible with standard CMOS manufacturing methods. This allows an APS array together with control and processing circuitry to be built on the same integrated circuit chip.
  • integration time used by each pixel cell in the optical sensor.
  • integration time has meant a corresponding increase in frame time. Since frame time determines the rate at which the output image is updated, increasing frame time may result in output images that no longer appear real-time.
  • Another object of the present invention is to provide a system and method for viewing details in scenes that may be obscured due to dim lighting or masked by bright light sources.
  • Still another object of the present invention is to provide a system and method for viewing scenes with wide inter-scene brightness levels.
  • a further object of the present invention is to provide a system and method for viewing scenes with wide intra-scene brightness levels.
  • a still further object is to describe a collection of possibly coexisting architectures for increased dynamic range optical sensors.
  • a method for increasing the effective integration time without a corresponding increase in the frame time.
  • each pixel cell holds the charge value corresponding to light accumulated during a previous frame period while integration of incident light is carried out in the current frame period. At the end of the current frame period, both values are read out and summed, and the current value is stored.
  • This double integration method produces in each frame period an output value representative of the incident light over two frame periods, effectively doubling the dynamic range .
  • each pixel cell In another embodiment, light incident over a long period is held in each pixel cell .
  • the cell then integrates incident light over a short period. Both values are read out and compared. If the long integration value is at or near saturation, the short integration value is used.
  • This dual integration method increases the dynamic range of the pixel cell by a factor roughly equivalent to the ratio of the long integration time to the short integration time.
  • integration time is increased by reading a subset of pixel cells each frame period while cells not read during the current frame period continue to integrate.
  • the values for cells not read in a given frame period can be interpo- lated. This interlacing method provides a tradeoff between integration time and spacial resolution.
  • individual or groups of pixel cells can be reset at any time, shortening their integration period. This increases dynamic range by allowing a pixel cell sensing a generally dim scene to have a longer integration time than a pixel cell sensing a smaller, brightly lit region.
  • double inte- gration, dual integration, interlacing and individual pixel reset are all available and may be selectively used together or separately to provide increased dynamic range over wide inter-scene and intra-scene brightness conditions .
  • a system is also provided in accordance with the present invention for a wide dynamic range optical sensor.
  • the system includes an array of active pixel sensor cells, one or more decoders for selecting groups of cells, output circuits that accepts output from pixel cells and may perform dynamic range increase operations, noise reduction operations and analog-to-digital conversion, and control circuitry.
  • FIGURE 1 is a block diagram of an exemplary optical sensor according to the present invention.
  • FIGURE 2 is a schematic diagram illustrating the operation of a CMOS photogate active pixel sensor
  • FIGURE 3 is a schematic diagram of an optical sensor architecture implementing double integration time according to the present invention.
  • FIGURE 4 is a schematic diagram of an optical sensor architecture implementing dual integration times according to the present inven ion;
  • FIGURE 5a is a block diagram of an optical sensor architecture implementing row-wise interlacing according to the present invention.
  • FIGURE 5b is a block diagram of an optical sensor architecture implementing group-wise interlacing according to the present invention.
  • FIGURE 6 is a schematic diagram of an optical sensor architecture implementing individual pixel reset according to the present invention.
  • the optical sensor combines an array of pixel sensors with control and output circuitry.
  • a plurality of optical pixel sensors preferably APS cells, are arranged in rows and columns.
  • a typical pixel sensor cell is indicated by 20.
  • Each row of pixel sensors is selected for output by row decoder 22.
  • Output circuits may condition and combine pixel sensor signals as will be described below with reference to Figures 3 and 4.
  • Output circuits may also contain analog-to-digital converters (ADCs) for digitizing output circuit results.
  • ADCs for optical sensors are well known as in, for example, “Low-Light -Level Image Sensor with On-Chip Signal Processing, " Proceeding of SPIE, Vol. 1952, pp. 23-33 (1993) by Mendis, Pain, Nixon and Fossum, and which is hereby incorporated by reference .
  • Output circuits 24 may deliver digitized values to buffer 28 so as to affect a serial or parallel stream of output data.
  • An alternative to placing an .ADC in each output circuit 24 is to use a single ADC in buffer 28.
  • Timing controller 26 which sends signals to pixel sensors 20, row decoder 22 and output circuits 24. Timing controller 26 may also receive external control signals 30 indicating, for example, integration times, reset times and types of image sensor architectures in use as are described with regards to Figures 3 through 6.
  • CMOS Photogate APS CMOS Photogate APS
  • a pixel sensor cell is shown generally by 20.
  • Photogate electrode 40 overlays silicon substrate 42.
  • Photogate signal PG is held at a positive voltage to form potential well 44 in substrate 42.
  • Light incident on photogate 40 generates charge, which is accumulated in well 44 during an integration period.
  • Transfer gate electrode 46 is initially held at a less positive voltage than photogate signal PG, so as to form potential barrier 48 adjacent to well 44.
  • Floating diffusion 50 is connected to the gate of source follower FET 52, whose drain is connected to drain voltage VDD.
  • Reset electrode 54 is initially held by reset signal RST at a voltage corresponding to transfer gate voltage TX to form transfer barrier 56 thereunder.
  • Supply voltage VDD connected to drain diffusion 58 creates a constant potential well 60 beneath diffusion 58.
  • Row select FET 62 presents a voltage at node OUT, referenced as 68, proportional to charge stored under floating diffusion 50 when signal ROW is asserted.
  • a constant voltage is applied by signal VLN at the gate of load FET 64.
  • Load FET 64 may be implemented in each APS cell or one transistor may be used for a column of APS cells.
  • Transistors 52, 62 and 64 form a switched buffer circuit .
  • the region beneath floating diffusion 50 is reset by temporarily bringing reset electrode 54 to a voltage near VDD so as to create reset potential level 66.
  • Asserting signal ROW causes a reference voltage to appear at OUT 68. This reference voltage can be subtracted from an integrated light readings to limit the effect of kTC and threshold-voltage non-uniformity induced fixed pattern noise, a technique known as correlated double sampling.
  • floating diffusion 50 acts as a charge storage device, and can hold the charge from a previous integration while photogate 40 integrates new charge.
  • floating diffusion 50 is replaced by a floating gate shown schemati- cally in Figure 2 by a simplified dashed line floating gate electrode 72.
  • the basic operation of floating gate electrode 72 is similar to floating diffusion 50.
  • Double integration sums the charge integrated over two frame periods .
  • an effective integration time of approximately twice the frame rate is achieved while still maintaining a near real-time output.
  • FIG. 3 a schematic diagram for the output circuit for a double integration image sensor is shown.
  • Photogate APS pixel sensors 20 in a given column have output nodes 68 tied together to form column bus 80.
  • the operation of an APS pixel sensor cell is described with regards to Figure 2 above.
  • the switched output buffer of Figure 2 has been replaced by row switch 82 for clarity. Since only one row is selected at any given time, at most one pixel sensor in the column has an output that does not appear as high impedance.
  • Column bus 80 is also connected to double integration output circuit 84.
  • column bus 80 is connected to the inputs of sample-and-hold signal 1 (SHS1) switch 86, sample-and-hole reset (SHR) switch 88, and sample-and-hold signal 2 (SHS2) switch 90.
  • SHS1 sample-and-hold signal 1
  • SHR sample-and-hole reset
  • SHS2 sample-and-hold signal 2
  • Each of capacitors 92, 94 and 96 are connected to a switched buffer circuit shown as 98, 100 and 102 respectively.
  • Each switched buffer passes the voltage on the capacitor connected to its input when signal CS is asserted.
  • the design of switched buffers is well known in the art .
  • each row is selected once per frame period by asserting corresponding signal ROW.
  • node output 68 is at a voltage level corresponding to the charge integrated by pixel sensor 20 during the previous frame period.
  • Signal SHS1 is asserted, charging capacitor 92 to the first integration voltage level.
  • Floating diffusion 50 in pixel sensor 20 is reset, and the reset voltage level is placed on node 68.
  • Capacitor 94 is charged to the reset voltage by asserting signal SHR.
  • the charge accumulated by photogate 40 during the current frame period is then transferred to floating diffusion 50 with the corresponding second integration voltage appearing at node 68.
  • Signal SHS2 is then asserted, charging capacitor 96 to the second integration voltage.
  • the first integration voltage and the second integration voltage are read by summing circuit 104 producing integrating signal 106 equal to the charge integrated in pixel sensor 20 over two frame periods.
  • the reset voltage is fed into doubling circuit 108 producing reference signal 110 approximating the kTC noise produced by two samplings of pixel sensor 20.
  • the differ- ence between integrating signal 106 and reference signal 110 is obtained in differencing circuit 112.
  • the output of differencing circuit 112, analog intensity signal 114 is a voltage representative of the light incident on pixel sensor 20 over the previous two frame periods less an estimate of noise over the same time.
  • Analog signal 114 is read by ADC 116 to produce digital intensity signal 118.
  • Output circuit 84 produces, in each frame period and for each pixel sensor 20 in the corresponding column, a value representing the light incident on pixel sensor 20 over two frame periods, effectively doubling the dynamic range of pixel sensor 20.
  • An optical sensor sensing a scene with a wide dynamic range may sacrifice saturation of pixel sensors viewing brightly lit regions in order to achieve sufficient detail in dimly lit regions.
  • Such problems occur in, for example, vehicle rear viewing systems where a well lit street may wash out details from an adjacent sidewalk. This problem is minimized by integrating each pixel cell over two periods of different lengths, and using the short integration value if the long integration value is saturated. Dual integration may also reduce the number of bits required in analog-to-digital conversion.
  • Pixel sensors 20 in a given column have output nodes 68 tied together to form column bus 80.
  • the operation of an APS pixel sensor cell is described with regards to Figure 2 above . Since only one row is selected at any given time, at most one pixel sensor in the column has an output that does not appear as high impedance .
  • Column bus 80 is also connected to dual integration output circuit 120.
  • column bus 80 is connected to sample-and-hold signal long (SHSL) switch 122, sample-and-hold signal short (SHSS) switch 124, and sample-and-hold reset (SHR) switch 126.
  • SHSL sample-and-hold signal long
  • SHSS sample-and-hold signal short
  • SHR sample-and-hold reset
  • photogate 40 During each frame period, photogate 40 accumulates charge for a long period then transfers the charge to floating diffusion 50 where it is stored. Photogate 40 then integrates for a short period. In one embodiment, the long and short periods are such that their sum is not greater than the frame period.
  • ROW is asserted at switch buffer 82 and the voltage at node output 68 becomes a value corre- sponding to the charge integrated by pixel sensor 20 during the long integration period.
  • Signal SHSL is asserted, charging capacitor 128 to the long integration voltage level.
  • Floating diffusion 50 in pixel sensor 20 is reset, and the reset voltage level is placed on node 68.
  • Capacitor 132 is charged to the reset voltage by asserting signal SHR.
  • the charge accumulated by photogate 40 during the short integration period is then transferred to floating diffusion 50 with the corresponding short integration voltage appearing at node 68.
  • Signal SHSS is then asserted, charging capacitor 130 to the short integration voltage.
  • Signal CS is then asserted.
  • the long inte- gration voltage, on capacitor 128, is read by threshold detector 140.
  • Threshold detector output 142 is asserted when the long integration voltage is at or near a level indicating saturation of pixel sensor 20 during the long integration period.
  • Threshold detector output 142 controls select switch 144 which routes to its output either the long integration voltage signal or the short integration voltage signal. In this way, if pixel sensor 20 is saturated during a long integration period, the value over a shorter period is used.
  • the reset voltage, representative of kTC noise, is subtracted from the selected long or short integration signal in difference circuit 146, producing analog signal 148.
  • Analog signal 148 is converted to a digital signal by ADC 150.
  • Threshold bit 142 may be included with the digital signal to produce output 152.
  • An alternative embodiment for implementing dual integration times uses 2m output circuits.
  • the m pixel sensors in each row are read then reset, once per frame period, into m outputs.
  • Each row is also read at some time prior to the next frame period into a second set of m outputs to implement the short integration time.
  • Control is simplified by reading the row completing the long integration time and the row completing the short integration time simultaneously into two sets of output circuits.
  • This embodiment is " further described in "Readout Schemes to Increase Dynamic Ranges of Image Sensors," NASA Tech Briefs, pp. 32-33 (January 1997) by Yadid-Pecht and Fossum, which is hereby incorporated by reference .
  • Another technique for extending the effective integration time beyond the frame time is through interlacing.
  • a subset of pixel cells are read and reset each frame period, the remaining cells continue to integrate light-induced charge.
  • Image values corresponding to pixel cells not read during a particular frame may be interpolated.
  • An m by n array of pixel sensors 20 are arranged into successive rows R 1# R 2 , R 3 , ... , R n where n is the number of rows.
  • Row decoder 22 is used to select the row of pixel sensors providing signals to m output circuits 24. Row decoder 22 sequentially selects all rows in the first set within one frame period followed by all rows in the second set within the next frame period, thereby implementing the sequence R 1# R 3 , ... , R n -i, R 2 , R 4 , ... , R sunt over two frame periods. This allows each pixel sensor to integrate incident light for two frame periods, effectively doubling the integration time per frame period.
  • Pixel sensors 20 are further arranged into groups G 1( G 2 , G 3 , ... , G n where n is the number of groups and the product of m and n is the total number of pixel sensors .
  • each group is an alternate pixel in two adjacent rows so as to form a checkerboard-like pattern within the two rows.
  • Group decoder 160 is used to simultaneously select all pixel sensors in a group. For example, select line 162 activates each of the shaded pixels sensors in Figure 5b. Pixel sensors 20 are connected to output circuits 24 such that, regardless of which group is selected, at most one activated pixel cell is connected to each output circuit. Groups may be placed into one or more sets . The groups in each set may be alternately selected during one frame period, then the groups in a different set during the next frame period, and so on until all groups are selected. The process is then repeated.
  • the sequence for asserting the groups would then be G l t G 5 , • • • / G n _3, G 2 , G 6 , ... , G n . 2 1 G 3 , G 7 , ... , G n ⁇ i ⁇ G 4 , ⁇ 8 • • • # G n over a span of four frame periods. This results in an integration period of approximately four times the frame period, resulting in four times the dynamic range if appropriate output electronics are used.
  • each set can be read out in one frame period.
  • image pixel values not updated from pixel sensors in a given frame can be interpolated from previous values, from other pixel values, or both.
  • even-numbered groups may be placed in one set and odd-numbered groups in another set.
  • Image pixel values not updated from pixel sensors may be obtained by averaging adjacent neighbor pixel values .
  • Optical sensors in some applications view scenes with small regions of intense brightness compared to the remainder of the scene. This may occur, for example, from headlamps sensed by vehicle vision sys- tems . Once these regions are detected, the integration time for the corresponding pixel cells may be reduced by resetting the accumulated charge.
  • FIG. 6 an individual pixel reset architecture for extending dynamic range of an optical sensor is shown. Individual or groups of pixel sensors can be reset during integration time, thereby providing a shorter integration period. Areas of the image which are dimly lit receive longer integration periods than areas which are brightly lit. A method of controlling the reset period for a pixel sensor is described in "Image Sensors With Individual Pixel Reset,” pg. 34 of NASA Tech Brief NPO-1973 of November 1996 by Pecht, Pain and Fossum and hereby incorporated by reference.
  • FIG. 6 shows an APS pixel cell, described with reference to Figure 2 above, with the addition of reset FET 170.
  • floating diffusion 50 is set to a reference level when reset electrode 54 is asserted as described with respect to Figure 2.
  • Reset FET 170 has its source connected to reset electrode 54, gate connected to row reset signal (RRST) 172, and drain connected to column reset signal (CRST) 174. When both RRST and CRST are asserted, floating diffusion 50 is reset.
  • all pixel sensor cell in each row are connected to a common RRST line and all pixel sensor cells in one column are connected to a common CRST line.
  • a row reset decoder 176 selectively asserts a row based on signals from timing controller 26.
  • a column reset decoder 178 selectively asserts a row based on signals from timing controller 26.

Abstract

A system and method is described for increasing effective integration time of an optical sensor including holding a first signal within each pixel cell, proportional to light integrated by the pixel cell over the previous frame period, generating a second signal within each pixel cell proportional to light integrated by the pixel cell over the current frame period, and summing the first signal and the second signal from each pixel, thereby producing an output signal representing the light integrated by each pixel over two frame periods. If saturation of pixel cells is possible, a further method of extending dynamic range is described including generating and storing a first signal in each pixel cell indicative of light integrated by the pixel cell over a long period, generating a second signal in each pixel cell indicative of light integrated by the pixel cell over a short period, and determining an output for each pixel as the first signal whenever the first signal is less than a threshold value, otherwise determining the output as the second signal. Also included are double correlated sampling for noise reduction, interlacing for increased integration time, and individual pixel reset for additional gains in dynamic range.

Description

WIDE DYNAMIC RANGE OPTICAL SENSOR
Technical Field
The present invention relates in general to optical sensors and in particular to CMOS photogate active pixel sensor array optical sensors with wide dynamic range.
Background Art
Optical sensors find a variety of uses, including satellite attitude sensors, video cameras and security systems. Many of these applications, such as a vehicle viewing system, are required to operate over an extensive range of conditions. For example, a wide intra-scene brightness range allows for viewing dimly lit night scenes even in the presence of glare from headlamps. A wide inter-scene brightness range allows for viewing scenes illuminated by bright sunlight as well as moonlight. Still further, frame rates must allow for displayed scenes to appear real-time.
Charge-coupled devices (CCDs) have often been the technology of choice. A CCD optical sensor operates as an analog shift register, passing charge developed in proportion to light incident on a pixel across adjacent pixels until the charge reaches an end pixel where it is processed. However cost, read-out rate limitations, requirements for high and multiple voltage levels, and support electronics integration incompatibilities have prohibited large-scale adaptation of CCDs into certain applications such as vehicle viewing systems. Unlike CCDs, active pixel sensors (APSs) utilize at least one active element within each pixel to accomplish amplification, pixel selection, charge storage or a similar benefit. As such, APS devices have many of the benefits of CCDs including high sensitivity, high signal fidelity and large array formats. Because APS cells are accessed in a row-wise manner, the problems arising from transferring charge across pixel cells, as is done in CCD sensors, are alleviated. Additional comparisons between APS cells and other devices are presented in, for example, "Active Pixel Sensors: Are CCD ' s Dinosaurs?" in Proceedings of SPIE : Charge-Coupled Devices and Solid State Optical Sensors III, Vol. 30, pp. 2-14 (1993) by E. R. Fossum, which is hereby incorporated by reference.
One form of APS utilizes a photodiode p-n junction and a source-follower buffer in each pixel. However, photodiode devices typically suffer from high kTC, 1/f and fixed pattern noise, thereby limiting dynamic range.
.n alternative APS design uses a metal-on- silicon (MOS) photogate to accumulate charge proportional to light incident during an integration period. The charge can be shifted to a sensing region for readout. The sensing region can also be reset, allowing a reference output indicative of noise levels. The reference can be subtracted from the integrated light value to implement correlated double sampling.
A photogate device presents several benefits. A first benefit is that photogates have a very low noise level compared to other devices such as photodiodes .
This results in the need for less integration time to achieve a desired light sensitivity. A second benefit is that the photogate APS is compatible with standard CMOS manufacturing methods. This allows an APS array together with control and processing circuitry to be built on the same integrated circuit chip.
In order to accommodate wide intra-scene and inter-scene brightness levels, an increased dynamic range is required. This can be accomplished by increasing the integration time used by each pixel cell in the optical sensor. Traditionally, integration time has meant a corresponding increase in frame time. Since frame time determines the rate at which the output image is updated, increasing frame time may result in output images that no longer appear real-time.
Summary Of The Invention
It is a primary object of the present invention to provide a system and method for increasing the dynamic range of optical sensors while maintaining a near real-time frame rate.
.Another object of the present invention is to provide a system and method for viewing details in scenes that may be obscured due to dim lighting or masked by bright light sources.
Still another object of the present invention is to provide a system and method for viewing scenes with wide inter-scene brightness levels.
A further object of the present invention is to provide a system and method for viewing scenes with wide intra-scene brightness levels. A still further object is to describe a collection of possibly coexisting architectures for increased dynamic range optical sensors.
In carrying out the above objects and other objects and features of the present invention, a method is provided for increasing the effective integration time without a corresponding increase in the frame time.
In one embodiment, each pixel cell holds the charge value corresponding to light accumulated during a previous frame period while integration of incident light is carried out in the current frame period. At the end of the current frame period, both values are read out and summed, and the current value is stored. This double integration method produces in each frame period an output value representative of the incident light over two frame periods, effectively doubling the dynamic range .
In another embodiment, light incident over a long period is held in each pixel cell . The cell then integrates incident light over a short period. Both values are read out and compared. If the long integration value is at or near saturation, the short integration value is used. This dual integration method increases the dynamic range of the pixel cell by a factor roughly equivalent to the ratio of the long integration time to the short integration time.
In still another embodiment, integration time is increased by reading a subset of pixel cells each frame period while cells not read during the current frame period continue to integrate. The values for cells not read in a given frame period can be interpo- lated. This interlacing method provides a tradeoff between integration time and spacial resolution.
In a further embodiment, individual or groups of pixel cells can be reset at any time, shortening their integration period. This increases dynamic range by allowing a pixel cell sensing a generally dim scene to have a longer integration time than a pixel cell sensing a smaller, brightly lit region.
In the preferred embodiment, double inte- gration, dual integration, interlacing and individual pixel reset are all available and may be selectively used together or separately to provide increased dynamic range over wide inter-scene and intra-scene brightness conditions .
A system is also provided in accordance with the present invention for a wide dynamic range optical sensor. The system includes an array of active pixel sensor cells, one or more decoders for selecting groups of cells, output circuits that accepts output from pixel cells and may perform dynamic range increase operations, noise reduction operations and analog-to-digital conversion, and control circuitry.
The above objects and other objects, features, and advantages of the present invention are readily apparent from the following detailed description of the best mode for carrying out the invention when taken in connection with the accompanying drawings. Brief Description Of The Drawings
FIGURE 1 is a block diagram of an exemplary optical sensor according to the present invention;
FIGURE 2 is a schematic diagram illustrating the operation of a CMOS photogate active pixel sensor;
FIGURE 3 is a schematic diagram of an optical sensor architecture implementing double integration time according to the present invention;
FIGURE 4 is a schematic diagram of an optical sensor architecture implementing dual integration times according to the present inven ion;
FIGURE 5a is a block diagram of an optical sensor architecture implementing row-wise interlacing according to the present invention;
FIGURE 5b is a block diagram of an optical sensor architecture implementing group-wise interlacing according to the present invention; and
FIGURE 6 is a schematic diagram of an optical sensor architecture implementing individual pixel reset according to the present invention.
Best Modes For Carrying Out The Invention
Referring now to Figure 1, a block diagram of an exemplary optical sensor is shown. The optical sensor combines an array of pixel sensors with control and output circuitry. A plurality of optical pixel sensors, preferably APS cells, are arranged in rows and columns. A typical pixel sensor cell is indicated by 20. Each row of pixel sensors is selected for output by row decoder 22.
Each pixel sensor in a selected row delivers an output signal to a corresponding output circuit 24. Output circuits may condition and combine pixel sensor signals as will be described below with reference to Figures 3 and 4. Output circuits may also contain analog-to-digital converters (ADCs) for digitizing output circuit results. ADCs for optical sensors are well known as in, for example, "Low-Light -Level Image Sensor with On-Chip Signal Processing, " Proceeding of SPIE, Vol. 1952, pp. 23-33 (1993) by Mendis, Pain, Nixon and Fossum, and which is hereby incorporated by reference .
Output circuits 24 may deliver digitized values to buffer 28 so as to affect a serial or parallel stream of output data. An alternative to placing an .ADC in each output circuit 24 is to use a single ADC in buffer 28.
Control of the optical sensor is accomplished through timing controller 26, which sends signals to pixel sensors 20, row decoder 22 and output circuits 24. Timing controller 26 may also receive external control signals 30 indicating, for example, integration times, reset times and types of image sensor architectures in use as are described with regards to Figures 3 through 6. CMOS Photogate APS
Referring now to Figure 2 , a CMOS photogate active pixel sensor is shown. A brief description of the APS operation is described forthwith. A more detailed description is in U.S. Patent 5,471,515 entitled "Active Pixel Sensor with Intra-Pixel Charge Transfer" to Fossum, Mendis and Kemeny which is hereby incorporated by reference.
A pixel sensor cell is shown generally by 20. Photogate electrode 40 overlays silicon substrate 42. Photogate signal PG is held at a positive voltage to form potential well 44 in substrate 42. Light incident on photogate 40 generates charge, which is accumulated in well 44 during an integration period. Transfer gate electrode 46 is initially held at a less positive voltage than photogate signal PG, so as to form potential barrier 48 adjacent to well 44. Floating diffusion 50 is connected to the gate of source follower FET 52, whose drain is connected to drain voltage VDD. Reset electrode 54 is initially held by reset signal RST at a voltage corresponding to transfer gate voltage TX to form transfer barrier 56 thereunder. Supply voltage VDD connected to drain diffusion 58 creates a constant potential well 60 beneath diffusion 58. Row select FET 62 presents a voltage at node OUT, referenced as 68, proportional to charge stored under floating diffusion 50 when signal ROW is asserted. A constant voltage is applied by signal VLN at the gate of load FET 64. Load FET 64 may be implemented in each APS cell or one transistor may be used for a column of APS cells. Transistors 52, 62 and 64 form a switched buffer circuit . The region beneath floating diffusion 50 is reset by temporarily bringing reset electrode 54 to a voltage near VDD so as to create reset potential level 66. Asserting signal ROW causes a reference voltage to appear at OUT 68. This reference voltage can be subtracted from an integrated light readings to limit the effect of kTC and threshold-voltage non-uniformity induced fixed pattern noise, a technique known as correlated double sampling.
At the end of the integration period, electrons in well 44 are transferred by decreasing the photogate voltage PG such that charge is moved from well 44 to beneath floating diffusion 50. This changes the potential level of the floating gate from reset level 66 to level 70 indicative of the charge amount accumulated during the integration period. A voltage proportional to the integrated charge is delivered to node OUT 68 by asserting signal ROW on FET 62.
It should be noted that floating diffusion 50 acts as a charge storage device, and can hold the charge from a previous integration while photogate 40 integrates new charge.
In an alternative embodiment, floating diffusion 50 is replaced by a floating gate shown schemati- cally in Figure 2 by a simplified dashed line floating gate electrode 72. The basic operation of floating gate electrode 72 is similar to floating diffusion 50.
Double Integration Period Architecture
In order to achieve an output image that appears real-time, at least 30 frames must be generated each second. This implies that, if each pixel cell is to be read each frame, an integration period not greater than 33 milliseconds is required. Double integration sums the charge integrated over two frame periods . For optical sensors that may be used to view dimly lit scenes such as, for example, vehicle viewing systems, an effective integration time of approximately twice the frame rate is achieved while still maintaining a near real-time output.
Referring now to Figure 3, a schematic diagram for the output circuit for a double integration image sensor is shown. Photogate APS pixel sensors 20 in a given column have output nodes 68 tied together to form column bus 80. The operation of an APS pixel sensor cell is described with regards to Figure 2 above. The switched output buffer of Figure 2 has been replaced by row switch 82 for clarity. Since only one row is selected at any given time, at most one pixel sensor in the column has an output that does not appear as high impedance. Column bus 80 is also connected to double integration output circuit 84.
Within output circuit 84, column bus 80 is connected to the inputs of sample-and-hold signal 1 (SHS1) switch 86, sample-and-hole reset (SHR) switch 88, and sample-and-hold signal 2 (SHS2) switch 90. As is well known in the art, switch 86, 88 or 90 may be implemented with an n-channel FET. When signal SHS1 is asserted, the voltage on bus 80 appears across capacitor 92 connected to the output of switch 86. When signal SHR is asserted, the voltage on bus 80 appears across capacitor 94 connected to the output of switch 88. When signal SHS2 is asserted, the voltage on bus 80 appears across capacitor 96 connected to the output of switch 90. Each of capacitors 92, 94 and 96 are connected to a switched buffer circuit shown as 98, 100 and 102 respectively. Each switched buffer passes the voltage on the capacitor connected to its input when signal CS is asserted. The design of switched buffers is well known in the art .
In operation, each row is selected once per frame period by asserting corresponding signal ROW. When its row is first selected, node output 68 is at a voltage level corresponding to the charge integrated by pixel sensor 20 during the previous frame period. Signal SHS1 is asserted, charging capacitor 92 to the first integration voltage level. Floating diffusion 50 in pixel sensor 20 is reset, and the reset voltage level is placed on node 68. Capacitor 94 is charged to the reset voltage by asserting signal SHR. The charge accumulated by photogate 40 during the current frame period is then transferred to floating diffusion 50 with the corresponding second integration voltage appearing at node 68. Signal SHS2 is then asserted, charging capacitor 96 to the second integration voltage.
When signal CS is asserted, the first integration voltage and the second integration voltage are read by summing circuit 104 producing integrating signal 106 equal to the charge integrated in pixel sensor 20 over two frame periods. Simultaneously, the reset voltage is fed into doubling circuit 108 producing reference signal 110 approximating the kTC noise produced by two samplings of pixel sensor 20. The differ- ence between integrating signal 106 and reference signal 110 is obtained in differencing circuit 112. The output of differencing circuit 112, analog intensity signal 114, is a voltage representative of the light incident on pixel sensor 20 over the previous two frame periods less an estimate of noise over the same time. Analog signal 114 is read by ADC 116 to produce digital intensity signal 118.
Output circuit 84 produces, in each frame period and for each pixel sensor 20 in the corresponding column, a value representing the light incident on pixel sensor 20 over two frame periods, effectively doubling the dynamic range of pixel sensor 20. By adding addi- tional floating diffusions and select logic to each pixel sensor 20 as well as capacitors and support circuitry to output circuits 84, further dynamic range may be gained.
Dual Integration Time Architecture
An optical sensor sensing a scene with a wide dynamic range may sacrifice saturation of pixel sensors viewing brightly lit regions in order to achieve sufficient detail in dimly lit regions. Such problems occur in, for example, vehicle rear viewing systems where a well lit street may wash out details from an adjacent sidewalk. This problem is minimized by integrating each pixel cell over two periods of different lengths, and using the short integration value if the long integration value is saturated. Dual integration may also reduce the number of bits required in analog-to-digital conversion.
Referring now to Figure 4, a schematic diagram for implementing dual integration times is shown. Pixel sensors 20 in a given column have output nodes 68 tied together to form column bus 80. The operation of an APS pixel sensor cell is described with regards to Figure 2 above . Since only one row is selected at any given time, at most one pixel sensor in the column has an output that does not appear as high impedance . Column bus 80 is also connected to dual integration output circuit 120.
Within output circuit 120, column bus 80 is connected to sample-and-hold signal long (SHSL) switch 122, sample-and-hold signal short (SHSS) switch 124, and sample-and-hold reset (SHR) switch 126. When signal SHSL is asserted, the voltage on bus 80 appears across capacitor 128 connected to the output of switch 122. When signal SHSS is asserted, the voltage on bus 80 appears across capacitor 130 connected to the output of switch 124. When signal SHR is asserted, the voltage on bus 80 appears across capacitor 132 connected to the output of switch 126. Each of capacitors 128, 130, and 132 is connected to its own switched buffer circuit, shown as 134, 136 and 138 respectively. Each switched buffer passes the voltage on the capacitor connected to its input when signal CS is asserted. The design of switched buffers is well known in the art.
During each frame period, photogate 40 accumulates charge for a long period then transfers the charge to floating diffusion 50 where it is stored. Photogate 40 then integrates for a short period. In one embodiment, the long and short periods are such that their sum is not greater than the frame period. At the end of the frame period, ROW is asserted at switch buffer 82 and the voltage at node output 68 becomes a value corre- sponding to the charge integrated by pixel sensor 20 during the long integration period. Signal SHSL is asserted, charging capacitor 128 to the long integration voltage level. Floating diffusion 50 in pixel sensor 20 is reset, and the reset voltage level is placed on node 68. Capacitor 132 is charged to the reset voltage by asserting signal SHR. The charge accumulated by photogate 40 during the short integration period is then transferred to floating diffusion 50 with the corresponding short integration voltage appearing at node 68. Signal SHSS is then asserted, charging capacitor 130 to the short integration voltage.
Signal CS is then asserted. The long inte- gration voltage, on capacitor 128, is read by threshold detector 140. Threshold detector output 142 is asserted when the long integration voltage is at or near a level indicating saturation of pixel sensor 20 during the long integration period. Threshold detector output 142 controls select switch 144 which routes to its output either the long integration voltage signal or the short integration voltage signal. In this way, if pixel sensor 20 is saturated during a long integration period, the value over a shorter period is used. The reset voltage, representative of kTC noise, is subtracted from the selected long or short integration signal in difference circuit 146, producing analog signal 148. Analog signal 148 is converted to a digital signal by ADC 150. Threshold bit 142 may be included with the digital signal to produce output 152.
An alternative embodiment for implementing dual integration times uses 2m output circuits. As in a conventional system, the m pixel sensors in each row are read then reset, once per frame period, into m outputs. Each row is also read at some time prior to the next frame period into a second set of m outputs to implement the short integration time. Control is simplified by reading the row completing the long integration time and the row completing the short integration time simultaneously into two sets of output circuits. This embodiment is" further described in "Readout Schemes to Increase Dynamic Ranges of Image Sensors," NASA Tech Briefs, pp. 32-33 (January 1997) by Yadid-Pecht and Fossum, which is hereby incorporated by reference .
An ideal ratio between the long and short integration times is about -to-1. This would extend the dynamic range by about a factor of ten, providing good resolution for dimly lit scenes as well as increasing the brightness range before pixel sensor saturation. Ratios of 8-to-l or 16-to-l are desirable due to the ease of implementing powers of two in digital circuits.
Interlacing Integration Architecture
Another technique for extending the effective integration time beyond the frame time is through interlacing. A subset of pixel cells are read and reset each frame period, the remaining cells continue to integrate light-induced charge. Image values corresponding to pixel cells not read during a particular frame may be interpolated.
Referring now to Figure 5a, an interlacing technique for extending dynamic range is shown. An m by n array of pixel sensors 20 are arranged into successive rows R1# R2, R3, ... , Rn where n is the number of rows.
Even numbered rows are placed in a first set and odd numbered rows in a second set. Row decoder 22 is used to select the row of pixel sensors providing signals to m output circuits 24. Row decoder 22 sequentially selects all rows in the first set within one frame period followed by all rows in the second set within the next frame period, thereby implementing the sequence R1# R3, ... , Rn-i, R2, R4 , ... , R„ over two frame periods. This allows each pixel sensor to integrate incident light for two frame periods, effectively doubling the integration time per frame period.
Using the above method with a frame period of 33 milliseconds produces a new frame each 33 milliseconds with half of the pixels (all pixels in half of the rows) of the output image being updated each frame. The values in the remaining half of the pixels in each frame may be interpolated from the new values.
Referring now to Figure 5b, a more general embodiment of the interlacing architecture is shown.
Pixel sensors 20 are further arranged into groups G1( G2, G3, ... , Gn where n is the number of groups and the product of m and n is the total number of pixel sensors .
In Figure 5b, each group is an alternate pixel in two adjacent rows so as to form a checkerboard-like pattern within the two rows. Group decoder 160 is used to simultaneously select all pixel sensors in a group. For example, select line 162 activates each of the shaded pixels sensors in Figure 5b. Pixel sensors 20 are connected to output circuits 24 such that, regardless of which group is selected, at most one activated pixel cell is connected to each output circuit. Groups may be placed into one or more sets . The groups in each set may be alternately selected during one frame period, then the groups in a different set during the next frame period, and so on until all groups are selected. The process is then repeated. For example, if four sets are used, each group Gα can be placed into set i = G2 modulo 4 for i = 0...3. The sequence for asserting the groups would then be Gl t G5, • • • / Gn_3, G2 , G6, ... , Gn.21 G3 , G7 , ... , Gn~i ι G4 , ^8 • • • # Gn over a span of four frame periods. This results in an integration period of approximately four times the frame period, resulting in four times the dynamic range if appropriate output electronics are used.
As with row groupings, each set can be read out in one frame period. Prior to display, image pixel values not updated from pixel sensors in a given frame can be interpolated from previous values, from other pixel values, or both. For example, with the arrangement shown in Figure 5b, even-numbered groups may be placed in one set and odd-numbered groups in another set. Image pixel values not updated from pixel sensors may be obtained by averaging adjacent neighbor pixel values .
It is appreciated that many different grouping patterns and numbers of sets may be used within the spirit of the present invention.
Individual Pixel Reset Architecture
Optical sensors in some applications view scenes with small regions of intense brightness compared to the remainder of the scene. This may occur, for example, from headlamps sensed by vehicle vision sys- tems . Once these regions are detected, the integration time for the corresponding pixel cells may be reduced by resetting the accumulated charge.
Referring now to Figure 6, an individual pixel reset architecture for extending dynamic range of an optical sensor is shown. Individual or groups of pixel sensors can be reset during integration time, thereby providing a shorter integration period. Areas of the image which are dimly lit receive longer integration periods than areas which are brightly lit. A method of controlling the reset period for a pixel sensor is described in "Image Sensors With Individual Pixel Reset," pg. 34 of NASA Tech Brief NPO-1973 of November 1996 by Pecht, Pain and Fossum and hereby incorporated by reference.
Figure 6 shows an APS pixel cell, described with reference to Figure 2 above, with the addition of reset FET 170. In each pixel sensor cell 20, floating diffusion 50 is set to a reference level when reset electrode 54 is asserted as described with respect to Figure 2. Reset FET 170 has its source connected to reset electrode 54, gate connected to row reset signal (RRST) 172, and drain connected to column reset signal (CRST) 174. When both RRST and CRST are asserted, floating diffusion 50 is reset.
In one embodiment, shown in Figure 6, all pixel sensor cell in each row are connected to a common RRST line and all pixel sensor cells in one column are connected to a common CRST line. A row reset decoder 176 selectively asserts a row based on signals from timing controller 26. A column reset decoder 178 selectively asserts a row based on signals from timing controller 26.
In an alternative embodiment, collections of pixels sensor cells are reset simultaneously by the same combination of row and column reset signals. This exchanges reset spacial resolution for simplified control. Possible arrangements for collections of pixels include, but are not limited to, n-by-n arrays for n = 2, 3 , 4 , ...
An optical sensor system capable of providing an image of a scene with increased dynamic range while still maintaining a near real-time frame rate and while operating under wide inter-scene and intra-scene brightness has been presented. While the best modes for carrying out the invention has been described in detail, those familiar with the art to which this invention relates will recognize various alternative designs and embodiments for practicing the invention as defined by the following claims.

Claims

What Is Claimed Is:
1. A method for increasing dynamic range for an optical sensor, the optical sensor having a frame period and the optical sensor including a plurality of pixel cells, the method comprising: storing a first signal within each pixel cell, the first signal proportional to light integrated by the pixel cell over the previous frame period; generating a second signal within each pixel cell, the second signal proportional to light integrated by the pixel cell over the current frame period; generating an output signal as the sum of signals from each pixel cell; and storing within each pixel cell the second signal, whereby in the next application of the method the stored second signal is operative as the first signal and a new second signal is generated.
2. A method for increasing dynamic range of an optical sensor as in Claim 1, further comprising: obtaining a noise signal from each pixel cell; doubling the noise signal; and subtracting the doubled noise signal from the third summed signal, thereby producing an output signal with reduced noise.
3. A method for increasing dynamic range of an optical sensor as in Claim 1, wherein the output signal for each pixel cell is determined once each frame period.
4. A method for increasing dynamic range of an optical sensor as in Claim 1, wherein each pixel cell belongs to at least one of a plurality of groups and each group belongs to at least one of a number of sets, the method further comprising: determining a group for each pixel cell and a set for each group; selecting a first set; selecting a first group within the first set; determining the output signal for each pixel cell in the first group in the first set; determining the output signal for each pixel cell in each remaining group in the first set within a frame period; proceeding through the groups in each of the number of sets in a sequence; and repeating the sequence of sets such that the output signal for each pixel cell is produced once in a time period found as the product of the frame time and the number of sets.
5. A method for increasing dynamic range of an optical sensor as in Claim 4, wherein each pixel cell is in a sequentially numbered row, the step of determining a group for each pixel cell and a set for each group comprising: assigning each pixel cell to a group corresponding to a sequentially numbered row; assigning each group corresponding to an even numbered row to a first set; and assigning each group corresponding to an odd numbered row to a second set .
6. A method for increasing dynamic range of an optical sensor as in Claim 4, wherein each pixel cell is in a sequentially numbered row, the step of determining a group for each pixel cell and a set for each group comprising: assigning pixel cells in two adjacent rows to an even numbered group such that the pattern of pixels forms a "checkerboard", the group of pixel cells including the left-most pixel cell of the top row of the row pair; assigning pixel cells in two adjacent rows to an odd numbered group such that the pattern of pixels forms a "checkerboard", the group of pixel cells including the second left-most pixel cell of the top row of the row pair; assigning each even numbered group to a first set ; and assigning each odd numbered group to a second set .
7. A method for increasing dynamic range of an optical sensor as in Claim 1, wherein the optical sensor has an integration time generally applied to all pixel cells and wherein each pixel cell can be individually reset, the method further comprising: determining if each pixel sensor requires less integration time than the general integration time; determining a shortened integration time for each pixel sensor requiring less than the general integration time; and resetting each sensor pixel at a time so that the time remaining prior to producing an output signal approximates the desired shortened integration time.
8. A method for increasing dynamic range of an optical sensor, the optical sensor including a plurality of pixel cells, the method comprising: generating a first signal in each pixel cell indicative of light integrated by the pixel cell over a first period; storing the first signal in the pixel cell; generating a second signal in each pixel cell indicative of light integrated by the pixel cell over a second integration period shorter than the first inte- gration period; comparing the first signal to a predetermined threshold value; generating an output signal corresponding to the first signal if the first signal does not exceed the threshold value; and generating an output signal corresponding to the second signal if the first signal exceeds the threshold value .
9. A method for increasing dynamic range of an optical sensor as in Claim 8, the optical sensor having a frame period, wherein the sum of the first and second integration periods is not greater than the frame period.
10. A method for increasing dynamic range of an optical sensor as in Claim 8 further comprising: obtaining a noise signal from each pixel cell; and subtracting the noise signal from the output signal, thereby producing a reduced noise output signal.
11. A method for increasing dynamic range of an optical sensor as in Claim 8, wherein each pixel cell belongs to at least one of a plurality of groups and each group belongs to at least one of a number of sets, the method further comprising: determining a group for each pixel cell and a set for each group; selecting a first set; selecting a first group within the first set; determining the output signal for each pixel cell in the first group in the first set; determining the output signal for each pixel cell in each remaining group in the first set within a frame period; proceeding through the groups in each of the number of sets in a sequence; and repeating the sequence of sets such that the output signal for each pixel cell is produced once in a time period found as the product of the frame time and the number of sets.
12. A method for increasing dynamic range of an optical sensor as in Claim 11, wherein each pixel cell is in a sequentially numbered row, the step of determining a group for each pixel cell and a set for each group comprising: assigning each pixel cell to a group corresponding to a sequentially numbered row; assigning each group corresponding to an even numbered row to a first set; and assigning each group corresponding to an odd numbered row to a second set .
13. A method for increasing dynamic range of an optical sensor as in Claim 11, wherein each pixel cell is in a sequentially numbered row, the step of determining a group for each pixel cell and a set for each group comprising: assigning pixel cells in two adjacent rows to an even numbered group such that the pattern of pixels forms a "checkerboard", the group of pixel cells including the left-most pixel cell of the top row of the row pair; assigning pixel cells in two adjacent rows to an odd numbered group such that the pattern of pixels forms a "checkerboard", the group of pixel cells including the second left-most pixel cell of the top row of the row pair; assigning each even numbered group to a first set ; and assigning each odd numbered group to a second set .
14. A method for increasing dynamic range of an optical sensor as in Claim 8, wherein the optical sensor has an integration time generally applied to all pixel cells and wherein each pixel cell can be individually reset, the method further comprising: determining if each pixel sensor requires less integration time than the general integration time; determining a shortened integration time for each pixel sensor requiring less than the general integration time; and resetting each sensor pixel at a time so that the time remaining prior to producing an output signal approximates the desired shortened integration time.
15. An optical sensor producing an image of a scene once per frame period, the optical sensor comprising: a plurality of pixel sensors disposed in an array of rows and columns, each of the pixel sensors operable to generate a signal in proportion to incident light, store the signal, reset the signal to a reference level, and buffer the signal from other signals generated external to the pixel sensor; a plurality of output circuits, each output circuit in communication with the pixel sensors in a column and in communication with the optical sensor output ; a row decoder having a plurality of control lines, each of the control lines in communication with the pixel sensors in a respective row; and a timing controller in communication with the plurality of pixel sensors, the row decoder and the plurality of output circuits; wherein each output circuit is operable to sum the buffered signal representing light incident on a pixel sensor during a previous frame period and the buffered signal representing light incident on the same pixel sensor during the current frame period.
16. An optical sensor as in Claim 15, wherein the timing controller is operative to select each row and for each pixel in a selected row to
(a) route the signal representing the light incident during the previous frame period to a respective output circuit, (b) store the signal representing the light integrated during the current frame period,
(c) route the signal representing the light incident during the current frame period to the respective output circuit , (d) reset each pixel sensor so as to achieve the proper incident light time up to one frame period, and (e) sequentially route the results of each output circuit to the corresponding optical sensor output.
17. An optical sensor comprising: a plurality of pixel sensors, each pixel sensor disposed in one or more of a collection of groups, and each group disposed in one or more of a collection of sets, each pixel sensor operable to generate a signal in proportion to incident light, reset the signal to a reference level, and buffer the signal from other signals generated external to the pixel sensor; a plurality of output circuits, each output circuit in communication with a unique pixel sensor in each group and in communication with the optical sensor output, wherein each output circuit is operable to deliver the buffered signal representing light incident on a pixel sensor; a group decoder having a plurality of control lines, each of the control lines in communication with the pixel sensors in a respective group; and a timing controller in communication with the plurality of pixel sensors, the group decoder and the plurality of output circuits.
18. An optical sensor as in Claim 17, wherein each group is a row and wherein the plurality of sets of groups comprises: a first set of even numbered rows; and a second set of odd numbered rows .
19. An optical sensor comprising: a plurality of active pixel sensors arranged into an array of rows and columns; a row decoder having a plurality of control lines connected to the pixel sensor array, each control line being connected to the pixel sensors in a respective column, wherein the row decoder is operable to activate the pixel sensors in a row; a plurality of doubling circuits, each doubling circuit in communication with the respective pixel sensors in a column; a plurality of summing output circuits, each summing circuit in communication with the respective pixel sensors in a column; a plurality of differencing circuits, each differencing circuit in communication with the respective doubling circuit in a column and the respective summing circuit in the column; and a timing controller in communication with each pixel sensor, the row decoder, each doubling circuit, each summing circuit, and each differencing circuit.
20. An optical sensor as in Claim 19, wherein each of the plurality of active pixel sensors is operable to integrate charge in proportion to incident light, transfer the integrated charge to storage, reset the charge in storage to a reference level indicative of pixel sensor noise, and present a voltage representative of the charge in storage.
21. An optical sensor as in Claim 19, wherein each doubling circuit is operable to store a reference voltage representative of the reference charge, and produce a reference signal proportional to twice the reference voltage.
22. An optical sensor as in Claim 19, wherein each summing circuit is operable to store a first integration voltage representative of charge integrated previous to a reset, store a second integration voltage representative of charge generated subsequent to a reset, and produce an integrating signal proportional to the sum of the first integration voltage and the second integration voltage.
23. An optical sensor as in Claim 19, wherein each differencing circuit is operable to produce the difference between the integrating signal and the reference signal.
PCT/US1998/027821 1997-12-31 1998-12-30 Wide dynamic range optical sensor WO1999034592A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
AU19495/99A AU1949599A (en) 1997-12-31 1998-12-30 Wide dynamic range optical sensor
CA002315145A CA2315145A1 (en) 1997-12-31 1998-12-30 Wide dynamic range optical sensor
DE69816126T DE69816126T2 (en) 1997-12-31 1998-12-30 OPTICAL SENSOR WITH A WIDE DYNAMIC RANGE
JP2000527085A JP2002500476A (en) 1997-12-31 1998-12-30 Optical sensor with wide dynamic range
EP98964332A EP1044561B1 (en) 1997-12-31 1998-12-30 Wide dynamic range optical sensor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/002,400 1997-12-31
US09/002,400 US6008486A (en) 1997-12-31 1997-12-31 Wide dynamic range optical sensor

Publications (1)

Publication Number Publication Date
WO1999034592A1 true WO1999034592A1 (en) 1999-07-08

Family

ID=21700583

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/027821 WO1999034592A1 (en) 1997-12-31 1998-12-30 Wide dynamic range optical sensor

Country Status (7)

Country Link
US (1) US6008486A (en)
EP (1) EP1044561B1 (en)
JP (1) JP2002500476A (en)
AU (1) AU1949599A (en)
CA (1) CA2315145A1 (en)
DE (1) DE69816126T2 (en)
WO (1) WO1999034592A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1096790A2 (en) * 1999-10-26 2001-05-02 Eastman Kodak Company Variable collection of blooming charge to extend dynamic range
EP1096789A2 (en) * 1999-10-26 2001-05-02 Eastman Kodak Company Cmos image sensor with extended dynamic range
WO2006022163A1 (en) * 2004-08-26 2006-03-02 Hamamatsu Photonics K.K. Photodetector
EP1770987A1 (en) * 2005-09-30 2007-04-04 STMicroelectronics (Research & Development) Limited Improvements in or relating to image sensor artifact elimination
EP1463306A3 (en) * 2003-03-25 2008-01-23 Matsushita Electric Industrial Co., Ltd. Imaging device that prevents loss of shadow detail
WO2010066850A1 (en) * 2008-12-12 2010-06-17 E2V Semiconductors Image sensor with double charge transfer for large dynamic range and method of reading
US7956914B2 (en) 2007-08-07 2011-06-07 Micron Technology, Inc. Imager methods, apparatuses, and systems providing a skip mode with a wide dynamic range operation
WO2013024938A1 (en) 2011-08-16 2013-02-21 Lg Innotek Co., Ltd. Pixel, pixel array, image sensor including the same and method for operating the image sensor
WO2014008946A1 (en) 2012-07-13 2014-01-16 Teledyne Dalsa B.V. Method of reading out a cmos image sensor and a cmos image sensor configured for carrying out such method

Families Citing this family (195)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5910854A (en) 1993-02-26 1999-06-08 Donnelly Corporation Electrochromic polymeric solid films, manufacturing electrochromic devices using such solid films, and processes for making such solid films and devices
US5668663A (en) 1994-05-05 1997-09-16 Donnelly Corporation Electrochromic mirrors and devices
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US6049171A (en) 1998-09-18 2000-04-11 Gentex Corporation Continuously variable headlamp control
US6611610B1 (en) * 1997-04-02 2003-08-26 Gentex Corporation Vehicle lamp control
US6130421A (en) 1998-06-09 2000-10-10 Gentex Corporation Imaging system for vehicle headlamp control
US6631316B2 (en) 2001-03-05 2003-10-07 Gentex Corporation Image processing system to control vehicle headlamps or other vehicle equipment
US6681163B2 (en) * 2001-10-04 2004-01-20 Gentex Corporation Moisture sensor and windshield fog detector
US6861809B2 (en) 1998-09-18 2005-03-01 Gentex Corporation Headlamp control to prevent glare
US7019275B2 (en) * 1997-09-16 2006-03-28 Gentex Corporation Moisture sensor and windshield fog detector
US5923027A (en) * 1997-09-16 1999-07-13 Gentex Corporation Moisture sensor and windshield fog detector using an image sensor
US8120652B2 (en) * 1997-04-02 2012-02-21 Gentex Corporation System for controlling vehicle equipment
US7653215B2 (en) 1997-04-02 2010-01-26 Gentex Corporation System for controlling exterior vehicle lights
US5837994C1 (en) * 1997-04-02 2001-10-16 Gentex Corp Control system to automatically dim vehicle head lamps
US6774988B2 (en) * 2002-07-30 2004-08-10 Gentex Corporation Light source detection and categorization system for automatic vehicle exterior light control and method of manufacturing
US6587573B1 (en) 2000-03-20 2003-07-01 Gentex Corporation System for controlling exterior vehicle lights
US6141050A (en) * 1997-06-20 2000-10-31 Lucent Technologies Inc. MOS image sensor
US6124886A (en) 1997-08-25 2000-09-26 Donnelly Corporation Modular rearview mirror assembly
US8294975B2 (en) 1997-08-25 2012-10-23 Donnelly Corporation Automotive rearview mirror assembly
US6172613B1 (en) 1998-02-18 2001-01-09 Donnelly Corporation Rearview mirror assembly incorporating vehicle information display
US6326613B1 (en) 1998-01-07 2001-12-04 Donnelly Corporation Vehicle interior mirror assembly adapted for containing a rain sensor
EP0928103A3 (en) * 1997-12-31 2000-08-02 Texas Instruments Incorporated CMOS imaging sensors
US8288711B2 (en) 1998-01-07 2012-10-16 Donnelly Corporation Interior rearview mirror system with forwardly-viewing camera and a control
US6445287B1 (en) 2000-02-28 2002-09-03 Donnelly Corporation Tire inflation assistance monitoring system
US6690268B2 (en) 2000-03-02 2004-02-10 Donnelly Corporation Video mirror systems incorporating an accessory module
US6667768B1 (en) 1998-02-17 2003-12-23 Micron Technology, Inc. Photodiode-type pixel for global electronic shutter and reduced lag
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6693517B2 (en) 2000-04-21 2004-02-17 Donnelly Corporation Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants
US6329925B1 (en) 1999-11-24 2001-12-11 Donnelly Corporation Rearview mirror assembly with added feature modular display
US6906745B1 (en) * 1998-04-23 2005-06-14 Micron Technology, Inc. Digital exposure circuit for an image sensor
JP3871439B2 (en) * 1998-06-05 2007-01-24 松下電器産業株式会社 Solid-state imaging device and driving method thereof
JP4200545B2 (en) * 1998-06-08 2008-12-24 ソニー株式会社 Solid-state imaging device, driving method thereof, and camera system
US7233346B1 (en) * 1998-07-14 2007-06-19 The United States Of America As Represented By The Secretary Of The Navy Differential imaging method and system
US6246043B1 (en) * 1998-09-22 2001-06-12 Foveon, Inc. Method and apparatus for biasing a CMOS active pixel sensor above the nominal voltage maximums for an IC process
US7139025B1 (en) * 1998-10-29 2006-11-21 Micron Technology, Inc. Active pixel sensor with mixed analog and digital signal integration
US6646245B2 (en) * 1999-01-22 2003-11-11 Intel Corporation Focal plane averaging implementation for CMOS imaging arrays using a split photodiode architecture
US6313457B1 (en) 1999-01-25 2001-11-06 Gentex Corporation Moisture detecting system using semiconductor light sensor with integral charge collection
JP2003524545A (en) 1999-01-25 2003-08-19 ジェンテクス・コーポレーション Vehicle device control using semiconductor optical sensor
US6679608B2 (en) * 1999-01-25 2004-01-20 Gentex Corporation Sensor device having an integral anamorphic lens
US6188433B1 (en) * 1999-02-02 2001-02-13 Ball Aerospace & Technologies Corp. Method and apparatus for enhancing the dynamic range of a CCD sensor
US6873363B1 (en) * 1999-02-16 2005-03-29 Micron Technology Inc. Technique for flagging oversaturated pixels
JP2000253315A (en) * 1999-03-01 2000-09-14 Kawasaki Steel Corp Cmos image sensor
US6803958B1 (en) * 1999-03-09 2004-10-12 Micron Technology, Inc. Apparatus and method for eliminating artifacts in active pixel sensor (APS) imagers
JP2000287130A (en) * 1999-03-31 2000-10-13 Sharp Corp Amplifier type solid-state image pickup device
US7092021B2 (en) * 2000-02-22 2006-08-15 Micron Technology, Inc. Frame shuttering scheme for increased frame rate
US7167796B2 (en) 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
US7004593B2 (en) 2002-06-06 2006-02-28 Donnelly Corporation Interior rearview mirror system with compass
US7370983B2 (en) 2000-03-02 2008-05-13 Donnelly Corporation Interior mirror assembly with display
AU4244701A (en) * 2000-03-03 2001-09-12 Xion Gmbh Method and device for the stroboscopic recording and reproduction of repetitive processes
DE10017162C5 (en) * 2000-03-03 2004-04-01 Xion Gmbh Method and device for stroboscopic recording and playback of repetitive processes
US6403942B1 (en) 2000-03-20 2002-06-11 Gentex Corporation Automatic headlamp control system utilizing radar and an optical sensor
US6396408B2 (en) 2000-03-31 2002-05-28 Donnelly Corporation Digital electrochromic circuit with a vehicle network
US6348681B1 (en) * 2000-06-05 2002-02-19 National Semiconductor Corporation Method and circuit for setting breakpoints for active pixel sensor cell to achieve piecewise linear transfer function
US6365926B1 (en) 2000-09-20 2002-04-02 Eastman Kodak Company CMOS active pixel with scavenging diode
US6504195B2 (en) 2000-12-29 2003-01-07 Eastman Kodak Company Alternate method for photodiode formation in CMOS image sensors
US7581859B2 (en) 2005-09-14 2009-09-01 Donnelly Corp. Display device for exterior rearview mirror
ES2287266T3 (en) 2001-01-23 2007-12-16 Donnelly Corporation IMPROVED VEHICLE LIGHTING SYSTEM.
US7255451B2 (en) 2002-09-20 2007-08-14 Donnelly Corporation Electro-optic mirror cell
US7079178B2 (en) * 2001-02-20 2006-07-18 Jaroslav Hynecek High dynamic range active pixel CMOS image sensor and data processing system incorporating adaptive pixel reset
DE10110108A1 (en) * 2001-03-02 2002-09-19 Reimar Lenz Digital camera with CMOS image sensor with improved dynamics and method for controlling a CMOS image sensor
US6816154B2 (en) * 2001-05-30 2004-11-09 Palmone, Inc. Optical sensor based user interface for a portable electronic device
JP3437843B2 (en) 2001-07-06 2003-08-18 沖電気工業株式会社 Method of forming insulating film and method of manufacturing integrated circuit
JP2003057113A (en) * 2001-08-13 2003-02-26 Canon Inc Photoelectric transducer, photometry sensor and imaging device
US6963370B2 (en) * 2001-09-24 2005-11-08 The Board Of Trustees Of The Leland Stanford Junior University Method for improving SNR in low illumination conditions in a CMOS video sensor system using a self-resetting digital pixel
US6617564B2 (en) 2001-10-04 2003-09-09 Gentex Corporation Moisture sensor utilizing stereo imaging with an image sensor
US8054357B2 (en) * 2001-11-06 2011-11-08 Candela Microsystems, Inc. Image sensor with time overlapping image output
US6795117B2 (en) 2001-11-06 2004-09-21 Candela Microsystems, Inc. CMOS image sensor with noise cancellation
US7233350B2 (en) * 2002-01-05 2007-06-19 Candela Microsystems, Inc. Image sensor with interleaved image output
US7006080B2 (en) * 2002-02-19 2006-02-28 Palm, Inc. Display system
US6982403B2 (en) * 2002-03-27 2006-01-03 Omnivision Technologies, Inc. Method and apparatus kTC noise cancelling in a linear CMOS image sensor
US20030193594A1 (en) * 2002-04-16 2003-10-16 Tay Hiok Nam Image sensor with processor controlled integration time
US6918674B2 (en) 2002-05-03 2005-07-19 Donnelly Corporation Vehicle rearview mirror system
US7329013B2 (en) 2002-06-06 2008-02-12 Donnelly Corporation Interior rearview mirror system with compass
US7683326B2 (en) 2002-07-09 2010-03-23 Gentex Corporation Vehicle vision system with high dynamic range
MXPA05001880A (en) 2002-08-21 2005-06-03 Gentex Corp Image acquisition and processing methods for automatic vehicular exterior lighting control.
US7382407B2 (en) * 2002-08-29 2008-06-03 Micron Technology, Inc. High intrascene dynamic range NTSC and PAL imager
WO2004103772A2 (en) 2003-05-19 2004-12-02 Donnelly Corporation Mirror assembly for vehicle
US7310177B2 (en) 2002-09-20 2007-12-18 Donnelly Corporation Electro-optic reflective element assembly
EP1543358A2 (en) 2002-09-20 2005-06-22 Donnelly Corporation Mirror reflective element assembly
US6894264B2 (en) * 2002-10-15 2005-05-17 Applera Corporation System and methods for dynamic range extension using variable length integration time sampling
US7489352B2 (en) * 2002-11-15 2009-02-10 Micron Technology, Inc. Wide dynamic range pinned photodiode active pixel sensor (APS)
KR100484278B1 (en) * 2003-02-07 2005-04-20 (주)실리콘화일 Optical image receiving device with wide operating range
EP1602117B2 (en) 2003-02-21 2015-11-18 Gentex Corporation Automatic vehicle exterior light control system assemblies
CN100420592C (en) * 2003-02-21 2008-09-24 金泰克斯公司 Automatic vehicle exterior light control system
US8326483B2 (en) * 2003-02-21 2012-12-04 Gentex Corporation Monitoring and automatic equipment control systems
US8045760B2 (en) * 2003-02-21 2011-10-25 Gentex Corporation Automatic vehicle exterior light control systems
US6897519B1 (en) 2003-02-26 2005-05-24 Dialog Semiconductor Tunneling floating gate APS pixel
US7015960B2 (en) * 2003-03-18 2006-03-21 Candela Microsystems, Inc. Image sensor that uses a temperature sensor to compensate for dark current
EP1460838A1 (en) * 2003-03-18 2004-09-22 Thomson Licensing S.A. Image sensing device, process for driving such a device and electrical signal generated in a such device
JP4979376B2 (en) 2003-05-06 2012-07-18 ジェンテックス コーポレイション Vehicle rearview mirror elements and assemblies incorporating these elements
KR100928056B1 (en) 2003-05-19 2009-11-24 젠텍스 코포레이션 Rearview mirror assembly with hands-free phone component
US7859581B2 (en) * 2003-07-15 2010-12-28 Eastman Kodak Company Image sensor with charge binning and dual channel readout
US7456884B2 (en) * 2003-08-05 2008-11-25 Aptina Imaging Corporation Method and circuit for determining the response curve knee point in active pixel image sensors with extended dynamic range
US7408195B2 (en) * 2003-09-04 2008-08-05 Cypress Semiconductor Corporation (Belgium) Bvba Semiconductor pixel arrays with reduced sensitivity to defects
US7446924B2 (en) 2003-10-02 2008-11-04 Donnelly Corporation Mirror reflective element assembly including electronic component
US7308341B2 (en) 2003-10-14 2007-12-11 Donnelly Corporation Vehicle communication system
US7446812B2 (en) 2004-01-13 2008-11-04 Micron Technology, Inc. Wide dynamic range operations for imaging
US20050212936A1 (en) * 2004-03-25 2005-09-29 Eastman Kodak Company Extended dynamic range image sensor with fixed pattern noise reduction
US7583305B2 (en) * 2004-07-07 2009-09-01 Eastman Kodak Company Extended dynamic range imaging system
US7397509B2 (en) * 2004-08-27 2008-07-08 Micron Technology, Inc. High dynamic range imager with a rolling shutter
JP4093220B2 (en) * 2004-10-05 2008-06-04 コニカミノルタホールディングス株式会社 Solid-state imaging device and imaging device including the solid-state imaging device
US20060082670A1 (en) * 2004-10-14 2006-04-20 Eastman Kodak Company Interline CCD for still and video photography with extended dynamic range
JP4838261B2 (en) * 2004-11-18 2011-12-14 ジェンテックス コーポレイション Image collection and processing system for vehicle equipment control
US8924078B2 (en) 2004-11-18 2014-12-30 Gentex Corporation Image acquisition and processing system for vehicle equipment control
EP1883855B1 (en) 2005-05-16 2011-07-20 Donnelly Corporation Vehicle mirror assembly with indicia at reflective element
US7417221B2 (en) * 2005-09-08 2008-08-26 Gentex Corporation Automotive vehicle image sensor
EP1949666B1 (en) 2005-11-01 2013-07-17 Magna Mirrors of America, Inc. Interior rearview mirror with display
KR100738551B1 (en) * 2006-01-17 2007-07-11 삼성전자주식회사 Heater roller of the image forming apparatus
EP2426552A1 (en) 2006-03-03 2012-03-07 Gentex Corporation Electro-optic elements incorporating improved thin-film coatings
EP2378350B1 (en) 2006-03-09 2013-12-11 Gentex Corporation Vehicle rearview assembly including a high intensity display
WO2008120292A1 (en) * 2007-02-28 2008-10-09 Hamamatsu Photonics K.K. Solid-state imaging apparatus
EP2071825A1 (en) * 2007-12-13 2009-06-17 St Microelectronics S.A. Pixel read circuitry
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
CN103189983B (en) * 2008-07-17 2016-10-26 微软国际控股私有有限公司 There is the electric charge detection unit of improvement and the CMOS grating 3D camera arrangement of pixel geometry
US9487144B2 (en) 2008-10-16 2016-11-08 Magna Mirrors Of America, Inc. Interior mirror assembly with display
US8723827B2 (en) 2009-07-28 2014-05-13 Cypress Semiconductor Corporation Predictive touch surface scanning
US8218046B1 (en) * 2009-12-17 2012-07-10 Jai, Inc. USA Monochrome/color dual-slope traffic camera system
US9056584B2 (en) 2010-07-08 2015-06-16 Gentex Corporation Rearview assembly for a vehicle
US8646924B2 (en) 2011-02-28 2014-02-11 Gentex Corporation Rearview device mounting assembly with rotatable support
US8814373B2 (en) 2011-02-28 2014-08-26 Gentex Corporation Rearview device support assembly
US8620523B2 (en) 2011-06-24 2013-12-31 Gentex Corporation Rearview assembly with multiple ambient light sensors
WO2013022731A1 (en) 2011-08-05 2013-02-14 Gentex Corporation Optical assembly for a light sensor
US9316347B2 (en) 2012-01-24 2016-04-19 Gentex Corporation Rearview assembly with interchangeable rearward viewing device
US8879139B2 (en) 2012-04-24 2014-11-04 Gentex Corporation Display mirror assembly
US11330171B1 (en) * 2012-05-25 2022-05-10 Gn Audio A/S Locally adaptive luminance and chrominance blending in a multiple imager video system
US8983135B2 (en) 2012-06-01 2015-03-17 Gentex Corporation System and method for controlling vehicle equipment responsive to a multi-stage village detection
EP2859436B1 (en) 2012-06-12 2020-06-03 Gentex Corporation Vehicle imaging system providing multi-stage aiming stability indication
WO2014022630A1 (en) * 2012-08-02 2014-02-06 Gentex Corporation System and method for controlling exterior vehicle lights responsive to detection of a semi-truck
US9511708B2 (en) 2012-08-16 2016-12-06 Gentex Corporation Method and system for imaging an external scene by employing a custom image sensor
WO2014032042A1 (en) 2012-08-24 2014-02-27 Gentex Corporation Shaped rearview mirror assembly
US9327648B2 (en) 2013-01-04 2016-05-03 Gentex Corporation Rearview assembly with exposed carrier plate
US9488892B2 (en) 2013-01-09 2016-11-08 Gentex Corporation Printed appliqué and method thereof
US9207116B2 (en) 2013-02-12 2015-12-08 Gentex Corporation Light sensor
US9870753B2 (en) 2013-02-12 2018-01-16 Gentex Corporation Light sensor having partially opaque optic
US9276031B2 (en) 2013-03-04 2016-03-01 Apple Inc. Photodiode with different electric potential regions for image sensors
US9741754B2 (en) 2013-03-06 2017-08-22 Apple Inc. Charge transfer circuit with storage nodes in image sensors
US9549099B2 (en) 2013-03-12 2017-01-17 Apple Inc. Hybrid image sensor
US9973716B2 (en) * 2013-08-02 2018-05-15 Samsung Electronics Co., Ltd. Reset noise reduction for pixel readout with pseudo correlated double sampling
EP3036130B1 (en) 2013-08-19 2023-10-11 Gentex Corporation Vehicle imaging system and method for distinguishing between vehicle tail lights and flashing red stop lights
EP3036131B1 (en) 2013-08-19 2021-04-07 Gentex Corporation Imaging system and method with ego motion detection
EP3036132B1 (en) 2013-08-19 2019-02-27 Gentex Corporation Vehicle imaging system and method for distinguishing reflective objects from lights of another vehicle
US9884591B2 (en) 2013-09-04 2018-02-06 Gentex Corporation Display system for displaying images acquired by a camera system onto a rearview assembly of a vehicle
EP3049286B1 (en) 2013-09-24 2018-05-09 Gentex Corporation Display mirror assembly
WO2015050996A1 (en) 2013-10-01 2015-04-09 Gentex Corporation System and method for controlling exterior vehicle lights on motorways
CN105705353B (en) 2013-11-15 2018-05-01 金泰克斯公司 Including decaying to color into the imaging system of Mobile state compensation
US9596423B1 (en) 2013-11-21 2017-03-14 Apple Inc. Charge summing in an image sensor
US9596420B2 (en) 2013-12-05 2017-03-14 Apple Inc. Image sensor having pixels with different integration periods
US9473706B2 (en) 2013-12-09 2016-10-18 Apple Inc. Image sensor flicker detection
WO2015116915A1 (en) 2014-01-31 2015-08-06 Gentex Corporation Backlighting assembly for display for reducing cross-hatching
US10285626B1 (en) 2014-02-14 2019-05-14 Apple Inc. Activity identification using an optical heart rate monitor
US9584743B1 (en) 2014-03-13 2017-02-28 Apple Inc. Image sensor with auto-focus and pixel cross-talk compensation
EP3119643B1 (en) 2014-03-21 2018-05-23 Gentex Corporation Tri-modal display mirror assembly
WO2015153740A1 (en) 2014-04-01 2015-10-08 Gentex Corporation Automatic display mirror assembly
US9538106B2 (en) 2014-04-25 2017-01-03 Apple Inc. Image sensor having a uniform digital power signature
US9686485B2 (en) 2014-05-30 2017-06-20 Apple Inc. Pixel binning in an image sensor
WO2016044746A1 (en) 2014-09-19 2016-03-24 Gentex Corporation Rearview assembly
US9694752B2 (en) 2014-11-07 2017-07-04 Gentex Corporation Full display mirror actuator
EP3218227B1 (en) 2014-11-13 2018-10-24 Gentex Corporation Rearview mirror system with a display
EP3227143B1 (en) 2014-12-03 2019-03-13 Gentex Corporation Display mirror assembly
USD746744S1 (en) 2014-12-05 2016-01-05 Gentex Corporation Rearview device
US9744907B2 (en) 2014-12-29 2017-08-29 Gentex Corporation Vehicle vision system having adjustable displayed field of view
US9720278B2 (en) 2015-01-22 2017-08-01 Gentex Corporation Low cost optical film stack
JP6380986B2 (en) * 2015-03-12 2018-08-29 富士フイルム株式会社 Imaging apparatus and method
JP2018513810A (en) 2015-04-20 2018-05-31 ジェンテックス コーポレイション Rear view assembly with decoration
KR102050315B1 (en) 2015-05-18 2019-11-29 젠텍스 코포레이션 Front display rearview mirror
KR102135427B1 (en) 2015-06-22 2020-07-17 젠텍스 코포레이션 Systems and methods for processing streamed video images to correct flicker of amplitude-modulated light
CN113099136A (en) * 2015-09-30 2021-07-09 株式会社尼康 Imaging element
USD797627S1 (en) 2015-10-30 2017-09-19 Gentex Corporation Rearview mirror device
US9994156B2 (en) 2015-10-30 2018-06-12 Gentex Corporation Rearview device
WO2017075420A1 (en) 2015-10-30 2017-05-04 Gentex Corporation Toggle paddle
USD798207S1 (en) 2015-10-30 2017-09-26 Gentex Corporation Rearview mirror assembly
USD800618S1 (en) 2015-11-02 2017-10-24 Gentex Corporation Toggle paddle for a rear view device
US10015429B2 (en) * 2015-12-30 2018-07-03 Omnivision Technologies, Inc. Method and system for reducing noise in an image sensor using a parallel multi-ramps merged comparator analog-to-digital converter
USD845851S1 (en) 2016-03-31 2019-04-16 Gentex Corporation Rearview device
USD817238S1 (en) 2016-04-29 2018-05-08 Gentex Corporation Rearview device
US9912883B1 (en) 2016-05-10 2018-03-06 Apple Inc. Image sensor with calibrated column analog-to-digital converters
US10025138B2 (en) 2016-06-06 2018-07-17 Gentex Corporation Illuminating display with light gathering structure
EP3712945A3 (en) 2016-09-23 2020-12-02 Apple Inc. Stacked backside illuminated spad array
USD809984S1 (en) 2016-12-07 2018-02-13 Gentex Corporation Rearview assembly
USD854473S1 (en) 2016-12-16 2019-07-23 Gentex Corporation Rearview assembly
JP2020505802A (en) 2016-12-30 2020-02-20 ジェンテックス コーポレイション Full screen mirror with on-demand spotter view
US10656251B1 (en) 2017-01-25 2020-05-19 Apple Inc. Signal acquisition in a SPAD detector
CN110235024B (en) 2017-01-25 2022-10-28 苹果公司 SPAD detector with modulation sensitivity
US10962628B1 (en) 2017-01-26 2021-03-30 Apple Inc. Spatial temporal weighting in a SPAD detector
JP6944525B2 (en) 2017-01-27 2021-10-06 ジェンテックス コーポレイション Image correction for motorcycle banking
WO2018170353A1 (en) 2017-03-17 2018-09-20 Gentex Corporation Dual display reverse camera system
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
US10440301B2 (en) 2017-09-08 2019-10-08 Apple Inc. Image capture device, pixel, and method providing improved phase detection auto-focus performance
US11282303B2 (en) 2017-12-01 2022-03-22 Gentex Corporation System and method for identifying vehicle operation mode
US11019294B2 (en) 2018-07-18 2021-05-25 Apple Inc. Seamless readout mode transitions in image sensors
US10848693B2 (en) 2018-07-18 2020-11-24 Apple Inc. Image flare detection using asymmetric pixels
US11272132B2 (en) 2019-06-07 2022-03-08 Pacific Biosciences Of California, Inc. Temporal differential active pixel sensor
WO2021084511A1 (en) 2019-10-31 2021-05-06 Gentex Corporation Rotatable outside mirror with imager assembly
US11563910B2 (en) 2020-08-04 2023-01-24 Apple Inc. Image capture devices having phase detection auto-focus pixels
US11546532B1 (en) 2021-03-16 2023-01-03 Apple Inc. Dynamic correlated double sampling for noise rejection in image sensors

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4647975A (en) * 1985-10-30 1987-03-03 Polaroid Corporation Exposure control system for an electronic imaging camera having increased dynamic range
EP0244230A2 (en) * 1986-04-29 1987-11-04 British Aerospace Public Limited Company Solid state imaging apparatus
EP0417397A1 (en) * 1989-09-13 1991-03-20 Kabushiki Kaisha Toshiba Solid-state camera
US5194957A (en) * 1990-11-02 1993-03-16 Canon Kabushiki Kaisha Image sensing method and apparatus in which the exposure time may be varied
EP0573235A1 (en) * 1992-05-30 1993-12-08 Sony Corporation Solid-state image pick-up apparatus with integration time control
WO1997017800A1 (en) * 1995-11-07 1997-05-15 California Institute Of Technology An image sensor with high dynamic range linear output
GB2313973A (en) * 1996-06-04 1997-12-10 Adam John Bexley Extended integration period video camera

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2522231B1 (en) * 1982-02-23 1986-01-31 Thomson Csf VIDEO FREQUENCY IMAGE SENSOR APPARATUS WITH AUTOMATIC SENSITIVITY CORRECTION FOR USE ON A TELEVISION AUTODIRECTOR
JP2813667B2 (en) * 1989-05-17 1998-10-22 富士重工業株式会社 Monitor screen automatic adjustment device for in-vehicle monitor device
US5060075A (en) * 1989-12-22 1991-10-22 North American Philips Corporation CRT display device with variable light transmission panel
US5387958A (en) * 1992-06-30 1995-02-07 Sony Electronics, Inc. Electro-optical control of light attenuation in the optical path of a camera
US5670935A (en) * 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
KR950002084A (en) * 1993-06-22 1995-01-04 오가 노리오 Charge transfer device
KR950004914A (en) * 1993-07-14 1995-02-18 김광호 Brightness control circuit in the viewfinder of the camcorder
JPH0775022A (en) * 1993-08-31 1995-03-17 Matsushita Electric Ind Co Ltd Camera apparatus
US5471515A (en) * 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US5631704A (en) * 1994-10-14 1997-05-20 Lucent Technologies, Inc. Active pixel sensor and imaging system having differential mode
US5721425A (en) * 1996-03-01 1998-02-24 National Semiconductor Corporation Active pixel sensor cell that reduces the effect of 1/f noise, increases the voltage range of the cell, and reduces the size of the cell
DE69732409D1 (en) * 1996-05-06 2005-03-10 Cimatrix Canton SMART CCD CAMERA WITH PROGRESSIVE SCAN
US5898168A (en) * 1997-06-12 1999-04-27 International Business Machines Corporation Image sensor pixel circuit

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4647975A (en) * 1985-10-30 1987-03-03 Polaroid Corporation Exposure control system for an electronic imaging camera having increased dynamic range
EP0244230A2 (en) * 1986-04-29 1987-11-04 British Aerospace Public Limited Company Solid state imaging apparatus
EP0417397A1 (en) * 1989-09-13 1991-03-20 Kabushiki Kaisha Toshiba Solid-state camera
US5194957A (en) * 1990-11-02 1993-03-16 Canon Kabushiki Kaisha Image sensing method and apparatus in which the exposure time may be varied
EP0573235A1 (en) * 1992-05-30 1993-12-08 Sony Corporation Solid-state image pick-up apparatus with integration time control
WO1997017800A1 (en) * 1995-11-07 1997-05-15 California Institute Of Technology An image sensor with high dynamic range linear output
GB2313973A (en) * 1996-06-04 1997-12-10 Adam John Bexley Extended integration period video camera

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1096789A2 (en) * 1999-10-26 2001-05-02 Eastman Kodak Company Cmos image sensor with extended dynamic range
JP2001169184A (en) * 1999-10-26 2001-06-22 Eastman Kodak Co Cmos image sensor having expanded dynamic range
EP1096790A3 (en) * 1999-10-26 2003-03-26 Eastman Kodak Company Variable collection of blooming charge to extend dynamic range
EP1096789A3 (en) * 1999-10-26 2003-03-26 Eastman Kodak Company Cmos image sensor with extended dynamic range
JP4536901B2 (en) * 1999-10-26 2010-09-01 イーストマン コダック カンパニー CMOS image sensor with expanded dynamic range
EP1096790A2 (en) * 1999-10-26 2001-05-02 Eastman Kodak Company Variable collection of blooming charge to extend dynamic range
US8319875B2 (en) 2003-03-25 2012-11-27 Panasonic Corporation Imaging device that prevents loss of shadow detail
EP1463306A3 (en) * 2003-03-25 2008-01-23 Matsushita Electric Industrial Co., Ltd. Imaging device that prevents loss of shadow detail
US7528871B2 (en) 2003-03-25 2009-05-05 Panasonic Corporation Imaging device that prevents loss of shadow detail
US7898587B2 (en) 2003-03-25 2011-03-01 Panasonic Corporation Imaging device that prevents loss of shadow detail
US7679663B2 (en) 2004-08-26 2010-03-16 Hamamatsu Photonics K.K. Photodetection apparatus
WO2006022163A1 (en) * 2004-08-26 2006-03-02 Hamamatsu Photonics K.K. Photodetector
EP1770987A1 (en) * 2005-09-30 2007-04-04 STMicroelectronics (Research & Development) Limited Improvements in or relating to image sensor artifact elimination
US7880779B2 (en) 2005-09-30 2011-02-01 Stmicroelectronics (Research And Development) Limited Image sensor artifact elimination
US7956914B2 (en) 2007-08-07 2011-06-07 Micron Technology, Inc. Imager methods, apparatuses, and systems providing a skip mode with a wide dynamic range operation
US8368792B2 (en) 2007-08-07 2013-02-05 Micron Technology, Inc. Imager methods, apparatuses, and systems providing a skip mode with a wide dynamic range operation
FR2939999A1 (en) * 2008-12-12 2010-06-18 E2V Semiconductors DUAL LOAD TRANSFER IMAGE SENSOR FOR HIGH DYNAMIC AND READING METHOD
WO2010066850A1 (en) * 2008-12-12 2010-06-17 E2V Semiconductors Image sensor with double charge transfer for large dynamic range and method of reading
CN102246510B (en) * 2008-12-12 2017-05-17 E2V半导体公司 Image sensor with double charge transfer for large dynamic range and method of reading
WO2013024938A1 (en) 2011-08-16 2013-02-21 Lg Innotek Co., Ltd. Pixel, pixel array, image sensor including the same and method for operating the image sensor
EP2745502A4 (en) * 2011-08-16 2015-07-01 Lg Innotek Co Ltd Pixel, pixel array, image sensor including the same and method for operating the image sensor
WO2014008946A1 (en) 2012-07-13 2014-01-16 Teledyne Dalsa B.V. Method of reading out a cmos image sensor and a cmos image sensor configured for carrying out such method

Also Published As

Publication number Publication date
DE69816126T2 (en) 2004-04-15
JP2002500476A (en) 2002-01-08
EP1044561B1 (en) 2003-07-02
DE69816126D1 (en) 2003-08-07
AU1949599A (en) 1999-07-19
US6008486A (en) 1999-12-28
EP1044561A1 (en) 2000-10-18
CA2315145A1 (en) 1999-07-08

Similar Documents

Publication Publication Date Title
US6008486A (en) Wide dynamic range optical sensor
US11595600B2 (en) Method, apparatus and system providing a storage gate pixel with high dynamic range
EP0862829B1 (en) An image sensor with high dynamic range linear output
US6570617B2 (en) CMOS active pixel sensor type imaging system on a chip
US7369166B2 (en) Single substrate camera device with CMOS image sensor
US6606122B1 (en) Single chip camera active pixel sensor
US6947087B2 (en) Solid-state imaging device with dynamic range control
US7476836B2 (en) Multi-point correlated sampling for image sensors
KR100750778B1 (en) Photodiode active pixel sensor with shared reset signal row select
US8462248B2 (en) Active pixel sensor with mixed analog and digital signal integration
US20050128327A1 (en) Device and method for image sensing
US20080246869A1 (en) Differential readout from pixels in CMOS sensor
EP0871326B1 (en) Motion-detecting image sensor incorporating signal digitization
USRE42918E1 (en) Single substrate camera device with CMOS image sensor
US20090084937A1 (en) Solid state imaging device
Yadid-Pecht et al. Image sensor with high dynamic range linear output

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2315145

Country of ref document: CA

Ref country code: CA

Ref document number: 2315145

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1998964332

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 527085

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: KR

WWP Wipo information: published in national office

Ref document number: 1998964332

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWG Wipo information: grant in national office

Ref document number: 1998964332

Country of ref document: EP