WO1989012837A1 - Range finding device - Google Patents

Range finding device Download PDF

Info

Publication number
WO1989012837A1
WO1989012837A1 PCT/AU1989/000263 AU8900263W WO8912837A1 WO 1989012837 A1 WO1989012837 A1 WO 1989012837A1 AU 8900263 W AU8900263 W AU 8900263W WO 8912837 A1 WO8912837 A1 WO 8912837A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
light source
image data
image
range finding
Prior art date
Application number
PCT/AU1989/000263
Other languages
French (fr)
Inventor
Kemal Ajay
Original Assignee
Kemal Ajay
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kemal Ajay filed Critical Kemal Ajay
Publication of WO1989012837A1 publication Critical patent/WO1989012837A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • This invention relates to optical radar systems, and more particularly to systems which acquire range data points in a parallel or simultaneous fashion, rather than point by point using a scanning mechanism.
  • the invention has particular utility in robotics where it is necessary for a robot to obtain a "picture" of the surrounding and consequently the invention i generally applicable over only a short range.
  • U.S. Patent No. 3,899,250 discloses a system where the delay between the outgoing pulse and receiver activation is controlled on successive pulse cycles, to give a known correction for sensitivity with distance.
  • French, U.S. Patent No. 4,226,529 presents a viewing system whereby the contrast of an image of a target at a particular distance is enhanced with the time gating adjustable to view different ranges.
  • U.S. Patent No. 4,603,250 discloses a viewing system in which an image intensifier is used as the receiver ⁇ with the photocathode gated and the gain of the receiver adjusted by varying the microchannel plate voltage.
  • Each diode is activated in turn and the time taken for light to travel from the source to the reflecting object and back to the photodiode is measured.
  • the echo time gives the time of flight, and hence distance, to the object in the scene that the particular diode is focused upon.
  • an optical radar range finding system comprising a high speed switchable light source for illuminating a scene, a high speed switchable imagin device for receiving a reflected image from said scene and providing an output to control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control -means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevan to said scene.
  • said control logic operates a switching device and the switching device is able to switch said light source to be continuously on, triggered by said pulsing means or in an off condition.
  • said light source is a laser diode array.
  • the store means is a video frame store.
  • Another broad form of the invention provides a method of obtaining range information relating to a scene comprising the steps of:
  • said method includes the further step of storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image dat from the first and second image data, respectively, prior to said division so as to remove background illumination effects.
  • Figure 1 is a simplified block diagram of apparatus constitutin the system.
  • Figure 2 shows time profiles of reflected light pulses and sensitivity of the image sensor, of the system of Figure 1.
  • Figures 3 and 4 show plan view and side view, respectively, of experimental apparatus used to demonstrate the apparatus.
  • Figure 5 shows the range image acquired from the system when viewing the apparatus of Figures 3 and 4.
  • Figure 6 shows a profile of distances for a horizontal section through Figure 5.
  • FIG. 7 is a more detailed block diagram of the apparatus of Figure 1.
  • Figure 8 is a timing diagram showing timing details of the system's hardware.
  • Figure 9 shows graphs of the approximate output of laser diode array, and the image intensifier sensitivity, respectively.
  • control hardware 1 selects switch 2 so that the high speed light source 3 is pulsed on and off by the pulse generator 4.
  • the pulse generator 4 also drives circuitry (not shown in Figure 1) that controls the sensitivity of image sensor 5.
  • the resulting image is stored in the contro hardware 1.
  • the reference 6 represents a video monitor.
  • control hardware 1 selects switch 2 so that the light source is continuously on.
  • the resulting image from the image sensor is again stored in the control hardware 1.
  • the intensity of the image is dependent upon the distance to the objects, due to the inverse square law.
  • the image is also dependent upon a gating effect caused by the overlap of the reflected light pulse from the scene with the 'on' time of the image sensor, again depende on distance.
  • .12 is taken when the light source is continuously on and is dependent only on the surface reflectivity of objects in the scene.
  • the intensity of the image is dependent upon the distan to the objects, caused by the inverse square law.
  • R is a range image and K is a calibration factor which is constant.
  • This processing removes the effect of surface reflectivity and inverse square light loss.
  • the only effect that remains is tha due to the overlap or convolution of the pulsed light with the gated image sensor.
  • the convolution value depends on the delay between the onset of the reflected pulse at the receiver, and the onset of the activation of the image sensor. This delay is due to the round trip time for light to travel from the pulsed light source, to the objects in the scene and back to the image sensor 5. Thus, for any pixel in the range, knowing the shape of the convolution function and the overall calibration factor K, the distance from the rangefinder to any point in the image may be found directly from R.
  • the calibration factor, K is globally applied to all points in the image and may be found by obtaining a range image of a scene of known dimensions. This need be done only once.
  • a refinement of the above mode of operation, to facilitate operation in non-ideal environments, is the extension of the control hardware to capture a third 'background' image to includ in the processing. This image is taken with the light source turned off so that background illumination effects are recorded. The processing is modified so that
  • the experiment used to demonstrate the system comprises six test cards A separated from each other by 15 cm.
  • the receiver 5 is 115 cm from the closet test card A.
  • the cards are also offset transversely to the direction of light source.
  • Figure 5 shows the range image acquired using such apparatus and
  • Figure 6 shows a profile of distances to the various cards on a horizontal section through the image of Figure 5.
  • a pulse generator 11 provides timed trigger signa to the image intensifier driver 12 which drives the gate of the image intensifier 9.
  • the pulse generator also triggers the las diode array and associated drive circuitry 17.
  • the laser diode used are the SHARP LT015 F type. Light from the laser diode array is directed toward the scene (not shown) .
  • the system controller 15 selects the mode of the electronic switch 16 which allows the diode array 17 to be operated in pulsed mode (position a), continuous mode (position b) and turn off completely (position c). The controller 15 also selects on of three frame memory buffers in the frame store 14 correspondi to images obtained in the three operating modes of the laser diode array.
  • the three outputs from the frame store 14 are processed by digital logic circuitry 18 to produce data representing range information about the scene. This is converted to an analogue signal by a digital to analog converter 19 the output of which i combined by a circuit 20 with video sync information form the sync processor 13. The resulting signal is displayed on a video monitor 21.
  • the image intensifier 9 has a driving voltage which ranges from O V to -60 V and back to 0 V in 30 nSec.
  • the microchannel plate is provided with -850 V (MCP out to MCP in) and the phosphor screen accelerating voltage is 5 V (Screen to MCP out).
  • Receiving lens optics 8 consist of a manually adjusted focus len and a galvanmeter controlled aperture.
  • the filter is a Kodak Wratten filter #87.
  • FIG 8 shows the relevant timing details of the system's hardware.
  • Timing signals (23), (24) and (25) show the select signals for operating the mode switch and governing the frame buffer memory selection in the frame store 14. These are shown in relation to the odd and even fields of the interlaced video signal 22 from the video camera 10.
  • the times when a control signal is asserted is the "active" time and is indicated by reference 26.
  • signal 23 selects the pulsed mode of the laser array.
  • signal 24 When signal 24 is active, it selects the continuous mode of the laser array, while signal 25 being active selects the background mode, during which time the laser array i idle.
  • the laser array is also idle when none of the selections is active (time intervals 27).
  • the active time 26 for a select signal is two video field time in length or 20 mSec.
  • the images are captured during the odd field of the interlaced video represented by the low state of signal 22.
  • Reference 27 represents the recovery time between modes. This allows the phosphor of the image intensifier to return to a neutral state prior to the next selection.
  • the recovery time is also two field times. These timing values are adjusted according to the persistence characteristics of the imaging system.
  • Figure 8 also shows the timing of the waveforms that drive the laser diode during pulsed mode, and drive the image intensifier.
  • the length 28 of the driving pulse is of the order of 30 nSec.
  • the off time 29 may be varied between 30 nSec and greater than 100 nSec.
  • the image intensifier 9 has the same pulse duration and duty cycle as the laser array.
  • a time delay 32 is adjustab over the range + or - 15 nSec. This is to compensate for circui delays and the longitudinal displacement between the laser arra and the image intensifier. This delay is set once, after the system is constructed and is not thereafter adjusted unless recalibration is required.
  • Figure 9 shows approximate graphs of the laser optical output 33 and the image intensifier sensitivity 34. These vary from ideal square waves because of the limited response times of the driver circuitry. The effects of these imperfections on the final rang readings are readily calibrated out of the system.
  • Images obtained from three modes of operation of a pulsed illuminator, gated receiver imaging system are combined through the use of the special hardware to produce a representation of range data for the viewed scene.
  • the system captures a complete range image of the viewed scene using just three images, one fro each mode.
  • the step which measures the background image may be eliminated, particularly when background light levels are low.

Abstract

A pulse generator (11) provides timed trigger signals to an image intensifier driver (12) which drives the gate of the image intensifier (9). The pulse generator also triggers the drive circuitry and diode array (7) to emit light towards a scene. Reflected light from the scene enters the imaging system through filter (7) and lens (8) focusing the filtered light onto the intensifier (9) which is observed by video camera (10). The output of video camera (10) is connected to a frame store (14) and sync processor (13). The system controller (15) regulates switch (16) which allows the diode array to be operated in pulsed mode (a), continuous mode (b) and off mode (c). The three outputs of from store (14) are processed by circuitry (18) to produce data representing range information about the scene. This information after passing through analog convertor (19) is combined with video sync information from the sync processor (13) and the resulting signal displayed on video monitor (21).

Description

RANGE FINDING DEVICE FIELD OF THE INVENTION
This invention relates to optical radar systems, and more particularly to systems which acquire range data points in a parallel or simultaneous fashion, rather than point by point using a scanning mechanism. The invention has particular utility in robotics where it is necessary for a robot to obtain a "picture" of the surrounding and consequently the invention i generally applicable over only a short range.
DESCRIPTION OF PRIOR ACT
Previous disclosures relating to pulsed illuminator, gated receiver viewing systems are mostly concerned with equalizing the intensity of viewed targets at different ranges while eliminating effects of bac scatter illumination. This is addressed in disclosures contained in the following.
Chernoch, U.S. Patent No. 3,305,633, discloses a system for enhancing the contrast of images of distant targets.
Bamburg et al., U.S. Patent No. 3,899,250 discloses a system where the delay between the outgoing pulse and receiver activation is controlled on successive pulse cycles, to give a known correction for sensitivity with distance.
French, U.S. Patent No. 4,226,529, presents a viewing system whereby the contrast of an image of a target at a particular distance is enhanced with the time gating adjustable to view different ranges.
Contini et al., U.S. Patent No. 4,603,250 discloses a viewing system in which an image intensifier is used as the receiver ~ with the photocathode gated and the gain of the receiver adjusted by varying the microchannel plate voltage.
This overcomes problems associated with poor focus control on gain controlled image intensifier, where the gain is controlled by the photocathode voltage alone. The detailed operation of a gated image intensifier is also presented.
Previous approaches to the problem of obtaining representations of range data using pulsed illumination include the following.
Kleider, U.S. Patent No. 4,068,124, uses a combination of pulse illumination techniques and single lined CCD sensor to detect wire-like obstacles at a fixed range.
Meyerand et al., U.S. Patent No. 3,463,588 describes a pulsed illuminator viewing system whereby a scene is scanned longitudinally (along the direction of light propagation). The time delay between the outgoing pulse and the receiver activation pulse is varied by an operator with the result that the system is sensitive only to objects at a distance corresponding to the time delay. Then, for a given delay, and hence distance, an object is visible only if it exists at that distance. Endo, U.K. Patent No. GB2,139,036A, discloses an optical radar for vehicles wherein an array of photodiodes, onto which the scene is focused, is used to detect the return echo of a laser light pulse which illuminates the scene. Each diode is activated in turn and the time taken for light to travel from the source to the reflecting object and back to the photodiode is measured. The echo time gives the time of flight, and hence distance, to the object in the scene that the particular diode is focused upon.
DESCRIPTION OF THE INVENTION
Accordingly, it is an object of this invention to provide an improved optical radar range finding system which at least reduces the aforementioned problems of prior art systems.
Thus one broad form of the invention provides an optical radar range finding system comprising a high speed switchable light source for illuminating a scene, a high speed switchable imagin device for receiving a reflected image from said scene and providing an output to control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control -means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevan to said scene. Preferably, said control logic operates a switching device and the switching device is able to switch said light source to be continuously on, triggered by said pulsing means or in an off condition.
Preferably, said light source is a laser diode array. Preferably, the store means is a video frame store.
Another broad form of the invention provides a method of obtaining range information relating to a scene comprising the steps of:
(1) illuminating the scene with continuous light from a ligh source and storing first image data reflected therefrom;
(2) illuminating the scene with high speed switchable pulsed light from the light source and storing second image dat reflected therefrom; and
(3) dividing the second image data by the first image data t obtain range image data relevant to the range of the scene.
Preferably, said method includes the further step of storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image dat from the first and second image data, respectively, prior to said division so as to remove background illumination effects. In order that the invention may be more readily understood, one particular embodiment will now be described with reference to the accompanying drawings wherein:
t
Figure 1 is a simplified block diagram of apparatus constitutin the system.
Figure 2 shows time profiles of reflected light pulses and sensitivity of the image sensor, of the system of Figure 1.
Figures 3 and 4 show plan view and side view, respectively, of experimental apparatus used to demonstrate the apparatus.
Figure 5 shows the range image acquired from the system when viewing the apparatus of Figures 3 and 4.
Figure 6 shows a profile of distances for a horizontal section through Figure 5.
Figure 7 is a more detailed block diagram of the apparatus of Figure 1.
Figure 8 is a timing diagram showing timing details of the system's hardware, and
Figure 9 shows graphs of the approximate output of laser diode array, and the image intensifier sensitivity, respectively. To extract range information about a scene, where the only ligh source is that of the invention, then two modes of operation ar necessary so that two images may be captured.
OPERATING MODE 1
Referring to Figure 1, the control hardware 1 selects switch 2 so that the high speed light source 3 is pulsed on and off by the pulse generator 4. The pulse generator 4 also drives circuitry (not shown in Figure 1) that controls the sensitivity of image sensor 5. The resulting image is stored in the contro hardware 1. The reference 6 represents a video monitor.
OPERATING MODE 2
In this mode, the control hardware 1 selects switch 2 so that the light source is continuously on. The resulting image from the image sensor is again stored in the control hardware 1.
PROCESSING
Let II be the sensed image from mode 1.
Let 12 be the sensed image from mode 2.
II represents an image that is dependent on the surface reflectivity of the objects in the scene. The intensity of the image is dependent upon the distance to the objects, due to the inverse square law. The image is also dependent upon a gating effect caused by the overlap of the reflected light pulse from the scene with the 'on' time of the image sensor, again depende on distance.
.12, is taken when the light source is continuously on and is dependent only on the surface reflectivity of objects in the scene. The intensity of the image is dependent upon the distan to the objects, caused by the inverse square law.
The range is found by forming the quotient of the corresponding parts of the II and 12, so that,
R = K. 11/12
where R is a range image and K is a calibration factor which is constant.
This processing removes the effect of surface reflectivity and inverse square light loss. The only effect that remains is tha due to the overlap or convolution of the pulsed light with the gated image sensor.
Idealised time profiles of the reflected light pulse "a" and th sensitivity of the image sensor "b" are shown in Figure 2. The convolution of the two is shown as "c". Each profile is a plot of relative magnitude versus time.
The convolution value depends on the delay between the onset of the reflected pulse at the receiver, and the onset of the activation of the image sensor. This delay is due to the round trip time for light to travel from the pulsed light source, to the objects in the scene and back to the image sensor 5. Thus, for any pixel in the range, knowing the shape of the convolution function and the overall calibration factor K, the distance from the rangefinder to any point in the image may be found directly from R.
The calibration factor, K, is globally applied to all points in the image and may be found by obtaining a range image of a scene of known dimensions. This need be done only once.
A refinement of the above mode of operation, to facilitate operation in non-ideal environments, is the extension of the control hardware to capture a third 'background' image to includ in the processing. This image is taken with the light source turned off so that background illumination effects are recorded. The processing is modified so that
R = K. (II - I3)/(I2 - 13)
where 13 is the background image. This removes backgroun effects from II and 12 which would cause errors in the range values.
Referring to Figures 3-4,it can be seen that the experiment used to demonstrate the system comprises six test cards A separated from each other by 15 cm. The receiver 5 is 115 cm from the closet test card A. The cards are also offset transversely to the direction of light source. Figure 5 shows the range image acquired using such apparatus and Figure 6 shows a profile of distances to the various cards on a horizontal section through the image of Figure 5.
Reference is now be made to the more detailed block diagram sho in Figure 7. A pulse generator 11 provides timed trigger signa to the image intensifier driver 12 which drives the gate of the image intensifier 9. The pulse generator also triggers the las diode array and associated drive circuitry 17. The laser diode used are the SHARP LT015 F type. Light from the laser diode array is directed toward the scene (not shown) .
Light reflected from the scene enters the imaging systems throu the filter 7 to reduce the effect of light from extraneous sources. Primary lens 8 focuses the filtered light onto the image intensifier 9, Varo type 5772, which produces an output observed by the video camera 10, NEC TI 22C. The output of vid camera 10 is connected to frame store 14 and sync processor 13. The system controller 15 selects the mode of the electronic switch 16 which allows the diode array 17 to be operated in pulsed mode (position a), continuous mode (position b) and turn off completely (position c). The controller 15 also selects on of three frame memory buffers in the frame store 14 correspondi to images obtained in the three operating modes of the laser diode array.
The three outputs from the frame store 14 are processed by digital logic circuitry 18 to produce data representing range information about the scene. This is converted to an analogue signal by a digital to analog converter 19 the output of which i combined by a circuit 20 with video sync information form the sync processor 13. The resulting signal is displayed on a video monitor 21.
The image intensifier 9 has a driving voltage which ranges from O V to -60 V and back to 0 V in 30 nSec. The microchannel plate is provided with -850 V (MCP out to MCP in) and the phosphor screen accelerating voltage is 5 V (Screen to MCP out).
Receiving lens optics 8 consist of a manually adjusted focus len and a galvanmeter controlled aperture. The filter is a Kodak Wratten filter #87.
Figure 8 shows the relevant timing details of the system's hardware. Timing signals (23), (24) and (25) show the select signals for operating the mode switch and governing the frame buffer memory selection in the frame store 14. These are shown in relation to the odd and even fields of the interlaced video signal 22 from the video camera 10. The times when a control signal is asserted is the "active" time and is indicated by reference 26. When active, signal 23 selects the pulsed mode of the laser array. When signal 24 is active, it selects the continuous mode of the laser array, while signal 25 being active selects the background mode, during which time the laser array i idle. The laser array is also idle when none of the selections is active (time intervals 27). The active time 26 for a select signal, is two video field time in length or 20 mSec. The images are captured during the odd field of the interlaced video represented by the low state of signal 22. Reference 27 represents the recovery time between modes. This allows the phosphor of the image intensifier to return to a neutral state prior to the next selection. The recovery time is also two field times. These timing values are adjusted according to the persistence characteristics of the imaging system.
Figure 8 also shows the timing of the waveforms that drive the laser diode during pulsed mode, and drive the image intensifier. The length 28 of the driving pulse is of the order of 30 nSec. The off time 29 may be varied between 30 nSec and greater than 100 nSec. The image intensifier 9 has the same pulse duration and duty cycle as the laser array. A time delay 32 is adjustab over the range + or - 15 nSec. This is to compensate for circui delays and the longitudinal displacement between the laser arra and the image intensifier. This delay is set once, after the system is constructed and is not thereafter adjusted unless recalibration is required.
Figure 9 shows approximate graphs of the laser optical output 33 and the image intensifier sensitivity 34. These vary from ideal square waves because of the limited response times of the driver circuitry. The effects of these imperfections on the final rang readings are readily calibrated out of the system.
1271047/WF127 - 19/06/89 It will be appreciated that the invention differs from the prior art in that a complete range image is determined, in the preferred embodiment, after three image sampling operations, whereas previous systems have relied on acquiring grange information by scanning the scene with repetitive samples longitudinally, along the direction of light propagation (Mayerand et al.).
As a result of the reduced number of samples, the range representation of scene is acquired much more quickly.
Images obtained from three modes of operation of a pulsed illuminator, gated receiver imaging system are combined through the use of the special hardware to produce a representation of range data for the viewed scene. The system captures a complete range image of the viewed scene using just three images, one fro each mode.
In a simplified form of operation which still produces quite acceptable results the step which measures the background image may be eliminated, particularly when background light levels are low.
Since modifications within the spirit and scope of the invention may be readily effected by persons skilled in the art, it is to be understood that the invention is not limited to the particula embodiment described, by way of example, hereinabove.

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1. An optical radar range finding system comprising a high speed switchable light source for illuminating a scene, high speed switchable imaging device for receiving a reflected image from said scene and providing an output control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevant to said scene.
2. An optical radar range finding system according to claim wherein the control logic operates a switching device an the switching device is able to switch the light source be continuously on, triggered by the pulsing means or in an off condition.
3. An optical radar range finding system according to claim wherein the light source is a laser diode array.
4. An optical radar range finding system according to claim wherein the store means is a video frame store.
5. A method of obtaining range information relating to a scene comprising the steps of: (1) illuminating the scene with continuous light from light source and storing first image data reflecte therefrom;
(2) illuminating the scene with high speed switchable pulsed light from the light source and storing second image data reflected therefrom; and
(3) dividing the second image data by the first image data to obtain range image data relevant to the range of the scene.
6. A method according to claim 5, including the further step of (4) storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image data from the first and secon image data, respectively, prior to step (3).
7. A method according to claim 5 using the optical radar range finding system of any one of claims 1 to 3.
8. A method according to claim 6 using the optical radar range finding system of any one of claims 1 to 5.
9. An optical radar range finding system substantially as hereinbefore described with reference to the accompanying drawings.
10. A method of obtaining range information relating to a scene substantially as hereinbefore described with reference to the accompanying drawings.
PCT/AU1989/000263 1988-06-20 1989-06-20 Range finding device WO1989012837A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPI8876 1988-06-20
AUPI887688 1988-06-20

Publications (1)

Publication Number Publication Date
WO1989012837A1 true WO1989012837A1 (en) 1989-12-28

Family

ID=3773167

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU1989/000263 WO1989012837A1 (en) 1988-06-20 1989-06-20 Range finding device

Country Status (3)

Country Link
EP (1) EP0424409A4 (en)
JP (1) JPH03505123A (en)
WO (1) WO1989012837A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0396867A2 (en) * 1989-05-12 1990-11-14 DORNIER GmbH Navigation procedure
EP0396865A3 (en) * 1989-05-12 1991-02-27 DORNIER GmbH Optical radar
EP0835460A2 (en) * 1995-06-22 1998-04-15 3DV Systems Ltd. Improved optical ranging camera
US6445884B1 (en) 1995-06-22 2002-09-03 3Dv Systems, Ltd. Camera with through-the-lens lighting
WO2005052633A1 (en) * 2003-10-29 2005-06-09 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Distance sensor and method for distance detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3897150A (en) * 1972-04-03 1975-07-29 Hughes Aircraft Co Scanned laser imaging and ranging system
US4190362A (en) * 1977-05-04 1980-02-26 Societe Anonyme De Telecommunications Laser telemeter
GB2125649A (en) * 1982-08-18 1984-03-07 Eastman Kodak Co Improvements in or relating to rangefinders
US4644149A (en) * 1983-04-18 1987-02-17 Canon Kabushiki Kaisha Photoelectric transducer element
US4812035A (en) * 1986-11-03 1989-03-14 Raytheon Company AM-FM laser radar

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4298280A (en) * 1979-09-25 1981-11-03 Massachusetts Institute Of Technology Infrared radar system
US4501961A (en) * 1982-09-01 1985-02-26 Honeywell Inc. Vision illumination system for range finder
US4678323A (en) * 1984-07-20 1987-07-07 Canon Kabushiki Kaisha Distance measuring devices and light integrators therefor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3897150A (en) * 1972-04-03 1975-07-29 Hughes Aircraft Co Scanned laser imaging and ranging system
US4190362A (en) * 1977-05-04 1980-02-26 Societe Anonyme De Telecommunications Laser telemeter
GB2125649A (en) * 1982-08-18 1984-03-07 Eastman Kodak Co Improvements in or relating to rangefinders
US4644149A (en) * 1983-04-18 1987-02-17 Canon Kabushiki Kaisha Photoelectric transducer element
US4812035A (en) * 1986-11-03 1989-03-14 Raytheon Company AM-FM laser radar

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0424409A4 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0396867A2 (en) * 1989-05-12 1990-11-14 DORNIER GmbH Navigation procedure
EP0396865A3 (en) * 1989-05-12 1991-02-27 DORNIER GmbH Optical radar
EP0396867A3 (en) * 1989-05-12 1991-08-14 DORNIER GmbH Navigation procedure
US6445884B1 (en) 1995-06-22 2002-09-03 3Dv Systems, Ltd. Camera with through-the-lens lighting
EP0886790A2 (en) * 1995-06-22 1998-12-30 3DV Systems Ltd. Telecentric 3d camera and method
EP0835460A4 (en) * 1995-06-22 1999-01-13 3Dv Systems Ltd Improved optical ranging camera
EP0835460A2 (en) * 1995-06-22 1998-04-15 3DV Systems Ltd. Improved optical ranging camera
US6654556B2 (en) 1995-06-22 2003-11-25 3Dv Systems Ltd. Camera with through-the-lens lighting
EP0886790B1 (en) * 1995-06-22 2006-03-01 3DV Systems Ltd. Telecentric 3d camera and method
US6993255B2 (en) 1999-02-16 2006-01-31 3Dv Systems, Ltd. Method and apparatus for providing adaptive illumination
US7355648B1 (en) 1999-02-16 2008-04-08 3Dv Systems Ltd. Camera having a through the lens pixel illuminator
WO2005052633A1 (en) * 2003-10-29 2005-06-09 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Distance sensor and method for distance detection
US7186965B2 (en) 2003-10-29 2007-03-06 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandton Forschung E.V Distance sensor and method for detecting a distance

Also Published As

Publication number Publication date
EP0424409A1 (en) 1991-05-02
EP0424409A4 (en) 1992-01-15
JPH03505123A (en) 1991-11-07

Similar Documents

Publication Publication Date Title
CA1332978C (en) Imaging lidar system using non-visible light
EP0462289B1 (en) Apparatus for measuring three-dimensional coordinates
US6323942B1 (en) CMOS-compatible three-dimensional image sensor IC
CN1099802C (en) Device and method for detection and demodulation of intensity modulated radiation field
US4708473A (en) Acquisition of range images
US7834985B2 (en) Surface profile measurement
US5048950A (en) Optical radar
JP2004538491A (en) Method and apparatus for recording three-dimensional range images
KR20010033549A (en) Method and device for recording three-dimensional distance-measuring images
JP2004523769A (en) Surface shape measurement
JPH03188322A (en) Method for image-forming two wavelength original position of single internal wave
US4119379A (en) Optical detection and ranging apparatus
Bretthauer et al. An electronic Cranz–Schardin camera
GB2374743A (en) Surface profile measurement
WO1989012837A1 (en) Range finding device
EP3543742B1 (en) A 3d imaging system and method of 3d imaging
AU3831189A (en) Range finding device
Christie et al. Design and development of a multi-detecting two-dimensional ranging sensor
GB2154388A (en) Image processing system
CN100417915C (en) Scanner-free imaging range finding method and its range finder
Kotake et al. Performance improvement of real-time 3D imaging ladar based on a modified array receiver
JPH0136082B2 (en)
EP0777134A1 (en) Device for observing objects
SU712662A1 (en) Method of remote automatic measuring of demensions of similar objects
GB2339355A (en) Laser rangefinder

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE

WWE Wipo information: entry into national phase

Ref document number: 1989907048

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1989907048

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1989907048

Country of ref document: EP