WO2009002327A1 - Optical rangefinder for a 3-d imaging system - Google Patents

Optical rangefinder for a 3-d imaging system Download PDF

Info

Publication number
WO2009002327A1
WO2009002327A1 PCT/US2007/015302 US2007015302W WO2009002327A1 WO 2009002327 A1 WO2009002327 A1 WO 2009002327A1 US 2007015302 W US2007015302 W US 2007015302W WO 2009002327 A1 WO2009002327 A1 WO 2009002327A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
optical rangefinder
reference point
image
imaging system
Prior art date
Application number
PCT/US2007/015302
Other languages
French (fr)
Inventor
Youngshik Yoon
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to PCT/US2007/015302 priority Critical patent/WO2009002327A1/en
Publication of WO2009002327A1 publication Critical patent/WO2009002327A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement

Definitions

  • the invention relates generally to 3-D image capturing. More specifically, the invention relates to an optical rangefinder for a 3-D imaging system.
  • an image may be enhanced with an appearance of depth by converting the image into a so-called 3-D image.
  • This is most often accomplished by optically polarizing the images which are to be viewed by a viewer's left eye differently than the images which are to be viewed by a viewer's right eye.
  • the 3-D effect is perceived by the viewer when the viewer views the polarized images though the use of polarized filter lenses, commonly configured as '3-D viewing glasses' with a polarized filter for use with the left eye of the viewer and a differently polarized filter for use with the right eye of the viewer.
  • depth information for 3-D image capturing is obtained using the human eye and a laser along with a detecting device including an infrared sensor and optics or using a stereo camera.
  • a detecting device including an infrared sensor and optics or using a stereo camera.
  • the use of the laser has often been objected to because of safety reasons, and also using a stereo camera requires high cost and causes often limited view. In particular, the possibility that the laser will shine in the human eyes directly causing irreparable damage.
  • a method of obtaining depth information for 3-D image capturing at low cost without using the human eye is needed.
  • a 3-D imaging system includes an optical rangefinder attached to a camera.
  • the optical rangefinder is used to convert un-superimposed images into fine object depth information based on a comparison with a reference point. Additionally, full 3- image capturing is obtained using different infrared wavelengths for object fferentiation.
  • FIG. 1 is a schematic drawing of an optical rangefinder of the present invention
  • FIG. 2 shows the optical rangefinder of FIG. 1 attached to a camera of a 3-D imaging system
  • FIG. 3 is a flow chart describing the conversion of image information into depth information.
  • the present invention is directed to a 3-D imaging system including an optical rangefinder attached to a camera.
  • the optical rangefinder is used to convert un- superimposed images into depth information based on a comparison with a reference point. Additionally, full 3-D image capturing is obtained using different infrared wavelengths for object differentiation.
  • the optical rangefinder includes a roof pentaprism 25, a beam splitter 12, an eye piece lens 14, mirrors 17, 18, and a converging lens 16. Un-superimposed image information of an object is input to the optical rangefinder 10 at two points 21, 20.
  • Image information 21 from a first perspective of the object is provided directly to the human eye 11 through the beam splitter 12.
  • Image information 20 from a second perspective of the object is provide to the human eye 11 via the roof pentaprism 25 converging lens 16 and eyepiece lens 14.
  • the roof pentaprism 25 inverts the image 20.
  • the converging lens 16 focuses the inverted image and shifts it to superimpose the images 20, 21 as the lens 16 is adjusted.
  • the converging lens 16 should rotate. Then, based n the pivoting angle of the lens 16 a distance relationship can be established.
  • the optical rangefinder 10 is attached to a camera 5, as shown in FIG. 2a.
  • initial characteristics of the optical rangefinder 10 are established as indicated in step 100.
  • the initial characteristics of the optical • rangefinder 10 include determining the distance D (FIG. 2a) between the two apertures 20, 21 (FIG. 1).
  • the reference point 35 is the point at which perfect superposition occurs. This distance may be determined, for example, by attaching an infrared radiator to an object at the reference point and detecting the distance the infrared light travels or by simple triangular geometry calculation. For multiple objects, several infrared radiators, each with a different frequency, may be used to differentiate between them. Referring to step 300, random objects A 40, B 50, C 60 are placed between the optical rangefinder 10 and the reference point 35.
  • the horizontal distance between the two un-superimposed images can be calculated to find the vertical position A 40, B 50, C 60 of the random objects, using simple geometry i.e., using parallax. This establishes the distance of each object A 40, B 50, C 60 from the viewing point 35.
  • the acquisition of depth information across the field may be similarly calculated as indicated in step 400 of FIG. 3.
  • a great advantage of such depth information is to be continuous, not discreet, i another word, defining depth levels is up to user. Therefore, it can be applied to any 3D display with different depth levels.
  • 3-D image generation of auxiliary objects such as for example, a hair, which may be attached to the main object can also be calculated as indicated in step 310. This is because the 2-D image will be recorded by the camera and once the initial depth information is calculated, follow the calculation from the main image.
  • the same method of determining the depth information may be used. Just the different angular values across the field must be considered when calculating the depth of an object.

Abstract

A 3-D imaging system including an optical rangefinder attached to a camera is described. The optical rangefinder is used to convert un-superimposed images into fine object depth information based on a comparison with a reference point. Additionally, full 3-D image capturing is obtained using different infrared wavelengths for object differentiation.

Description

OPTICAL RANGEFINDER FOR A 3-D IMAGING SYSTEM
FIELD OF THE INVENTION
The invention relates generally to 3-D image capturing. More specifically, the invention relates to an optical rangefinder for a 3-D imaging system.
BACKGROUND OF THE INVENTION
It is well known that an image may be enhanced with an appearance of depth by converting the image into a so-called 3-D image. This is most often accomplished by optically polarizing the images which are to be viewed by a viewer's left eye differently than the images which are to be viewed by a viewer's right eye. The 3-D effect is perceived by the viewer when the viewer views the polarized images though the use of polarized filter lenses, commonly configured as '3-D viewing glasses' with a polarized filter for use with the left eye of the viewer and a differently polarized filter for use with the right eye of the viewer.
Typically, depth information for 3-D image capturing is obtained using the human eye and a laser along with a detecting device including an infrared sensor and optics or using a stereo camera. However, the use of the laser has often been objected to because of safety reasons, and also using a stereo camera requires high cost and causes often limited view. In particular, the possibility that the laser will shine in the human eyes directly causing irreparable damage. Thus, a method of obtaining depth information for 3-D image capturing at low cost without using the human eye is needed.
SUMMARY OF THE INVENTION
A 3-D imaging system includes an optical rangefinder attached to a camera.
The optical rangefinder is used to convert un-superimposed images into fine object depth information based on a comparison with a reference point. Additionally, full 3- image capturing is obtained using different infrared wavelengths for object fferentiation.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described with reference to the accompanying figures of which.
FIG. 1 is a schematic drawing of an optical rangefinder of the present invention; FIG. 2 shows the optical rangefinder of FIG. 1 attached to a camera of a 3-D imaging system; and
FIG. 3 is a flow chart describing the conversion of image information into depth information.
DETAILED DESCRIPTION OF THE INVENTION
The present invention is directed to a 3-D imaging system including an optical rangefinder attached to a camera. The optical rangefinder is used to convert un- superimposed images into depth information based on a comparison with a reference point. Additionally, full 3-D image capturing is obtained using different infrared wavelengths for object differentiation.
Referring now to FIG. 1 , there is shown a schematic drawing of an optical rangefinder 10 of the present invention. The optical rangefinder includes a roof pentaprism 25, a beam splitter 12, an eye piece lens 14, mirrors 17, 18, and a converging lens 16. Un-superimposed image information of an object is input to the optical rangefinder 10 at two points 21, 20.
Image information 21 from a first perspective of the object is provided directly to the human eye 11 through the beam splitter 12. Image information 20 from a second perspective of the object is provide to the human eye 11 via the roof pentaprism 25 converging lens 16 and eyepiece lens 14. The roof pentaprism 25 inverts the image 20. The converging lens 16 focuses the inverted image and shifts it to superimpose the images 20, 21 as the lens 16 is adjusted. In order to uperimpose the images 20, 21 the converging lens 16 should rotate. Then, based n the pivoting angle of the lens 16 a distance relationship can be established.
In operation the optical rangefinder 10 is attached to a camera 5, as shown in FIG. 2a. Referring to FIG. 3, initial characteristics of the optical rangefinder 10 are established as indicated in step 100. The initial characteristics of the optical • rangefinder 10 include determining the distance D (FIG. 2a) between the two apertures 20, 21 (FIG. 1).
Thereafter, the distance from the viewing point 20, 21 to the reference point 35 should be determined, as indicated in step 200. The reference point 35 is the point at which perfect superposition occurs. This distance may be determined, for example, by attaching an infrared radiator to an object at the reference point and detecting the distance the infrared light travels or by simple triangular geometry calculation. For multiple objects, several infrared radiators, each with a different frequency, may be used to differentiate between them. Referring to step 300, random objects A 40, B 50, C 60 are placed between the optical rangefinder 10 and the reference point 35. Then, based on the distance between the two apertures 20, 21 as well as the distance from the viewing point to the reference point 35 the horizontal distance between the two un-superimposed images can be calculated to find the vertical position A 40, B 50, C 60 of the random objects, using simple geometry i.e., using parallax. This establishes the distance of each object A 40, B 50, C 60 from the viewing point 35.
Referring to FIG. 2b, as the un-superimposed images of the random objects A 40ai, A 4Oa2, B 5Ob1, B 50b2, C 60ci, C 6OC2 get closer as the distance between them gets closer to the reference point 35. Thus, the horizontal distance 59 between un-superimposed images C 60c-i, C 6Oc2 is greater than the horizontal distance 49 between un-superimposed images B 5Ob1, B 5Ob2. Similarly, both horizontal distances 59, 49 are greater than the horizontal distance 39 between un- superimposed images 4Oa1, A 4Oa2.
Once the distance of each object A 40, B 50, C 60 to the viewing point 35 is calculated the acquisition of depth information across the field may be similarly calculated as indicated in step 400 of FIG. 3. A great advantage of such depth information is to be continuous, not discreet, i another word, defining depth levels is up to user. Therefore, it can be applied to any 3D display with different depth levels.
3-D image generation of auxiliary objects, such as for example, a hair, which may be attached to the main object can also be calculated as indicated in step 310. This is because the 2-D image will be recorded by the camera and once the initial depth information is calculated, follow the calculation from the main image.
For off-axis objects the same method of determining the depth information may be used. Just the different angular values across the field must be considered when calculating the depth of an object.
The foregoing illustrates only some of the possibilities for practicing the invention. Many other embodiments are possible within the scope and spirit of the invention. It is, therefore, intended that the foregoing description be regarded as illustrative rather than limiting, and that the scope of the invention is given by the appended claims together with their full range of equivalents.

Claims

1. A 3D image system, comprising: a camera; and a detecting device coupled to the camera, wherein distance to an object is determined based on a comparison of un-superimposed images with a reference point.
2. The 3D image system of claim 1 wherein the detecting device includes an infrared detector.
3. The 3D image system of claim 1 wherein the detecting device includes a roof pentaprism.
•4. The 3D image system of claim 1 wherein the detecting device includes a converging lens.
5. A method of converting image information to depth information in a 3-D imaging system, comprising the steps of: determining a distance from a viewing point to a reference point; and converting a horizontal distance between un-superimposed images into depth information based on a comparison with the distance to the reference point.
PCT/US2007/015302 2007-06-28 2007-06-28 Optical rangefinder for a 3-d imaging system WO2009002327A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2007/015302 WO2009002327A1 (en) 2007-06-28 2007-06-28 Optical rangefinder for a 3-d imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2007/015302 WO2009002327A1 (en) 2007-06-28 2007-06-28 Optical rangefinder for a 3-d imaging system

Publications (1)

Publication Number Publication Date
WO2009002327A1 true WO2009002327A1 (en) 2008-12-31

Family

ID=39144283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/015302 WO2009002327A1 (en) 2007-06-28 2007-06-28 Optical rangefinder for a 3-d imaging system

Country Status (1)

Country Link
WO (1) WO2009002327A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3663105A (en) * 1967-06-28 1972-05-16 Alton D Anderson Method and apparatus for measuring range utilizing superimposition or alignment of images
US4341447A (en) * 1980-12-01 1982-07-27 Polaroid Corporation Infrared camera ranging system
US4469939A (en) * 1980-12-01 1984-09-04 Nippon Kogaku K.K. Distance measuring apparatus
US4533226A (en) * 1982-03-30 1985-08-06 Robert Bosch Gmbh Still or motion picture camera
US4601053A (en) * 1983-11-21 1986-07-15 Grumman Aerospace Corporation Automatic TV ranging system
US4601574A (en) * 1981-05-01 1986-07-22 Ricoh Company, Ltd. Distance measuring apparatus
GB2202104A (en) * 1987-02-13 1988-09-14 Tecnomare Spa Ranging by correlation
WO1994010535A1 (en) * 1992-10-30 1994-05-11 Vx Optronics Corp. Coincidence sensor for optical rangefinders

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3663105A (en) * 1967-06-28 1972-05-16 Alton D Anderson Method and apparatus for measuring range utilizing superimposition or alignment of images
US4341447A (en) * 1980-12-01 1982-07-27 Polaroid Corporation Infrared camera ranging system
US4469939A (en) * 1980-12-01 1984-09-04 Nippon Kogaku K.K. Distance measuring apparatus
US4601574A (en) * 1981-05-01 1986-07-22 Ricoh Company, Ltd. Distance measuring apparatus
US4533226A (en) * 1982-03-30 1985-08-06 Robert Bosch Gmbh Still or motion picture camera
US4601053A (en) * 1983-11-21 1986-07-15 Grumman Aerospace Corporation Automatic TV ranging system
GB2202104A (en) * 1987-02-13 1988-09-14 Tecnomare Spa Ranging by correlation
WO1994010535A1 (en) * 1992-10-30 1994-05-11 Vx Optronics Corp. Coincidence sensor for optical rangefinders

Similar Documents

Publication Publication Date Title
US11126016B2 (en) Method and device for determining parameters for spectacle fitting
US8253653B2 (en) Image observation system
EP3480648B1 (en) Adaptive three-dimensional imaging system
US20140267622A1 (en) Stereo camera
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
JP5827988B2 (en) Stereo imaging device
JP5440903B2 (en) Imaging device, stereo camera device, and vehicle exterior monitoring device
JP2015194884A (en) driver monitoring system
CN103163663A (en) Method and device for estimating the optical power of corrective lenses in a pair of eyeglasses worn by a spectator
CN108919492B (en) Near-to-eye display device, system and display method
US10412378B2 (en) Resonating optical waveguide using multiple diffractive optical elements
CN108919495A (en) A kind of 3D head-up-display system and its design method, automobile
CN106444042A (en) Dual-purpose display equipment for augmented reality and virtual reality, and wearable equipment
US20230319256A1 (en) Image Display Control Method, Image Display Control Apparatus, and Head-Mounted Display Device
US20210118177A1 (en) Method and system for calibrating a plenoptic camera system
US20140333532A1 (en) Stereoscopic image display apparatus and computer-readable recording medium storing program thereon
Kagawa et al. A three‐dimensional multifunctional compound‐eye endoscopic system with extended depth of field
CN104932106A (en) Virtual reality display method and virtual reality glasses
JP6767481B2 (en) Eye surgery using light field microscopy
US20150335241A1 (en) Apparatus for obtaining status information of crystalline lens and equipment including the same
WO2009002327A1 (en) Optical rangefinder for a 3-d imaging system
US20140175270A1 (en) Display measuring device
KR20160088178A (en) The method and the recording media for original image restoration technology based on scattering noise removal and photon detection for single recorded image
Orlosky Depth based interaction and field of view manipulation for augmented reality
Zhao et al. Prism-based single-camera system for stereo display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07835953

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07835953

Country of ref document: EP

Kind code of ref document: A1