WO2010092533A1 - Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot - Google Patents

Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot Download PDF

Info

Publication number
WO2010092533A1
WO2010092533A1 PCT/IB2010/050608 IB2010050608W WO2010092533A1 WO 2010092533 A1 WO2010092533 A1 WO 2010092533A1 IB 2010050608 W IB2010050608 W IB 2010050608W WO 2010092533 A1 WO2010092533 A1 WO 2010092533A1
Authority
WO
WIPO (PCT)
Prior art keywords
hologram
opl
coherence
source
light scattering
Prior art date
Application number
PCT/IB2010/050608
Other languages
French (fr)
Inventor
Christian Depeursinge
Yves Delacretaz
Daniel Boss
Original Assignee
Ecole Polytechnique Federale De Lausanne (Epfl)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecole Polytechnique Federale De Lausanne (Epfl) filed Critical Ecole Polytechnique Federale De Lausanne (Epfl)
Publication of WO2010092533A1 publication Critical patent/WO2010092533A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/02015Interferometers characterised by the beam path configuration
    • G01B9/02032Interferometers characterised by the beam path configuration generating a spatial carrier frequency, e.g. by creating lateral or angular offset between reference and object beam
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/02041Interferometers characterised by particular imaging or detection techniques
    • G01B9/02047Interferometers characterised by particular imaging or detection techniques using digital holographic imaging, e.g. lensless phase imaging without hologram in the reference path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/0209Low-coherence interferometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/0209Low-coherence interferometers
    • G01B9/02091Tomographic interferometers, e.g. based on optical coherence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/021Interferometers using holographic techniques
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/0443Digital holography, i.e. recording holograms with digital recording means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H2001/0033Adaptation of holography to specific applications in hologrammetry for measuring or analysing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/0443Digital holography, i.e. recording holograms with digital recording means
    • G03H2001/0454Arrangement for recovering hologram complex amplitude
    • G03H2001/0456Spatial heterodyne, i.e. filtering a Fourier transform of the off-axis record
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2222/00Light sources or light beam properties
    • G03H2222/20Coherence of the light source
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2222/00Light sources or light beam properties
    • G03H2222/20Coherence of the light source
    • G03H2222/24Low coherence light normally not allowing valuable record or reconstruction
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2223/00Optical components
    • G03H2223/16Optical waveguide, e.g. optical fibre, rod
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2223/00Optical components
    • G03H2223/26Means providing optical delay, e.g. for path length matching

Definitions

  • the invention relates to a method and apparatus for topological or shape measurement of objects of any size, by using an interferometric technique comprising an optical source, with selectable coherence, allowing contouring from single shot interferograms, giving access to absolute tri-dimensional topography of illuminated object.
  • Incoherent light can be used with fringes projection and Moire pattern techniques.
  • Photogrammetry techniques employing mainly stereo vision procedures to obtain 3D shape are also available.
  • the tri-dimensional information can also be provided by interferometric measurements. Holography and speckle interferometry can be used to evaluate the phase change induced when a laser light is either transmitted through or reflected by an object.
  • Patent WO 0020929 (Al) "Method And Apparatus For Simultaneous Amplitude And Quantitative Phase Contrast Imaging by Numerical Reconstruction of Digital Holograms" by E.Cuche et Al. teaches that phase can be derived from a single hologram. This approach holds for microscopic objects with sizes comparable to wavelength. However for macroscopic objects, surface roughness and the role of 2 ambiguity render the phase signal un-interpretable for unpolished macroscopic objects. The phase is therefore practically never evaluated in large object holography, but only the amplitude or intensity of the reconstructed wavefront, which, by itself, does not provide a precise 3D shape measurement.
  • Coherence gating has permitted sectioning and profiling by evaluating the envelope of the interference signal, providing thereby a "gating" property, and thus enabling profiling.
  • the interference of two waves coming from a light source with finite temporal coherence length can be expressed as,
  • Rizk et al used the so-called white light interferometry approach with a multimode laser diode as a source, and by enhancing the detectability of the brightest fringe through different adaptive digital filters based on a least mean square measurement.
  • OCT optical coherence tomography
  • FD-OCT Fourier domain optical coherence Tomography
  • coherence radar and light-in-flight holography techniques are based on the detection of the envelope of the interference signal in the time domain. They are indeed less sensitive to phase fluctuations, and provide absolute tri-dimensioning. The counterpart is that they most often require scanning in time and in space, and thus are less adequate for applications where fast imaging is required.
  • the novelty of the approach used in this invention resides in the fact that it requires only one short acquisition per depth contour and one sensor. In this sense it places the technique in direct concurrence with long range OCT, when the latter is used for topological measurements.
  • the present invention relates to a method and apparatus for measuring the topography or shape of an object, convex or concave, rounded shape, solid object, or hollow body or tube that can be measured from inside, for instance with can endoscope.
  • the object can have any size: from microscopic to macroscopic size.
  • the 3D shape or profile is derived from a series of depth contour measured from a single interferogram or hologram.
  • the retrieved data are quantitative and highly accurate. Depending on the characteristics of the surface, the accuracy and precision are in the micron range, possibly down to the nanometer range depending on the surface roughness and reflective properties, dielectric in particular.
  • the approach is based on an interferometric method whereby a source with selectable coherence length illuminates the object and excites a reference path.
  • the interferometric setup comprises two arms, arranged in an original configuration: the object path comprises a source which illuminates the object and an optical setup comprising means to collect the light scattered by the object surface or volume and to image the object on a detector plane.
  • a second path is used as reference path, comprising a fixed or variable delay line whereby the Optical Pathlength (OPL) can be continuously varied in a computer controlled manner, and also comprising a source which illuminates the detector plane where an interference pattern resulting from the superposition of the object and reference beam is generated.
  • OPL Optical Pathlength
  • the interference pattern is called an hologram.
  • hologram will be preferred to the term interference pattern because a reconstruction of the wavefront to form an image of the object is always an option.
  • the interference pattern provides an in- focus image of the object and can be interpreted as an hologram yielding an image of the object by the wavefront reconstruction without wavefront propagation.
  • the interference pattern cannot provide an in-focus image of all part of the object and can be advantageously considered as an hologram formed on the detector plane and that can be acquired digitally and processed in order to focus digitally the image of the different parts of the object, according to the techniques of digital holography: the wavefield is first reconstructed in the detector or hologram plane by multiplication by a digital reference wave and then propagated to the different parts of the object.
  • the electronic focusing provided by the wavefront reconstruction is not efficient enough to provide a full 3D image of the object from its various focused parts.
  • This large depth of field is beneficial to the observation of all parts of the object in-focus, but detrimental to the lateral resolution of the object image.
  • this compromise appear favorable to the observation of objects in some applications such as endoscopy, where the imaging optics is designed to have a very low numerical aperture and therefore provide a large depth of field, so that the user has not to manage the focusing issue. In return, the 3D perception of the object is lost.
  • a low coherence optical source provides a wavefield, the autocorrelation of which is limited in time by the so- called coherence time, which corresponds to a propagating distance of the wave called the coherence length.
  • the condition to obtain an interference on the detector point is that the OPL from the object illuminating point to the object light scattering point and from that point to the point inside the detector plane, has a well defined value corresponding to a mismatch between the object and reference path less than the coherence length.
  • the coherence zone appears as a layer in the depth of the object. So this discrimination over the depth of the object compensate for the loss of discrimination in depth by low N.A imaging optics.
  • the interest of using a source with selectable coherence is that it permits to control the depth of the coherence zone and can be adapted to the shape of the object as well as to the needed accuracy of the measurement: the use of short or low coherence lengths provides high accuracies but reduced mutual coherence signals. On the contrary, long coherence signals provide high intensities signal but low accuracies. A tradeoff between these two situations is favorable to the achievement of optimal signal to noise ratio and measuring accuracies: this optimum is achieved for what is called "reduced coherence length" and must be adjusted and selected for each object measurement.
  • the disclosed method comprises the following innovative aspects: the determination of the areas of the interferogram or hologram captured on the detector plane, where the mutual coherence terms do not vanish, as well as the computation of these so-called mutual coherence terms, and whereby the said determination and said computation are performed on a single interferogram or hologram acquired in a very short acquisition time, as short as the detector and source intensity permits, constitute a major innovative aspect of the invention.
  • the innovative aspects also include the methods and associated computer algorithms involved in the determination of the non vanishing mutual coherence terms. Different methods and algorithms are claimed as innovative numerical techniques performing well in these tasks.
  • the areas of non vanishing mutual coherence are also called contour-depth.
  • a series of contour-depths can be retrieved from the interferograms or holograms obtained by changing the optical delay in the reference path.
  • the position of the apparatus can be changed relatively to the object by moving the object and/or the apparatus.
  • the real 3D position of the object light scattering points can be determined from the contour-depth of the hologram by back propagating the wavefield according to the teaching of Digital Holography or Digital Holographic Microscopy (DHM) techniques.
  • the invention can be applied to the apparatus with small NA imaging optics, whereby the image forming process is a central projection of the object light scattering points over the detector plane.
  • the object light scattering points can be positioned along the optics ray connecting the detector point in the detector plane and the principal points of the imaging optics.
  • the object light scattering point is located along the optics ray at a distance determined by the delay line according to the rule that the OPL on the two paths of the interferometer must match within the coherence length.
  • the invention discloses means to derive the expression of a surface interpolating the cluster of object light scattering points.
  • the disclosed method guarantees a low sensitivity to perturbations such as movements and vibrations.
  • the invention is particularly useful when vibrations or jerky movements of the measured object or of the measuring apparatus are part of the measuring process, the detection of mutual coherence.
  • the method can therefore be used in hand held apparatus such as endoscopes.
  • the invention relates to a feasible technique for contouring, insensitive to perturbations and independent of the particularities of the object shape such as sharp jumps, concavities and possibly hidden parts, by the addition of a plurality of illumination sources.
  • the technique performs well when light illuminates the sample at grazing incidence, which makes the technique particularly suitable for the observation of "tube-like" objects.
  • the disclosed apparatus can be realized as a standalone device or as a device that can be connected to another optical instrument. In this case, the apparatus enhances the functionality of the instrument in order to achieve quantitative dimensioning.
  • the short or reduced coherence interferometry or holography techniques bring much more robust measuring equipments which can be used by non specialists, such as physicians or surgeons in a hospital environment.
  • the apparatus provides a technique appropriate to quality and dimensional control, where the illumination can be designed and distributed over the tested objects.
  • the insensitivity of the technique to the morphological and constitutive particularities of the measured objects enables to obtain depth signal and quantitative measurements even in harsh environments and on unstable samples. In particular, it allows for recording topography changes in real time.
  • the method can be applied by embedding the system in the tip of an endoscope. This opens new possibilities for endoscopic system, for medical as well as industrial metrology.
  • Fig. 1 shows a schematic of the method. It consists in a source 1 with selectable coherence length, which is further separated in two arms: 3 the one in which the object to be measured is placed, and 2 the other one which forms an adjustable delay line and that is used to select a contour line with respect to the length of the delay line. These two signals are then recombined and 4 detected. The detected signal is then filtered (5). Each acquisition provides a set of points corresponding to a given depth. Finally, the whole data points cloud provides a tri-dimensional representation of the object surface. represents an implementation of the method with a source 1 with selectable coherence length, a detector 2, a device in charge of splitting/recombining the two beams 3, a variable reference path length 4 and a fixed object position 5.
  • 1 is a source with selectable coherence length
  • 2 is a detector
  • 3 is a device in charge of splitting/recombining the two beams
  • 4 is a fixed reference path length
  • 5 is the object with variable position. The same figure illustrate the situation where the measuring apparatus is moved and the object fixed or both the object and the measuring apparatus are moved.
  • Fig. 4 depicts the coherent gating approach.
  • 1 is the optical source with selectable coherence length.
  • 2 is device used to split the light between the object wave and the reference wave.
  • 3 is the reference. 4 is a delay line.
  • 5 is the reference illuminating source.
  • 6 is the object path.
  • 7 is the object illuminating source.
  • 8 is the sample under study.
  • 9 is the volume defined by coherence gating.
  • 10 is an object light scattering point.
  • Fig. 5 depicts the physical principle leading to the coherent volume creation.
  • 1 is the image plane for a given object depth.
  • 2 is the detector plane.
  • 3 denotes the phase of the reference wave at the detector plane.
  • 4 is the imaging system with an adjustable aperture.
  • 5 is the illumination device.
  • 6 is the recombining device.
  • 7 depicts the reference wave arriving with an off-axis configuration.
  • 8 shows the isophase location of the reference wave in the object space for a given length of the reference path, the dashed curves represent isophase values shifted by +/- Lc/2.
  • 9 depicts the isoradius of the illumination wavefronts, the dashed curves represent isoradius values shifted by +/- Lc/2.
  • 10 depicts the volume in the object space corresponding to the coherence gating. 11 is the sample.
  • Fig. 6 depicts the relationship between the contour-lines which have to be reconstructed in 3D, and the contour-depth computed from the hologram or interferogram acquired on the detector plane (8).
  • 1 and 2 are representations of two contour-lines on the 3D object, with P and Q as particular points.
  • 3 is a sketch of the light path emanating from P and passing through the imaging optics 5 and joining point P' of the line computed as the contour-depth on the detector plane 8 .
  • 4 is a sketch of the light path emanating from Q and passing through the imaging optics 5 and joining point Q' of the line computed as the contour- depth on the detector plane 8 .
  • 6 and 7 are the contour-depth in the detector plane for the contour-lines 1 and 2, respectively.
  • the detector plane 8 is the plane where the electronic camera is placed.
  • P' and Q' are the images of the points P and Q, respectively, given by the imaging optics 5.
  • the gray area 9 features, for the object point P, the volume that is mapped to the single detector point P' by inverse convolution transform based on diffraction theory.
  • Fig. 7 shows the typical off-axis configuration, with a source 1 with selectable coherence length, a device in charge of splitting/recombining the two beams 2, a mirror 3, an imaging system 5 to collect the light backscattered by the object 7.
  • the reference and the object wave are arriving at the detector 6 with different angles. This enables the creation of carrier fringes on the interferogram. Every position of the delay stage 4 corresponds to a different curve level on the object.
  • Fig. 9 is a flow chart depicting the measurement procedure.
  • 1 is the acquisition process
  • 2 is the numerical filtering procedure
  • 3 symbolizes the stack recording of the different interferograms
  • 4 is the process of changing the optical path difference between the two arm of the interferometer
  • 5 symbolizes the ability to observe live contour fringes during acquisition
  • 6 is the tri-dimensional rendering of the observed surface.
  • Fig. 10 is an implementation of the method with a standalone setup. It consists in an interferometer in which the reference path has adjustable length 3. 2 separates the light coming from the collimated laser diode into a object arm and a reference arm. An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6. A diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram. 4 is used to recombine the reference wave with the object wave. 1 is a source with selectable coherence length, 5 is an optional lens used to adapt the reference beam.
  • Fig. 11 is a realization of the approach described on Fig. 2 using monomode fibers for the object path (10) and for the reference path (11).
  • 1 is a source with selectable coherence length
  • 5 is an optional lens used to adapt the reference beam.
  • 2 separates the light coming from the collimated laser diode into a object arm and a reference arm.
  • An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6.
  • a diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram.
  • 4 is used to recombine the reference wave with the object wave.
  • 3 represents the delay stage used to adapt the reference path length in order to select different level curves.
  • Fig. 12 is a realization of the approach described on Fig. 3.
  • the fibered reference path (11) has a fixed length and the whole detection system (3) can be moved in three dimensions.
  • 1 is a source with selectable coherence length
  • 5 is an optional lens used to adapt the reference beam.
  • 2 separates the light coming from the collimated laser diode into a object arm and a reference arm.
  • An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6.
  • a diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram. 4 is used to recombine the reference wave with the object wave.
  • 10 is a monomode fiber, which length has to be chosen in conjunction with 11 so that the reference path and the object path have the same optical path length for the mean depth that has to be detected.
  • 12 is a position detector.
  • Fig. 13 is an implementation of the method by embedding the whole system in 2, at the tip of an endoscope 1.
  • An example of a realization of 2 are the elements contained in box 3 of Fig. 9.
  • Fig. 14 presents an implementation of the method for the system to be pluggable on rigid endoscopes 9. It consists in a device designed to be left on a cart 14, containing the source 1, the beam splitter 2 used to separate into object and reference wave, the adjustable delay stage 3. A device pluggable on other imaging systems 15, containing the recombining beam splitter 11, a lens or an imaging system 12 to provide a real image from the virtual one coming out of the endoscope, an optional lens 10 to adjust the reference wave with respect to the object wave in order to create adequate fringes for detection, and the detector 13. 4 is a monomode fiber used as reference path, 5 is another monomode fiber, which is then split in 3 ( 6, 7, 8) in order to illuminate the sample 16.
  • Fig. 15 is the CAD design of the pluggable part described on Fig. 11.
  • Fig. 16 describes the mount for the multiple illumination fiber system.
  • 1 are the Illumination fiber end ferrules (6, 7, 8 on Fig. 11), 2 are the ferrules mount, 3 is the endoscope.
  • Fig. 17 shows some results. Left: topology of a pencil tip obtained with the setup of Fig. 7, right: topology of an ex-vivo pork bronchi obtained with the setup of Fig. 11.
  • the interferogram I(x,y) will also be considered as an hologram, because the option of restoring the object wave field O from the cross term R * O( ⁇ ,y) by multiplying it by the reference field R will be considered.
  • R O and RO * are called the "cross terms" and represent the mutual coherence between the reference and the object beam. They vanish if no degree of mutual coherence exists between them.
  • the role of the computer represented by 4 is to filter out the various components of the hologram I(x,y) and to keep the mutual coherence terms which convey the spatial data used to establish the contour-depth of the object.
  • the idea is to use a finite or a reduced coherence length source to isolate a depth signal.
  • the mutual coherence term is present on the acquired signal if, and only if the optical path difference between the reference and object wave is smaller than the coherence length of the source.
  • One commonly accepted definition of the coherence length is that it corresponds to the optical path difference for which a reduction of fifty percent of the fringe pattern is observed.
  • the interference of two waves when considering the effect of the coherence length of the source, can be expressed as,
  • I I 1 + I 2 + 2 (/i/ 2 ) 1/2 7 ( ⁇ r) cos (fc 0 ⁇ r) (2)
  • ⁇ 0 is the central wavelength of the emission spectrum of the light source.
  • is the coherence function.
  • the width of ⁇ is inversely related to the band width of the optical source: The wider the emission spectrum, the sharper is the coherence function and the better will be the resolution in depth.
  • ⁇ r is the Optical Pathlength (OPL) difference between the two waves.
  • OPL Optical Pathlength
  • selectable coherence source The importance of the concept of selectable coherence source can be understood at that point:
  • the interest of using a source with selectable coherence is that it permits to control the depth of the coherence zone and can be adapted to the shape of the object as well as to the needed accuracy of the measurement: the use of short or low coherence lengths provides high accuracies in the establishment of the contour-depth and therefore depth imaging because the extension of the coherence function ⁇ is diminished when coherence length is reduced.
  • the importance of the interference term (third term in equation (2)) is reduced accordingly and in some limiting case can cause a drop of the signal to noise ratio.
  • the selectable coherence source is realized according several techniques: according to the teaching given in the above paragraphs, the source must be a broadband source. A second
  • the broad band source is a superluminescent diode coupled to an optional interference filter, the bandwidth of which can be selected according to the accuracy needed and signal intensity available.
  • the broad band source is a semiconductor laser of the Fabry-Perot type or VCEL type powered by a modulated electrical current source.
  • Most available laser diodes emit their spectrum with a current-wavelength dependency.
  • semiconductors lasers have this dependency.
  • a periodic signal for example a sinusoidal, a square or a triangular signal
  • an averaging effect is observed, and a synthetically broadened emission spectrum is created.
  • the broad band source comprises a plurality of single frequency sources comprising at least two single frequency sources.
  • the single frequency sources are semiconductor laser diodes, solid state lasers, gas lasers or a combination of them.
  • a so-called synthetic frequency is generated, which can be used to encode in depth the object, by considering the phase of the synthetic wavelength as a parameter indicating directly the depth of the object light scattering point.
  • the two frequencies can be filtered out of a plurality of frequencies provided by the modal decomposition of the laser source.
  • the longitudinal modes of HeNe lasers permitting to have a synthetic length in the range of ten centimeters.
  • Two or a plurality of semiconductor laser sources also constitute a broad band source usable for interferometric measurement.
  • a pulsed laser source also constitutes broadband source for a reduced coherence length interferometry. It has the advantage of permitting the acquisition of the interferogram or hologram in a single shot: depending on the duration of the pulse, the acquisition time of the hologram can vary from a few femtoseconds with mode locked lasers up to several nanoseconds for Q-switched solid state lasers: DPSS type or any other type of solid state pulsed lasers.
  • the pulsed laser can also be a semiconductor laser: It can provide an enhanced intensity for a period ranging from nanosecond to milliseconds and reduce the acquisition time permitted by the electronic camera down to microseconds or sub- microsecond range.
  • Fig. 2 and Fig.3 illustrate the general concepts at the basis of method of generating the full 3D shape or profile of the object.
  • the concept illustrated in fig. 1 has evidenced that the recourse to low coherence or, more precisely, to reduced coherence optical source brings a mean to isolate and range the object light scattering points as a function of depth, by detecting in the interferogram or hologram the areas where the interference between object and reference wave do not vanish.
  • the area of non vanishing mutual coherence are called contour-depth.
  • a series of contour-depths can therefore be retrieved from the interferograms or holograms obtained by changing the optical delay in the reference path. This principle is illustrated in fig.
  • an interferometer comprising a selectable coherence source 1
  • a beam splitter 3 is sketched and where the delay of the beam is varied (4) in the reference path in order to form a series of contour-depth of the object (5), extracted by the computer from the interferograms or holograms collected by the detector (2).
  • the position of the object (5) can be changed relatively to the apparatus by moving the object and/or the apparatus.
  • a series of contour-depth are extracted by the computer from the interferograms or holograms collected by the detector (2), during the move of the object relatively to the apparatus.
  • the interferometer comprises an optical source (1) providing a selectable coherence length according to the techniques described previously.
  • the beam is divided into two beams following the two arms of the interferometer by a beam divider (2), which can be a beam splitter or fiber coupler if the beam is injected into an optics fiber.
  • the object path (6) comprises a source called object illuminating source (7) which illuminates the object and an optical setup comprising means to collect the light scattered by the object surface or volume and to image the object on a detector plane (12) containing a detector point (11).
  • a second path is used as reference path (3), comprising a fixed or variable delay line (4) whereby the Optical Pathlength (OPL) can be continuously varied in a computer controlled manner, and also comprising a source called reference illuminating source (5) which illuminates the detector plane (12) containing a detector point (11) and where an interference pattern resulting from the superposition of the object and reference beam is generated.
  • the interference pattern is called an hologram.
  • hologram will be preferred to the term interference pattern because a reconstruction of the wavefront to form an image of the object is always an option.
  • Fig. 4 also features the object 8, where a so called coherence zone is represented (9) containing all the object light scattering points where the mutual coherence between object and reference beams does not vanish.
  • the coherence zone is the volume defined by coherence gating: it is the 3D region comprising the object light scattering points for which, in the interferometer, the total Optical Pathlength (OPL) on the measuring arm: i.e. the sum of the OPL from the beamsplitter (2) to the object illuminating source (7) and the OPL from the Object illuminating source(7) to the object light scattering point (9) and the OPL from the object light scattering point(9) to the detector point (11) of the detector plane (12) is equal, within a +/- ⁇ OPL margin, to the total OPL in the reference arm (3): i.e.
  • the ⁇ OPL margin is directly related to the selectable coherence length of the optical source and whereby the ⁇ OPL margin can be arbitrarily defined as the zone where the contrast factor of the interference fringes is higher than a predefined percentage of the maximum contrast factor: typically 50% Fig 5. discloses more details concerning the implementation of the concepts developed: The figure sketches the imaging optics (4) comprising a lens or a compound lens in the case of a complex imaging system with corrected optics or of a rigid endoscope.
  • Fig.5 also features the object illuminating point O (5) which in this case is distinct from the imaging optics but could, in an alternative imbodiment, be also behind the imaging optics and cross the imaging optics.
  • the illuminated object (11) is intercepted by the coherence zone (8),which features a cap crossing the object (11) on a contour line (10) which is the locus of the real 3D position of the object light scattering points which are determined from the contour-depth of the hologram catched by the detector in (2) by back propagating the wavefield from the image side of the imaging optics to the object side.
  • Fig 5. also sketches the image plane (1) of the contour-line (10) for one point of the contour-line(l ⁇ ).
  • the imaging plane (1) varies accordingly.
  • the detector plane (2) where the camera is placed therefore catches most generally an unfocussed image, unless a complex autofocus mechanisms corrects for this defocus for each point of the contour-line.
  • a reference wave (7) is superimposed to the defocused image of the contour-line in the detector plane (2), therefore forming an hologram.
  • (3) is a representation of the wavefront of the reference wave, which is approximately a plane wave, but more generally a spherical wave.
  • the contour-line can be retrieved from a single hologram by computing the wavefield in the plane of the hologram and by computing the back- propagated complex wavefield up to the points situated inside the volume defined by coherence gating. More precisely, the reconstruction of the wavefront converging on the point of the contour-line can be assimilated to an inverse convolution transform which associates the object light scattering point to its image in the detector plane.
  • This method considers the forward image forming model as a convolution which accounts for the free field propagation followed by the wavefield diffraction by the imaging optics, usually described by the Amplitude Point Spread Function (APSF) of and whereby, in the Spatial Frequency Domain, the transform is usually treated as a product by the Optical Transfer Function of the imaging device.
  • APSF Amplitude Point Spread Function
  • the disclosed method can be applied in a simpler manner to the problem of reconstructing from a single hologram, the contour-line (10) from the contour-depth situated in the hologram plane (2).
  • the invention can be applied to the apparatus with small NA imaging optics, whereby the image forming process can be assimilated to a central projection of the object light scattering points over the detector plane.
  • Fig.6 depicts the relationship between the contour-lines which have to be reconstructed in 3D, and the contour-depth computed from the hologram or interferogram acquired on the detector plane (8).
  • (1) and (2) are representations of two contour-lines on the 3D object, with P and Q as particular points.
  • FIG. 3 is a sketch of the light path emanating from P and passing through the imaging optics 5 and joining point P' of the line computed as the contour-depth on the detector plane 8 .
  • similarly 4 is a sketch of the light path emanating from Q and passing through the imaging optics 5 and joining point Q' of the line computed as the contour-depth on the detector plane 8 .
  • 6 and 7 are the contour-depth in the detector plane for the contour- lines 1 and 2, respectively.
  • the detector plane. 8 is the plane where the electronic camera is placed.
  • P' and Q' are the images of the points P and Q, respectively, given by the imaging optics 5.
  • the gray area 9 features, for the object point P, the volume that is mapped to the single detector point P' by inverse convolution transform based on diffraction theory.
  • the image forming process is a central projection of the object light scattering points over the detector plane.
  • the object light scattering points can be positioned along the optics ray connecting the detector point in the detector plane and the principal points of the imaging optics.
  • the object light scattering point is located along the optics ray at a distance determined by the delay line according to the rule that the OPL on the two paths of the interferometer must match within the coherence length.
  • Fig.7 gives a more intuitive and somewhat simplified implementation of the invention with the goal of facilitating the understanding the inventive concept: it shows a typical off-axis configuration, with a source 1 with selectable coherence length, a device in charge of splitting/recombining the two beams 2, a mirror 3, an imaging system 5 to collect the light backscattered by the object 7.
  • the reference and the object wave are arriving at the detector 6 with different angles.
  • This enables the creation of carrier fringes on the interferogram or hologram. Every position of the delay stage 4 corresponds to a different curve level on the object.
  • Contour lines can be computed, according to the method disclosed earlier, from the data concerning the contour-depth extracted from the interferogram or hologram.
  • the reference beam is directed to a delay stage 3 mounted on a motorized axis.
  • a lens 5 is placed in the object arm of the interferometer. It is used to form an image at the CCD plane 6.
  • An adjustable diaphragm can be fitted to the lens 5 and permit the control of the numerical aperture of the lens 5. This control enables the formation of fringes which result from the interferences between the object and reference wave.
  • Expression (1) can be profitably developed in the Spatial Frequency Domain (SFD): let us consider the spatial Fourier transform of / (x,y,): I H (k x k y ).
  • ⁇ O ⁇ 2 (k x k y ) is more problematic: the spatial spectrum of the spatial autocorrelation of the object wave. It also provides the spatial spectral density of the object wave in the k x> k y domain.
  • the spatial spectrum O(k x k y ) is limited, either in the best case, by the amplitude of the beam wavevector in the observation medium, sometimes vacuum, or most usually by instrumental considerations, when a limiting aperture is involved in the hologram formation. This is the case in particular where a lens or a beam splitter is introduced in an optical setup used to generate the hologram, or simply by the limited size of the detector itself.
  • a mismatch between the optical axis of the reference wave R and object wave O is deliberately introduced in order to separate the contribution of the mutual coherence terms.
  • 5 is an optional lens used to adapt the spherical reference wave in order to match the illumination geometry. 3 is placed so that at 4 there is a small, which in turn can be regarded as interferences of spherical waves coming from off-axis points.
  • Fig.8 The left part shows a typical spatial frequency spectrum of an hologram or interferogram acquired on a rough metallic cylinder.
  • the insert shows the interferogram itself where a zoom is performed on the area of the interferogram where the mutual terms give a non vanishing contribution: a fringe pattern can be observed which contributes to the high frequency lobes observed in the spatial frequency spectrum .
  • This property of the interference pattern corresponding to the non vanishing terms of the mutual coherence can be exploited to filter out the area of the interferogram corresponding to the contour-depth.
  • the right part shows the result of the filtration, which corresponds to a contour depth.
  • the scattered field is composed of speckle grains: the mutual interference of the wavefront creates a complicated interference pattern, with statistical properties. Indeed, each points of the object can be considered as a point source. Thus the field in space is composed of the coherent addition of every wavelet emanating from the rough object. Due to surface roughness, there is a phase difference between all these wavelets.
  • the speckle grain exhibits a fine pattern, which is most of the time finer than the periodicity of the sinusoidal variation of the interference term. In such a case, the sinusoidal variation cannot be observed anymore.
  • the areas where the interference fringes are situated is a major aspect of the invention. From the previous considerations, it has been emphasized that the contour depth provides a depth signal for macroscopic object. It is possible to detect the presence of these fringes with an adequate numerical filtering procedure based on only one acquisition (Fig. 8, right). This fact provides a simple but efficient method of extracting the area on the image where coherent superposition of the two waves occurs, thus extracting a contour-depth in only one acquisition. In this section, the different filtering methods that were implemented are carefully described. An attractive property of these methods is that they apply to objects with height much larger than the coherence length and an easy contouring procedure can be derived from this method.
  • the simplest method to extract a contour depth from an interferogram consist in high pass filtering the interferogram taken with an off axis reference wave.
  • the lens comprised in the imaging optics used to generate an interferogram or a hologram is characterized by a limited spatial bandwidh. These bandwidth limitations introduce naturally a band-stop or band rejection filter in the spatial frequency domain. The presence of this band-stop can be built on to filter out the contribution of zero order terms: The occupation of the SFD is therefore limited to an area or most often a disk corresponding to the autocorrelation of the lens bandwidth. Extra-bandwidth components can be therefore filtered out and exploited to reconstruct the contour-depth by restoring the interferences by inverse Fourier transform, or simply by high pass filtering.
  • One solution is to process the interferogram in the spacial frequencies domain: the Fast Fourier Transform (FFT) of a neighborhood of the processed pixel is calculated, and the output of the filter is then the maximum value of the local spectrum components, after zero-order removal. This corresponds to a local measurement of the high frequencies contribution. It is also possible to evaluate the mean value or the cumulative sum of a certain bandwidth of spatial frequencies. It was observed that the outputting of the maximum value is sufficiently selective, providing that there is no sinusoidal variation of the intensity induced by the object itself that falls into the frequency calculated by the FFT. This is due to the coarse sampling of the FFT calculated on a small neighborhood.
  • FFT Fast Fourier Transform
  • a set of matrix defining a set of N 2 filter masks is created by calculating ' x/.
  • the output is then the sum of the squared result from the convolution between these matrices and the input signal,
  • H (x, ⁇ , ⁇ , f) ⁇ exp (igj) • exp (i2 ⁇ f T R ( ⁇ ) x) , ⁇ € [0 : ⁇ ]
  • R( ⁇ ) is the rotation matrix about the angle ⁇
  • f [f c 0]
  • is the frequency of the complex sinusoidal variation to detect
  • is the standard deviation of the Gaussian envelope of the filter.
  • This complex value filter H is composed of 2 filters: the real part with cosine modulation and the imaginary part with sine modulation. This allows to detect every possible shift of a sinusoidal intensity variation with frequency f c along the corresponding axis. This filter typically needs 2 s to process an interferogram of 512x512 pixels.
  • the calibration procedure consists in several acquisitions with a plane object perpendicular to the optical axis being shifted between acquisitions, such that interference fringes are observed on the whole image. Based on this measurement, a map matching every pixel to a known frequency and orientation is then calculated. In such a case, the output of the filter corresponds to only one convolution with the calibrated filter, and not the sum of several convolutions. We called this approach "matched filtering". Moreover, when such calibration maps are available, the use of a filterbank is not optimum anymore, as each filter contributes with its own noise response.
  • the output of a filter chosen accordingly to a calibration map is smaller for zones without fringes than the output of a filterbank, thus leading to a better signal-to-noise ratio.
  • the calculation time for an interferogram of 512x512 pixels is 0.65 s, for the discrete cosine filter as for the Gabor filter.
  • the first step of the procedure is to locally threshold the signal.
  • the filtered image is typically divided in blocks of 80x80 overlapping pixels.
  • the mean intensity in the block is compared to the mean intensity of the whole image, if it is below a given threshold; the whole block is set to zero. If not, the 80x80 pixels block is further divided into a sub block of 10x10 pixel and the mean value of the sub block is compared to the mean value of the 80x80 pixels block. Finally, the value of each pixel in the sub block is compared to the mean value of the sub block.
  • This technique has the advantage to keep the intensity information of the pixels, and minimize the effect of local variance in intensity.
  • the curve to extract is characterized by an accumulation of high intensity data points and the points after thresholding are not necessarily interconnected.
  • the apparent width of the curve cannot be assumed to be constant, and after thresholding, often very isolated points remain that do not correspond to any data, so that the thresholded image has to be processed further.
  • a disk is moved through the points to extract successively the meaningful data, the disk radius is typically of 35 pixels.
  • a starting point is found by sliding the disk along the rows of the image. Once the mean intensity in the disk reaches a given threshold, it is assumed that the points within the disk belong to the curve to extract. First, the disk center is placed on the gravity center of the non-zero pixels within the disk. The neighborhood in the sliding disk is extracted and the coordinates of the gravity center are also recorded.
  • a local estimation of the variance is then used to find the direction of the curve.
  • the covariance matrix is calculated, and expressed in its eigen basis.
  • the eigenvector associated with the maximal eigenvalue defines the direction of the maximal variance, corresponding to the direction toward which the next points belonging to the curve have to be extracted.
  • the sliding disk is translated on a distance that equals its radius in that direction.
  • a set of pixels belonging to the coherent area is extracted. Taking into account the properties of the imaging system, such as the magnification, these pixels are mapped to physical dimensions in the XY plane. As the depth coordinate is known with respect to the length of the reference path, at the end of the procedure a set of tri-dimensional points in true physical dimensions (for example in units of meter) is obtained. These points can then be used to add useful dimensions on a 2D image, and moreover, can be used to represent a 3D surface of the observed object.
  • Fig. 14 shows as an example two results: on the left is a measurement made on a wooden-pencil tip and on the right is a result obtained when observing an ex-vivo pig bronchi.
  • the set of tri-dimensional points is non-uniformly sampled and, most of the time, scattered.
  • an interpolation procedure has to be implemented.
  • We choose to define the surface implicitly in a Cartesian coordinate system with an equation of the type f(x,y,z) 0.
  • off-surface points are synthetically generated. They are defined as points situated at a distance ⁇ d from the surface, in the direction of its normal. By adding these points, we create a numerical distance function d ⁇ f ⁇ y ⁇ zO- This function is then interpolated with a Radial Basis Function (RBF) defined as, s (x) A 1 • ⁇ (l-r - X 1
  • RBF Radial Basis Function
  • f(Xj) are the sparse and generally non-homogeneously dispatched values of the distance function created from the measurement and the numerically added off-surface points.
  • the second equality corresponds to side conditions. This leads to a linear system that has to be solved in order to find the different coefficients defining the RBF,
  • a and P are matrices containing the values of the radially symmetric function ⁇ and of the low-degree polynomial basis, respectively.
  • is a factor added to smooth the fitted surface. For a high value of ⁇ , a very smooth surface is obtained, but of course in this case the fitting accuracy is lower.
  • Fig. 9 illustrates the different steps comprised in the generation of the true 3D image of the object. It shows the flow chart depicting the whole measurement procedure.
  • 1 is the acquisition process
  • 2 is the numerical filtering procedure
  • 3 symbolizes the stack recording of the different interferograms
  • 4 is the process of changing the optical path difference between the two arm of the interferometer
  • 5 symbolizes the ability to observe live contour fringes during acquisition
  • 6 is the tri-dimensional rendering of the observed surface.
  • Curve level extraction is achieved with the use of the smooth reference wave interferometer depicted in principle on Fig. 10.
  • the source 1 is an optical source, with adjustable coherence length.
  • a beam splitter 2 separates the collimated beam generated by 1 into the reference beam (R) and the object beam (O).
  • the reference beam is directed to a delay stage 3 mounted on a motorized axis.
  • a lens 6 is placed in the object arm of the interferometer. It is used to form an image at the CCD plane 8.
  • the adjustable diaphragm 7 is an essential part of the setup: indeed, it is used to limit the aperture of the system, thus enabling for the creation of interferences composed of fringes modulated by a slow varying speckle pattern.
  • Fig.10 is an implementation of the method with a standalone setup.
  • the fibered setup of Fig. 11 presents a realization with a movable head and fibers to connect the movable head to the fixed part of the apparatus.
  • the imaging system represented by 6 can be any available lens or compound-lens.
  • the object path fiber (10), and the reference path fiber (11) are used to deliver the light of the two path of the interferometer to the part of the system pluggable on the imaging system 6.
  • This pluggable part is composed of: a beam splitter 4, in order to recombine the light backscattered by the object 9 and the reference wave.
  • the fibered setup of Fig. 12 presents another Here the reference path is composed of the monomode fiber 11 providing a fixed length reference path.
  • the whole detection system 3 can be moved in three dimensions.
  • the element depicted by the box 12 is a position detector (for example a magnetic tracker), which recovers the position and orientation of the system under box 3. Doing so, it is possible to match every detected contour-line to an absolute referential. This prevents any delay stage scanning, so that the different interferograms can be acquired "on the fly” as the system is moved with respect to the object.
  • Elements contained in box 3 are: a beam splitter 4, in order to recombine the light backscattered by the object 9 and the reference wave.
  • 6 is the available imaging system used to create an image of the diffracted field.
  • 7 is a diaphragm, which permits to choose the aperture of the imaging system in order to create discernible and filterable fringes patterns, and 8 is a detector (a CCD camera).
  • 12 is, as said before, a position sensor.
  • An application is to embed these elements at the tip of an endoscope (number 2 of Fig. 10).
  • Fig.13 illustrates the fact that all the apparatus and optical design of fig. 10 to fig 12 can be miniaturized and inserted inside a flexible gain with steering means, in order to realize an endoscope yielding 3D images of organs in medicine or industrial objects
  • Fig. 14 shows a realization of an apparatus connectable or pluggable on rigid endoscopes 9.
  • a beam splitter 2 separates the collimated beam (B) generated by the modulated laser diode 1 into the reference beam (R) and the object beam (O), which is injected into 5 and then split in 3, before illuminating the object 16.
  • the light scattered by 16 is collected by the endoscope 9. Roughness of the observed surface induces random intensity pattern distribution, known as speckle effect.
  • the reference wave (R) is first directed to a motorized delay stage 3. This way it is possible to adjust the time delay between R and O.
  • the reference wave is then injected in another monomode fiber 4 so that the light can be transmitted to the part of the device attached to the endoscope eyepiece 16. It also provides a simple way to adjust the system to any other rigid endoscope, by simply updating the length of 4 with respect to the length of the 9. 4 also serves as spatial filter for reference beam cleaning.
  • the object wave interferes on the CCD plane with the reference wave projected on the camera 13 through a second beam splitter 11.
  • a lens 10 is placed at the end of 4 to match the divergence and the size of the reference and object beam.
  • a lens or an imaging system 12 is placed in front of the CCD to form a real image on the detector from the virtual one coming from the eyepiece of 9.
  • the end of the reference fiber is placed on a XY stage, in order to set the reference beam slightly off-axis, thus permitting to choose the fringes frequency so that it satisfy the Shannon principle to be correctly sampled on the interferogram.
  • Fig. 15 is the CAD design of the pluggable part described on Fig. 11.
  • Fig. 16 shows the design of the multiple fiber illumination system, with 1 the illumination fibers end ferrules (6, 7, 8 on Fig. 11), 2 are the ferrules mounts, 3 is the endoscope. At the distal position of 30 mm, this gives 3 non overlapping areas of about 7 mm diameter. Monomode fibers packed with ferrules of 1.25 mm diameter were used and inserted in a small mount designed to be adapted around the metallic housing of the endoscope. The mount has an overall diameter of 9 mm and thus can fit into the surrounding tube used as an instrument channel during usual procedure. To split the object beam in 3, a 1x3 coupler is connected to the illumination fiber of the bench module. The overall optical path length of the reference arm is adapted, with respect to the length of the endoscope and illumination fibers, with a suitable patchable to obtain reference arm and object arms length equalization within the scan range of the delay stage.
  • Fig 17 illustrates the result obtained with the apparatus of fig. 14 -16.
  • a set of pixels belonging to the coherent area is extracted. Taking into account the properties of the imaging system, such as the magnification, these pixels are mapped to physical dimensions in the XY plane.
  • the depth coordinate is known with respect to the length of the reference path, at the end of the procedure a set of tri-dimensional points in true physical dimensions (for example in units of meter) is obtained. These points can then be used to add useful dimensions on a 2D image, and moreover, can be used to represent a 3D surface of the observed object.
  • Fig. 17 shows as an example two results: on the left is a measurement made on a wooden- pencil tip and on the right is a result obtained when observing an ex-vivo pig bronchi.

Abstract

The invention relates to a method and apparatus for topological or shape measurement of objects of any size, by using an interferometric technique comprising an optical source, with selectable coherence, allowing contouring from single shot interferograms or holograms, giving access to absolute tri-dimensional topography of illuminated object. A contour depth is obtained based on only one acquisition, which enables to obtain depth signal and quantitative measurements even in harsh environments and on changing samples. This allows for real time record of topography changes and/or distance to surface measurements. The apparatus can be used independently for high precision metrology or incorporated in or connected to any imaging device like an endoscope. This opens new possibilities for industrial metrology as well as for medical applications.

Description

METHOD AND APPARATUS FOR 3D OBJECT SHAPE AND SURFACE TOPOLOGY MEASUREMENTS BY CONTOUR DEPTH EXTRACTION ACQUIRED IN A
SINGLE SHOT
FIELD OF THE INVENTION
The invention relates to a method and apparatus for topological or shape measurement of objects of any size, by using an interferometric technique comprising an optical source, with selectable coherence, allowing contouring from single shot interferograms, giving access to absolute tri-dimensional topography of illuminated object.
INCORPORATION BY REFERENCE
The present application claims a priority from International application PCT/IB2009/050607, filed on February 13, 2009 and which is incorporated by reference for all purposes.
DISCUSSION OF THE BACKGROUND OF THE INVENTION
There is a large panel of techniques available for optical 3D measurements for rough macroscopic objects with sub-millimeter accuracy. Incoherent light can be used with fringes projection and Moire pattern techniques. Photogrammetry techniques employing mainly stereo vision procedures to obtain 3D shape are also available. The tri-dimensional information can also be provided by interferometric measurements. Holography and speckle interferometry can be used to evaluate the phase change induced when a laser light is either transmitted through or reflected by an object.
Patent WO 0020929 (Al) "Method And Apparatus For Simultaneous Amplitude And Quantitative Phase Contrast Imaging by Numerical Reconstruction of Digital Holograms" by E.Cuche et Al. teaches that phase can be derived from a single hologram. This approach holds for microscopic objects with sizes comparable to wavelength. However for macroscopic objects, surface roughness and the role of 2 ambiguity render the phase signal un-interpretable for unpolished macroscopic objects. The phase is therefore practically never evaluated in large object holography, but only the amplitude or intensity of the reconstructed wavefront, which, by itself, does not provide a precise 3D shape measurement. The introduction of low coherence interferometry or holography has permitted to overcome the difficulties in overall shape measurement: Coherence gating has permitted sectioning and profiling by evaluating the envelope of the interference signal, providing thereby a "gating" property, and thus enabling profiling. The interference of two waves coming from a light source with finite temporal coherence length can be expressed as,
/ - J1 + I2 + 2 (/i/2)1/2 7 (Δr) cos (k0Ar)
Where ko=2π/λo, with λ0 the central wavelength of the emission spectrum of the light source. is the coherence function, its width is inversely proportional to the bandwidth of the emission spectrum. It has the effect of modulating the visibility of the interference term, providing a so-called "coherence-gating" effect. In such low coherence interferometry configurations, extracting γ on an interferogram is an efficient way of achieving an optical sectioning or profiling. In this case the depth resolution is directly determined by the coherence length of the source.
In "Adaptive filtering of white-light interferometry fringe patterns", /IEEE Transactions on Instrumentation and Measurement/ *47*(3), 782—788 (1998), Rizk et al used the so-called white light interferometry approach with a multimode laser diode as a source, and by enhancing the detectability of the brightest fringe through different adaptive digital filters based on a least mean square measurement.
Optical coherence tomography (OCT) is based on this concept, huge amount of literature in this field of interest exists, a starting point can be found in the review written by J.M. Schmitt: Optical Coherence Tomography (OCT): a review. IEEE Journal on Selected Topics in Quantum Electronics, 1999. 5(4): p. 1205- 1215. Associated with OCT, the Fourier domain optical coherence Tomography (FD-OCT) surpasses the classical OCT in term of both speed and signal-to-noise ratio, as shown by L. Froehly et al. in Multiplexed 3D imaging using wavelength encoded spectral interferometry: A proof of principle. Optics Communications, 2003. 222(1-6): p. 127-136, and by A.R. Tumlinson et al. in Endoscope-tip interferometer or ultrahigh resolution frequency domain optical coherence tomography in mouse colon. Optics Express, 2006. 14(5): p. 1878-1887. Yelin et al. showed in Spectral-domain spectrally-encoded endoscopy. Optics Express, 2007. 15(5): p. 2432-2444, that methods based on spectrally encoded recording offer great potential in term of miniaturization, sensitivity and imaging speed for endoscopic applications. Zara and Lingley- Papadopoulos recently published Endoscopic OCT approaches toward cancer diagnosis. IEEE Journal on Selected Topics in Quantum Electronics, 2008. 14(1): p. 70-81, about endoscopic OCT approaches for cancer diagnosis.
Low coherence interferometry is quite well known in microscopy to achieve optical sectioning, and in this case is also called coherence probe microscopy, as described by J.A. Izatt et al. in Optical coherence microscopy in scattering media. Optics Letters, 1994. 19(8): p. 590-592. It has to be mentioned that, as opposed to the other methods described in this paragraph, here the information encoded in the phase is effectively evaluated, and the short coherence length is used to filter out-of -focus signal. Oh et al. adapted this technique to endoscopy to achieve high resolution endoscopic imaging (Spectrally-modulated full-field optical coherence microscopy for ultrahigh-resolution endoscopic imaging. Optics Express, 2006. 14(19): p. 8675-8684).
When using a broadband light source in a typical out-of -plane sensitive ESPI interferometer, the process of extracting the location of the coherent superposition of two waves is referenced in the literature as the "coherence radar" technique. Dresel et al. used a simple Michelson configuration, with a piezo-electric actuator to take three phase-stepped acquisitions in order to extract a contour depth, with results published in Three-dimensional sensing of rough surfaces by coherence radar. Appl. Opt., 1992. 31(7): p. 919-925, and a patent (Blossey, S. G. and G. Hausler, Method for the non-contact rapid and accurate acquisition of the surface topology of objects. 1995: US Patent No. 5706085). It is also possible to use short light pulses, in this case the method is called "light-in-flight holography" (T.E. Carlsson et al, System for acquisition of three- dimensional shape and movement using digital Light-in-Flight holography. Optical Engineering, 2001. 40(1): p. 67-75). Balboa et al. reported 3D measurements using superluminescent diodes and multimodes laser diodes in a fibered interferometer, using a five step algorithm for fringe amplitude extraction (Low- coherence optical fibre speckle interferometry. Measurement Science and Technology, 2006. 17(4): p. 605- 616).
Compared to holography or speckle interferometry, the so-called coherence radar and light-in-flight holography techniques are based on the detection of the envelope of the interference signal in the time domain. They are indeed less sensitive to phase fluctuations, and provide absolute tri-dimensioning. The counterpart is that they most often require scanning in time and in space, and thus are less adequate for applications where fast imaging is required.
Most of the time, several acquisitions are taken to extract the fringes amplitude of the interferogram. Another possibility is to acquire several phase stepped images on the same sensor or to use rather complicated optical systems with multiple detectors. In these two cases fine alignment procedures are needed to keep object and reference beams on-axis and good images registrations techniques with sub-pixel performances have to be implemented. Recently, Hrebesh et al. proposed a system for the record of three phase-stepped interferograms and a reference image at the same time, allowing single-shot low-coherence time -domain profilometry (Profilometry with compact single-shot low-coherence time-domain interferometry. Optics Communications, 2008. 281(18): p. 4566-4571). All these techniques however require much care to perform well and are sensitive to time evolution of the signal: they are not immune to fast movements and / or vibrations of the object or measuring apparatus. In particular, they do not perform well when the apparatus must be hand held as this is the case for an endoscope.
The novelty of the approach used in this invention resides in the fact that it requires only one short acquisition per depth contour and one sensor. In this sense it places the technique in direct concurrence with long range OCT, when the latter is used for topological measurements.
SUMMARY OF THE INVENTION
Objectives of the invention
The present invention relates to a method and apparatus for measuring the topography or shape of an object, convex or concave, rounded shape, solid object, or hollow body or tube that can be measured from inside, for instance with can endoscope. The object can have any size: from microscopic to macroscopic size. The 3D shape or profile is derived from a series of depth contour measured from a single interferogram or hologram. The retrieved data are quantitative and highly accurate. Depending on the characteristics of the surface, the accuracy and precision are in the micron range, possibly down to the nanometer range depending on the surface roughness and reflective properties, dielectric in particular.
Problematics and working principle
The approach is based on an interferometric method whereby a source with selectable coherence length illuminates the object and excites a reference path. The interferometric setup comprises two arms, arranged in an original configuration: the object path comprises a source which illuminates the object and an optical setup comprising means to collect the light scattered by the object surface or volume and to image the object on a detector plane. A second path is used as reference path, comprising a fixed or variable delay line whereby the Optical Pathlength (OPL) can be continuously varied in a computer controlled manner, and also comprising a source which illuminates the detector plane where an interference pattern resulting from the superposition of the object and reference beam is generated. In the case where this interference pattern is used to reconstruct the wavefront generated by the illuminated object, the interference pattern is called an hologram. In the following the term hologram will be preferred to the term interference pattern because a reconstruction of the wavefront to form an image of the object is always an option. In the particular case where the object is almost flat and perpendicular to the optical axis, the interference pattern provides an in- focus image of the object and can be interpreted as an hologram yielding an image of the object by the wavefront reconstruction without wavefront propagation. But , in the more general case, where the object is a full 3D object, the interference pattern cannot provide an in-focus image of all part of the object and can be advantageously considered as an hologram formed on the detector plane and that can be acquired digitally and processed in order to focus digitally the image of the different parts of the object, according to the techniques of digital holography: the wavefield is first reconstructed in the detector or hologram plane by multiplication by a digital reference wave and then propagated to the different parts of the object. In the case of an imaging optics having low numerical aperture, the electronic focusing provided by the wavefront reconstruction is not efficient enough to provide a full 3D image of the object from its various focused parts. This large depth of field is beneficial to the observation of all parts of the object in-focus, but detrimental to the lateral resolution of the object image. However this compromise appear favorable to the observation of objects in some applications such as endoscopy, where the imaging optics is designed to have a very low numerical aperture and therefore provide a large depth of field, so that the user has not to manage the focusing issue. In return, the 3D perception of the object is lost.
Exploitation of reduced coherence sources provide a mean to compensate for this loss of 3D perception. Low coherence interferometry or holography brings a remedy to this problem by providing the following means: a low coherence optical source provides a wavefield, the autocorrelation of which is limited in time by the so- called coherence time, which corresponds to a propagating distance of the wave called the coherence length. When the wavefield is split in two waves, which are ultimately recombined on a detector point after propagation in the two paths of the interferometer, will interfere if, and only if the two optical pathlengths (OPL) differ only by a value lower than the coherence length. If the wavefield is diffracted by an object scattering point, the condition to obtain an interference on the detector point is that the OPL from the object illuminating point to the object light scattering point and from that point to the point inside the detector plane, has a well defined value corresponding to a mismatch between the object and reference path less than the coherence length. This condition that all defines a zone, called coherence zone, inside which the object light scattering point must sit, in order to obtain an interference signal, called mutual coherence between the object and reference signal. When the arrangement of the illuminating point and the imaging optics is such as the main scattering signal is the backscattered signal, the coherence zone appears as a layer in the depth of the object. So this discrimination over the depth of the object compensate for the loss of discrimination in depth by low N.A imaging optics.
The interest of using a source with selectable coherence is that it permits to control the depth of the coherence zone and can be adapted to the shape of the object as well as to the needed accuracy of the measurement: the use of short or low coherence lengths provides high accuracies but reduced mutual coherence signals. On the contrary, long coherence signals provide high intensities signal but low accuracies. A tradeoff between these two situations is favorable to the achievement of optimal signal to noise ratio and measuring accuracies: this optimum is achieved for what is called "reduced coherence length" and must be adjusted and selected for each object measurement.
Innovative aspects:
The disclosed method comprises the following innovative aspects: the determination of the areas of the interferogram or hologram captured on the detector plane, where the mutual coherence terms do not vanish, as well as the computation of these so-called mutual coherence terms, and whereby the said determination and said computation are performed on a single interferogram or hologram acquired in a very short acquisition time, as short as the detector and source intensity permits, constitute a major innovative aspect of the invention. The innovative aspects also include the methods and associated computer algorithms involved in the determination of the non vanishing mutual coherence terms. Different methods and algorithms are claimed as innovative numerical techniques performing well in these tasks. The areas of non vanishing mutual coherence are also called contour-depth. A series of contour-depths can be retrieved from the interferograms or holograms obtained by changing the optical delay in the reference path. In an alternative imbodiment of the invention, the position of the apparatus can be changed relatively to the object by moving the object and/or the apparatus.
In another strand of the invention, it is disclosed how to determine the real 3D position of the object light scattering points from the contour-depth located in the detector plane. Two imbodiments of the invention are disclosed. For large aperture optics, the real 3D position of the object light scattering points can be determined from the contour-depth of the hologram by back propagating the wavefield according to the teaching of Digital Holography or Digital Holographic Microscopy (DHM) techniques. In another, preferred, imbodiment, the invention can be applied to the apparatus with small NA imaging optics, whereby the image forming process is a central projection of the object light scattering points over the detector plane. Conversely, the object light scattering points can be positioned along the optics ray connecting the detector point in the detector plane and the principal points of the imaging optics. The object light scattering point is located along the optics ray at a distance determined by the delay line according to the rule that the OPL on the two paths of the interferometer must match within the coherence length.
Finally the invention discloses means to derive the expression of a surface interpolating the cluster of object light scattering points.
The disclosed method guarantees a low sensitivity to perturbations such as movements and vibrations. In particular, the invention is particularly useful when vibrations or jerky movements of the measured object or of the measuring apparatus are part of the measuring process, the detection of mutual coherence. The method can therefore be used in hand held apparatus such as endoscopes.
It is also a merit of the invention to provide an original solution to the contouring problem. The invention relates to a feasible technique for contouring, insensitive to perturbations and independent of the particularities of the object shape such as sharp jumps, concavities and possibly hidden parts, by the addition of a plurality of illumination sources. The technique performs well when light illuminates the sample at grazing incidence, which makes the technique particularly suitable for the observation of "tube-like" objects. The disclosed apparatus can be realized as a standalone device or as a device that can be connected to another optical instrument. In this case, the apparatus enhances the functionality of the instrument in order to achieve quantitative dimensioning. In particular, it is a major achievement of this invention that a system could be realized and plugged on a rigid endoscope, and more precisely, but not limited to, the ones used during ear-nose-throat (ENT) inspection procedures. Another objective is a method offering implementation of the apparatus that can be miniaturized and fully embedded at the tip of an endoscope.
Applications
Compared to the various interferometric techniques such Electronic Speckle Interferometry (ESPI), the short or reduced coherence interferometry or holography techniques bring much more robust measuring equipments which can be used by non specialists, such as physicians or surgeons in a hospital environment. For more general applications in metrology, the apparatus provides a technique appropriate to quality and dimensional control, where the illumination can be designed and distributed over the tested objects. The insensitivity of the technique to the morphological and constitutive particularities of the measured objects enables to obtain depth signal and quantitative measurements even in harsh environments and on unstable samples. In particular, it allows for recording topography changes in real time. The method can be applied by embedding the system in the tip of an endoscope. This opens new possibilities for endoscopic system, for medical as well as industrial metrology.
BRIEF DESCRIPTION OF THE DRAWINGS
Other objects and features of the present invention will become apparent from the following detailed description considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawing are designed solely for purpose of illustration and not as definition of the limits of the invention, for which reference should be made to the appended claims.
Fig. 1 shows a schematic of the method. It consists in a source 1 with selectable coherence length, which is further separated in two arms: 3 the one in which the object to be measured is placed, and 2 the other one which forms an adjustable delay line and that is used to select a contour line with respect to the length of the delay line. These two signals are then recombined and 4 detected. The detected signal is then filtered (5). Each acquisition provides a set of points corresponding to a given depth. Finally, the whole data points cloud provides a tri-dimensional representation of the object surface. represents an implementation of the method with a source 1 with selectable coherence length, a detector 2, a device in charge of splitting/recombining the two beams 3, a variable reference path length 4 and a fixed object position 5. represents an implementation of the method with a fixed reference path length, with the object scanned by either moving the whole measuring apparatus or only the object. 1 is a source with selectable coherence length, 2 is a detector, 3 is a device in charge of splitting/recombining the two beams, 4 is a fixed reference path length and 5 is the object with variable position. The same figure illustrate the situation where the measuring apparatus is moved and the object fixed or both the object and the measuring apparatus are moved.
Fig. 4 depicts the coherent gating approach. 1 is the optical source with selectable coherence length. 2 is device used to split the light between the object wave and the reference wave. 3 is the reference. 4 is a delay line. 5 is the reference illuminating source. 6 is the object path. 7 is the object illuminating source. 8 is the sample under study. 9 is the volume defined by coherence gating. 10 is an object light scattering point. 11 is the detector. 12 is the detector plane, 13 is the processing unit.
Fig. 5 depicts the physical principle leading to the coherent volume creation. 1 is the image plane for a given object depth. 2 is the detector plane. 3 denotes the phase of the reference wave at the detector plane. 4 is the imaging system with an adjustable aperture. 5 is the illumination device. 6 is the recombining device. 7 depicts the reference wave arriving with an off-axis configuration. 8 shows the isophase location of the reference wave in the object space for a given length of the reference path, the dashed curves represent isophase values shifted by +/- Lc/2. 9 depicts the isoradius of the illumination wavefronts, the dashed curves represent isoradius values shifted by +/- Lc/2. 10 depicts the volume in the object space corresponding to the coherence gating. 11 is the sample.
Fig. 6 depicts the relationship between the contour-lines which have to be reconstructed in 3D, and the contour-depth computed from the hologram or interferogram acquired on the detector plane (8). 1 and 2 are representations of two contour-lines on the 3D object, with P and Q as particular points. 3 is a sketch of the light path emanating from P and passing through the imaging optics 5 and joining point P' of the line computed as the contour-depth on the detector plane 8 . Similarly 4 is a sketch of the light path emanating from Q and passing through the imaging optics 5 and joining point Q' of the line computed as the contour- depth on the detector plane 8 . 6 and 7 are the contour-depth in the detector plane for the contour-lines 1 and 2, respectively. The detector plane 8 is the plane where the electronic camera is placed. P' and Q' are the images of the points P and Q, respectively, given by the imaging optics 5. Finally the gray area 9 features, for the object point P, the volume that is mapped to the single detector point P' by inverse convolution transform based on diffraction theory.
Fig. 7 shows the typical off-axis configuration, with a source 1 with selectable coherence length, a device in charge of splitting/recombining the two beams 2, a mirror 3, an imaging system 5 to collect the light backscattered by the object 7. The reference and the object wave are arriving at the detector 6 with different angles. This enables the creation of carrier fringes on the interferogram. Every position of the delay stage 4 corresponds to a different curve level on the object. Fig. 8 left: typical spectrum of an acquisition on a metallic cylinder, inset showing the interferogram with a zoom on the fringe pattern. Right: filtering result corresponding to a contour depth.
Fig. 9 is a flow chart depicting the measurement procedure. 1 is the acquisition process, 2 is the numerical filtering procedure, 3 symbolizes the stack recording of the different interferograms, 4 is the process of changing the optical path difference between the two arm of the interferometer, 5 symbolizes the ability to observe live contour fringes during acquisition, 6 is the tri-dimensional rendering of the observed surface.
Fig. 10 is an implementation of the method with a standalone setup. It consists in an interferometer in which the reference path has adjustable length 3. 2 separates the light coming from the collimated laser diode into a object arm and a reference arm. An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6. A diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram. 4 is used to recombine the reference wave with the object wave. 1 is a source with selectable coherence length, 5 is an optional lens used to adapt the reference beam.
Fig. 11 is a realization of the approach described on Fig. 2 using monomode fibers for the object path (10) and for the reference path (11). 1 is a source with selectable coherence length, 5 is an optional lens used to adapt the reference beam. 2 separates the light coming from the collimated laser diode into a object arm and a reference arm. An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6. A diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram. 4 is used to recombine the reference wave with the object wave. 3 represents the delay stage used to adapt the reference path length in order to select different level curves.
Fig. 12 is a realization of the approach described on Fig. 3. The fibered reference path (11) has a fixed length and the whole detection system (3) can be moved in three dimensions. 1 is a source with selectable coherence length, 5 is an optional lens used to adapt the reference beam. 2 separates the light coming from the collimated laser diode into a object arm and a reference arm. An image of the observed surface 9 is formed on the detector 8 with the help of an imaging system 6. A diaphragm 7 is used to limit the aperture of the imaging system, providing detectable fringes on the interferogram. 4 is used to recombine the reference wave with the object wave. 10 is a monomode fiber, which length has to be chosen in conjunction with 11 so that the reference path and the object path have the same optical path length for the mean depth that has to be detected. 12 is a position detector.
Fig. 13 is an implementation of the method by embedding the whole system in 2, at the tip of an endoscope 1. An example of a realization of 2 are the elements contained in box 3 of Fig. 9.
Fig. 14 presents an implementation of the method for the system to be pluggable on rigid endoscopes 9. It consists in a device designed to be left on a cart 14, containing the source 1, the beam splitter 2 used to separate into object and reference wave, the adjustable delay stage 3. A device pluggable on other imaging systems 15, containing the recombining beam splitter 11, a lens or an imaging system 12 to provide a real image from the virtual one coming out of the endoscope, an optional lens 10 to adjust the reference wave with respect to the object wave in order to create adequate fringes for detection, and the detector 13. 4 is a monomode fiber used as reference path, 5 is another monomode fiber, which is then split in 3 ( 6, 7, 8) in order to illuminate the sample 16.
Fig. 15 is the CAD design of the pluggable part described on Fig. 11. rigid endoscope 1, spring-locked adapter 2 for easy connection of the endoscope, double XY axis tilt adjustment 3, connector for reference wave fiber 4, XY stages 5, lens for reference beam adjustment 6 (10 on Fig. 11), device for object and reference wave recombination 7, lens for imaging on the detector 8 (12 on Fig. 11), detector 9.
Fig. 16 describes the mount for the multiple illumination fiber system. 1 are the Illumination fiber end ferrules (6, 7, 8 on Fig. 11), 2 are the ferrules mount, 3 is the endoscope.
Fig. 17 shows some results. Left: topology of a pencil tip obtained with the setup of Fig. 7, right: topology of an ex-vivo pork bronchi obtained with the setup of Fig. 11.
DETAILED DESCRIPTION OF THE INVENTION
Fig.l illustrates the general concept of the invention: the sketch represents diagrammatically an interferometer whereby an optical source is divided into two parts and illuminates the object on one side and excites a reference path on the other side. After propagation and scattering of the beam by the object, the so- called object beam O and reference beam R recombine on a detector which captures the interference pattern I(x,y) on a 2D array of detectors indexed by x,y, which transmit an image of the interference pattern to a computer. l(x,y) =(R+O)* -{R+O)
Figure imgf000011_0001
+R*O+RO* (1)
In the following, the interferogram I(x,y) will also be considered as an hologram, because the option of restoring the object wave field O from the cross term R*O(χ,y) by multiplying it by the reference field R will be considered. The two first terms of equation (1) represent respectively the intensity of the reference beam
Figure imgf000011_0002
I2= \O\2. These two terms are called the zero order terms are always not vanishing and do not depend on the fact that the two beams are coherent or incoherent. The two last terms of equation (1): R O and RO* are called the "cross terms" and represent the mutual coherence between the reference and the object beam. They vanish if no degree of mutual coherence exists between them. In fig.l, the role of the computer represented by 4 is to filter out the various components of the hologram I(x,y) and to keep the mutual coherence terms which convey the spatial data used to establish the contour-depth of the object. The idea is to use a finite or a reduced coherence length source to isolate a depth signal. The mutual coherence term is present on the acquired signal if, and only if the optical path difference between the reference and object wave is smaller than the coherence length of the source. One commonly accepted definition of the coherence length is that it corresponds to the optical path difference for which a reduction of fifty percent of the fringe pattern is observed. Mathematically, the interference of two waves, when considering the effect of the coherence length of the source, can be expressed as,
I = I1 + I2 + 2 (/i/2)1/27 (Δr) cos (fc0Δr) (2)
Where the "cross terms" or mutual coherence term or "interference term" are now given by the third last term of equ. (2). λ0 is the central wavelength of the emission spectrum of the light source. ko=2π/λo is the corresponding wavevector. γ is the coherence function. The width of γ is inversely related to the band width of the optical source: The wider the emission spectrum, the sharper is the coherence function and the better will be the resolution in depth. Δ r is the Optical Pathlength (OPL) difference between the two waves. The interference term is composed of a sinusoidal variation modulated by the coherence function, which is most often taken as a gaussian shaped curve. Roughly, one can consider that the interference term vanishes when the optical path difference is larger than the coherence length.
The importance of the concept of selectable coherence source can be understood at that point: The interest of using a source with selectable coherence is that it permits to control the depth of the coherence zone and can be adapted to the shape of the object as well as to the needed accuracy of the measurement: the use of short or low coherence lengths provides high accuracies in the establishment of the contour-depth and therefore depth imaging because the extension of the coherence function γ is diminished when coherence length is reduced. But the importance of the interference term (third term in equation (2)) is reduced accordingly and in some limiting case can cause a drop of the signal to noise ratio. On the contrary, long coherence signals provide high values of the interference signal and good signal to noise ratio but low accuracies due to the extension of the coherence function. A tradeoff between these two situations is favorable to the achievement of optimal signal to noise ratio and measuring accuracies: the achievement of this optimum is covered under the concept of "reduced coherence length" and must be adapted to each particular case.
The selectable coherence source is realized according several techniques: according to the teaching given in the above paragraphs, the source must be a broadband source. A second
In a first embodiment, the broad band source is a superluminescent diode coupled to an optional interference filter, the bandwidth of which can be selected according to the accuracy needed and signal intensity available.
In a second embodiment, the broad band source is a semiconductor laser of the Fabry-Perot type or VCEL type powered by a modulated electrical current source. Most available laser diodes emit their spectrum with a current-wavelength dependency. Typically, semiconductors lasers have this dependency. By modulating the command current of the laser source with a periodic signal (for example a sinusoidal, a square or a triangular signal) having a high frequency compared to the acquisition time of the detector, an averaging effect is observed, and a synthetically broadened emission spectrum is created. Taking into account that the broader the spectrum, the shorter the coherence length, this provides a handy way of tuning the apparent coherence length of the laser source. Moreover, it is possible to balance between this coherence reduction and the emitted power, by correctly choosing the DC bias and the current modulation amplitude.
An emphasis is put on the fact that the application of the method is not limited by the use of current- modulated sources. Indeed, any source emitting a broad spectrum or a synthetically broadened spectrum that provides a finite coherence length matching the desired resolution can be used.
In a third embodiment, the broad band source comprises a plurality of single frequency sources comprising at least two single frequency sources. The single frequency sources are semiconductor laser diodes, solid state lasers, gas lasers or a combination of them. In the limiting case where the broadband source comprises only two frequencies, a so-called synthetic frequency is generated, which can be used to encode in depth the object, by considering the phase of the synthetic wavelength as a parameter indicating directly the depth of the object light scattering point. The two frequencies can be filtered out of a plurality of frequencies provided by the modal decomposition of the laser source. As an example it is worth to mention the longitudinal modes of HeNe lasers permitting to have a synthetic length in the range of ten centimeters. Two or a plurality of semiconductor laser sources also constitute a broad band source usable for interferometric measurement.
In a fourth embodiment, a pulsed laser source also constitutes broadband source for a reduced coherence length interferometry. It has the advantage of permitting the acquisition of the interferogram or hologram in a single shot: depending on the duration of the pulse, the acquisition time of the hologram can vary from a few femtoseconds with mode locked lasers up to several nanoseconds for Q-switched solid state lasers: DPSS type or any other type of solid state pulsed lasers. The pulsed laser can also be a semiconductor laser: It can provide an enhanced intensity for a period ranging from nanosecond to milliseconds and reduce the acquisition time permitted by the electronic camera down to microseconds or sub- microsecond range.
Fig. 2 and Fig.3 illustrate the general concepts at the basis of method of generating the full 3D shape or profile of the object. The concept illustrated in fig. 1 has evidenced that the recourse to low coherence or, more precisely, to reduced coherence optical source brings a mean to isolate and range the object light scattering points as a function of depth, by detecting in the interferogram or hologram the areas where the interference between object and reference wave do not vanish. The area of non vanishing mutual coherence are called contour-depth. A series of contour-depths can therefore be retrieved from the interferograms or holograms obtained by changing the optical delay in the reference path. This principle is illustrated in fig. 2 where an interferometer comprising a selectable coherence source 1 , a beam splitter 3 is sketched and where the delay of the beam is varied (4) in the reference path in order to form a series of contour-depth of the object (5), extracted by the computer from the interferograms or holograms collected by the detector (2). In an alternative imbodiment of the invention sketched in fig.3, the position of the object (5) can be changed relatively to the apparatus by moving the object and/or the apparatus. Similarly a series of contour-depth are extracted by the computer from the interferograms or holograms collected by the detector (2), during the move of the object relatively to the apparatus.
Fig.4 describes in more details the interferometric setup used to acquire and compute the contour-depth. The interferometer comprises an optical source (1) providing a selectable coherence length according to the techniques described previously. The beam is divided into two beams following the two arms of the interferometer by a beam divider (2), which can be a beam splitter or fiber coupler if the beam is injected into an optics fiber. The object path (6) comprises a source called object illuminating source (7) which illuminates the object and an optical setup comprising means to collect the light scattered by the object surface or volume and to image the object on a detector plane (12) containing a detector point (11). A second path is used as reference path (3), comprising a fixed or variable delay line (4) whereby the Optical Pathlength (OPL) can be continuously varied in a computer controlled manner, and also comprising a source called reference illuminating source (5) which illuminates the detector plane (12) containing a detector point (11) and where an interference pattern resulting from the superposition of the object and reference beam is generated. In the case where this interference pattern is used to reconstruct the wavefront generated by the illuminated object, the interference pattern is called an hologram. In the following the term hologram will be preferred to the term interference pattern because a reconstruction of the wavefront to form an image of the object is always an option. Fig. 4 also features the object 8, where a so called coherence zone is represented (9) containing all the object light scattering points where the mutual coherence between object and reference beams does not vanish.
The coherence zone is the volume defined by coherence gating: it is the 3D region comprising the object light scattering points for which, in the interferometer, the total Optical Pathlength (OPL) on the measuring arm: i.e. the sum of the OPL from the beamsplitter (2) to the object illuminating source (7) and the OPL from the Object illuminating source(7) to the object light scattering point (9) and the OPL from the object light scattering point(9) to the detector point (11) of the detector plane (12) is equal, within a +/- δ OPL margin, to the total OPL in the reference arm (3): i.e. the sum of the OPL from the beamsplitter (2) to the reference illumination source (5), including an optional variable or fixed delay line (4) causing an additional fixed or variable OPL, and the OPL from the reference illumination source (5) to the detector point (11) of the detector plane (12). The δ OPL margin is directly related to the selectable coherence length of the optical source and whereby the δ OPL margin can be arbitrarily defined as the zone where the contrast factor of the interference fringes is higher than a predefined percentage of the maximum contrast factor: typically 50% Fig 5. discloses more details concerning the implementation of the concepts developed: The figure sketches the imaging optics (4) comprising a lens or a compound lens in the case of a complex imaging system with corrected optics or of a rigid endoscope. Fig.5 also features the object illuminating point O (5) which in this case is distinct from the imaging optics but could, in an alternative imbodiment, be also behind the imaging optics and cross the imaging optics. The illuminated object (11) is intercepted by the coherence zone (8),which features a cap crossing the object (11) on a contour line (10) which is the locus of the real 3D position of the object light scattering points which are determined from the contour-depth of the hologram catched by the detector in (2) by back propagating the wavefield from the image side of the imaging optics to the object side.
Fig 5. also sketches the image plane (1) of the contour-line (10) for one point of the contour-line(lθ). As the distance from each point of the contour-line (10) to the principal plane of the imaging optics (4) varies, the imaging plane (1) varies accordingly. The detector plane (2) where the camera is placed, therefore catches most generally an unfocussed image, unless a complex autofocus mechanisms corrects for this defocus for each point of the contour-line. In the case considered in the disclosed method and apparatus, a reference wave (7) is superimposed to the defocused image of the contour-line in the detector plane (2), therefore forming an hologram. (3) is a representation of the wavefront of the reference wave, which is approximately a plane wave, but more generally a spherical wave. An off axis geometry is also suggested in Fig.5 but not mandatory: an small angle is featured on the sketch. This off axis geometry plays a central role in the disclosed method permitting to reconstruct from a single hologram, the contour-line from the contour-depth in the hologram plane (2).
The following methods will permit the 3D reconstruction of one contour line from a single hologram:
In a first imbodiment and according to the teaching of Digital Holography or Digital Holographic Microscopy (DHM) techniques, the contour-line can be retrieved from a single hologram by computing the wavefield in the plane of the hologram and by computing the back- propagated complex wavefield up to the points situated inside the volume defined by coherence gating. More precisely, the reconstruction of the wavefront converging on the point of the contour-line can be assimilated to an inverse convolution transform which associates the object light scattering point to its image in the detector plane. This method considers the forward image forming model as a convolution which accounts for the free field propagation followed by the wavefield diffraction by the imaging optics, usually described by the Amplitude Point Spread Function (APSF) of and whereby, in the Spatial Frequency Domain, the transform is usually treated as a product by the Optical Transfer Function of the imaging device.
In a second, preferred, imbodiment, the disclosed method can be applied in a simpler manner to the problem of reconstructing from a single hologram, the contour-line (10) from the contour-depth situated in the hologram plane (2). The invention can be applied to the apparatus with small NA imaging optics, whereby the image forming process can be assimilated to a central projection of the object light scattering points over the detector plane. Fig.6 depicts the relationship between the contour-lines which have to be reconstructed in 3D, and the contour-depth computed from the hologram or interferogram acquired on the detector plane (8). (1) and (2) are representations of two contour-lines on the 3D object, with P and Q as particular points. 3 is a sketch of the light path emanating from P and passing through the imaging optics 5 and joining point P' of the line computed as the contour-depth on the detector plane 8 . similarly 4 is a sketch of the light path emanating from Q and passing through the imaging optics 5 and joining point Q' of the line computed as the contour-depth on the detector plane 8 . 6 and 7 are the contour-depth in the detector plane for the contour- lines 1 and 2, respectively. The detector plane. 8 is the plane where the electronic camera is placed. P' and Q' are the images of the points P and Q, respectively, given by the imaging optics 5. Finally the gray area 9 features, for the object point P, the volume that is mapped to the single detector point P' by inverse convolution transform based on diffraction theory. The reconstruction process is disclosed as follows: the image forming process is a central projection of the object light scattering points over the detector plane. Conversely, the object light scattering points can be positioned along the optics ray connecting the detector point in the detector plane and the principal points of the imaging optics. The object light scattering point is located along the optics ray at a distance determined by the delay line according to the rule that the OPL on the two paths of the interferometer must match within the coherence length.
Fig.7 gives a more intuitive and somewhat simplified implementation of the invention with the goal of facilitating the understanding the inventive concept: it shows a typical off-axis configuration, with a source 1 with selectable coherence length, a device in charge of splitting/recombining the two beams 2, a mirror 3, an imaging system 5 to collect the light backscattered by the object 7. The reference and the object wave are arriving at the detector 6 with different angles. This enables the creation of carrier fringes on the interferogram or hologram. Every position of the delay stage 4 corresponds to a different curve level on the object. Contour lines can be computed, according to the method disclosed earlier, from the data concerning the contour-depth extracted from the interferogram or hologram. The reference beam is directed to a delay stage 3 mounted on a motorized axis. A lens 5 is placed in the object arm of the interferometer. It is used to form an image at the CCD plane 6. An adjustable diaphragm can be fitted to the lens 5 and permit the control of the numerical aperture of the lens 5. This control enables the formation of fringes which result from the interferences between the object and reference wave.
According to the teaching of digital holography, filtering is possible in the spatial frequency domain. Expression (1) can be profitably developed in the Spatial Frequency Domain (SFD): let us consider the spatial Fourier transform of / (x,y,): IH (kxky ). In the spatial frequency spectrum, the zero order term \R\2(kxky) appears usually as a localized contribution for a plane wave approximately described by a delta function at the origin kxky=O , which can be easily removed from IH (kxky) in the SFD by subtracting the average intensity. The elimination of \O\2(kxky) is more problematic: the spatial spectrum of the spatial autocorrelation of the object wave. It also provides the spatial spectral density of the object wave in the kx> ky domain. In monochromatic illumination, the spatial spectrum O(kxky) is limited, either in the best case, by the amplitude of the beam wavevector in the observation medium, sometimes vacuum, or most usually by instrumental considerations, when a limiting aperture is involved in the hologram formation. This is the case in particular where a lens or a beam splitter is introduced in an optical setup used to generate the hologram, or simply by the limited size of the detector itself. These bandwidth limitations introduce naturally a band- stop or band rejection filter in the SFD. The presence of this band-stop can be built on to filter out the contribution of zero order terms: The occupation of the SFD is therefore limited to an area or most often a disk of diameter 3/2 of the bandwidth corresponding to the autocorrelation of optical system bandwidth. Extra-bandwidth can be therefore still make available for filtering cross terms: evaluating their value outside the SFD area occupied by the zero order terms can be achieved by taking digitalized hologram in an off-axis geometry: the off-axis geometry translates the object spatial frequency spectrum to high frequencies, beyond the contribution of zero order terms. The off-axis reference wave introduces a so-called spatial carrier frequency and after filtration, demodulation, i.e. multiplication by R, translates the R*O spatial spectrum to the origin and restores the full O spatial frequency content of the wavefront. This approach presents the main advantage of having all the information for reconstructing the complex wavefield from a single hologram or interferogram. This hologram can be acquired in a very short period of time: in the microsecond range or even much less by using pulsed lasers. The blur caused by the displacement or erratic movement of the sample or instrument can be therefore avoided. This advantage is of major concern for extracting the contour-steps
In the apparatus of fig.7, a mismatch between the optical axis of the reference wave R and object wave O is deliberately introduced in order to separate the contribution of the mutual coherence terms. Depending on the angle composed of modulated by a slow varying speckle pattern. 5 is an optional lens used to adapt the spherical reference wave in order to match the illumination geometry. 3 is placed so that at 4 there is a small, which in turn can be regarded as interferences of spherical waves coming from off-axis points.
Fig.8. The left part shows a typical spatial frequency spectrum of an hologram or interferogram acquired on a rough metallic cylinder. The insert shows the interferogram itself where a zoom is performed on the area of the interferogram where the mutual terms give a non vanishing contribution: a fringe pattern can be observed which contributes to the high frequency lobes observed in the spatial frequency spectrum . This property of the interference pattern corresponding to the non vanishing terms of the mutual coherence can be exploited to filter out the area of the interferogram corresponding to the contour-depth. In Fig.8, the right part shows the result of the filtration, which corresponds to a contour depth.
Concerning the grainy aspect of the interferogram or hologram, the following explanation holds: When coherent light is used to illuminate a rough or diffusing surface, the scattered field is composed of speckle grains: the mutual interference of the wavefront creates a complicated interference pattern, with statistical properties. Indeed, each points of the object can be considered as a point source. Thus the field in space is composed of the coherent addition of every wavelet emanating from the rough object. Due to surface roughness, there is a phase difference between all these wavelets. When an imaging system with a large aperture is used, the speckle grain exhibits a fine pattern, which is most of the time finer than the periodicity of the sinusoidal variation of the interference term. In such a case, the sinusoidal variation cannot be observed anymore. Indeed, even though it is still present, it is drown under the high frequency variation of the speckle pattern. But as the smaller the aperture, the larger the speckle pattern mean size, there is effectively an arrangement allowing to observe this sinusoidal variation. As it can be shown on Fig. 8 (left), this creates "carrier fringes" on the interferogram. In the Fourier spectrum, this separates the interference terms and the zero-order term, as depicted on Fig. 8 (left).
Numerical processing of the interferograms:
Extracting from the interferogram or hologram taken in a single shot, the areas where the interference fringes are situated is a major aspect of the invention. From the previous considerations, it has been emphasized that the contour depth provides a depth signal for macroscopic object. It is possible to detect the presence of these fringes with an adequate numerical filtering procedure based on only one acquisition (Fig. 8, right). This fact provides a simple but efficient method of extracting the area on the image where coherent superposition of the two waves occurs, thus extracting a contour-depth in only one acquisition. In this section, the different filtering methods that were implemented are carefully described. An attractive property of these methods is that they apply to objects with height much larger than the coherence length and an easy contouring procedure can be derived from this method.
1. Fourier spectrum estimation
The simplest method to extract a contour depth from an interferogram consist in high pass filtering the interferogram taken with an off axis reference wave. The lens comprised in the imaging optics used to generate an interferogram or a hologram, is characterized by a limited spatial bandwidh. These bandwidth limitations introduce naturally a band-stop or band rejection filter in the spatial frequency domain. The presence of this band-stop can be built on to filter out the contribution of zero order terms: The occupation of the SFD is therefore limited to an area or most often a disk corresponding to the autocorrelation of the lens bandwidth. Extra-bandwidth components can be therefore filtered out and exploited to reconstruct the contour-depth by restoring the interferences by inverse Fourier transform, or simply by high pass filtering.
2. Local Fourier spectrum estimation
One solution is to process the interferogram in the spacial frequencies domain: the Fast Fourier Transform (FFT) of a neighborhood of the processed pixel is calculated, and the output of the filter is then the maximum value of the local spectrum components, after zero-order removal. This corresponds to a local measurement of the high frequencies contribution. It is also possible to evaluate the mean value or the cumulative sum of a certain bandwidth of spatial frequencies. It was observed that the outputting of the maximum value is sufficiently selective, providing that there is no sinusoidal variation of the intensity induced by the object itself that falls into the frequency calculated by the FFT. This is due to the coarse sampling of the FFT calculated on a small neighborhood. Indeed, if the Fourier Transform window is calculated on a greater number of frequency components, more than one pixel will have a high value, and in this case, one better has to output the sum of the cumulative energy in a given bandwidth. Mathematically, this filter is expressed as, hm,n = EfJ0 1 ∑flo1 X (P + k, q + I) e-'lrnwo(p+k)+nw o { q+l))
2/ (p, g) =
Figure imgf000019_0001
\ - P, q = 0. 1, . . . , M - 1
In the equation above, it is necessary to compute the 2D formulation of the FFT, which is very time consuming, in the order of 30 s per interferogram of 512x512 pixels. Thanks to the separability of the FFT, it is possible to calculate the different hmn, corresponding to the frequency components of the transform, first along the rows, and then along the columns. Moreover, an incremental formulation of this filter is possible, in which the k* value of the filter is calculated with the (k-l)"1 value. This is possible because the FFT forms a geometrical serie, h ,} i k > = {h t, (k - 1 ) - j- (A- - 1 Ti r""'"" - x {k - X - 1 ) . f« .< « < Λ - i χ With this approach, the calculation time for this filter is reduced down to 10 s.
3. Discrete cosine transform filters
In the spatial domain, convolution based window filters have been implemented. More precisely, the interferogram is convoluted with a well chosen window filter. This provides a pattern matching approach for depth signal retrieval. One approach is to define a set of discrete cosine with different frequencies. Let us define a set of N discrete cosine vectors of the form,
Figure imgf000019_0002
k e [0 : N - 1] , n E [0 : JV - 1]
A set of matrix defining a set of N2 filter masks is created by calculating
Figure imgf000019_0003
' x/. Among these matrices, M (in our case we chose M=4) of them returning the highest results from our test samples are kept. The output is then the sum of the squared result from the convolution between these matrices and the input signal,
Figure imgf000019_0004
Ht e Hk,,, k, l e [Q : N - l] The calculation time of this filter for interferograms of 512x512 pixels is 1.2 s.
4. Gabor filterbank
Instead of using cosine filters, another solution is to use Gabor filterbank, they are defined as,
H (x, σ, θ, f) = ^ exp (igj) • exp (i2πf TR (θ) x) , θ € [0 : π]
Where R(θ) is the rotation matrix about the angle θ, f=[fc 0]τ is the frequency of the complex sinusoidal variation to detect and σ is the standard deviation of the Gaussian envelope of the filter. This creates a window filter composed of a sinusoidal variation with a Gaussian envelop, and oriented along an axis with angle θ, which can be seen as an adaptation of the exponential basis function of the Fourier transform operation. Thanks to this Gaussian envelope, the intensity variation of the pixels of these window filters is very similar to the local intensity distribution of the modulated speckle pattern to detect. This complex value filter H is composed of 2 filters: the real part with cosine modulation and the imaginary part with sine modulation. This allows to detect every possible shift of a sinusoidal intensity variation with frequency fc along the corresponding axis. This filter typically needs 2 s to process an interferogram of 512x512 pixels.
Within the above defined filters, we showed that the best filtered signal in term of signal-to-noise ratio is obtained using Gabor filterbank.
5. Calibration and matched filtering
Set of filters defined above do not use any a priori knowledge of the local fringe pattern. However, knowledge of either the orientation or the frequency of the fringes to detect can be used to minimize the calculation cost of the procedure. The calibration procedure consists in several acquisitions with a plane object perpendicular to the optical axis being shifted between acquisitions, such that interference fringes are observed on the whole image. Based on this measurement, a map matching every pixel to a known frequency and orientation is then calculated. In such a case, the output of the filter corresponds to only one convolution with the calibrated filter, and not the sum of several convolutions. We called this approach "matched filtering". Moreover, when such calibration maps are available, the use of a filterbank is not optimum anymore, as each filter contributes with its own noise response. To this, the output of a filter chosen accordingly to a calibration map is smaller for zones without fringes than the output of a filterbank, thus leading to a better signal-to-noise ratio. When matched filtering is used, the calculation time for an interferogram of 512x512 pixels is 0.65 s, for the discrete cosine filter as for the Gabor filter.
Cleaning filter output:
The first step of the procedure is to locally threshold the signal. The filtered image is typically divided in blocks of 80x80 overlapping pixels. The mean intensity in the block is compared to the mean intensity of the whole image, if it is below a given threshold; the whole block is set to zero. If not, the 80x80 pixels block is further divided into a sub block of 10x10 pixel and the mean value of the sub block is compared to the mean value of the 80x80 pixels block. Finally, the value of each pixel in the sub block is compared to the mean value of the sub block. This technique has the advantage to keep the intensity information of the pixels, and minimize the effect of local variance in intensity.
The curve to extract is characterized by an accumulation of high intensity data points and the points after thresholding are not necessarily interconnected. The apparent width of the curve cannot be assumed to be constant, and after thresholding, often very isolated points remain that do not correspond to any data, so that the thresholded image has to be processed further. A disk is moved through the points to extract successively the meaningful data, the disk radius is typically of 35 pixels. A starting point is found by sliding the disk along the rows of the image. Once the mean intensity in the disk reaches a given threshold, it is assumed that the points within the disk belong to the curve to extract. First, the disk center is placed on the gravity center of the non-zero pixels within the disk. The neighborhood in the sliding disk is extracted and the coordinates of the gravity center are also recorded. A local estimation of the variance is then used to find the direction of the curve. The covariance matrix is calculated, and expressed in its eigen basis. The eigenvector associated with the maximal eigenvalue defines the direction of the maximal variance, corresponding to the direction toward which the next points belonging to the curve have to be extracted. The sliding disk is translated on a distance that equals its radius in that direction.
For each interferogram, a set of pixels belonging to the coherent area is extracted. Taking into account the properties of the imaging system, such as the magnification, these pixels are mapped to physical dimensions in the XY plane. As the depth coordinate is known with respect to the length of the reference path, at the end of the procedure a set of tri-dimensional points in true physical dimensions (for example in units of meter) is obtained. These points can then be used to add useful dimensions on a 2D image, and moreover, can be used to represent a 3D surface of the observed object. Fig. 14 shows as an example two results: on the left is a measurement made on a wooden-pencil tip and on the right is a result obtained when observing an ex-vivo pig bronchi.
3D rendering:
The set of tri-dimensional points is non-uniformly sampled and, most of the time, scattered. In order to obtain a rendering of the observed surface, an interpolation procedure has to be implemented. We choose to define the surface implicitly in a Cartesian coordinate system with an equation of the type f(x,y,z)=0.
To achieve correct surface interpolation, off-surface points are synthetically generated. They are defined as points situated at a distance ±d from the surface, in the direction of its normal. By adding these points, we create a numerical distance function d^f^y^zO- This function is then interpolated with a Radial Basis Function (RBF) defined as, s (x)
Figure imgf000022_0001
A1 Φ (l-r - X1 |)
Where pm is a low-degree polynomial, here we typically use a 1st order polynomial, and Φ is a radially symmetric function from R+ to R. Thus the problem is to find the polynomial coefficients and the X1 satisfying, s (xt ) = f (xι) , i = l, 2, . . . , n
Figure imgf000022_0002
Where f(Xj) are the sparse and generally non-homogeneously dispatched values of the distance function created from the measurement and the numerically added off-surface points. The second equality corresponds to side conditions. This leads to a linear system that has to be solved in order to find the different coefficients defining the RBF,
A - ηl P\ (\\ _ (f p- o) W - [o
Where A and P are matrices containing the values of the radially symmetric function Φ and of the low-degree polynomial basis, respectively. © is a factor added to smooth the fitted surface. For a high value of ©, a very smooth surface is obtained, but of course in this case the fitting accuracy is lower.
The last step of the surface fitting procedure is to find points satisfying f(x,y,z)=0 by iso-surfacing.
Data processing
The diagram of Fig. 9 illustrates the different steps comprised in the generation of the true 3D image of the object. It shows the flow chart depicting the whole measurement procedure. 1 is the acquisition process, 2 is the numerical filtering procedure, 3 symbolizes the stack recording of the different interferograms, 4 is the process of changing the optical path difference between the two arm of the interferometer, 5 symbolizes the ability to observe live contour fringes during acquisition, 6 is the tri-dimensional rendering of the observed surface.
Apparatus
The illustrations of fig 10 to 16 disclose different realizations of 3D measuring apparatus, where the different inventive concepts and methods are implemented in order
Curve level extraction is achieved with the use of the smooth reference wave interferometer depicted in principle on Fig. 10. The source 1 is an optical source, with adjustable coherence length. A beam splitter 2 separates the collimated beam generated by 1 into the reference beam (R) and the object beam (O). The reference beam is directed to a delay stage 3 mounted on a motorized axis. A lens 6 is placed in the object arm of the interferometer. It is used to form an image at the CCD plane 8. The adjustable diaphragm 7 is an essential part of the setup: indeed, it is used to limit the aperture of the system, thus enabling for the creation of interferences composed of fringes modulated by a slow varying speckle pattern. 5 is an optional lens used to adapt the spherical reference wave in order to match the illumination geometry. 3 is placed so that at 4 there is a small mismatch between the optical axis of R and O, which in turn can be regarded as interferences of spherical waves coming from off-axis points. Fig.10 is an implementation of the method with a standalone setup.
The fibered setup of Fig. 11 presents a realization with a movable head and fibers to connect the movable head to the fixed part of the apparatus. Indeed, the imaging system represented by 6 can be any available lens or compound-lens. The object path fiber (10), and the reference path fiber (11) are used to deliver the light of the two path of the interferometer to the part of the system pluggable on the imaging system 6. This pluggable part is composed of: a beam splitter 4, in order to recombine the light backscattered by the object 9 and the reference wave. A diaphragm 7, which permits to select the aperture of the imaging system in order to create discernible and filterable fringes patterns, and a detector 8 (a CCD camera). For each position of the delay stage 3, a different contour depth is obtained, and the whole topology of the object is recovered with a scanning procedure. With this implementation, it is necessary that the object do not move during the acquisition procedure.
The fibered setup of Fig. 12 presents another Here the reference path is composed of the monomode fiber 11 providing a fixed length reference path. The whole detection system 3 can be moved in three dimensions. The element depicted by the box 12 is a position detector (for example a magnetic tracker), which recovers the position and orientation of the system under box 3. Doing so, it is possible to match every detected contour-line to an absolute referential. This prevents any delay stage scanning, so that the different interferograms can be acquired "on the fly" as the system is moved with respect to the object. Elements contained in box 3 are: a beam splitter 4, in order to recombine the light backscattered by the object 9 and the reference wave. An optional lens 5 used to adapt the spherical reference wave in order to match the illumination geometry. 6 is the available imaging system used to create an image of the diffracted field. 7 is a diaphragm, which permits to choose the aperture of the imaging system in order to create discernible and filterable fringes patterns, and 8 is a detector (a CCD camera). 12 is, as said before, a position sensor. An application is to embed these elements at the tip of an endoscope (number 2 of Fig. 10).
Fig.13 illustrates the fact that all the apparatus and optical design of fig. 10 to fig 12 can be miniaturized and inserted inside a flexible gain with steering means, in order to realize an endoscope yielding 3D images of organs in medicine or industrial objects Fig. 14 shows a realization of an apparatus connectable or pluggable on rigid endoscopes 9. A beam splitter 2 separates the collimated beam (B) generated by the modulated laser diode 1 into the reference beam (R) and the object beam (O), which is injected into 5 and then split in 3, before illuminating the object 16. The light scattered by 16 is collected by the endoscope 9. Roughness of the observed surface induces random intensity pattern distribution, known as speckle effect. The reference wave (R) is first directed to a motorized delay stage 3. This way it is possible to adjust the time delay between R and O. The reference wave is then injected in another monomode fiber 4 so that the light can be transmitted to the part of the device attached to the endoscope eyepiece 16. It also provides a simple way to adjust the system to any other rigid endoscope, by simply updating the length of 4 with respect to the length of the 9. 4 also serves as spatial filter for reference beam cleaning. The object wave interferes on the CCD plane with the reference wave projected on the camera 13 through a second beam splitter 11. A lens 10 is placed at the end of 4 to match the divergence and the size of the reference and object beam. A lens or an imaging system 12 is placed in front of the CCD to form a real image on the detector from the virtual one coming from the eyepiece of 9. The end of the reference fiber is placed on a XY stage, in order to set the reference beam slightly off-axis, thus permitting to choose the fringes frequency so that it satisfy the Shannon principle to be correctly sampled on the interferogram.
Fig. 15 is the CAD design of the pluggable part described on Fig. 11. rigid endoscope 1, spring-locked adapter 2 for easy connection of the endoscope, double XY axis tilt adjustment 3, connector for reference wave fiber 4, XY stages 5, lens for reference beam adjustment 6 (10 on Fig. 11), device for object and reference wave recombination 7, lens for imaging on the detector 8 (12 on Fig. 11), detector 9.
Fig. 16 shows the design of the multiple fiber illumination system, with 1 the illumination fibers end ferrules (6, 7, 8 on Fig. 11), 2 are the ferrules mounts, 3 is the endoscope. At the distal position of 30 mm, this gives 3 non overlapping areas of about 7 mm diameter. Monomode fibers packed with ferrules of 1.25 mm diameter were used and inserted in a small mount designed to be adapted around the metallic housing of the endoscope. The mount has an overall diameter of 9 mm and thus can fit into the surrounding tube used as an instrument channel during usual procedure. To split the object beam in 3, a 1x3 coupler is connected to the illumination fiber of the bench module. The overall optical path length of the reference arm is adapted, with respect to the length of the endoscope and illumination fibers, with a suitable patchable to obtain reference arm and object arms length equalization within the scan range of the delay stage.
It is technically difficult and really expensive to ensure that 6, 7 and 8 are exactly of the same length. However, we by-pass this difficulty by a simple calibration procedure and the application of 3 efficient digital masks, so that the algorithmic procedure used to retain the significant data points after thresholding can be applied sequentially to the extracted points corresponding to each illuminated area, for one acquisition. First the lengths of the 3 fibers are measured. With this calibration procedure the absolute depth value for each extracted points is known with respect to the position of the 3. Then the threshold signal is multiplied by a digital binary mask; the resulting signal is set to zero for object points that are not illuminated with the fiber currently processed. The procedure is repeated for each fiber, each time with a different mask. For each acquisition, we obtain 3 sets of measured points, each time associated with a different depth.
The various implementations presented in fig 10-16 can also be used to enhance the functionality of other imaging systems.
Fig 17 illustrates the result obtained with the apparatus of fig. 14 -16. For each interferogram, a set of pixels belonging to the coherent area is extracted. Taking into account the properties of the imaging system, such as the magnification, these pixels are mapped to physical dimensions in the XY plane. As the depth coordinate is known with respect to the length of the reference path, at the end of the procedure a set of tri-dimensional points in true physical dimensions (for example in units of meter) is obtained. These points can then be used to add useful dimensions on a 2D image, and moreover, can be used to represent a 3D surface of the observed object. Fig. 17 shows as an example two results: on the left is a measurement made on a wooden- pencil tip and on the right is a result obtained when observing an ex-vivo pig bronchi.

Claims

1. An apparatus for measuring the topography or shape of an object form a series of depth contours measured from a single shot hologram and comprising the following means: a. At least one light source, with selectable coherence, for producing light to illuminate the object b. At least one interferometer adapted to provide a hologram acquired in a single shot, from which a contour depth is determined. c. An imaging optics to form the hologram on the image recording device d. An image recording device to record a series of holograms e. First processing means for extracting from a single hologram the hologram area where the mutual coherence does not vanish f. Second processing means for retrieving a depth contour from a single hologram g. Third processing means for positioning the depth contour in 3D space.
2. An apparatus according to claim 1, where the light source with selectable coherence comprises a broadband source
3. An apparatus according to claim 2, where the broad band source is a superluminescent diode coupled to an optional interference filter, the bandwidth of which can be selected according to the accuracy needed and signal intensity available.
4. An apparatus according to claim 2, where the broad band source is a semiconductor laser powered by a modulated electrical current source.
5. An apparatus according to claim 2, where the broad band source comprises a plurality of single frequency sources comprising at least two single frequency sources.
6. An apparatus according to claim 5, where the broad band source comprises at least two frequencies out of a plurality of frequencies provided by the modal decomposition of one or plurality of laser sources.
7. An apparatus according to claim 5 and 6, where the single frequency sources are semiconductor laser diodes, solid state lasers, gas lasers or a combination of them.
8. An apparatus according to claim 2, where the broad band source is a pulsed laser permitting the acquisition of a single shot hologram.
9. An apparatus according to claim 8, where the pulsed laser is a semiconductor laser.
10.An apparatus according to claim 8, where the pulsed laser is a solid state laser.
11.An apparatus according to claims 1-10, where the pulsed laser is a mode locked laser , optionally producing pulses in the femtosecond range.
12.An apparatus according to claim 1-11, where the interferometer is a Michelson interferometer.
13.An apparatus according to claim 1-11, where the interferometer is a Mach Zehnder interferometer.
14.An apparatus according to claim 1-11, where the interferometer is a combination of Michelson and Mach- Zehnder interferometer.
15.An apparatus according to claim 1-11, where part of the interferometer comprises fiber optics links.
16.An apparatus according to claim 1-15, where the measuring path has a variable length which can be measured by a distance sensor, in particular a magnetic tracker.
17.An apparatus according to claim 1-15, where the reference path has a variable Optical Pathlength which can be computer controlled.
18.An apparatus according to claim 1-17 comprising means for acquiring the hologram in a single pulsed laser shot.
19.An apparatus according to claim 1-17 comprising means for acquiring the hologram in a single shot by the detector in a short acquisition time.
20.An apparatus according to claims 1-19, where the imaging optics comprised in the apparatus is an image forming lens.
21.An apparatus according to claims 20, where the imaging optics comprised in the apparatus is a low aperture lens with a large depth of field.
22.An apparatus according to claims 20, where the imaging optics comprised in the apparatus is a rigid endoscope.
23.An apparatus according to claims 20, where the imaging optics comprised in the apparatus is a flexible endoscope.
24.An apparatus according to claims 20, where the means are any of the means according to the methods and apparatus described in claims 1-21, comprised in a flexible endoscope.
25.An apparatus according to claims 1-24, where the image recording device is an electronic camera associated with computer means to acquire the series of holograms and store them in a digital form.
26.An apparatus according to claims 1-25, where the method to extract from a single hologram the contribution of the mutual coherence to hologram intensity, comprises an interferometer according to claim 1, and claims 12-17, where the reference wave is off- axis so that the contribution of the mutual coherence terms can be moved to high spatial frequency components.
27. A method for using an apparatus as defined in anyone of the previous claims 1 to 26.
28.A method according to claim 27, where the mutual coherence components are discriminated in the spatial frequency domain by high pass filtering the frequency terms, beyond the cutoff frequency of the optical imaging device.
29.A method according to claim 27 or 28, where the cutoff frequency is selected by the design of the imaging device having the optimal numerical aperture permitting to maximize the spectral domain of the object beam.
30. A method according to claim 27 or 28, where the cutoff frequency is fixed by a diaphragm in the pupil of the imaging device.
31. A method according to claim 29 or 30, where the high pass filtering is achieved by Fourier spectrum analysis.
32.A method according to claims 29-30, where the high pass filtering is achieved by local Fourier spectrum analysis.
33.A method according to claim 29 or 30, where the high pass filtering is achieved by discrete cosine filterbank convolution.
34.A method according to claim 33, which comprises the use of an a priori knowledge of either the orientation or the frequency (or both) of the fringes to detect in order to create filters with reduced calculation cost and enhanced sensitivity.
35.A method according to claims 29-30, where the high pass filtering is achieved by Gabor filterbank convolution.
36.A method according to claims 29-30, where the filtering is achieved by multiscale wavelet decomposition.
37.A method according to claim 31-36, where the connectivity of the hologram points having non-zero mutual coherence is obtained by a method which comprises the steps of sequentially moving a disk through the filtered points, in the direction of the eigenvector corresponding to the maximum eigenvalue of the covariance matrix of a neighborhood, in order to filter out the noisy or isolated points.
38.A method according to claims 29-37, where the depth contour is retrieved from a single hologram by computing the wavefield in the plane of the hologram and by computing the back- propagated complex wavefield up to the points situated inside the volume defined by coherence gating.
39.A method according to claims 29-37, where the wavefield is computed in the hologram plane by multiplying the hologram filtered according to one of the methods according to claims 26-36 by the numerical expression of the fixed reference wave in the hologram plane.
40.A method according to claims 29-37, where the numerical expression of the fixed reference wave in the hologram plane can be derived from a calibration measurement taken on a test object: a reference plane.
41. A method according to claim 38, where the computation of the back- propagated wavefield is achieved by computing the inverse of the convolution transform of the wavefield in the hologram plane.
42.A method according to claim 41 where the convolution transform accounts for the free field propagation followed by the wavefield diffraction by the imaging optics, usually described by the Amplitude Point Spread Function (APSF) of and whereby, in the Spatial Frequency Domain, the transform is usually treated as a product by the Optical Transfer Function of the imaging device.
43.A method according to claim 38, where the volume defined by coherence gating is the 3D region comprising the object light scattering points for which, in the interferometer, the total Optical Pathlength (OPL) on the measuring arm: i.e. the sum of the OPL from the beamsplitter to the illuminating source and the OPL from the illuminating source to the object light scattering point and the OPL from the object light scattering point to the detector of the detector plane is equal, within a +/- OPL margin, to the total OPL in the reference arm: i.e. the sum of the OPL from the beamsplitter to the reference illumination source, including an optional variable or fixed delay line causing an additional fixed or variable OPL, and the OPL from the reference illumination source to the detector point of the detector plane.
44.A method according to claim 38, where the OPL margin is directly related to the selectable coherence length of the optical source and whereby the OPL margin can be arbitrarily defined as the zone where the contrast factor of the interference fringes is higher than a predefined percentage of the maximum contrast factor: typically 50%.
45.A method according to claim 22 to 24 and according to claim 38-44, where the Amplitude Point Spread Function (APSF) is an elongated spindle in the case of low Numerical Aperture imaging optics.
46.A method and apparatus according to claim 45, where the spindle length can reach the whole object dimension, corresponding to a depth of field large enough to image the whole object with sufficient resolution.
47.A method according to claim 45-46, where the hologram is found to match approximately the object image and no computation of the back propagated field, as recited in claims 40-44, is required, and whereby the depth contour can be directly retrieved from the object image taken by the image recording device, according to the procedures of claims 28-37.
48.A method according to claims 29-37, where the various object light scattering points are precisely positioned in 3D space from the image of the point of the depth contour contained in the detector plane measured according to claims, by a. Computing the conjugate points of the reconstructed points of the depth contour according to claim 38-47, provided that the aperture of the imaging optics is sufficiently large to provide sharp focusing of the object light scattering points. b. in a preferred embodiment, computing the position of object light scattering points from the points of the depth contour by inverse central projection, i.e. positioning the object light scattering points from the point of the depth contour contained in the detector plane by computing the position of object light scattering point along the line passing through the center of the principal planes and at a total distance from the detector plane defined so as the sum of the OPL from the beamsplitter to the illuminating source and the OPL from the illuminating source to the object light scattering point and the OPL from the object light scattering point to the detector of the detector plane is equal to the OPL in the reference arm, i.e. the sum of the OPL from the reference illumination source to the detector point of the depth contour of the detector plane and an OPL which is given by the adjustable delay line, which can be computer controlled.
49. A method according to claims 48, where the delay line is scanned under computer control in order to form a cluster of object light scattering points that feature the surface of the object.
50. A method according to claims 48, where the delay line is maintained fixed to a value in the measuring range, and a. Where the apparatus is moved with respect to the object in order to form a cluster of object light scattering points that feature the surface of the object. b. Where the object is moved with respect to the apparatus in order to form a cluster of object light scattering points that feature the surface of the object.
51. A method according to claims 48, where, respectively: a. The position of the apparatus head, with respect to the object, is precisely monitored by a position sensor providing the precise x,y,z position and angular orientation (2 angles) of the apparatus head, in order to position precisely the object light scattering point according to the recipe of claim 50 b. The position of the object, with respect to the apparatus head, is precisely monitored by a position sensor providing the precise x,y,z position and angular orientation (2 angles) of the object, in order to position precisely the object light scattering point according to the recipe of claim 50.
52. A method according to claims 49, where the position sensor is a magnetic tracker.
53.A method according to claims 49-52, where, a method is performed to fit the object light scattering points with an interpolated surface, by a. Least square fitting method b. Signed volume functions c. Radial function decomposition d. Any other method for achieving the best fit of the point cluster.
54.A method according to claims 1-53, where the apparatus comprises a. A fixed part containing the optical source and an optional fixed or controllable delay line b. A movable part called apparatus head connected to the fixed part by two fiber optics.
PCT/IB2010/050608 2009-02-13 2010-02-10 Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot WO2010092533A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IB2009050607 2009-02-13
IBPCT/IB2009/050607 2009-02-13

Publications (1)

Publication Number Publication Date
WO2010092533A1 true WO2010092533A1 (en) 2010-08-19

Family

ID=42173427

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/050608 WO2010092533A1 (en) 2009-02-13 2010-02-10 Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot

Country Status (1)

Country Link
WO (1) WO2010092533A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012136238A1 (en) * 2011-04-04 2012-10-11 Universität Stuttgart Method and arrangement for short coherence holography
JP2016057379A (en) * 2014-09-08 2016-04-21 日立マクセル株式会社 Hologram data generation device, hologram data generation method and hologram data generation program
RU2608322C2 (en) * 2011-10-20 2017-01-17 Конинклейке Филипс Н.В. Holographic user interfaces for medical procedures
EP2667150A4 (en) * 2011-01-21 2017-01-18 University of Hyogo Three-dimensional shape measurement method and three-dimensional shape measurement device
DE102017105910A1 (en) * 2017-03-20 2018-09-20 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Frequency-based projection segmentation
CN109374580A (en) * 2018-09-30 2019-02-22 北京工业大学 A kind of Terahertz lamination image probe positional error correction method
WO2019179687A1 (en) * 2018-03-19 2019-09-26 Medizinisches Laserzentrum Lübeck GmbH Method for photocopying a sequence of cut surfaces inside a light-scattering object with improved scanning
CN112136182A (en) * 2017-11-16 2020-12-25 杨晓东 System and method for blood flow imaging based on Gabor optical coherence tomography
IT201900023202A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining the separation distance between a body and the surface of an object using low coherence optical interferometry techniques in subsampling distortion regime
IT201900023214A1 (en) * 2019-12-06 2021-06-06 Adige Spa Method and system for determining the local position of at least one optical element in a machine for laser processing of a material, using low coherence optical interferometry techniques
IT201900023229A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining the position of an element of an optical system in a machining or measuring complex of an object by means of parallel interferometric measurements
IT201900023181A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining and controlling the separation distance between a processing head of a laser processing machine and the surface of a workpiece using low coherence optical interferometry techniques
CN113435455A (en) * 2021-05-12 2021-09-24 西安工程大学 Image contour extraction method based on space-time pulse coding
WO2022003867A1 (en) * 2020-07-01 2022-01-06 Hamamatsu Photonics K.K. Slanted optical coherence tomography imaging for high-speed inspection
CN114322808A (en) * 2021-12-02 2022-04-12 上海大学 Multi-dimensional speckle interference system and real-time measurement method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706085A (en) 1995-08-03 1998-01-06 Blossey; Stefan G. Method for the non-contact rapid and accurate acquisition of the surface topology of objects
WO2000020929A1 (en) 1998-10-07 2000-04-13 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus for simultaneous amplitude and quantitative phase contrast imaging by numerical reconstruction of digital holograms

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5706085A (en) 1995-08-03 1998-01-06 Blossey; Stefan G. Method for the non-contact rapid and accurate acquisition of the surface topology of objects
WO2000020929A1 (en) 1998-10-07 2000-04-13 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus for simultaneous amplitude and quantitative phase contrast imaging by numerical reconstruction of digital holograms

Non-Patent Citations (13)

* Cited by examiner, † Cited by third party
Title
"Low-coherence optical fibre speckle interferometry", MEASUREMENT SCIENCE AND TECHNOLOGY, vol. 17, no. 4, 2006, pages 605 - 616
"Profilometry with compact single-shot low-coherence time-domain interferometry", OPTICS COMMUNICATIONS, vol. 281, no. 18, 2008, pages 4566 - 4571
"Three-dimensional sensing of rough surfaces by coherence radar", APPL. OPT., vol. 31, no. 7, 1992, pages 919 - 925
A.R. TUMLINSON ET AL.: "Endoscope-tip interferometer for ultrahigh resolution frequency domain optical coherence tomography in mouse colon", OPTICS EXPRESS, vol. 14, no. 5, 2006, pages 1878 - 1887
J.A. IZATT: "Optical coherence microscopy in scattering media", OPTICS LETTERS, vol. 19, no. 8, 1994, pages 590 - 592
J.M. SCHMITT: "Optical Coherence Tomography (OCT): a reriew", JOURNAL ON SELECTED TOPICS IN QUANTUM ELECTRONICS, vol. 5, no. 4, 1999, pages 1205 - 1215
KÜHN, COLOMB, MONTFORT, CHARRIERE, EMERY, CUCHE, MARQUET, DEPEURSINGE: "Real-time dual-wavelength digital holographic microscopy with a single hologram acquisition", OPTICS EXPRESS, vol. 15, no. 12, 29 May 2007 (2007-05-29), pages 7231 - 7242, XP002586231 *
L. FROEHLY ET AL.: "Multiplexed 3D imaging using waelengih encoded spectral interferometry: A proof of principle", OPTICS COMMUNICATIONS., vol. 222, no. 1-6, 2003, pages 127 - 136
OH ET AL.: "Spectrally-modulated full-field optical coherence microscopy for ultrahigh-resolution endoscopic imaging", OPTICS EXPRESS, vol. 14, 2006, pages 8675 - 8684
RIZK: "Adaptive filtering of white-light interferometry fringe patterns", LEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, vol. 47, no. 3, 1998, pages 782 - 788
T.E. CARLSSON ET AL.: "System for acquisition of three-dimensional shape and movement using digital Light-in-Flight holography", OPTICAL ENGINEERING, vol. 40, no. 1, 2001, pages 67 - 75
YELIN ET AL.: "Spectral-domain spectrally-encoded endoscopy", OPTICS EXPRESS., vol. 15, no. 5, 2007, pages 2432 - 2444
ZARA; LINGLEY-PAPADOPOULOS: "Endoscopic OCT approaches toward cancer diagnosis", IEEE JOURNAL ON SELECTED TOPICS IN QUANTUM ELECTRONICS, vol. 14, 2008, pages 70 - 81

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2667150A4 (en) * 2011-01-21 2017-01-18 University of Hyogo Three-dimensional shape measurement method and three-dimensional shape measurement device
WO2012136238A1 (en) * 2011-04-04 2012-10-11 Universität Stuttgart Method and arrangement for short coherence holography
GB2505106A (en) * 2011-04-04 2014-02-19 Univ Stuttgart Method and arrangement for short coherence holography
RU2608322C2 (en) * 2011-10-20 2017-01-17 Конинклейке Филипс Н.В. Holographic user interfaces for medical procedures
JP2016057379A (en) * 2014-09-08 2016-04-21 日立マクセル株式会社 Hologram data generation device, hologram data generation method and hologram data generation program
DE102017105910A1 (en) * 2017-03-20 2018-09-20 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Frequency-based projection segmentation
US10558889B2 (en) 2017-03-20 2020-02-11 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Frequency-based projection segmentation
CN112136182A (en) * 2017-11-16 2020-12-25 杨晓东 System and method for blood flow imaging based on Gabor optical coherence tomography
WO2019179687A1 (en) * 2018-03-19 2019-09-26 Medizinisches Laserzentrum Lübeck GmbH Method for photocopying a sequence of cut surfaces inside a light-scattering object with improved scanning
US11482044B2 (en) 2018-03-19 2022-10-25 Visotec Gmbh Method for photocopying a sequence of cut surfaces inside a light-scattering object with improved scanning
CN109374580A (en) * 2018-09-30 2019-02-22 北京工业大学 A kind of Terahertz lamination image probe positional error correction method
IT201900023229A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining the position of an element of an optical system in a machining or measuring complex of an object by means of parallel interferometric measurements
US11320254B2 (en) 2019-12-06 2022-05-03 Adige S.P.A. Method and system for determining the separation distance between a body and the surface of an object by means of low coherence optical interferometry techniques under distortion due to sub-sampling
IT201900023181A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining and controlling the separation distance between a processing head of a laser processing machine and the surface of a workpiece using low coherence optical interferometry techniques
EP3832251A1 (en) * 2019-12-06 2021-06-09 Adige S.p.A. Method and system for determining the separation distance between a body and the surface of an object by means of low coherence optical interferometry techniques under distortion due to sub-sampling
WO2021111399A1 (en) * 2019-12-06 2021-06-10 Adige S.P.A. Method and system for determining the position of an element of an optical system in an assembly for processing or measuring an object, as well as the position of said object relative to said assembly, by parallel interferometric measurements
WO2021111393A1 (en) * 2019-12-06 2021-06-10 Adige S.P.A. Method and system for determining the local position of at least one optical element in a machine for laser processing of a material, using low-coherence optical interferometry techniques
WO2021111423A1 (en) * 2019-12-06 2021-06-10 Adige S.P.A. Method and system for determining and controlling the separation distance between a working head of a laser processing machine and the surface of an object being processed by means of low coherence optical interferometry techniques
IT201900023202A1 (en) * 2019-12-06 2021-06-06 Adige Spa Procedure and system for determining the separation distance between a body and the surface of an object using low coherence optical interferometry techniques in subsampling distortion regime
IT201900023214A1 (en) * 2019-12-06 2021-06-06 Adige Spa Method and system for determining the local position of at least one optical element in a machine for laser processing of a material, using low coherence optical interferometry techniques
WO2022003867A1 (en) * 2020-07-01 2022-01-06 Hamamatsu Photonics K.K. Slanted optical coherence tomography imaging for high-speed inspection
JP7433467B2 (en) 2020-07-01 2024-02-19 浜松ホトニクス株式会社 Oblique optical coherence tomography imaging for high-speed inspection
CN113435455A (en) * 2021-05-12 2021-09-24 西安工程大学 Image contour extraction method based on space-time pulse coding
CN113435455B (en) * 2021-05-12 2024-03-22 深圳灵图创新科技有限公司 Image contour extraction method based on space-time pulse coding
CN114322808A (en) * 2021-12-02 2022-04-12 上海大学 Multi-dimensional speckle interference system and real-time measurement method
CN114322808B (en) * 2021-12-02 2024-03-19 上海大学 Multidimensional speckle interference system and real-time measurement method

Similar Documents

Publication Publication Date Title
WO2010092533A1 (en) Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot
Picart New techniques in digital holography
US10113961B2 (en) Apparatus and method for quantitive phase tomography through linear scanning with coherent and non-coherent detection
Schnars et al. Digital holography and wavefront sensing
JP5680826B2 (en) Data generation system using endoscopic technology for encoding one or more spectra
JP5887006B2 (en) Method and system for performing angle-resolved Fourier domain optical coherence tomography
US7602501B2 (en) Interferometric synthetic aperture microscopy
JP5748753B2 (en) Equipment for absolute measurement of two-dimensional optical path distribution by interferometry
CN108139198B (en) Method and device for exposing at least one cross section in the interior of a light scattering object
CN103733144A (en) Method for optical tomography
Paltauf et al. Photoacoustic tomography with integrating area and line detectors
Inanç et al. 3-d optical profilometry at micron scale with multi-frequency fringe projection using modified fibre optic lloyd’s mirror technique
Ballester et al. Single-shot tof sensing with sub-mm precision using conventional cmos sensors
CN113031422B (en) Holographic imaging device
Dakoff et al. Microscopic three-dimensional imaging by digital interference holography
Picart et al. Basic fundamentals of digital holography
Windecker et al. Fast coherence scanning interferometry for measuring smooth, rough and spherical surfaces
KR100871270B1 (en) Method of Generating Image Information of Sample Using Optical Interferometer And Apparatus Using The Same
KR20200040209A (en) Apparatus for generating three-dimensional shape information of an object to be measured
Khodadad Multiplexed digital holography incorporating speckle correlation
Çetin 3D optical profilometry with a double beam-splitter setup
Ballester et al. Single-shot synthetic wavelength imaging: Sub-mm precision ToF sensing with conventional CMOS sensors
İnanç 3-d optical profilometry at micron scale with modified fiber optic lloyds mirror technique
Balasubramanian Coherent Optics In Photogranmetry
Khodadad Combined Digital Holography and Speckle Correlation for Rapid Shape Evaluation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10712776

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10712776

Country of ref document: EP

Kind code of ref document: A1