US20070057211A1 - Multifocal imaging systems and method - Google Patents

Multifocal imaging systems and method Download PDF

Info

Publication number
US20070057211A1
US20070057211A1 US11/442,702 US44270206A US2007057211A1 US 20070057211 A1 US20070057211 A1 US 20070057211A1 US 44270206 A US44270206 A US 44270206A US 2007057211 A1 US2007057211 A1 US 2007057211A1
Authority
US
United States
Prior art keywords
detector
light
array
optical
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/442,702
Inventor
Karsten Bahlman
Ki-Hean Kim
Timothy Ragan
Peter So
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Massachusetts Institute of Technology
Original Assignee
Massachusetts Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Massachusetts Institute of Technology filed Critical Massachusetts Institute of Technology
Priority to US11/442,702 priority Critical patent/US20070057211A1/en
Assigned to MASSACHUSETTS INSTITUTE OF TECHNOLOGY reassignment MASSACHUSETTS INSTITUTE OF TECHNOLOGY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAHLMAN, KARSTEN, KIM, KI H., RAGAN, TIMOTHY, SO, PETER T.C.
Publication of US20070057211A1 publication Critical patent/US20070057211A1/en
Assigned to NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT reassignment NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF HEALTH AND HUMAN SERVICES (DHHS), U.S. GOVERNMENT CONFIRMATORY LICENSE (SEE DOCUMENT FOR DETAILS). Assignors: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
Priority to US15/815,536 priority patent/US10598597B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6452Individual samples arranged in a regular 2D-array, e.g. multiwell plates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6486Measuring fluorescence of biological material, e.g. DNA, RNA, cells
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0032Optical details of illumination, e.g. light-sources, pinholes, beam splitters, slits, fibers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0076Optical details of the image generation arrangements using fluorescence or luminescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes

Definitions

  • Fluorescence microscopy for example, has been used for optical analysis including the histological analysis of excised tissue specimens.
  • Optical coherence tomography has been used for three dimensional imaging of tissue structures, however, the limited resolution of existing systems has constrained its use for definitive pathological analysis.
  • Confocal microscopy has been used for high resolution imaging and has controllable depth of field but limited imaging speed.
  • Multiphoton microscopy is based on the nonlinear excitation of fluorophores in which fluorescence generation is localized at the focus of excitation light. Multiphoton microscopy is used for deep tissue imaging because of its subcellular three dimensional (3D) resolution, minimal phototoxicity, and tissue penetration depth of over a few hundred micrometers. It has become useful in biomedical studies such as neuronal plasticity, angiogenesis in solid tumors, transdermal drug delivery, and non-invasive optical biopsy, for example.
  • a practical limitation of multiphoton microscopy is its imaging speed which typically lies in a range of less than two frames per second. While this speed is sufficient in many cases, there remain applications in which can be enhanced by improvements in imaging speed. There is a continuing need for further improvements in microscopic analysis of biological materials for numerous applications.
  • the present invention relates to systems and methods for the multifocal imaging of biological materials.
  • An optical system is provided in which a plurality of optical pathways are used in combination with focusing optics to provide a plurality of focal locations within a region of interest of a material being optically measured or imaged.
  • the detector can comprise a plurality of detector elements which are correlated with the plurality of focal locations to provide for the efficient collection of light from the material being imaged.
  • a preferred embodiment of the invention utilizes a scanning system that provides relative movement between the material and the focal locations to provide for fast imaging of the material.
  • a light source such as a laser
  • a multifocal optical element to provide an array of spatially separated optical pathways.
  • the multifocal optical element can comprise a micro lens array, a diffractive optical element, or a beam splitter device, for example, such that a plurality of beams are provided that can be focused onto a plurality of focal locations within a biological material to be imaged.
  • high speed multiphoton microscopy can measure biological systems such as, for example, kinetic processes in the cytosol of a single cell, for example, or imaging a volume of tissue.
  • high speed 3D imaging can map 3D propagation of a calcium wave and the associated physical contraction wave through a myocyte, or the rolling of luckocytes within the blood vessel of a solid tumor.
  • High speed 3D microscopy provides for sampling a statistically significant volume of biological specimens. Since the field of view of most microscopes is limited to about 100 microns on a side with an imaging depth of 100 microns, the measurement volume is limited to only 1 ⁇ 10 ⁇ 3 mm 3 .
  • a neuron with its extensive dendritic tree can span a volume over 1 mm 3 and many dermal structures such as hair follicles and sabestious glands can not be seen with images confined to an area of 100-200 micrometers. It is desirable, for example, to image a hierarchy of cardiac structures ranging from a single nucleus in a cardiac myocyte, to the distribution of muscle fibers and blood vessels, to the structure of chambers and heart valves with perfect registration across five orders of magnitude by imaging a whole mouse heart.
  • a first method increases the scanning speed by using a high-speed scanner such as a polygonal mirror scanner or a resonant mirror scanner instead of a galvanometer-driven mirror scanner. This achieves an increase of scanning speed of more than 10 frames per second in the imaging of typical tissue specimens.
  • the system can operate at frequencies in a range of 1 to 500 Hz.
  • This method can be used for turbid tissue imaging since it is not sensitive to the scattering of emission photons.
  • a second method increases the imaging speed by parallelizing the multiphoton imaging process. It scans a sample with a multiple of excitation foci instead of forming only a single focus.
  • foci are raster scanned across the specimen in parallel where each focus needs to cover a smaller area.
  • the emission photons from these foci are collected simultaneously with a spatially resolved detector.
  • One advantage of this method is that the imaging speed is increased by the number of excitation foci generated, without increasing the power of excitation light per each focus.
  • High speed scanning systems needs higher power to compensate for the signal reduction per pixel due to the decrease of pixel dwell time. Images can be obtained by selecting the depth of focus to be positioned in a plane within the tissue or sample at a depth in a range of 10 microns to 500 microns.
  • fiber optics can be used to couple the light source to the microlens array or other beam splitting element.
  • the system can be implemented as a handheld optical probe for the diagnosis of dermal, cervical or colorectal cancer, for example.
  • the brain is an inherently three dimensional organ composed of many subregions. Accurate segmentation of brain morphology of small mammals is currently challenged by the lack of techniques which can sample the brain at high resolution over a large volume. The current method of choice, serial section reconstruction, is laborious, time consuming, and error prone.
  • the device and methods described herein can quickly image brains or thick tissue sections of brains in 3D at sufficient resolution and over a large enough volume to provide 3D images suitable for classification of brain morphology and biochemical composition.
  • the brain can be further stained by dyes, such as nuclear dyes DAPI or Hoescht, either through intravital injection, transgenic expression, or ex vivo methods, to facilitate classification of regions. Automatic segmentation routines can also be used to improve the classification and automate portions of the process.
  • ⁇ Acquisition of vasculature is important to characterize many biomedical for vasculature related diseases.
  • proangiogenesis therapies are useful in such areas as tissue engineering, wound healing, bone fractures and coronary heart disease.
  • Anti-angiongenesis treatments are important in processes as cancer, blindness, and rheumatoid arthritis.
  • Unfortunately traditional histopathological analysis of tissue sections is wholly inadequate to characterize the vasculature of a tissue or organ as blood vessels form complex, multiscale 3D networks, with feature spanning from the submicron to centimeter scale.
  • the device and methods described in the patent are capable of acquiring high quality 3D datasets over 3D tissue and organ samples suitable for characterization of the vasculature of the tissue.
  • the tissue can be stained by contrast agents which bind to the epithelial wall of the blood vessels, or fill the interior of vessels. Automatic segmentation routines can also be used to improve the classification and automate portions of the process.
  • the present histopathology methods provide limited information about the 3D spatial arrangement of cancer cells with the 3D vasculature of the organ. It is known that one of the critical steps in metastasis is extravasation into the surrounding stroma from the vasculature so it is essential to be able to visualize this spatial relationship between cancer cell and the endothelial blood vessel wall.
  • Preferred embodiments of the present invention are capable of acquiring high quality 3D datasets over 3D tissue and organ samples suitable for characterization of the metastases.
  • the cancer cell can be stained by dyes or labeled with proteins such as OFP. Automatic segmentation routines can also be used to improve the classification and automate the localization of the cancer cells and tumors.
  • ADME efficacy and toxicology effects are known to have strong spatial variations on the morphological, cellular and biochemical state of a tissue. Even within a specific tissue type, the response can be nonuniform due to variations in the transport and distribution of a drug throughout tissue, epigenetic expression, and cellular activity.
  • the devices and methods described herein can be used to provide morphological, biochemical and spectroscopic information about the state of a tissue across multiple length scales, from subcellular, whole tissue, whole organ and even entire organism, in response to the treatment of a molecular agent. Efficacy, ADME, and toxicology information can be derived which provides a fuller and more accurate description to predict the actual effect of drug candidate at the organism level.
  • FIG. 1 is a schematic diagram of an imaging system in accordance with a preferred embodiment of the invention.
  • FIGS. 2 a - 2 c are images of human skin acquired with the present invention including the stratum corneum layer, the stratum granular and the basal layer, respectively.
  • FIG. 3 graphically illustrates the signal decay with increasing imaging depth of conventional systems and those incorporating the present invention.
  • FIGS. 4 a - 4 d include images before and after deconvolution as well as graphical illustration of scattering and crosstalk.
  • FIGS. 5 a - 5 i are images based on CCD, MAPMT and deconvolution thereof at the surface and difference depth of brain tissue.
  • FIG. 6 illustrates a method and apparatus for multi-focal, multi-photon microscopy (MMM) according to a preferred embodiment of the invention, showing parallelized illumination and detection device with a common focusing device.
  • MMM multi-focal, multi-photon microscopy
  • FIG. 7 illustrates a method and apparatus for multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • MMM multi-focal, multi-photon microscopy
  • FIG. 8 illustrates another method and apparatus for multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • MMM multi-focal, multi-photon microscopy
  • FIG. 9 illustrates generating and detecting a 3D foci pattern in a focal region.
  • FIGS. 10 ( a )-( i ) illustrates close up views of the 3D focal region generated by the setup in fig. W-10 and views of the 3D scanning: (a) the focal region; (b) an array of excitation light beams; (c) x/y view at first depth; (d) x/y view at second depth; (e) x/z view; (f) multiple rows of excitation foci lie in different focal planes and are all shown in this xy view; (g) xz view of the rows shown in (f); (h) x/y view of the x/y scanning configuration covering the x-y-z image (as in (f) all foci are shown, even though they lie in different planes and in the lower part of the images even behind each other; and (i) a view in the yz plane illustrating scan progression.
  • FIG. 11 a illustrates a further method and apparatus for a 3D cytometer, based on multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • MMM multi-focal, multi-photon microscopy
  • FIG. 11 d illustrates an image of the array of foci in the focus of the objective lens; the foci are 45 micrometers apart resulting in a scanning field of 240 mm when 6 ⁇ 6 foci are utilized.
  • FIG. 11 ( c ) illustrates the Z-profile and corresponding fit function of a 200 nm bead.
  • FIGS. 12 ( a )-( e ) illustrates a further method and apparatus for multi color detection MMM employing scanning and multi anode PMT's according to a preferred embodiment of the invention: (a) the setup in the xz-plane; (b) the foci and their scanning in the focal xy-plane;
  • FIG. 13 illustrates a beam splitter configuration used in some embodiments according to the invention.
  • FIGS. 14 ( a )-( d ) illustrate preferred embodiments for providing illumination beam paths in accordance with the invention.
  • FIGS. 15 ( a )-( d ) illustrated further preferred embodiments for detecting light from different focal locations in accordance with preferred embodiments of the invention.
  • FIG. 16 illustrates determining the optimal number of foci at a certain laser power for samples with different damage thresholds.
  • FIGS. 17 ( a ) and ( b ) illustrate a time multiplexing method.
  • FIGS. 18 ( a )-( c ) illustrate a pixellated detector collection method.
  • FIG. 19 illustrates an endoscope apparatus according to an embodiment of the invention.
  • FIGS. 20 ( a ) and ( b ) illustrates scattered light detection with one PMT and one excitation focus according to an embodiment of the invention.
  • FIGS. 21 ( a ) and ( b ) illustrates scattered light detection with two PMTs and one excitation foci according to an embodiment of the invention.
  • FIGS. 22 ( a ) and ( d ) illustrates scattered light detection with two PMT's and two excitation foci according to an embodiment of the invention.
  • FIGS. 23 ( a ) and ( b ) illustrates reducing optical cross talk by increasing the distances between the excitation foci and distances between the detection elements.
  • FIGS. 24 ( a ) and ( b ) illustrates reducing optical cross talk according to an embodiment of the invention by increasing the distance between the excitation foci and increasing the area of the detection elements.
  • FIGS. 25 ( a )-( e ) illustrate in tabular form two alternative embodiments A and B of the invention in terms of changing optical setup.
  • FIGS. 26 ( a )-( b ) illustrates the different conjugated areas of detection from each channel of the multi anode PMT in the conjugated image plane for configuration A and B from FIG. 22 ( a )-( e ).
  • FIGS. 26 ( c )-( d ) illustrate an objective lens with large field of view enables large separation of foci and thus enables low optical cross reduction
  • FIGS. 27 ( a ) and ( b ) illustrate data post-processing sequences.
  • FIGS. 28 ( a ) and ( b ) illustrate a normalization method.
  • FIGS. 29 ( a )-( c ) illustrate a linear deconvolution process.
  • FIGS. 30 ( a )-( d ) illustrate further details for a linear deconvolution process I: Signal distribution in multi channel detector
  • the fluorophores which are excited with the last pulse, stay in the excitation state for a few nano-seconds (depending on the fluorophore). Some excited fluorophores may not be excited again with the next pulse of excitation light (12 ns later in case of the laser having 80 MHz pulse repetition rate). Therefore, signal level becomes saturated with a higher input power than the limited input power level.
  • the limitation on the input power level is related to the excitation probability of a single fluorophore with a single pulse, P pulse . It is formulated in the following expression with the condition that excitation light is focused with an objective into a fluorophore of an absorption coefficient ( ⁇ a ).
  • P pulse ⁇ a ⁇ [ ⁇ K a ⁇ hc ] 2 ⁇ NA 4 ⁇ p ⁇ f p 2 ⁇ [ P a ⁇ ( t ) ] 2 ( 1 )
  • MMM increases the frame rate by scanning with multiple excitation foci. Therefore, MMM can achieve the higher frame rate, while the input power for each excitation focus is kept below the saturation limit.
  • the limit of optical imaging depth in tissues is limited by photon interaction with tissue constituents.
  • Photon scattering is a dominant factor in multiphoton microscopy whereas the effect of photon absorption is relatively negligible. Scattering of excitation photons reduces the amount of fluorescence generated at its focus, because less excitation photons reach the focal volume.
  • the emission photons from the focus are also scattered so that they may not be collected by the optics in the detection path or spatially dispersed in the imaging plane where detectors are positioned. Since the excitation light has a longer wavelength than the emission light, the excitation light typically experiences less scattering than emission light.
  • the effect of photon scattering is expressed by the mean free path length of scattering, l s which is the depth constant in exponential decay of unscattered photons, S(z) ⁇ exp( ⁇ z/l s ).
  • Intralipid emulsion can be used as a tissue phantom with similar optical properties as tissue.
  • Conventional multiphoton microscopy is based on the scanning of a single excitation focus and the signal is collected using a detector without spatial resolution such as a conventional photomultiplier tube (PMT).
  • PMT photomultiplier tube
  • the PMT has a large detection area and can collect most of the signal generated at the excitation focus including a large fraction of the scattered photons. Therefore, conventional multiphoton microscopy is relatively immune to the scattering of emission photons by the tissue.
  • the scattering of emission photons seriously degrades the SNR of the instrument for deep tissue imaging.
  • the CCD camera has relatively slow readout speed and typically integrates all the emission photons during the acquisition of each frame.
  • a CCD camera contains pixels in which each pixel covers a 0.1 ⁇ m 2 region in the specimen plane, scattered emission photons deflected from their original paths are not collected in the correct pixel but are distributed broadly across the imaging plane.
  • the distribution of scattered emission photons is very broad with its FWHM of 40 ⁇ m in the depth of 2 ⁇ l em s .
  • These scattered photons result in a degradation of image SNR by more than one order of magnitude when imaging depth is over 2 ⁇ l em s , compared with conventional multiphoton microscopy.
  • CCD-based MMM system lies in its small pixel area.
  • a large number of CCD pixels are needed to maintain good resolution while covering a good size field of view.
  • a 100 ⁇ m size image will require about 10 7 pixels to be imaged at full optical resolution (300 nm).
  • the situation is very different for MMM imaging. Since a femtosecond light source can only provide at maximum 2-4 watts of optical power and typically about 50-100 mW are required at each focus to generate an efficient muliphoton excitation process for deep tissue imaging.
  • An MMM system can realistically and effectively scan about 20-40 foci in parallel with tissue specimens.
  • the image resolution is determined by the excitation point spread function (PSF) of the light and is not sensitive to the detector pixelation.
  • PSF point spread function
  • a preferred embodiment uses an MMM system having photon detectors containing only as many elements as the number of excitation foci. The need for fewer elements allows the use of a detector with a significantly larger pixel area while maintaining a reasonable device size.
  • a multi-anode PMT (MAPMT) is a preferred detector for this purpose.
  • a preferred embodiment of the present invention uses an MAPMT instead of the CCD camera for the signal collection from multiple foci.
  • the MAPMT is similar to conventional PMTs with a good quantum efficiency (over 20% in the blue/green spectral range), negligible read noise and minimal dark noise with cooling.
  • MAPMT has a cathode and dynode chain with a geometry that ensures that the spatial distribution of photons on the cathode is reproduced accurately as electrons distribution at the anode.
  • the anode of the multi-anode PMT is divided rectilinearly into its elements providing spatial resolution for the simultaneous collection of signals from multiple locations.
  • a MAPMT which has an array of 8 ⁇ 8 pixels (H7546, Hamamatsu, Bridgewater, N.J.) is used.
  • a flat panel detector having a pixel area of sufficient size can also be used.
  • a binnable CMOS or CCD imaging sensor can be operated to read out binned images at comparable frame rates with an effective pixel size corresponding to that of a MAPMT.
  • a preferred embodiment of the invention uses the imaging systems as described herein in conjunction with a system for sectioning a sample such as a tissue sample that is described in greater detail in U.S. application Ser. No. 10/642,447, by So, et al. filed Aug. 15, 2003, the entire contents of which is incorporated herein by reference.
  • the schematic of a preferred embodiment of the imaging system 10 in accordance with the invention is shown in FIG. 1 .
  • the light source 12 used is a Ti-Sapphire (Ti-Sa) laser (Tsunami, Spectra-Physics, Mountain View, Calif.) pumped by a continuous wave, diode-pumped, frequency-doubled Nd:YVO 4 laser (Millenia, Spectra-Physics, Mountain View, Calif.). It generates approximately 2 W at 800 nm wavelength which is sufficient for most MMM applications.
  • Ti-Sa Ti-Sapphire
  • the excitation beam from the laser is optically coupled using optical fiber 14 or free space lens system to a beam expander 16 and then illuminates a microlens array 20 (1000-17-S-A, Adaptive Optics, Cambridge, Mass.) which, in this example, is an array of 12 ⁇ 12 (or 8 ⁇ 8) square microlenses that are 1 mm ⁇ 1 mm in size and 17 mm in focal length.
  • the degree of beam expansion can be selected such that an array of 8 ⁇ 8 beam-lets is produced after the microlens array.
  • the beam-lets are collimated after lens L 1 and reflected onto an x-y scanner mirror 30 (6220, Cambridge Technology, Cambridge Mass.) which is positioned in the focal plane of lens L 1 .
  • the beam-lets overlap each other on the scanner mirror surface and are reflected similarly by the rotation of the scanner mirror.
  • the beam-lets enter a coupling lens system such as a microscope (BX51, Olympus, Melville, N.Y.) via a modified side port.
  • a combination of lenses L 2 and L 3 expands the beam-lets to fill the back aperture of the objective lens 36 in order to use the full NA of the objective lens.
  • the scanning mirror is in the telecentric plane of the back aperture of an objective lens so that the beamlets are stationary on its back aperture independent of the motion of the scanner mirror.
  • the objective lens generates the 8 ⁇ 8 focus array of excitation light in the sample plane in the specimen 34 .
  • the scanner mirror moves the array of excitation foci in the sample plane in a raster pattern to cover the whole sample plane.
  • a digital micromirror (MEMS) device can be used to control beam scanning in the sample plane.
  • a beamsplitter can also be used to split an input beam before the microlens array.
  • Another alternative embodiment employs a diffractive optical element in conjunction with a beam splitter.
  • the objective used in this system is a 20 ⁇ water immersion lens with 0.95 NA (XLUMPLFL20XW, Olympus, Melville, N.Y.).
  • the excitation foci are separated from each other by 45 ⁇ m in this example so that the scanning area of each focus is 45 ⁇ m ⁇ 45 ⁇ m.
  • the frame size is 360 ⁇ m ⁇ 360 ⁇ m by scanning with the array of 8 ⁇ 8 foci.
  • the frame rate to generate images of 320 ⁇ 320 pixels becomes approximately 19 frames per second with the pixel dwell time of 33 ⁇ s.
  • Emission photons are generated at the array of excitation foci in the specimen and are collected by the same objective lens forming an array of emission beam-lets.
  • the emission beam-lets are reflected on a long-pass dichroic mirror 38 (650dcxxr, Chroma Technology, Brattleboro, Vt.) and get focused in optional CCD camera 28 (PentaMax, Princeton Instruments, Trenton, N.J.) with a lens (L 3 ).
  • the CCD camera integrates emission photons during the scanning time of each frame to generate images.
  • the emission beam-lets travel back to the scanner mirror 30 retracing the excitation paths.
  • the emission beam-lets are reflected by the scanner mirror.
  • the emission beam-lets are de-scanned and their propagation directions remain stationary irrespective of the movement of the scanner.
  • the emission beam-lets are reflected by a long-pass dichroic mirror 32 (650dcxxr, Chroma Technology, Brattleboro, Vt.) and are focused after lens (L 4 ).
  • a short-pass filter (E700SP, Chroma Technology, Brattleboro, Vt.) blocks any strayed excitation light.
  • the focused emission beam-lets are collected at the center of corresponding channels of a MAPMT 22 (H7546, Hamamatsu, Bridgewater, N.J.).
  • the emission photons coming from the array of excitation foci are collected by the MAPMT.
  • An image is formed by the temporal encoding of the integrated signal with the known raster scanning pattern using image processor or computer 24 and is electronically stored in memory and/or displayed using display 26 .
  • the pair of L 2 and L 4 lenses magnifies the array of emission foci so that individual emission beamlets are focused at the center of corresponding elements of the MAPMT. Further, since the emission beam-lets are descanned, they remain stationary. Since the emission beam-lets have to go through more optical elements, loss of emission photons occurs. The transmission efficiency is approximately 0.7.
  • the signals from the MAPMT are collected by a stack of four multi-channel photon counter card (mCPhC) which has 64 channels for simultaneous signal collection. Each mCPhC has 18 channels of photon counter circuits and can be housed 25 with a digital interface to the computer 24 . The mCPhC expandable so that 64 channels are readily implemented by using 4 cards in parallel.
  • mCPhC multi-channel photon counter card
  • the mCPhC has a 32-bit parallel interface with a computer for high-speed data transfer.
  • the speed is limited by the speed of the computer PCI bus.
  • Transfer rate can be more than one hundred frames (320 ⁇ 320 pixels, 16 bit images) per second.
  • each pixel can be treated as an individual detection element.
  • each pixel has an effective detector area of 1 ⁇ m ⁇ 1 ⁇ m for 20 ⁇ magnification. Therefore, the CCD-based MMM system cannot utilize these scattered emission photons which are distributed uniformly throughout the image contributing to the background noise.
  • the effective detector area of each channel is 45 ⁇ m ⁇ 45 ⁇ m. Therefore, the MAPMT can collect significantly more scattered emission photons into the correct channels than the CCD camera, because its effective detector area, or detector element collection area, is comparable with, or corresponds to, the width of the scattered photon distribution from each focal area (45 microns ⁇ 45 microns).
  • MAPMT-based MMM system can be easily converted to the conventional multiphoton microscope which is based on single-focus scanning and signal collection with PMTs.
  • the excitation beam is not expanded and goes directly onto the scanner without the combination of the microlens array and lens L 1 .
  • the rest of excitation beam path is the same as MAPMT-based MMM.
  • Specimens are scanned with a single excitation focus.
  • the emission light collected by the objective lens is reflected on a dichroic mirror.
  • the reflected emission beam shrinks with a pair of lenses and is collected by a detector (PMT).
  • An image is formed by the temporal encoding of the integrated signal with the known raster scanning pattern.
  • CCD-based MMM have limitations for turbid tissue imaging by measuring the effect of emission photon scattering on PSF (scattering function).
  • Scattered emission photons form additional intensity distribution around the PSF constructed with ballistic unscattered photons. Their intensity distribution is quite wide with its FWHM of 40 ⁇ m at the imaging depth of 2 ⁇ l em s .
  • FWHM of the total PSF (including the intensity distribution due to scattered emission photons) is not changed due to scattering up to such depth because the wide distribution of scattered emission photons does not contribute to FWHM.
  • signal decay in CCD-based MMM with the increase of imaging depth is higher than that of SMM by an order of magnitude at 2 ⁇ l em s .
  • the wide distribution of scattered emission photons contributes as noise and causes loss of contrast by another order of magnitude at the depth.
  • Imaging dermal structure based on autofluorescence has been performed using the system of the present invention. Endogenous fluorophores have low quantum yield and low extinction coefficients compared with typical exogenous fluorescent labels.
  • the dermal structure imaged using a preferred embodiment of the present invention has a layered structure with significantly different indices of refraction resulting in significant spherical aberration.
  • Multiphoton imaging of dermal structures without photodamage has a pixel rate of 15 KHz with 15 mW input power. In this example an input power of 7 mW per focus at the specimen with the excitation wavelength set at 800 nm.
  • the objective used is 20 ⁇ water immersion with 0.95 NA (XLUMPLFL20XW, Olympus, Melville, N.Y.).
  • a beam splitter, serial dichroic mirrors or a top hat holographic filter can be used to provide a more uniform array of beams delivered to the individual focal positions.
  • the signal decay can be measured as a function of scattering length. As the imaging depth increases, the signal is decreased due to scattering of both excitation photons and emission photons.
  • the signal decay is measured by imaging 4 ⁇ m diameter fluorescent latex microspheres (F8858, Molecular Probes, Eugene, Oreg.) immobilized in 3D by 2% argarose gel (UltraPure Low Melting Point Argarose, Invitrogen, Carlsbad, Calif.). Intralipid emulsion (Liposyn III, Abbott Laboratories, North Chicago, Ill.) is added to the sample as a scatterer in various concentrations of 0.5 to 2%.
  • Intralipid emulsion of 2% volume concentration is known to have similar scattering properties to those of tissues: mean free path length (l s ) of scattering is 80 ⁇ m, 168 ⁇ m at the wavelength of emission (605 nm), excitation (800 nm) respectively.
  • the scattering properties of these intralipid solutions are verified by diffusive wave measurements. Peak intensity of the sphere image is a signal in the measurement and the decay of peak intensity as a function of the imaging depth is measured at each concentration. The signal decay can also be measured with a conventional multiphoton microscope as a reference. Signal decays in the three systems are measured down to a depth of 180 ⁇ m which is equivalent to 2.25 ⁇ l em s ( FIG. 5 ).
  • the decay coefficient, c is 1.22, 1.87, 2.30 in case of the conventional multiphoton microscopy, MAPMT-based MMM, and CCD-based MMM respectively.
  • the decay rate from the conventional multiphoton microscope is the lowest as expected.
  • the decay is the combinational effect of both excitation and emission photon scattering. Since the effect of excitation photon scattering is the same, the difference in decay coefficient is due to the effect of emission photon scattering.
  • the decay coefficient, c from MAPMT-based MMM (1.87) is lower than the one from CCD-based MMM (2.30).
  • the one from MAPMT-based MMM is still higher than the one from the conventional multiphoton microscope. It indicates that the spatial distribution of scattered emission photons is wider than the effective detector area of the MAPMT (45 ⁇ m ⁇ 45 ⁇ m) so that some portion of the scattered emission photons are collected in the neighboring channels.
  • the ratio of intensity sum collected in the neighboring pixels of the MAPMT to the intensity in the correct pixel was approximately 2 at the depth of 2 ⁇ l em s .
  • the photons acquired at each pixel are temporally encoded and are organized to form an image based on the known scanner position as a function of time. This is exactly how images are formed in a conventional multiphoton or confocal microscope. A primary image is formed by photons acquired at the correct pixels corresponding to the fluorophore distribution in that portion of specimen. Note that the scattered photons in the neighboring pixels are also similarly temporally encoded. Therefore, secondary “ghost” images are formed in the areas of the image covered by the neighboring pixels. As an example, FIG.
  • 4 a is an image of spheres at 150 ⁇ m deep from the surface in 2% intralipid emulsion.
  • the effect of emission photon scattering on imaging can be described as follows. Generally, an image is formed as a convolution of source pixels and an emission point spread function (PSFem).
  • the convolution matrix, [C] is constructed based on the simplified PSF em , EPSF em in which PSF em is spatially integrated over the effective detector area of the individual pixels of the MAPMT.
  • EPSF em Since EPSF em has a very coarse spatial resolution of 45 ⁇ m with the spatial integration, deconvolution with EPSF em becomes simple and is less sensitive to noise.
  • the study of emission photon scattering on PSF em shows that the scattered emission photons form additional intensity distribution around the PSF em , which is formed with ballistic unscattered emission photons. Its distribution is broad with its FWHM of 40 ⁇ m range in the imaging depth of 2 ⁇ l s em .
  • the change of PSF em due to scattering ( FIG. 4 c ) affects EPSF em by increasing intensity in the neighboring pixel areas ( FIG. 4 d ).
  • EPSF em is roughly estimated by measuring the intensity ratio of the real image to the ghost images as a function of imaging depth.
  • the convolution matrix, [C] est is constructed based on the estimated EPSF em .
  • the restored image is presented in FIG. 4 ( b ).
  • the signal decay of a depth sequence of restored images is measured and the decay coefficient, c is significantly reduced to 1.58 after the deconvolution algorithm because the scattered emission photon can now be corrected and reassigned.
  • the ghost images are almost completely eliminated as a result.
  • Restoration algorithms can be further refined such as by adding maximum likelihood estimation to minimize image structural overlap between neighboring pixels. This simple deconvolution approach improves very effectively the performance of MAPMT-based MMM and allows this system to perform within a factor of two compared with conventional multiphoton microscope.
  • the performance comparison of the two MMM systems can also be evaluated for the imaging of biological tissues.
  • the specimen is an ex-vivo brain tissue section with neurons expressing green fluorescent protein (GFP).
  • GFP green fluorescent protein
  • Thy1-GFP transgenic mice are deeply anesthetized with 2.5% Avertin (0.025 ml/g i.p.) and transcardially perfused with PBS, followed by 4% paraformaldehyde. Brains are dissected and placed overnight in cold 4% paraformaldehyde. 1-mm thick coronal sections are taken by vibrotome, mounted and coverslipped on microscope slides using adhesive silicone isolators (JTR20-A2-1.0, Grace Bio-Labs, Bend, Oreg.).
  • the specimen is imaged in 3D with both CCD-based MMM and a MAPMT-based MMM.
  • the objective used is 20 ⁇ water immersion with NA 0.95 (XLUMPLFL20XW, Olympus, Melville, N.Y.).
  • the input laser power is 300 mW at 890 nm wavelength.
  • the frame rate is 0.3 frames per second with 320 ⁇ 320 pixels.
  • the slow frame rate is set in order to collect enough emission photons up to 120 ⁇ m deep.
  • the total imaging depth is 120 ⁇ m with 1.5 ⁇ m depth increment. Representative images are shown in FIGS. 5 a - 5 i.
  • the first column of images are from CCD-based MMM at surface, 30 ⁇ m, and 75 ⁇ m deep.
  • the second column of images are the ones from MAPMT-based MMM, raw images and the third column are after deconvolution processing.
  • the dendritic structures of neurons are visible in all images.
  • the image from CCD-based MMM does not provide as good contrast of neurons as MAPMT-based MMM. This is because some of the emission photons that are initially forward propagating into the tissue are eventually backscattered. These backscattered photons are acquired in the incorrect pixels of the CCD and degrades the image SNR. Starting at about 30 ⁇ m, background noise increases and thin dendrite structure becomes invisible in CCD-based MMM images. On the other hand, in the images from MAPMT-based MMM, dendrites are still visible due to lower background noise and higher SNR.
  • the MAPMT since the MAPMT is positioned in the image plane, the location of each excitation focus corresponds to the center position of the matching pixel of the MAPMT.
  • the effective detector area scales quadratically with the separation of the foci. Therefore, with wider foci separation, the MAPMT has higher collection efficiency for scattered emission photons.
  • the excitation foci are separated from each other by 45 ⁇ m so that the effective detector area for each channel of the MAPMT is 45 ⁇ m ⁇ 45 ⁇ m.
  • the size of imaging field with 8 ⁇ 8 foci becomes 360 ⁇ m ⁇ 360 ⁇ m. As the excitation foci are separated more, the system becomes less sensitive to the scattering of emission photons.
  • the maximum separation of excitation foci is limited by either the field of view of the objective or apertures of other collection optics.
  • the 20 ⁇ water immersion objective used has the field of view of 1000 ⁇ m in diameter. This allows positioning the foci as far apart as almost 100 microns in this example.
  • a limitation of the MAPMT-based MMM system compared with a CCD-based MMM design is that the signals are de-scanned.
  • emission photons are processed by more optical elements including the scanner mirror before they are collected at the MAPMT suffering more optical loss at each reflection.
  • the de-scanned geometry also has a longer optical path that contributed to the loss of some scattered photons due to the finite aperture of the optics.
  • the signal collection efficiency is approximately 70% in this example due to additional optical elements.
  • An MAPMT-based MMM system in a non-de-scanned geometry for example can recover this loss.
  • the MAPMT is manufactured with a current quantum efficiency of about 20% compared to 80% quantum efficiency of the CCD camera.
  • MAPMT has very low noise. It has 20 dark counts per second without cooling and can be several orders of magnitude lower with cooling. Since the MAPMT has a readout rate of approximately 20 KHz, the typical dark count per pixel. is less than 1 ⁇ 10 ⁇ 3 . In comparison, the CCD noise is dominated by both read noise and dark noise which are a few counts per pixel. Therefore, for very low photon count situation, i.e. dim sample or high frame rate, the MAPMT system can have superior performance. MAPMTs with higher sensitivity cathode materials such as GaAsP can provide a system with a quantum efficiency up to about 40-50%.
  • the photon sensitivity of each channel is not equal and can vary up to 50%. This effect is further compounded by the Gaussian geometry of the excitation beam which results in higher excitation efficiency at the center pixels verses the edge region. This problem has been solved previously using multiple reflecting beam splitter to generate equal intensity beam-lets.
  • the MAPMT-based MMM system can be further improved by utilizing this type of beam splitter with an additional flat field correction algorithm to remove inherent sensitivity non-uniformity of the MAPMT.
  • a preferred embodiment provides for parallelized illumination and detection device which uses a common focusing device, such as an objective lens.
  • the device provides simultaneous measurement of intensity, lifetime, spectroscopic or other information from the focal spots (foci 151 , 152 , 153 ).
  • a common focusing device 110 such as an objective lens.
  • the focusing device 110 generates from each illumination light path 141 , 142 , 143 a separate intensity cone.
  • a first detector 121 , a second detector 122 and a third detector 123 detect light generated by the intensity cones associated with each first, second and third illumination path, respectively.
  • Light from a first illumination path 141 illuminates the focus spot 151 in the sample 105 , with the detected light following a first illumination and detection light path 161 and first detection light path 111 to reach the first detector 121 .
  • light from second and third illumination paths 142 , 143 illuminate the focal locations 152 and 153 , respectively, in the sample 120 , with the detected light following second and third illumination and detection light paths 162 and 163 and second and third detection light path 112 and 113 to reach the second and third detectors 122 and 123 , respectively.
  • light from each path generates a 3D intensity distribution in its associated focus, according to the multi photon excitation process.
  • the detectors 111 , 112 , 113 detect all the light in the ‘detection cone’ associated with their active area.
  • This light includes light generated by the light path associated with each detector (for example, light from the first focus 151 is detected by first detector 121 ), as well as light that is generated in the first focus 151 but scatters around the first focus 151 on its way to the first detector 151 , and light that is generated in the second and/or third foci 152 , 153 and is then scattered into the detection cone of the first detection light detection path 111 .
  • a confocal pinhole is placed in front of the detectors, for instance, in FIG. 6 a confocal pinhole can be placed between each detector 121 , 122 and/or 123 and the associated reflectors and collimation lens 126 , 127 , 128 , 131 , 132 and/or 133 , respectively.
  • a confocal pinhole impeding much of the scattered light, according to the confocal principle, only light from the focal spot associated with that detector is collected in each detector. For example, following the first light path, only light from the 3D light distribution in the first focus 151 is detected by detector 121 .
  • a setup could as well consists of a mixture of detectors with and without a confocal pinhole.
  • the illumination light and the associated detection can be time multiplexed.
  • a device can become an imaging device by the illumination beams 141 , 142 , 143 being angle-scanned with respect to the focusing device 110 .
  • the imaging of the x-y planes is enabled by rotating the device through two perpendicular angles theta and phi around the x and y axes, respectively.
  • the intensity information is recorded along with the angular position of the device and reconstructed by a image processor.
  • Imaging of zy planes can be achieved as well by scanning the sample in respect to the imaging device in xy.
  • the device In an imaging mode, in which the beams are scanned, the device is capable of simultaneously generating 2D images of sub-regions of samples. By simultaneously imaging with each separate illumination and detection pathway, the speed at which images are generated can be increased by the number of illumination paths and detection channels.
  • Imaging in the z plane occurs by moving the imaging device with respect to the sample, or vice-versa.
  • the intensity information is recorded along with the z-position of the sample or device and reconstructed by an image processor.
  • FIG. 7 Another embodiment according to the invention provides for a multifocal, multiphoton microscope based on MAPMT, as illustrated in FIG. 7 , in which an expanded excitation beam 104 A comes from bottom of FIG. 7 and illuminates a square microlens array 140 A.
  • a plurality of optical pathways is generated by the micro lens array 140 A in conjunction with lens L 1 ; for instance, in the embodiment illustrated here the microlens array 140 A splits the excitation beam 104 A into 8 ⁇ 8 multiple beams (i.e., 64 beamlets).
  • FIG. 7 only two beamlets 141 A, 142 A are ray-traced.
  • a specimen 105 A is scanned with an 8 ⁇ 8 array of excitation foci 150 A, which includes focus spots 151 A and 152 A illuminated by beamlets 141 A and 142 A respectively.
  • the sample area that each excitation focus covers can be relatively small the focus is in x and y direction, the full width half maximum (FWHM) of the focus is 200-1000 micro meter. In z direction the FWHM is 200-5000 micro meter.
  • each foci scans an area of the size of the distance of the foci, meaning 10-1000 microns (the scanning is accomplished by an optical scanner 180 A such as, a galvo-scanner).
  • the two lenses L 2 and L 3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110 A.
  • the detection light paths, 111 A and 112 A, respectively, resemble the illumination light path until the light paths are separated by a light reflector, which is in this case a dichroic mirror 130 A.
  • the light is then focused by a common lens L 4 onto the multi anode PMT detectors 120 A.
  • the emission beam-lets are collected at pixels 121 A and 122 A, respectively, of a multi anode PMT (MAPMT) 120 A.
  • the MAPMT 120 A which has the same number of pixels as excitation beamlets, detects the signal of 8 ⁇ 8 pixels synchronized with the scanning.
  • the intensity information is recorded along with the angular position of the scanner and reconstructed by an image processor.
  • a further embodiment according to the invention provides for a multifocal, multiphoton microscope based on MAPMT, in which an expanded excitation beam 104 B comes from laser 101 B and illuminates a micro lens array 140 B.
  • a plurality of optical pathways is generated by the micro lens array 140 B in conjunction with lens L 1 ; for instance, in the embodiment illustrated here two beamlets 141 B, 142 B are ray-traced.
  • a specimen 105 B is scanned with an array of at least one excitation foci 151 B and/or 152 B which are illuminated by beamlets 141 B and 142 B respectively. The scanning is accomplished by a scanner 180 B.
  • the two lenses L 2 and L 3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110 B.
  • the light is then focused by a common lens L 4 and reflector 134 B onto two multi anode PMT detectors 120 B, 124 B.
  • the MAPMT detectors 120 B, 124 B each detect the same number of pixels as are emitted excitation beamlets, integrating the signal of the at least one pixel synchronized with the scanning.
  • a z-piezo actuator 109 B (such as MIPOS 250 SG, micro-objective positioning system, integrated strain gauge motion: 200 ⁇ m (closed loop), Piezo System Jena controllable by controller 170 B is attached to the objective lens 110 B in order to move it in the z direction for 3D image generation.
  • the sample 105 B is attached to a sample stage 115 B, which can be moved in x, y and z directions, also controllable by controller 170 B and/or computer 176 B.
  • Light reflector 134 B (such as, for example, a dichroic mirror) is positioned in the detection pathways to enable multi channel imaging by a first MAPMT detector 120 B and a second MAPMT detector 124 B, for multi channel imaging.
  • An IR block filter 116 B (such as e700sp Special, Multi-Photon Blocking, Block 750-1000nm>OD 6, Chroma Technology Corp is positioned in the detection pathway to separate the long wavelength excitation light from the short wavelength detection light.
  • the filter 116 B is exchangeable with a variety of filters or can be removed completely for reflected light confocal imaging.
  • the filter 116 B can be mounted on a motorized mount, which allows it to exchanged via a controller 170 B and/or computer interface 176 B.
  • a band-pass filter 117 B (such as 560DCXR for detecting DAPI and FITC, Chroma Technology Corp) (560DCXR for the transmission of light generated by the excitation of GFP and Rhodamin, HQ460/40 for the transmission of light generated by the excitation of DAPI, HQ630/60 for the transmission of light generated by the excitation of Alexa 594 bandpass; Chroma Technology Corp) is positioned in front of each of the multi anode PMTs 120 B, 124 B, in order to detect certain spectra.
  • the band-pass filters 117 B, 117 B are exchangeable with other different filters and can be mounted on a motorized mount enabling changing of filters via a controller 170 B and/or computer interface 176 B. The same sample region can then be imaged with a different set of band-pass filters for more than two-color imaging.
  • a detection-part light-shield enclosure 118 B is used to shield the detection part of the apparatus from ambient light.
  • a variable iris 119 B (such as for the case of a manual version D20S—Standard Iris, 20.0 mm max. Aperature; Thorlabs. Motorized versions of equivalent devices are available as well) is positioned in the focal plane of the micro lens array 140 B in order to enable single spot illumination. For 8 ⁇ 8 foci imaging, the iris 119 B is relatively open and for fewer or single spot imaging the iris is relatively closed, so that only a few or one micro lenses illuminate the sample.
  • variable iris 119 B does not have to be in a round shape; and it will be square in shape when only a certain array of micro lenses should be blocked to enable illumination with a view of selected foci only.
  • the variable iris 119 B can be motorized and controlled via controller 170 B (such as, for example by connection 191 B), and/or via the computer 176 B.
  • a micro lens foci mask 125 B (such as a thin (for example 0.3 mm) aluminum sheet in which small wholes (for example 0.5 mm wholes) are drilled, at the points of where the micro lens focuses) positioned proximate the micro lens array 140 B is a pinhole mask with a large pinhole size that enables the transmission of most of the light focused by the micro lenses, but otherwise it blocks ambient and stray light from the laser.
  • a first reflector 131 B generates a first laser reference beam 165 B from the incident laser beam (for monitoring the laser illumination power, wavelength and angular pointing stability).
  • the reference beam 165 B projects upon the diode or detector 160 B which generates a signal that measures the laser illumination power, wavelength and angular pointing stability.
  • a further embodiment of the invention provides for a scan reference beam 166 B from a scan reference beam illumination source 168 B to be projected via reflector 172 B and reflector 132 B onto the scan region, whereupon the returning scan reference beam returns via reflector 132 B to pass through dichroic 172 B and lens 174 B to be received by detector 164 B.
  • the scan beam is provided for monitoring the scanning accuracy.
  • Detector 164 B can be a diode or CCD detector or another type detection device.
  • an embodiment of the invention can provide for the detector 164 B to be a CCD camera, which can be used to compare images generated by CCD camera detection methods and other detection methods according to the invention that employ one or more multi anode PMTs as described above.
  • a high voltage power supply 188 B supplies power to the multi anode PMTs.
  • Multi channel photon counting cards 184 B, 186 B are connected to each element of the MAPMTs, with one photon counting device for every multi anode PMT element, such as, for example, MAPMT elements 120 B and 124 B.
  • a computer 176 B (including input devices, such as, for example, a keyboard and mouse) can be provided in one embodiment, connected to computer display 178 B. The computer 176 B can be connected to controller 170 B.
  • the computer 176 B controls numerous elements of the invention either directly and/or indirectly through controller 170 B, and one skilled in the art will appreciate that numerous alternative configurations can be implemented within the scope of the invention.
  • One embodiment provides for the computer 176 B to be programmed with a processing software and for the computer 176 B to control a number of optical elements through a variety of electronic interfaces.
  • the computer 176 B and/or the controller 170 B can be electronically interfaced with the scanner 180 B and the multi channel photon counting cards 184 B, 186 B to perform the steps of scanning and data acquisition.
  • the computer 176 B can perform imaging post-processing steps.
  • the display 178 B can be used to display the acquired images in real-time after further processing.
  • a laser power attenuator 163 B can be provided to control the laser incident power.
  • the attenuator 163 B can be controlled by the controller 170 B and/or by the computer 176 B in order to enable power adjustments for different samples and different locations in samples.
  • the laser power can be automatically adjusted, so that the laser power can be increased at higher penetration depth.
  • the attenuator 163 B is integrated in order to make laser power adjustments, such as, for example, low power at the sample surface and increased power at increased penetration depth.
  • a third reflector 133 B generates a second laser reference beam 167 B from the incident laser beam (also for monitoring the laser illumination power, wavelength and angular pointing stability).
  • This second laser reference beam 167 B projects upon a second diode or detector 161 B to generate a signal that measures the laser illumination power, wavelength and angular pointing stability.
  • a laser power attenuator 163 B controls the laser incident power and is integrated in order to make laser power adjustments, such as, for example, low power at the sample surface and increased power at increased penetration depth.
  • Laser 101 B is an illumination light source, such as a titanium sapphire laser (Mai Tai HP, Spectra Physics).
  • the pulse compressor 102 B is built from a pair of standard high reflectance mirrors and a pair of prisms (IB-21.7-59.2-LAFN28, Material: LaFN28; CVI Laser Corp., Albuquerque, New. Mex. 87123) mounted on translational and rotational stages pre-chirps the laser pulse in order to attain a short laser pulse in the focus of the objective lens.
  • a confocal pinhole array optionally can be placed between either of the multi anode PMT arrays 120 B, 124 B and the band-pass filters 117 B, 117 B, respectively. This option enables the system to be used for confocal microscopy or for multi-photon microscopy with confocal detection.
  • a telescope 103 B expands the laser beam. With different expansion ratios, a different number of micro lenses can be illuminated. With a small beam expansion for example, a relatively smaller array of 2 ⁇ 2 micro lenses can illuminated and, thus, an array of only 2 ⁇ 2 foci is generated. As the beam expansion is made larger, an array of 8 ⁇ 8 or more micro lenses can be illuminated and, thus, an array of 8 ⁇ 8 or more foci is created.
  • a further preferred embodiment employs a set of at least two mirrors 135 B, 136 B after the telescope 3 B for precise beam alignment.
  • a mechanical micro lens holder 145 B enables the precise positioning of the micro lens array 140 B with respect to the multi anode PMTs 120 B, 124 B in the x, y and z directions.
  • the holder 145 B can be a motorized holder and can be controlled through a computer interface 176 B, or, alternatively, can be controlled via a controller 170 B, which controller in turn can be directed by computer 176 B.
  • Mechanical multianode PMT holder 125 B, 126 B enables the precise positioning of the multi anode PMTs 124 B, 120 B, respectively, with respect to the micro lens array 140 B in the x, y and z directions.
  • the holders 125 B, 126 B can be motorized holders and can be controlled through a computer interface 176 B, or, alternatively, can be controlled via a controller 170 B, which controller in turn can be directed by computer 176 B.
  • the computer 176 B, or the controller 170 B, or the computer and controller together can be configured to control automatically or to control in supervised fashion, one or more of the following elements, without limitation: the scan reference beam illumination source 168 B, the sample piezo stage 115 B, the objective z-piezo stage 109 B, the scan reference beam detector 164 B, the scanner 180 B (by connection 193 B), the IR block filter 116 B (by connection 194 B), the band-pass filters 117 B (for example, by connection 195 B), the laser source 101 B, the laser attenuator 163 B, the first laser reference beam detector 161 B, the second laser reference beam detector 160 B, the pulse compressor 102 B, the multi-photon channel counting cards 184 B, 186 B, the mechanical multi-anode PMT holders 125 B, 126 B (for example, by connection 190 B), the variable iris 119 B (such as, for example, by connection 191 B), and the mechanical micro lens array holder 145 B (for example, by connection 192
  • the focal region has a focal pattern variation in xy plane.
  • the foci can be distributed unevenly, e.g., the rows and columns do not have to be spaced uniformly.
  • a system can be built, in which there are additional rows and/or columns of PMT's at the outer region of the array of detection tubes. For example, there can be more than 8 ⁇ 8 rows and columns in both the micro lens array and the detector. This is particularly important for detecting scattered photons of the outer foci and for using the information of the scattered photons from the outer foci for deconvolution purposes.
  • an embodiment of the invention provides for a system in which there are more detector elements than there are foci, so that a plurality of detector elements (or detection pixels) collect the photons of one, optically conjugated foci.
  • a 16 ⁇ 16 detector array can used as a detector device, while an array of 8 ⁇ 8 foci can be illuminated by an 8 ⁇ 8 multi lens array. Smaller and larger PMT-to-foci ratios can be utilized.
  • a detector array in which one foci is optically conjugated to an uneven number of detector elements can be employed. This is important for detecting scattered photons in the neighboring channels of the to the focus optically conjugated detector and for using the information of the scattered photons for deconvolution purposes.
  • the image of the sample is formed by scanning in the optical plane (xy) when the intensity signal from the detectors is correlated with the foci positions.
  • the foci scan the specimen in the x direction, then move an increment in the y direction, and then raster in the x direction again until the sample is fully covered at some desired resolution.
  • intensity light signals are recorded by the multi anode PMT. These signals are then saved along with the foci positions in the computer and can be concurrently or afterwards displayed by the computer display or other graphics outputs.
  • the foci positions are known by the scanner position (beam scan) or the sample position (stage scan). The smaller the step increments, the higher the resolution the final image will be.
  • the scanning can be performed in a raster fashion, or in many other ways, such as with time multiplexed methods, or scanning simultaneously at different depths.
  • an embodiment of the invention provides for generating and detecting a 3D foci pattern in focal region 154 C.
  • a source of light is directed onto a micro lens array 140 C and a plurality of optical pathways is generated by the micro lens array 140 C in conjunction with lens L 1 ; for instance, in the embodiment illustrated here, the microlens array 140 C splits the excitation beam 104 C into 8 ⁇ 8 multiple beams (i.e., 64 beamlets).
  • the microlens array 140 C splits the excitation beam 104 C into 8 ⁇ 8 multiple beams (i.e., 64 beamlets).
  • only one beamlet 141 C is ray-traced.
  • a specimen 105 C is scanned with an 8 ⁇ 8 array of excitation foci, which includes focus spot 151 C illuminated by beamlet 141 C. The scanning is accomplished by an optical scanner 180 C.
  • the two lenses L 2 and L 3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110 C.
  • the detection light path, 111 C resembles the illumination light path until the light paths are separated by a light reflector, which in this case is a dichroic mirror 130 C.
  • the light is then focused by a common lens L 4 onto the multi anode PMT detector pixel element 121 C.
  • Changing the focal length or the positions of the micro lenses of the microlens array 140 C with respect to each other generates collimated and non-collimated beams at the back aperture of the objective lens 110 C. These beams generate a 3D pattern of foci.
  • the 3D pattern of foci generates light which is collected by the detector array.
  • the positions of the PMT's are changed.
  • one-photon illumination such as is illustrated in the “Option I” detection region 124 C
  • no confocal pinholes are placed in front of the detectors.
  • a plurality of confocal pinholes 128 C are placed in front of the plurality of detectors cells.
  • Each of the detection options 124 C, 126 C can be used for single photon and/or for multi photon imaging.
  • the MAPMT which has the same number of pixels as excitation beamlets, integrates the signal of 8 ⁇ 8 pixels synchronized with the scanning, although other array dimensions can also be used.
  • FIGS. 10 ( a )-( i ) illustrate in greater detail the arrangement and progression of foci corresponding to the relative shifting in position of micro array lenses and MAPMT pixels shown in FIG. 9 .
  • FIG. 10 ( a ) shows an expanded detail of the focal region with 3D foci pattern.
  • FIG. 10 ( b ) illustrates an array of excitation light beams (in this case, an array of 2 ⁇ 8 beams) illuminating a focusing device 110 D, such as, for example, an objective lens, as viewed here in the x-z plane.
  • a focusing device 110 D such as, for example, an objective lens
  • focal points 151 C and 152 C are created for the two beams at certain distances, d 1 and d 2 , respectively, along the optical axis (z-axis). According to the relative angle of the illumination light beams with respect to the optical axis, the array of foci are separated in the optical plane. Controlling parameters of the beams provider selection of a variety of 3D foci distribution(s). For collimated light, the excitation foci 151 C is at a distance from the focal objective, fobj, designated here as distance d 1 .
  • an excitation foci 152 C is at a distance d 2 that is not equal to the focal objective, as depicted in FIG. 10 ( b ).
  • FIG. 10 ( c ) depicting a “static” view of an x-y plane “slice” at focal depth d 1 , illustrates a first row of 8 foci (of the 2 ⁇ 8 array in this example) all lying at the same focal depth d 1 , understanding that any one of these foci may correspond with the 151 D focus point in the x-z plane view of FIG. 10 ( b ).
  • FIG. 10 ( c ) depicting a “static” view of an x-y plane “slice” at focal depth d 1 , illustrates a first row of 8 foci (of the 2 ⁇ 8 array in this example) all lying at the same focal depth d 1 , understanding that any one of these foci may correspond with the 151 D focus point in the x-z plane view of FIG. 10 ( b ).
  • FIG. 10 ( d ) depicting a “static” view of a second x-y plane at focal depth d 2 , illustrates a second row of 8 foci (of the 2 ⁇ 8 array) all lying at the same focal depth d 2 , any one of which foci might correspond with the 152 C focus point in the x-z plane view of FIG. 10 ( b ).
  • FIG. 10 ( f ) shows a 3D, 8 ⁇ 8 beam matrix (64 beams) of excitation foci that have been generated.
  • FIG. 10 ( g ) illustrates an x/y view in a scanning configuration, where each line of foci is scanned in the xy plane to cover the whole yx image in its particular z-plane. A number of xy planes are shown simultaneously, but actually each plane lies at a different focal depth on the z-axis.
  • FIG. 10 ( h ) illustrates an x/y view in a scanning configuration, where each line of foci is scanned in the xy plane to cover the whole yx image in its particular z-plane. A number of xy planes are shown simultaneously, but actually each plane lies at a different focal depth on the z-axis.
  • FIG. 10 ( h ) illustrates an x/y view in a scanning configuration, where each line of foci is scanned in the xy plane to cover the whole yx image in its particular z-plane. A number of xy planes are shown simultaneously, but actually each plane lies at a different focal depth on the
  • FIG. 10 ( i ) provides a view of a section in the yz plane, illustrating the scanning configuration while the z/y scan is performed, i.e., scanning along the y axis and through multiple depth layers in z.
  • a 3D volume can be imaged by only scanning the foci array in xy.
  • the x/y, x/z and y/z coordinates illustrate the associated planes; they can be displayed with an arrow in their positive or negative direction.
  • 3D AM-PMT MMM can be used in multi photon endoscope device in accordance with another preferred embodiment of the invention (to be added).
  • a further embodiment provides for a 3D cytometer, based on multi-focal, multi-photon microscope with a multi anode PMT detector.
  • a 10 W solid state pump laser 100 D pumps a titanium sapphire laser 101 D (Millennia X & Tsunami, Spectra Physics, Mountain View, Calif.), which generates maximum output power of 2.5 W at 800 nm, and 120 fs pulses at a repetition rate of 76 MHz.
  • the light is conducted through two reflectors 137 D, 138 D and then passes through a first telescope 103 D, two additional reflectors 139 D, 133 D, an attenuator 163 D, and a second telescope 203 D.
  • the light After passing through another reflector 136 D, the light is subsequently split into an array of beams by the micro lens array 140 D and is transmitted by lenses L 1 , L 2 and L 3 onto the back aperture of the objective lens 110 D, thus creating multiple foci in the focal plane.
  • the micro beams are scanned by a xy-scanner 180 D (Cambridge Technologies, Cambridge, Mass.).
  • the fluorescence is collected by the same lenses and separated from the illumination light by a dichroic filter 130 D and a two-photon block filter 116 D.
  • the fluorescence passes through lens L 4 and is then separated into two spectral channels by the dichroic filter 134 D and directed onto the multi-anode PMTs 124 D, 120 D.
  • the degree of spectral separation can be chosen depending upon the application.
  • the embodiment disclosed here uses a red/green and a green/blue filter to accomplish the spectral separation.
  • the variation of the magnification of the telescope 20 3D enables the utilization of, for example, 4 ⁇ 4, 6 ⁇ 6 or 8 ⁇ 8 arrays of micro lenses, among other size arrays.
  • FIG. 11 ( b ) illustrates an image of the array of foci in the focus of the objective lens, such image as can be taken by a CCD camera, where here the foci are not scanning.
  • the foci are 45 ⁇ m apart in resulting in a potential scanning field of 240 ⁇ m when 6 ⁇ 6 foci are utilized.
  • FIG. 11 ( c ) shows a z-profile and a corresponding fit function of a 200 nm bead.
  • the system shows a resolution of 2.4 ⁇ m, which is close to a theoretical value of 2.2 ⁇ m, considering the under-fulfillment of the back aperture of the objective lens. Acquisition speed for this scanning profile was 10 frames per second. The profile is averaged over 5 consecutive pixels, reducing the sampling from 30 nm per pixel to 150 nm per pixel.
  • FIG. 12 ( a ) a further embodiment of the invention provides for multi color detection MMM in the xz-plane.
  • An array of 2 ⁇ 8 beams is generated by the micro lens array 140 E.
  • the setup here is illustrated with two 1 ⁇ 8 beam lines. The distance between the foci in each line is determined by the combination of the source beam configuration and the micro lens array 140 E.
  • Two light beams are conducted through the micro lens array 140 E and intermediate optics onto the focal plane of the microscope in which they create two lines of 1 ⁇ 8 foci.
  • FIG. 12 ( a ) only 3 of the 16 beam traces in a 2 ⁇ 8 setup are illustrated.
  • the full field is then scanned by the mirror oscillation of the scanning mirror 180 E, in which the scanning amplitudes need to be adapted to the distances of the foci.
  • a holographic diffraction grating 192 E is incorporated that diffracts the multiple wavelengths emitted from the sample onto the photo-multiplier arrays of two stacked multi-anode PMTs 120 E, 124 E.
  • the two multi-anode PMTs will be stacked on top of each other, each serving as a spectral detection device for one line of 1 ⁇ 8 foci.
  • the grating 192 E properties (pitch/inch) and the focal length of the focusing lens (L 4 ), which determines the distance between the grating and the multi-anode PMT, have to be chosen in accordance to the anticipated fluorescent probes used for staining the tissue sample.
  • a transmission grating 192 E is used.
  • FIG. 12 ( b ) illustrates the illumination foci and their scanning in the focal xy-plane. Scanning is indicated for two arrays of 8 foci each.
  • FIG. 12 ( b ) illustrates the illumination foci and their scanning in the focal xy-plane. Scanning is indicated for two arrays of 8 foci each.
  • FIG. 12 ( c ) shows the detection path of two beams projected in the yz-plane through grating 192 E and lens L 4 onto the stack of two AM-PMTs 120 E, 124 E.
  • FIG. 12 ( d ) shows the detection path projected in the xz-plane, where the beams are depicted passing through grating 192 E and lens L 4 with each of eight color bands being collected by the two AM-PMTs 120 E and 124 E.
  • FIG. 12 ( c ) shows the detection path of two beams projected in the yz-plane through grating 192 E and lens L 4 onto the stack of two AM-PMTs 120 E, 124 E.
  • FIG. 12 ( d ) shows the detection path projected in the xz-plane, where the beams are depicted passing through grating 192 E and lens L 4 with each of eight color bands being collected by the two AM-PMTs 120 E and 124 E.
  • FIG. 12 ( e ) illustrates the anodes of the multi-anode PMTs 120 E, 124 E in the x/y plane, showing that the 8 ⁇ 8 anode arrays of the detectors each detects one of the two 1 ⁇ 8 beam lines, where each 1 ⁇ 8 beam line has been diffracted by the grating 192 E into eight color bands.
  • a beam splitter device 400 can be used to create a homogenous intensity profile over a plurality of beamlets. Depending upon its design, the beam splitter splits one beam into 256 , 128 , 64 , 36 or 16 approximately equally powered beams by one or more fully reflective or semitransparent mirrors. In FIG. 13 50% and 100% indicate the percent reflectance of the mirrors used, where a series of fully reflective mirrors 420 with one longer semi-transparent 410 mirror splits the beam in the x-plane (BS-X). By combining two such cubes in series, it is possible to generate a 2 D array of beamlets.
  • BS-X x-plane
  • the internal optics of a second beam-splitting cube for the y-plane are the same as for x-plane beam splitting cube.
  • the beams are then focused by micro lens 430 (or via other multifocal optics) through lens 432 and objective lens 434 onto the focal plane.
  • FIGS. 14 ( a )- 14 ( d ) illustrate additional preferred embodiments for providing multifocal illumination including a micro-lens array 140 N from expanded beam 201 N in FIG. 14 a and FIG. 14 b a diffractive optical element 205 N separates beam 201 N into a separated plurality of beams which are coupled to focal locations as previously described herein.
  • a plurality of optical fibers 220 N can be used to provide a plurality of beams with lens L 1 for delivery to the focal locations or spots.
  • the fibers 220 N can position beams in different directions for smaller or greater focal separation.
  • CARS coherent anti Stokes Raman scattering
  • confocal microscopy can be preformed with the same instrument.
  • FIGS. 15 ( a )- 15 ( d ) illustrate further preferred embodiments for use with detectors which can be a multi anode PMT or an array of single detectors, connected via optical fiber.
  • the detectors can be PMT's or avalanche photo diodes, or the detector array can be a combined device (like a multi anode PMT), connected via optical fiber, an avalanche photon diode array, a CMOS imaging detector, or a CCD camera in which each pixel or each area of binned pixels is correlated to one focus, or a CCD camera in which more than one pixel or more than one binned pixel area is correlated to one focus.
  • the detector 210 P can be coupled directly to optical fibers 220 P which receive light from lens L 1 . As shown in FIG. 15 b, individual detectors 210 P can collect at different angles, or as seen in FIGS. 15 c and 15 d, a detector array 212 P can detect at the same or different angles respectively.
  • the optimal number of foci for a two photon excitation process at a certain laser power for samples with different damage thresholds can be determined.
  • the optimal number of foci will depend on (i) the damage threshold, (ii) the quadratic dependency of the two-photon signal to the laser power, and (iii) the limited amount of laser power.
  • the laser power is limited to 1.2 W at the sample, while the damage threshold of the sample ranges can be 10 mW, 20 mW and 50 mW.
  • the optimal number of foci is 120, 60 and 24 respectively.
  • the appropriate power level for two-photon imaging is constrained by two basic boundary conditions: (a) the minimum accepted signal-to-noise ratio determines the minimum power that can be used, whereas (b) the damage threshold of the sample determines the maximum power.
  • the limited laser power is distributed over a large number of foci. The best signal is obtained when the number of foci is chosen in a manner such that each of the foci delivers a power level just below the damage threshold for the sample. The relationship is illustrated in FIG. 16 . It is possible to obtain less signal from the sample as more foci are used, owing to the squared dependence of signal on laser power. A judicious choice of power levels and of number of foci must be made in order to obtain optimal results. Therefore, a preferred method and system provides for a versatile system in which the number of foci can be varied with respect to the sample threshold. The threshold can be different for different penetration depths into the sample and can therefore be adjusted by the attenuator 163 B.
  • Time multiplexed illumination and detection enable MMM microscopy with one detector only, which is gated to the excitation light pulse.
  • the illumination light source is a pulsed laser.
  • a Ti:Sa Laser with a repetition rate of approx. 80 MHz and a pulse width of approx. 100-200 fs as an example.
  • all beams carry the same pulse distribution along time.
  • the array of excitation foci in the focal region is formed simultaneously.
  • the beam or sample is scanned on both axis perpendicular to the optical plane; here indicated by x ⁇ 0.
  • detection elements are collecting light from 18 simultaneously illuminated spots.
  • the delay between the illumination pulses 502 is alternated between foci shown at 504 .
  • This configuration can be imagined to be accomplished with more detection channels per simultaneously illuminated foci.
  • the beam or sample is scanned on both axes perpendicular to the optical plane; here indicated at 506 by x ⁇ 0.
  • the light in non-corresponding detection channels has an additional time delay relative to light from the foci corresponding to the detector channel 508 that is receiving light at a particular time.
  • the resulting signals may not or may minimally overlap and thus be registered to the proper foci directly.
  • the temporal separation will aid numerical registration and deconvolution algorithms.
  • the response in the neighboring non-corresponding detectors can be used to generate additional information about the sample.
  • the temporal delays introduced into the illumination foci mean that this supersampling condition exists even when the number of detectors is the same as the number of foci.
  • alternating excitation foci pattern can be detected by a multichannel detector with smaller number of elements than number of foci.
  • a time multiplexed MMM illumination and a detection in a single channel can be used.
  • the repetition rate of the laser is a hundred times lower than in previous examples.
  • each beam carries a pulse which is temporally separated in regards to pulses of the other beams. In one particular case, they are separated evenly over the time period of one laser repetition, so that at evenly distributed time points, one single foci is illuminated at a time. If a fast detector is correlated with the pulse distribution and capable of detecting each pulse separately during this short time period, an MMM with only one detection element can be used.
  • This detection element has a corresponding detection area to collect light which is generated by each individual foci during its scan.
  • the beam or sample is scanned on both axis perpendicular to the optical plane.
  • optical cross-talk is completely eliminated, as the light from the different foci is excited and detected at different time points.
  • excitation processes which appear instantly, like scattering effects (such as Second Harmonic Generation (SHG) or Coherent Ramen Anti Stokes scattering (CARS)).
  • SHG Second Harmonic Generation
  • CARS Coherent Ramen Anti Stokes scattering
  • an array of 3 ⁇ 3 beams illuminates the focusing device, forming an array of 3 ⁇ 3 foci ( FIG. 18 b ).
  • the image of the scattering distribution of the foci is imaged by the CCD camera.
  • the wide-field image is accumulated per scanned illumination point.
  • the scattering distribution of each foci can be recorded on many CCD pixels.
  • the illumination or sample can be scanned and the wide-field data can be further processed to form an image or statistical representation from many object points. This configuration can be employed in a non-de-scanning configuration as well.
  • the above described systems and methods can be used for imaging of all semi-transparent and highly scattering materials; 2D and 3D, and in particular for imaging of human and animal tissue, cells in suspension, plants and other biological material.
  • the illumination can be achieved with visible light and alternated with the MMM scanning measurement or out of band illumination light can be used and the camera measurement can be taken simultaneously with the MMM measurement.
  • This configuration can be used for large field imaging, sample guided MMM measurements, conventional staining measurements, and online MMM measurement process control, for example, bubble formation monitoring, and laser spot diagnostics.
  • fluorescent There is a large variety of fluorescent that can be used with various embodiments of the invention dyes. In general they fall into two families: Dyes that have to be applied to stain the tissue “from the outside” and dyes, that are expressed from animals as proteins. Most commonly used dyes by external staining MitoTracker Red, DAPI, Hoechst 33342, Cy2-IgG, Alexa Fluor 546, Rhodamine 123, Alexa Fluor 488, FITC-IgG, Acridine Orange, Cy3-IgG, Issamine Rhodamine, TexasRed-Phalloidin, TexasRed-IgG, Alexa Fluor 594, Propodium Idonide.
  • GFP green fluorescent protein
  • YFP Yellow fluorescent protein
  • Enhanced YFP Auto fluorescent imaging does not use a particular dye, but can be used as part of an imaging technique.
  • confocal microscopy fluorescent, as well as reflected light confocal
  • these include all other multi-photon microscopy techniques, such as, 2, 3, or more photon excitation microscopy, Second (SHG), Third (THG) ore more Harmonic Generation microscopy, Coherent Anti Stokes Raman Scattering (CARS) microscopy, multi photon quantum dot imaging, surface plasmon imaging and Stimulated Emission Depletion (STED) Microscopy.
  • SHG Second
  • THG Third
  • CARS Coherent Anti Stokes Raman Scattering
  • STED Stimulated Emission Depletion
  • FIG. 19 illustrates a probe or endoscope apparatus according to an embodiment of the invention, having a handle portion 272 F and an insertable probe portion 270 F, wherein light delivered from a light source 244 F (which can be a laser or other light source) is delivered through an optical wave guide 234 F (such as, for example, optical fiber, hollow fiber optics, photonic band gap fiber, or other wave guide) to an optical connector 224 F (such as, for example, a pigtail), whereupon an expanded beam 104 F passes through a lens or optionally through lens pair telescope 103 F and then through a micro lens array, or other optical device that creates a plurality of optical pathways, 140 F.
  • a light source 244 F which can be a laser or other light source
  • an optical wave guide 234 F such as, for example, optical fiber, hollow fiber optics, photonic band gap fiber, or other wave guide
  • an optical connector 224 F such as, for example, a pigtail
  • the illumination path can then pass through lens L 1 , dichroic 130 F and lenses L 2 and L 3 onto the rear aperture of the objective 110 F
  • the beam is made to scan by scanner 180 F which can tilt in the x and/or y directions, and the return fluorescent signal is directed by dichroic 130 F and reflector 136 F, optionally an IR block filter 116 F through lens L 4 and optionally a band pass filter 117 F onto a multi-anode PMT detector 120 F.
  • a plurality of confocal pinholes 119 F are placed in front of the plurality of detectors cells.
  • the detector 120 F can be connected to a controller 170 F and to an image processing computer 176 F.
  • the scanner 180 F can also be controllably connected by electrical connector 193 F to a controller 170 F and/or computer 176 F.
  • the proportions of the endoscope have relationship to the focal distances of the micro lens array, fm, the lenses L 1 -L 4 , being f L1 , f L2 , f L3 and f L4 , offset distances d 1 and d 2 , and the size will be related to the relative size of the various elements.
  • the active area, relative proximate orientation of active detector elements such as, for example, the active area of multiple anode photomultiplier tube detector elements
  • the distance of the foci and the intermediate optics have an important relationship to the effectiveness of detecting scattered light from one or more light spots in a sample specimen, as explained in the following.
  • FIG. 20 ( a ) depicts one PMT and one excitation focus, and the direction of scattered light with respect to the detection light cone of the active detection area will control whether or not the photon will be detected.
  • FIG. 20 ( a ) provides an illustration of how the size of the active detection area relates to scattered light detection from a spot created by multi photon excitation, as follows: An illumination light beam 204 G coming from the left (parallel solid lines), generates a multi photon excitation light spot 251 G (so-called excitation point spread function) in the sample 105 G, in which the structure causes a multi-photon excitation process.
  • light is generated according to the multi-photon excitation principle and scattered on its path (such as, for example, an auto-fluorescent tissue sample).
  • the potential detection path is illustrated by the very thin bounding lines enclosing the stippled shaded region, which are geometrically determined by the side boundaries of the active PMT detection area 222 G of the detector 120 G.
  • all the photons that propagate within the detection cone, indicated by the shaded region, and that travel in the direction of the detector 120 G, are collected in the active detection area 222 G. Photons that propagate in the opposite direction, or that are scattered outside of the detection cone defined by the optics and thus outside of the active detection area 222 G of the detector 120 G, are not detected.
  • FIG. 20 ( b ) This is depicted in FIG. 20 ( b ) as well, where the detection area 224 G (dashed box) in the sample focal plane 210 G corresponds to the active detection area 222 G, while a potential scattering region 254 G (circle) extends beyond the confines of the detection area 224 G.
  • FIG. 20 ( b ) Three examples of photon path are shown in FIG.
  • Light generated in the spot 251 G is detected by the same objective lens 110 G. It can as well be detected by an opposing objective lens and collected by a detector associated with the detection area to the light spot at the opposing side. Then also photons in the detection cone of the second, opposing lens, traveling towards the direction of the incident light into the opposing detector, are collected.
  • the gap 232 G (also marked as “g”) between the active detection area 222 G of detector element 120 G and the active area of the second element 124 G will correspond to the gap 236 G in FIG. 21 ( b ) between two detection areas 224 G, 226 G in the sample focal plane.
  • the photon 281 G that is scattered beyond the detection cone for active detector area 222 G of detector 120 G can follow photon path 282 G into the adjacent detector 124 G.
  • Scattered light detection from a spot 251 G created by multi photon excitation, detected by two large area detectors 120 G, 124 G, positioned next to each other, are separated by a distance:
  • the unscattered photon 261 G and the first scattered photon 271 G are still collected by the active detection area 222 G of the first detector 120 G.
  • the second scattered photon 281 G is not lost, but is collected by the second detector 124 G. This effect of light being scattered into detectors other than the optically conjugated detectors is termed “optical cross talk”.
  • FIG. 22 ( a ) illustrates scattered light detection with two PMTs 120 G, 124 G and two excitation foci 251 G, 252 G, where again the issue of “optical cross talk” is relevant.
  • a second illumination light path 206 G at an angle ⁇ 1 with respect to the first illumination light path 204 G creates a second focus 252 G (excitation PSF) at a distance ⁇ from the illumination light spot 251 G.
  • FIG. 22 ( a ) only one, unscattered photon 291 G is illustrated in order to simplify the drawing.
  • This photon 291 G originating from 252 G follows optical path 292 G into detector 124 G (although the illustration includes a collimating lens between the reflector and the detector and a refraction in path 272 G by said lens is depicted, owing to constraints in the size of the drawing and to illustrate better the features emphasized here as aspects of an embodiment of the invention no refraction in the paths of 282 G and 292 G is depicted in this illustration).
  • Light originating from the second light spot 252 G will be collected by the second detector 124 G, but also by the first detector 120 G as well because photons from the second light spot 252 G are similarly scattered as photons from the first light spot 251 G.
  • the invention provides for reducing optical cross talk by increasing the gap distance between the excitation foci, this gap distance depicted as ⁇ , and simultaneously increasing the gap 232 G between the active detection areas in the detection elements.
  • the second illumination light path 206 G is separated further by an angle ⁇ 2 > ⁇ 1 from the first light path 204 G, generating an illumination light spot 252 G in a location that is larger distance ⁇ > ⁇ from the illumination light spot 251 G.
  • the detectors 120 G, 124 G are also separated from each other by increasing gap 232 G between the active detector areas to a value “G”, where G>g, such that the second unscattered photon path 282 G no longer falls into the second detector 124 G.
  • FIG. 23 ( b ) illustrates this by showing no overlap between scattering regions 254 G, 256 G and the neighboring detection areas 226 G, 224 G, respectively.
  • optical cross talk is reduced because fewer photons end up in the “wrong” channel; however, some scattered photons will not be detected because their paths will pass between the active detection areas of more widely separated detectors. Thus, detection light (signal) is lost.
  • a preferred embodiment of the invention provides for reducing optical cross talk without inducing signal loss, by increasing the distance between the excitation foci and simultaneously increasing the active detection area of the detector elements. By separating the foci and the associated detectors, the optical cross talk is reduced. By increasing the active area of the detectors, most scattered photons are collected. In FIG. 24 ( a ) this is depicted by the second scattered photon path 282 G being collected by its corresponding detector 120 G.
  • FIG. 24 ( b ) shows the expanded detection areas 226 G can encompass the scattering
  • changing the optical configuration of the apertures and focal length of the lenses in the optical system can create the same effect.
  • Changing the aperture and the focal length of the micro lens array, increasing the area of the scan mirror, changing the aperture and the focal length of the lenses L 1 , L 2 , L 3 and L 4 has a similar effect of reducing cross talk without loss of signal.
  • An example case is presented in tabular format in FIG. W-22( a )-( e ).
  • FIGS. 25 ( a )-( e ) The optical configurations for two alternative embodiments of the invention, Example A and Example B, are presented in FIGS. 25 ( a )-( e ).
  • FIG. 25 ( a ) lists the objective lens specifications, which are the same for both Examples A and B (i.e., Olympus, 180 mm tube lens, XLUMPLFL 20 ⁇ magnification objective; water immersion; 0.95 numerical aperture; 17.1 mm back aperture; 2 mm working distance; 9 mm focal length; 22 mm field number; and 1.1 mm corrected field).
  • FIGS. 25 ( b )-( c ) list the details of the illumination path. A different micro lens array is described for each embodiment, but in both the micro lenses are square shaped.
  • Example A the side aperture pitch of each micro lens is 1.06 mm and the diagonal is about 1.5 mm.
  • the 8 lenses per side create a side aperture of 8.48 mm and a diagonal aperture across the array of 11.99 mm.
  • the focal distance of each micro lens is 17 mm in Example A.
  • the aperture or pitch of each micro lens in the array is 1.9 mm and its focal distance is 25 mm.
  • the focal lengths of the lenses L 1 and L 4 are 50 mm and 103.77 mm, respectively, in Example A, while in Example B they are 40 mm and 46.32 mm, respectively. When standard optical components are used, they can approximate components with a focal distance of 100 mm and 45 mm for L 4 in the configurations A and B, respectively. In both the embodiments of Example A and B, the focal lengths of lenses L 3 and L 4 are 30 mm and 125 mm, respectively.
  • the diameter of illumination of the back aperture of the objective lens for both Examples A and B remains approximately constant at 13.0 mm and 12.7 mm, respectively. This results in an ‘under-illumination’ of the back-aperture of the objective lens which has a back aperture of 17.1 mm in diameter. This is desirable, so an optimal (maximal) employment of the illumination light power is warranted.
  • Example A and B achieve different distances between excitation foci in the optical plane:
  • Example A has a foci distance of 46 microns, whereas Example B has a foci distance of 103 microns.
  • the total optical field, listed below the foci distance results from the fact that in this particular case, an 8 ⁇ 8 configuration of foci is chosen. It has a square side of 366 microns for configuration A and 821 microns for configuration B, when the foci are scanned.
  • FIG. 25 ( e ) lists the details of the detection path for each example.
  • These alternative examples, A and B can be created according to the layouts of either FIG. 7 or FIG. 8 , according to embodiments of the invention. In both cases the size of the MAPMT can remain constant at about 2 mm ⁇ 2 mm. Employing a MAPMT with larger detection elements can increase the detection efficiency.
  • FIG. 26 ( a )and ( b ) illustrate the foci distribution in the focal plane of the objective lens for the embodiment Examples A and B, respectively.
  • the conjugated detection area of each channel of the multi anode PMT is by a factor of 5 larger in the Example B than in Example A.
  • the optimal distance between the foci is influenced by three factors: (1) the optimal number of foci that are needed to generate as much light as possible (this number can be distinguished in accordance with the graph in FIG. 16 ; (2) the corrected field of the focusing device, such as an objective lens (the larger the corrected field, the further the foci can be separated from each other and the more the optical cross talk can be reduced); and (3) the numerical aperture (NA) of the objective lens for high resolution imaging (the larger the numerical aperture of the objective lens, the more photons can be collected and the better the images are). Nevertheless, there is a compromise between the NA of the lens and its effective field of view. Therefore, the objective lens used in a most preferred embodiment of the invention has a large NA of around 1.0 or greater and is capable of imaging a large effective field of view, preferably of approximately 1-6 mm.
  • a large field objective provides an advantage for certain embodiments of the invention, because the foci can be further separated.
  • An objective lens with large field of view enables large separation of foci and thus reduces optical crosstalk.
  • FIG. 26 ( c ) and ( d ) two objective lenses with different fields of view are shown, 600 micron objective field versus 6000 micron objective field, respectively.
  • the conjugated detection area of each channel of the multi anode PMT associated with the focal plane within this field of view is a factor of 100 larger in the objective of FIG. 26 ( c ) versus FIG. 26 ( d ).
  • NA numerical aperture
  • Commercially available objective lenses with a large numerical aperture (NA) of around and above 1 usually have a field of view for which they are corrected between (100 ⁇ objectives) around 200 mm and (20 ⁇ objectives) 1000 mm.
  • NA numerical aperture
  • Olympus XLUMPLFL20 ⁇ water immersion objective mentioned above and used in embodiment Examples A and B when an array of 8 ⁇ 8 foci is employed, the optimal distance between the foci is 111.11 microns and the total field imaged is approximately 1000 microns.
  • the active detection area of the different detector channels in the MAPMT is approx. 2 mm and limited by the commercially available MAPMT devices. If this area is increases, the optical cross talk and the collection light efficiency is increased.
  • Step 310 H can include initiating a computer program and/or software application automatically as part of a data acquisition step in a computer that is connected directly to the imaging apparatus and/or can include a series of human-supervised data-processing steps.
  • the data processing can be automatically intiated by the computer and proceed entirely automatically according to a data-processing control software application and/or the program may proceed semi-automatically with opportunities for human supervision and intervention in one or more of the data processing steps.
  • An embodiment of the data-processing method follows the start step 310 H with a next step to load the image data 312 H.
  • Metadata about the data includes, inter alia and without limitation: foci number, pixel dimensions; pixel spacing; channels; instrument parameters (including, without limitation, optics, objective, illumination, wavelengths, beam-splitting, phasing, polarity, light pumping, pulse compression, chirping, upconversion, dispersion, diffraction, source-light properties, source light stability, source attenuation, reference scanning, micro lens configuration and properties, focal lengths, filter types and positioning, detection configuration, detector type, detector active area, detector sensitivity and stability, and other detector specifications and properties, inter alia); sample properties and sample information, such as, for example, for biological samples (including biological and non-biological information, such as, for example, tissue type, specimen type, size, weight, source, storage, tracking, scattering properties, stain/dye type, specimen history, and other physical properties of the specimen) or sample properties and sample information for chemical and/or physical material samples; and scanner data, including, without limitation,
  • next step 314 H is deconvolution of the image data, which deconvolution is described further below.
  • the data can be saved in an optional step 316 H, whereupon the post-processing can optionally be stopped 318 H.
  • An embodiment also allows the processing to continue to a next step 320 H that comprises performing an intensity normalization on the data, which normalization steps are described in more detail below, then optionally saving the data (step 322 H), and stopping the data processing sequence (step 324 H).
  • Step 332 H can include accessing metadata from a storage device, including here by reference all the description of possible metadata described above for the steps illustrated in FIG. 27 ( a ).
  • Step 334 H can include normalizing, filtering (de-noising), and blending (integrating) of multifoci subimages, and further can include registering subimages into a single image.
  • Step 336 H can include filtering and normalizing images produced from corrected subimages.
  • Step 338 H can include registering, building mosaics, and blending sets of corrected images into a larger whole. Also, optionally, at this step 338 H, an embodiment of the method of the invention provides for creating lower resolution images of the larger image to facilitate access, as well as images from different perspectives (such as, image views taken of the xy-, xz-, and/or yz-planes) and creating data-compressed versions of the data and/or results (e.g., JPEG, wavelet compression, inter alia without limitation).
  • Step 340 H can include segmenting images into objects, which segmentation step can either be manual, automated or a combination of both.
  • Step 342 H can include parameterizing the objects, samples or specimens (such as, for example, size, shape, spectral signature).
  • Step 344 H can include classifying objects into higher order structures/features (e.g. material stress or cracks, vasculature, nuclei, cell boundaries, extra-cellular matrix, and location, inter alia, without limitation).
  • Step 346 H can include statistically analyzing parameterized objects (such as, for example, by correlation methods, principal component analysis, hierarchical clustering, SVMs, neural net classification, and/or other methods).
  • Step 332 H can include presenting results to one or more persons on one or more local or distant display devices (examples include: 3D/2D images, annotated images, histograms, cluster plots, overlay images, and color coded images, inter alia).
  • the post-processing steps can include a number of substeps, including, among other steps those illustrated in FIG. 27 ( b ), without limitation:
  • lower resolution images can be created of the larger image to facilitate access, as well as images from different perspectives (xy, xz, yz).
  • Data-compressed versions e.g., JPEG, wavelet compression, inter alia without limitation
  • JPEG JPEG
  • wavelet compression inter alia without limitation
  • Classifying objects into higher order structures/features 344 H e.g. material stress or cracks, vasculature, nuclei, cell boundaries, extra-cellular matrix, and location, inter alia, without limitation
  • higher order structures/features 344 H e.g. material stress or cracks, vasculature, nuclei, cell boundaries, extra-cellular matrix, and location, inter alia, without limitation
  • xi Presenting results to user on display device 348 H (examples include: 3D/2D images, annotated images, histograms, cluster plots, overlay images, and color coded images).
  • FIGS. 28 ( a ) and( b ) relate to image normalization.
  • the normalized signal distribution resembles the normalized power map squared and shows an intensity drop of 45% toward the corner PMTs in respect to the center PMTs.
  • the laser power was attenuated to 75.3 mW in the sample and can reach a maximal value of approx. 645 mW, resembling a power of approx. 15 mW for the center foci in the sample.
  • Measured intensity profile can be generated by imaging a homogeneously distributed fluorescent dye under a cover slip. The intensity measurement is not only mapped by the power/intensity distribution of the foci, but also by the sensitivity of the detector array. As a result it resembles the “true” measured intensity distribution.
  • the image consists of 192 ⁇ 192 pixels and was generated by an array of 6 ⁇ 6 foci which were scanned across a uniform fluorescent dye sample.
  • Case 1 The normalized inverse of this intensity image (from a uniform fluorescent dye) is multiplied with the yx images taken of the sample. The resulting images are then displayed and saved as a normalized image.
  • Case 2 A large number of images from a sample at various positions (and thus with a random underlying intensity structure) is averaged. This image is then inversed and normalized. This image is multiplied with the original data is then displayed and saved as a normalized image.
  • Case 3 A simplified image is generated which consists of 36 sub-images (generated by the 6 ⁇ 6 foci). Each of the sub-images carries the average intensity generated by the specific foci. For example, all 32 ⁇ 32 pixels in the top left sub image carry the same number; 45. The image is then inversed and normalized. This image multiplied with the original data is then displayed and saved as a normalized image. An image can be generated either from the intensity image generated by the process of case 1 (fluorescent image) or case 2 (over many images averaged). 3D xyz image normalization is carried out in a similar fashion as in case 2 of the xy image normalization. A z-intensity profile (an example is FIG.
  • a computer program is fed with the imaging potentially dynamically adjustable (including feedback from the measurement) parameters and controls the imaging procedure.
  • 1D Scan Collect Data from a Region of Interest
  • the 2D scanning starts for example at a corner of an area and is then scanned in a raster until the area is covered according to the imaging parameters (current implementation)
  • a key consideration for the improvement in the measurement is that the detector measures the sample for each scanning position, without overlap.
  • the focusing device objective lens
  • the focusing device objective lens
  • the sample can be moved.
  • the focusing device objective lens
  • the focusing device is moved stepwise in regards to the fixed sample.
  • 2D imaging along the optical axis (Z) can begin at any point in the sample and end at any point of the sample within the region of interest.
  • the movement can be reversed (starting inside of the sample and then move out), or performed in a random, fashion, covering the whole area, as long as the z-position is known.
  • the z position of the foci is known as the piezo position is known
  • the 2D scanning is done while the z-scan from one position to the next takes place or after the z-scan has completed its move to the next position.
  • Images of 2D sections can be done alone, without any 3D movement involved.
  • time-resolved measurements can greatly aid in distinguishing signals from different reporter probes and processes, such as simple scattering and non-linear scattering.
  • the additional information from time-resolved measurements can potentially increase the number of probes which can be used simultaneously, provide images cell morphology by detection of second harmonic generation, and aid in deconvolution of images from highly scattering samples.
  • FIGS. 29 ( a ) and 29 ( c ) relate to a deconvolution process.
  • Illumination foci in the optical plane. (foci f 11 -f 33 are illustrated in an enlarged) along with an object, illuminated by foci f 22 .
  • the detection signals are scattered along with the detection areas a 11 -a 33 .
  • FIG. 29 ( c ) are example of signal counts detected by the associated channels of the multi channel detector (signal from area all is collected by the detector channel c 11 ; a 12 by c 12 and so on) at a certain time point, when the focus f 1 scans the center of the object (a).
  • the relative signal distribution between the channels is dependent on the mean free path of the detection photons in the media and the penetration depth. It is constant however, if a homogeneous scattering distribution is assumed (For many samples this can de assumed in the first approximation). Low mean free path means highly scattering, means higher amount of signal in other than channel c 22 . As the penetration depth into the sample increases, the chance, that a photon is scattered on its way to the detector array increases and thus the described “optical cross talk” increases as well. More scattered light is found in the channels, neighboring the outer channels c 11 , c 12 , c 13 , c 21 , c 23 , c 31 , c 32 and c 33 . The signals in these detection elements are smaller though and are not illustrated for simplicity.
  • FIGS. 30 ( a )- 30 ( d ) display a 1 dimensional (1D) deconvolution exemplifying the final 2D deconvolution executed in the linear image deconvolution. For simplification only nearest neighbors are shown.
  • a linear convolution with a delta function with inversed side lobes ( FIG. 30 ( c ) (Illustration only in along one channel number direction) results in a linearly de-convolved image in which only the channel c 22 carries a signal. In practice, this function can either be modeled or measured. If the deconvolution process is shown for simplicity only in x direction. It will be carried out in both x and y directions, and will result in an image in which only channel c 22 will carry a signal.
  • the linear deconvolution of cross-talk is primarily a 2D process.
  • the values of the weighting matrix depend on several factors.
  • the optical contribution to the cross talk increases with increasing penetration depth.
  • the channels have different sensitivities, there is electronic cross talk between channels that varies from channel to channel and other factors influence the amount of total cross talk between the channels.
  • the cross talk for each individual channel can be determined experimentally.
  • An example is where, one focus illuminates the sample or a test object and the whole array of detectors detects the signal. At different penetration depths a cross talk matrix is measured for each channel. This matrix is then used to carry out the deconvolution. Data of such a measurement at the sample surface and at a penetration depth of 200 mm, can be used. The measurement is repeated for every channel for example by moving the iris from transmitting light from one single micro lens to the next (in this case for channels c 11 to c 33 ). Similar alternative methods are also possible, for example by illuminating with all of the foci but using a sample with large object spacing. Furthermore, models can also replace experimental determination.
  • An entire 2D image consists of collections of ensembles for a non linear deconvolution, of pixels, from each detector.
  • the key point is that relationships between entire ensembles, and certain regions of pixels between ensembles, can be established to constrain the variation of the weighting matrix to aid convergence without assumptions, or with minimal assumptions, of the sample or the processes which cause the variation in the weighting matrix.
  • the ensembles can be considered largely independent, except due to the cross-talk introduced by the weighting matrix.
  • the ideal image can be recovered by simultaneously solving for a weighting matrix which minimizes the covariance between ensembles.
  • minimal models of the object such as from image morphology or segmentation of the collected image, etc . . . ) can be used to form constraints.
  • Additional model dependent and independent constraints can also be applied by consideration of the planes above and below the plane under evaluation. Further constraints can also be applied to the weighting matrix from either general (such as continuation, smoothness, sharpness, etc . . . ) or model based considerations .

Abstract

In the systems and methods of the present invention a multifocal multiphoton imaging system has a signal to noise ratio (SNR) that is reduced by over an order of magnitude at imaging depth equal to twice the mean free path scattering length of the specimen. An MMM system based on an area detector such as a multianode photomultiplier tube (MAPMT) that is optimized for high-speed tissue imaging. The specimen is raster-scanned with an array of excitation light beams. The emission photons from the array of excitation foci are collected simultaneously by a MAPMT and the signals from each anode are detected using high sensitivity, low noise single photon counting circuits. An image is formed by the temporal encoding of the integrated signal with a raster scanning pattern. A deconvolution procedure taking account of the spatial distribution and the raster temporal encoding of collected photons can be used to improve decay coefficient. We demonstrate MAPMT-based MMM can provide significantly better contrast than CCD-based existing systems.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the priority of U.S. Provisional Application No. 60/684,608 filed May 25, 2005 entitled, MULTI FOCAL MULTIPHOTON IMAGING SYSTEMS AND METHODS, the whole of which is hereby incorporated by reference herein.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • N/A
  • BACKGROUND OF THE INVENTION
  • Systems and methods for microscopic analysis of biological material have been used for characterization and diagnosis in many applications. Fluorescence microscopy, for example, has been used for optical analysis including the histological analysis of excised tissue specimens. Optical coherence tomography has been used for three dimensional imaging of tissue structures, however, the limited resolution of existing systems has constrained its use for definitive pathological analysis. Confocal microscopy has been used for high resolution imaging and has controllable depth of field but limited imaging speed.
  • Multiphoton microscopy is based on the nonlinear excitation of fluorophores in which fluorescence generation is localized at the focus of excitation light. Multiphoton microscopy is used for deep tissue imaging because of its subcellular three dimensional (3D) resolution, minimal phototoxicity, and tissue penetration depth of over a few hundred micrometers. It has become useful in biomedical studies such as neuronal plasticity, angiogenesis in solid tumors, transdermal drug delivery, and non-invasive optical biopsy, for example.
  • A practical limitation of multiphoton microscopy is its imaging speed which typically lies in a range of less than two frames per second. While this speed is sufficient in many cases, there remain applications in which can be enhanced by improvements in imaging speed. There is a continuing need for further improvements in microscopic analysis of biological materials for numerous applications.
  • SUMMARY OF THE INVENTION
  • The present invention relates to systems and methods for the multifocal imaging of biological materials. An optical system is provided in which a plurality of optical pathways are used in combination with focusing optics to provide a plurality of focal locations within a region of interest of a material being optically measured or imaged. The detector can comprise a plurality of detector elements which are correlated with the plurality of focal locations to provide for the efficient collection of light from the material being imaged. A preferred embodiment of the invention utilizes a scanning system that provides relative movement between the material and the focal locations to provide for fast imaging of the material.
  • In a preferred embodiment a light source, such as a laser, is used with a multifocal optical element to provide an array of spatially separated optical pathways. The multifocal optical element can comprise a micro lens array, a diffractive optical element, or a beam splitter device, for example, such that a plurality of beams are provided that can be focused onto a plurality of focal locations within a biological material to be imaged.
  • An important issue in the collection of light from discrete focal spots or locations within a turbid medium such as tissue is the cross talk that can occur due to the scattering of light. This cross talk can substantially limit the usefulness of the images of the tissue that are produced. By increasing the distance between adjacent focal spots such cross talk can be reduced or eliminated, however, this reduces the resolution of the resulting image or increases the time needed to scan the tissue. Thus it is desirable to employ focal spacing of at least 10 microns and preferably more than 25 microns.
  • In a preferred embodiment of the invention, high speed multiphoton microscopy can measure biological systems such as, for example, kinetic processes in the cytosol of a single cell, for example, or imaging a volume of tissue. For example, high speed 3D imaging can map 3D propagation of a calcium wave and the associated physical contraction wave through a myocyte, or the rolling of luckocytes within the blood vessel of a solid tumor. High speed 3D microscopy provides for sampling a statistically significant volume of biological specimens. Since the field of view of most microscopes is limited to about 100 microns on a side with an imaging depth of 100 microns, the measurement volume is limited to only 1×10−3 mm3. While this volume is sufficient for cellular imaging, many tissues have physiologically relevant structures ranging from the cellular level up to several millimeters in size. For example, a neuron with its extensive dendritic tree can span a volume over 1 mm3 and many dermal structures such as hair follicles and sabestious glands can not be seen with images confined to an area of 100-200 micrometers. It is desirable, for example, to image a hierarchy of cardiac structures ranging from a single nucleus in a cardiac myocyte, to the distribution of muscle fibers and blood vessels, to the structure of chambers and heart valves with perfect registration across five orders of magnitude by imaging a whole mouse heart. Equally importantly, traditional 3D microscopes sample only tens to hundreds of cells and can never achieve comparable statistical accuracy and precision in many biomedical assays as techniques such as flow cytometry and image cytometry. High speed imaging can circumvent this difficulty by improving the number of cells or tissue volume to be sampled. By performing high speed multiphoton imaging, better quantitative measurements of transport pathways across the stratum corneum in transdermal drug delivery applications can be made, for example.
  • Systems and methods have been developed to enhance multiphoton imaging speed. A first method increases the scanning speed by using a high-speed scanner such as a polygonal mirror scanner or a resonant mirror scanner instead of a galvanometer-driven mirror scanner. This achieves an increase of scanning speed of more than 10 frames per second in the imaging of typical tissue specimens. In general, the system can operate at frequencies in a range of 1 to 500 Hz. This method can be used for turbid tissue imaging since it is not sensitive to the scattering of emission photons. A second method increases the imaging speed by parallelizing the multiphoton imaging process. It scans a sample with a multiple of excitation foci instead of forming only a single focus. These foci are raster scanned across the specimen in parallel where each focus needs to cover a smaller area. The emission photons from these foci are collected simultaneously with a spatially resolved detector. One advantage of this method is that the imaging speed is increased by the number of excitation foci generated, without increasing the power of excitation light per each focus. High speed scanning systems needs higher power to compensate for the signal reduction per pixel due to the decrease of pixel dwell time. Images can be obtained by selecting the depth of focus to be positioned in a plane within the tissue or sample at a depth in a range of 10 microns to 500 microns.
  • In another embodiment, fiber optics can be used to couple the light source to the microlens array or other beam splitting element. The system can be implemented as a handheld optical probe for the diagnosis of dermal, cervical or colorectal cancer, for example.
  • The brain is an inherently three dimensional organ composed of many subregions. Accurate segmentation of brain morphology of small mammals is currently challenged by the lack of techniques which can sample the brain at high resolution over a large volume. The current method of choice, serial section reconstruction, is laborious, time consuming, and error prone. The device and methods described herein can quickly image brains or thick tissue sections of brains in 3D at sufficient resolution and over a large enough volume to provide 3D images suitable for classification of brain morphology and biochemical composition. The brain can be further stained by dyes, such as nuclear dyes DAPI or Hoescht, either through intravital injection, transgenic expression, or ex vivo methods, to facilitate classification of regions. Automatic segmentation routines can also be used to improve the classification and automate portions of the process.
  • Accurate measurement of vasculature is important to characterize many biomedical for vasculature related diseases. For instance, proangiogenesis therapies are useful in such areas as tissue engineering, wound healing, bone fractures and coronary heart disease. Anti-angiongenesis treatments are important in processes as cancer, blindness, and rheumatoid arthritis. Unfortunately traditional histopathological analysis of tissue sections is wholly inadequate to characterize the vasculature of a tissue or organ as blood vessels form complex, multiscale 3D networks, with feature spanning from the submicron to centimeter scale. The device and methods described in the patent are capable of acquiring high quality 3D datasets over 3D tissue and organ samples suitable for characterization of the vasculature of the tissue. To aid visualization of the vasculature, the tissue can be stained by contrast agents which bind to the epithelial wall of the blood vessels, or fill the interior of vessels. Automatic segmentation routines can also be used to improve the classification and automate portions of the process.
  • A large percentage of deaths are due to metastasis. Unfortunately, the migration of cancer cells from the primary tumor to secondary sites is a multi-step process which is not well understood. Standard histopathological analysis is ill-suited to study metastasis and suffers from a number of limitations. First, it is extremely difficult to find rare metastatic cancer within a 3D bulk tissue using traditional 2D histopathology. In many instances traditional 2D histopathology is unable to find evidence of the presence of metastatic cancer cells in an organ of animal. However, it is known that many subjects eventually develop tumors at a later time. It is clear that traditional histopathology cannot effectively detect rare cells. Another limitation is that the present histopathology methods provide limited information about the 3D spatial arrangement of cancer cells with the 3D vasculature of the organ. It is known that one of the critical steps in metastasis is extravasation into the surrounding stroma from the vasculature so it is essential to be able to visualize this spatial relationship between cancer cell and the endothelial blood vessel wall. Preferred embodiments of the present invention are capable of acquiring high quality 3D datasets over 3D tissue and organ samples suitable for characterization of the metastases. To aid visualization of the metastases, the cancer cell can be stained by dyes or labeled with proteins such as OFP. Automatic segmentation routines can also be used to improve the classification and automate the localization of the cancer cells and tumors.
  • In order to understand the effects of a drug on an organism, analysis at the tissue, whole organ, and whole organism level is vitally important. ADME, efficacy and toxicology effects are known to have strong spatial variations on the morphological, cellular and biochemical state of a tissue. Even within a specific tissue type, the response can be nonuniform due to variations in the transport and distribution of a drug throughout tissue, epigenetic expression, and cellular activity. The devices and methods described herein can be used to provide morphological, biochemical and spectroscopic information about the state of a tissue across multiple length scales, from subcellular, whole tissue, whole organ and even entire organism, in response to the treatment of a molecular agent. Efficacy, ADME, and toxicology information can be derived which provides a fuller and more accurate description to predict the actual effect of drug candidate at the organism level.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram of an imaging system in accordance with a preferred embodiment of the invention.
  • FIGS. 2 a-2 c are images of human skin acquired with the present invention including the stratum corneum layer, the stratum granular and the basal layer, respectively.
  • FIG. 3 graphically illustrates the signal decay with increasing imaging depth of conventional systems and those incorporating the present invention.
  • FIGS. 4 a-4 d include images before and after deconvolution as well as graphical illustration of scattering and crosstalk.
  • FIGS. 5 a-5 i are images based on CCD, MAPMT and deconvolution thereof at the surface and difference depth of brain tissue.
  • FIG. 6 illustrates a method and apparatus for multi-focal, multi-photon microscopy (MMM) according to a preferred embodiment of the invention, showing parallelized illumination and detection device with a common focusing device.
  • FIG. 7 illustrates a method and apparatus for multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • FIG. 8 illustrates another method and apparatus for multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • FIG. 9 illustrates generating and detecting a 3D foci pattern in a focal region.
  • FIGS. 10(a)-(i) illustrates close up views of the 3D focal region generated by the setup in fig. W-10 and views of the 3D scanning: (a) the focal region; (b) an array of excitation light beams; (c) x/y view at first depth; (d) x/y view at second depth; (e) x/z view; (f) multiple rows of excitation foci lie in different focal planes and are all shown in this xy view; (g) xz view of the rows shown in (f); (h) x/y view of the x/y scanning configuration covering the x-y-z image (as in (f) all foci are shown, even though they lie in different planes and in the lower part of the images even behind each other; and (i) a view in the yz plane illustrating scan progression.
  • FIG. 11 a illustrates a further method and apparatus for a 3D cytometer, based on multi-focal, multi-photon microscopy (MMM) employing scanning and multi anode PMT's, according to a preferred embodiment of the invention.
  • FIG. 11 d illustrates an image of the array of foci in the focus of the objective lens; the foci are 45 micrometers apart resulting in a scanning field of 240 mm when 6×6 foci are utilized.
  • FIG. 11(c) illustrates the Z-profile and corresponding fit function of a 200 nm bead.
  • FIGS. 12(a)-(e) illustrates a further method and apparatus for multi color detection MMM employing scanning and multi anode PMT's according to a preferred embodiment of the invention: (a) the setup in the xz-plane; (b) the foci and their scanning in the focal xy-plane;
  • (c) detection path in the yz-plane (d) detection path projected in the xz-plane; (e) the anodes of the multi anode PMT in the x/z and x/y plane in conjunction with the detected colors. In this case only visible light is shown. Any other light spectra can be separated, though.
  • FIG. 13 illustrates a beam splitter configuration used in some embodiments according to the invention.
  • FIGS. 14(a)-(d) illustrate preferred embodiments for providing illumination beam paths in accordance with the invention.
  • FIGS. 15(a)-(d) illustrated further preferred embodiments for detecting light from different focal locations in accordance with preferred embodiments of the invention.
  • FIG. 16 illustrates determining the optimal number of foci at a certain laser power for samples with different damage thresholds.
  • FIGS. 17(a) and (b) illustrate a time multiplexing method.
  • FIGS. 18(a)-(c) illustrate a pixellated detector collection method.
  • FIG. 19 illustrates an endoscope apparatus according to an embodiment of the invention.
  • FIGS. 20(a) and (b) illustrates scattered light detection with one PMT and one excitation focus according to an embodiment of the invention.
  • FIGS. 21(a) and (b) illustrates scattered light detection with two PMTs and one excitation foci according to an embodiment of the invention.
  • FIGS. 22(a) and (d) illustrates scattered light detection with two PMT's and two excitation foci according to an embodiment of the invention.
  • FIGS. 23(a) and (b) illustrates reducing optical cross talk by increasing the distances between the excitation foci and distances between the detection elements.
  • FIGS. 24(a) and (b) illustrates reducing optical cross talk according to an embodiment of the invention by increasing the distance between the excitation foci and increasing the area of the detection elements.
  • FIGS. 25(a)-(e) illustrate in tabular form two alternative embodiments A and B of the invention in terms of changing optical setup.
  • FIGS. 26(a)-(b) illustrates the different conjugated areas of detection from each channel of the multi anode PMT in the conjugated image plane for configuration A and B from FIG. 22(a)-(e).
  • FIGS. 26(c)-(d) illustrate an objective lens with large field of view enables large separation of foci and thus enables low optical cross reduction
  • FIGS. 27(a) and (b) illustrate data post-processing sequences.
  • FIGS. 28(a) and (b) illustrate a normalization method.
  • FIGS. 29(a)-(c) illustrate a linear deconvolution process.
  • FIGS. 30(a)-(d) illustrate further details for a linear deconvolution process I: Signal distribution in multi channel detector
  • DETAILED DESCRIPTION OF THE INVENTION
  • As the input power of excitation light increases, the signal is increased proportionally to the square of input power, S(t)∝[P(t)]2. However, there is a limitation in input power level due to finite lifetimes of fluorophores.
  • In the multiphoton excitation of fluorophores with a pulsed laser, the fluorophores, which are excited with the last pulse, stay in the excitation state for a few nano-seconds (depending on the fluorophore). Some excited fluorophores may not be excited again with the next pulse of excitation light (12 ns later in case of the laser having 80 MHz pulse repetition rate). Therefore, signal level becomes saturated with a higher input power than the limited input power level. The limitation on the input power level is related to the excitation probability of a single fluorophore with a single pulse, Ppulse. It is formulated in the following expression with the condition that excitation light is focused with an objective into a fluorophore of an absorption coefficient (δa). P pulse = δ a [ λ K a hc ] 2 NA 4 τ p f p 2 [ P a ( t ) ] 2 ( 1 )
    The nominal conditions are that the excitation light has the wavelength (λ=800 nm), the pulse width (τp=200 fs), the pulse repetition rate (ƒp=80 MHz), and the average power, Pa(t). The numerical aperture (NA) of the lens objective is 1, (NA=1). The fluorophore has the absorption coefficient, (δa=10 GM), where 1 GM is 10−50 cm4×s/photon. In order to avoid the saturation, Ppulse must be less than 0.1 in general (Ppulse<0.1). With these conditions, the input power (Pa(t)), with which Ppulse becomes close to the saturation limit, is approximately 6 mW, (Pa sat(t)=≅6 mW) in this example. In case the concentration of the fluorophores is 10 μM, the number of emission photons collected per second is approximately 3×107 photons/s with the assumption that the collection efficiency of emission photons is approximately 0.01 (εcol=0.01). Assuming that each pixel needs 300 photons and each image comprises of 256×256 pixels, the frame rate that can be achieved with the input power under the saturation limit is 1.5 frames/s. Although the higher frame rate is achieved with specimens of higher fluorophore concentration, it is clear that there is a limitation in input power level due to fluorophore saturation.
  • MMM increases the frame rate by scanning with multiple excitation foci. Therefore, MMM can achieve the higher frame rate, while the input power for each excitation focus is kept below the saturation limit. For example, the MMM system, which scans with an 8×8 array of excitation foci, can achieve the frame rate of 96 frames/s (=1.5 frames/s×64 foci). In a preferred embodiment it is desirable to collect at least 15 frames per second and preferably 30 frames per second or more. One practical limitation in MMM is that more input power is required to generate multiple excitation foci. The power requirement to generate 64 foci is 384 mW (=64 foci×6 mW per each focus). Since available laser sources can output approximately 2 W of power, enough power is available for MMM.
  • The limit of optical imaging depth in tissues is limited by photon interaction with tissue constituents. Photon scattering is a dominant factor in multiphoton microscopy whereas the effect of photon absorption is relatively negligible. Scattering of excitation photons reduces the amount of fluorescence generated at its focus, because less excitation photons reach the focal volume. The emission photons from the focus are also scattered so that they may not be collected by the optics in the detection path or spatially dispersed in the imaging plane where detectors are positioned. Since the excitation light has a longer wavelength than the emission light, the excitation light typically experiences less scattering than emission light. The effect of photon scattering is expressed by the mean free path length of scattering, ls which is the depth constant in exponential decay of unscattered photons, S(z)∝exp(−z/ls).
  • Intralipid emulsion can be used as a tissue phantom with similar optical properties as tissue. The optical properties of 2% intralipid are mean free path length at excitation wavelength (780 nm) of 167 μm, (lex s≅167 μm) and at emission wavelength (515 nm) of 62.5 μm, (lem s≅65 μm). Since it is known that only ballistic excitation photons contribute multiphoton excitation in the depth of a few times of scattering length, the amount of multiphoton excitation decays with the mean free path length of 84 μm (=167 μm/2) with the consideration that two-photon excitation is a quadratic process. Conventional multiphoton microscopy is based on the scanning of a single excitation focus and the signal is collected using a detector without spatial resolution such as a conventional photomultiplier tube (PMT). The PMT has a large detection area and can collect most of the signal generated at the excitation focus including a large fraction of the scattered photons. Therefore, conventional multiphoton microscopy is relatively immune to the scattering of emission photons by the tissue. However, for an MMM system that utilizes a CCD detector to distinguish the signals originated from each of the foci, the scattering of emission photons seriously degrades the SNR of the instrument for deep tissue imaging. The CCD camera has relatively slow readout speed and typically integrates all the emission photons during the acquisition of each frame. Because a CCD camera contains pixels in which each pixel covers a 0.1 μm2 region in the specimen plane, scattered emission photons deflected from their original paths are not collected in the correct pixel but are distributed broadly across the imaging plane. The distribution of scattered emission photons is very broad with its FWHM of 40 μm in the depth of 2×lem s. These scattered photons result in a degradation of image SNR by more than one order of magnitude when imaging depth is over 2×lem s, compared with conventional multiphoton microscopy.
  • The major limitation of CCD-based MMM system lies in its small pixel area. For conventional wide field imaging, a large number of CCD pixels are needed to maintain good resolution while covering a good size field of view. A 100 □m size image will require about 107 pixels to be imaged at full optical resolution (300 nm). The situation is very different for MMM imaging. Since a femtosecond light source can only provide at maximum 2-4 watts of optical power and typically about 50-100 mW are required at each focus to generate an efficient muliphoton excitation process for deep tissue imaging. An MMM system can realistically and effectively scan about 20-40 foci in parallel with tissue specimens. Since these foci are raster scanned across the specimen, the image resolution is determined by the excitation point spread function (PSF) of the light and is not sensitive to the detector pixelation. In particular, a preferred embodiment uses an MMM system having photon detectors containing only as many elements as the number of excitation foci. The need for fewer elements allows the use of a detector with a significantly larger pixel area while maintaining a reasonable device size. A multi-anode PMT (MAPMT) is a preferred detector for this purpose.
  • A preferred embodiment of the present invention uses an MAPMT instead of the CCD camera for the signal collection from multiple foci. The MAPMT is similar to conventional PMTs with a good quantum efficiency (over 20% in the blue/green spectral range), negligible read noise and minimal dark noise with cooling. MAPMT has a cathode and dynode chain with a geometry that ensures that the spatial distribution of photons on the cathode is reproduced accurately as electrons distribution at the anode. The anode of the multi-anode PMT is divided rectilinearly into its elements providing spatial resolution for the simultaneous collection of signals from multiple locations. In one example, a MAPMT, which has an array of 8×8 pixels (H7546, Hamamatsu, Bridgewater, N.J.) is used. Note that a flat panel detector having a pixel area of sufficient size can also be used. For example, a binnable CMOS or CCD imaging sensor can be operated to read out binned images at comparable frame rates with an effective pixel size corresponding to that of a MAPMT.
  • A preferred embodiment of the invention uses the imaging systems as described herein in conjunction with a system for sectioning a sample such as a tissue sample that is described in greater detail in U.S. application Ser. No. 10/642,447, by So, et al. filed Aug. 15, 2003, the entire contents of which is incorporated herein by reference.
  • The schematic of a preferred embodiment of the imaging system 10 in accordance with the invention is shown in FIG. 1. The light source 12 used is a Ti-Sapphire (Ti-Sa) laser (Tsunami, Spectra-Physics, Mountain View, Calif.) pumped by a continuous wave, diode-pumped, frequency-doubled Nd:YVO4 laser (Millenia, Spectra-Physics, Mountain View, Calif.). It generates approximately 2 W at 800 nm wavelength which is sufficient for most MMM applications. The excitation beam from the laser is optically coupled using optical fiber 14 or free space lens system to a beam expander 16 and then illuminates a microlens array 20 (1000-17-S-A, Adaptive Optics, Cambridge, Mass.) which, in this example, is an array of 12×12 (or 8×8) square microlenses that are 1 mm×1 mm in size and 17 mm in focal length. The degree of beam expansion can be selected such that an array of 8×8 beam-lets is produced after the microlens array. The beam-lets are collimated after lens L1 and reflected onto an x-y scanner mirror 30 (6220, Cambridge Technology, Cambridge Mass.) which is positioned in the focal plane of lens L1. In this configuration, the beam-lets overlap each other on the scanner mirror surface and are reflected similarly by the rotation of the scanner mirror. After the scanner, the beam-lets enter a coupling lens system such as a microscope (BX51, Olympus, Melville, N.Y.) via a modified side port. A combination of lenses L2 and L3 expands the beam-lets to fill the back aperture of the objective lens 36 in order to use the full NA of the objective lens. The scanning mirror is in the telecentric plane of the back aperture of an objective lens so that the beamlets are stationary on its back aperture independent of the motion of the scanner mirror. The objective lens generates the 8×8 focus array of excitation light in the sample plane in the specimen 34. The scanner mirror moves the array of excitation foci in the sample plane in a raster pattern to cover the whole sample plane. Alternatively, a digital micromirror (MEMS) device can be used to control beam scanning in the sample plane. A beamsplitter can also be used to split an input beam before the microlens array. Another alternative embodiment employs a diffractive optical element in conjunction with a beam splitter. The objective used in this system is a 20× water immersion lens with 0.95 NA (XLUMPLFL20XW, Olympus, Melville, N.Y.). The excitation foci are separated from each other by 45 μm in this example so that the scanning area of each focus is 45 μm×45 μm. The frame size is 360 μm×360 μm by scanning with the array of 8×8 foci. The frame rate to generate images of 320×320 pixels becomes approximately 19 frames per second with the pixel dwell time of 33 μs.
  • Emission photons are generated at the array of excitation foci in the specimen and are collected by the same objective lens forming an array of emission beam-lets. In case of a CCD-based MMM, the emission beam-lets are reflected on a long-pass dichroic mirror 38 (650dcxxr, Chroma Technology, Brattleboro, Vt.) and get focused in optional CCD camera 28 (PentaMax, Princeton Instruments, Trenton, N.J.) with a lens (L3). The CCD camera integrates emission photons during the scanning time of each frame to generate images. In case of a preferred embodiment using an (without the CCD) MAPMT, the emission beam-lets travel back to the scanner mirror 30 retracing the excitation paths. The emission beam-lets are reflected by the scanner mirror. The emission beam-lets are de-scanned and their propagation directions remain stationary irrespective of the movement of the scanner. The emission beam-lets are reflected by a long-pass dichroic mirror 32 (650dcxxr, Chroma Technology, Brattleboro, Vt.) and are focused after lens (L4). A short-pass filter (E700SP, Chroma Technology, Brattleboro, Vt.) blocks any strayed excitation light. The focused emission beam-lets are collected at the center of corresponding channels of a MAPMT 22 (H7546, Hamamatsu, Bridgewater, N.J.). The emission photons coming from the array of excitation foci are collected by the MAPMT. An image is formed by the temporal encoding of the integrated signal with the known raster scanning pattern using image processor or computer 24 and is electronically stored in memory and/or displayed using display 26.
  • The pair of L2 and L4 lenses magnifies the array of emission foci so that individual emission beamlets are focused at the center of corresponding elements of the MAPMT. Further, since the emission beam-lets are descanned, they remain stationary. Since the emission beam-lets have to go through more optical elements, loss of emission photons occurs. The transmission efficiency is approximately 0.7. The signals from the MAPMT are collected by a stack of four multi-channel photon counter card (mCPhC) which has 64 channels for simultaneous signal collection. Each mCPhC has 18 channels of photon counter circuits and can be housed 25 with a digital interface to the computer 24. The mCPhC expandable so that 64 channels are readily implemented by using 4 cards in parallel. The mCPhC has a 32-bit parallel interface with a computer for high-speed data transfer. Currently, the speed is limited by the speed of the computer PCI bus. Transfer rate can be more than one hundred frames (320×320 pixels, 16 bit images) per second.
  • Since the scattered emission photons have the spatial distribution of 40 μm as its FWHM at the imaging depth of 2×lem s, the sensitivity of the microscope is partly determined by the effective detector area, the area in the sample plane from which a detector collects emission photons. Since microscopes are telecentric systems, the effective detector area is linearly related with the detector size in the image plane. With a magnification, M, and a linear dimension of detector, LD, the linear dimension of effective detector area (LE) is LE=LD/M. In general, the larger the effective detector area, the more effective the detector can collect scattered emission photons. In the case of using a 20× magnification objective, a 10 mm diameter standard PMT has an effective detector area of 500 μm diameter that is significantly larger than the width of the scattered emission photon distribution. Therefore, standard PMTs have good collection efficiency of scattered emission photons and allow very effective deep tissue imaging. In the case of a spatially resolved detector, each pixel can be treated as an individual detection element. For a CCD camera with 20 μm×20 μm pixels, each pixel has an effective detector area of 1 μm×1 μm for 20× magnification. Therefore, the CCD-based MMM system cannot utilize these scattered emission photons which are distributed uniformly throughout the image contributing to the background noise. In this example of the MAPMT-based MMM system, the effective detector area of each channel is 45 μm×45 μm. Therefore, the MAPMT can collect significantly more scattered emission photons into the correct channels than the CCD camera, because its effective detector area, or detector element collection area, is comparable with, or corresponds to, the width of the scattered photon distribution from each focal area (45 microns×45 microns).
  • MAPMT-based MMM system can be easily converted to the conventional multiphoton microscope which is based on single-focus scanning and signal collection with PMTs. In the set up of conventional multiphoton microscope, the excitation beam is not expanded and goes directly onto the scanner without the combination of the microlens array and lens L1. The rest of excitation beam path is the same as MAPMT-based MMM. Specimens are scanned with a single excitation focus. The emission light collected by the objective lens is reflected on a dichroic mirror. The reflected emission beam shrinks with a pair of lenses and is collected by a detector (PMT). An image is formed by the temporal encoding of the integrated signal with the known raster scanning pattern.
  • CCD-based MMM have limitations for turbid tissue imaging by measuring the effect of emission photon scattering on PSF (scattering function). Scattered emission photons form additional intensity distribution around the PSF constructed with ballistic unscattered photons. Their intensity distribution is quite wide with its FWHM of 40 μm at the imaging depth of 2×lem s. FWHM of the total PSF (including the intensity distribution due to scattered emission photons) is not changed due to scattering up to such depth because the wide distribution of scattered emission photons does not contribute to FWHM. In terms of contrast, signal decay in CCD-based MMM with the increase of imaging depth is higher than that of SMM by an order of magnitude at 2×lem s. Also the wide distribution of scattered emission photons contributes as noise and causes loss of contrast by another order of magnitude at the depth.
  • Imaging dermal structure based on autofluorescence has been performed using the system of the present invention. Endogenous fluorophores have low quantum yield and low extinction coefficients compared with typical exogenous fluorescent labels. The dermal structure imaged using a preferred embodiment of the present invention has a layered structure with significantly different indices of refraction resulting in significant spherical aberration. Multiphoton imaging of dermal structures without photodamage has a pixel rate of 15 KHz with 15 mW input power. In this example an input power of 7 mW per focus at the specimen with the excitation wavelength set at 800 nm. The objective used is 20× water immersion with 0.95 NA (XLUMPLFL20XW, Olympus, Melville, N.Y.). With a frame rate of 2.5 fps for a 320×320 pixel image (4 KHz pixel rate) this is 10 times faster than the previous systems. The epidermis is imaged down to the basal cell layer using this MAPMT-based MMM. Representive layers from the stratum corneum, stratum granular, and the basal layer are shown in FIGS. 2 a-2 c. The signal from these layers are mostly due to the fluorescence of NAD(P)H inside the cell. Thus an MAPMT-based MMM has equivalent or improved sensitivity as conventional multiphoton microscopy but with significantly increased imaging speed. The intensity of the image is not uniform: the intensity is high in the center and becomes dim in the corner of the image. This is because the intensity of the excitation beam has Gaussian spatial distribution so that the beamlets made from center part of the expanded beam have higher intensity than the peripheral portions of the beam. A beam splitter, serial dichroic mirrors or a top hat holographic filter can be used to provide a more uniform array of beams delivered to the individual focal positions.
  • Using both CCD and MAPMT detectors in MMM geometry, the signal decay can be measured as a function of scattering length. As the imaging depth increases, the signal is decreased due to scattering of both excitation photons and emission photons. The signal decay is measured by imaging 4 μm diameter fluorescent latex microspheres (F8858, Molecular Probes, Eugene, Oreg.) immobilized in 3D by 2% argarose gel (UltraPure Low Melting Point Argarose, Invitrogen, Carlsbad, Calif.). Intralipid emulsion (Liposyn III, Abbott Laboratories, North Chicago, Ill.) is added to the sample as a scatterer in various concentrations of 0.5 to 2%. Intralipid emulsion of 2% volume concentration is known to have similar scattering properties to those of tissues: mean free path length (ls) of scattering is 80 μm, 168 μm at the wavelength of emission (605 nm), excitation (800 nm) respectively. The scattering properties of these intralipid solutions are verified by diffusive wave measurements. Peak intensity of the sphere image is a signal in the measurement and the decay of peak intensity as a function of the imaging depth is measured at each concentration. The signal decay can also be measured with a conventional multiphoton microscope as a reference. Signal decays in the three systems are measured down to a depth of 180 μm which is equivalent to 2.25×lem s (FIG. 5). The signal decay is expressed an exponential function, S(z)=exp(−cz/lem s). The decay coefficient, c is 1.22, 1.87, 2.30 in case of the conventional multiphoton microscopy, MAPMT-based MMM, and CCD-based MMM respectively. The decay rate from the conventional multiphoton microscope is the lowest as expected. The decay is the combinational effect of both excitation and emission photon scattering. Since the effect of excitation photon scattering is the same, the difference in decay coefficient is due to the effect of emission photon scattering. The decay coefficient, c from MAPMT-based MMM (1.87) is lower than the one from CCD-based MMM (2.30). However, the one from MAPMT-based MMM is still higher than the one from the conventional multiphoton microscope. It indicates that the spatial distribution of scattered emission photons is wider than the effective detector area of the MAPMT (45 μm×45 μm) so that some portion of the scattered emission photons are collected in the neighboring channels. The ratio of intensity sum collected in the neighboring pixels of the MAPMT to the intensity in the correct pixel was approximately 2 at the depth of 2×lem s.
  • Although a significant portion of the scattered emission photons are still distributed outside the correct pixels in MAPMT-based MMM, these photons can be effectively restored to the correct pixels based on post-acquisition image processing. Note that the photons acquired at each pixel are temporally encoded and are organized to form an image based on the known scanner position as a function of time. This is exactly how images are formed in a conventional multiphoton or confocal microscope. A primary image is formed by photons acquired at the correct pixels corresponding to the fluorophore distribution in that portion of specimen. Note that the scattered photons in the neighboring pixels are also similarly temporally encoded. Therefore, secondary “ghost” images are formed in the areas of the image covered by the neighboring pixels. As an example, FIG. 4 a is an image of spheres at 150 μm deep from the surface in 2% intralipid emulsion. The fact that the primary image at one pixel is “copied” into neighboring pixel, the spatial distribution of the scattered photons provides information for the reassignment of these scattered photons back to the correct pixel. Note that this temporally encoded information is not available in a CCD-based MMM system where the temporal information is lost during the integration process of the CCD. The effect of emission photon scattering on imaging can be described as follows. Generally, an image is formed as a convolution of source pixels and an emission point spread function (PSFem). In the set up of MAPMT-based MMM, the array of 8×8 pixels is collected together each time so that the vector of pixels are acquired together, {Sacq} (64×1) which is the product of convolution matrix, [C] (64×64) and the source pixels, {Ss} (64×1), {Sacq}=[C]×{Ss}. The convolution matrix, [C] is constructed based on the simplified PSFem, EPSFem in which PSFem is spatially integrated over the effective detector area of the individual pixels of the MAPMT. Since EPSFem has a very coarse spatial resolution of 45 μm with the spatial integration, deconvolution with EPSFem becomes simple and is less sensitive to noise. The study of emission photon scattering on PSFem shows that the scattered emission photons form additional intensity distribution around the PSFem, which is formed with ballistic unscattered emission photons. Its distribution is broad with its FWHM of 40 μm range in the imaging depth of 2×ls em. The change of PSFem due to scattering (FIG. 4 c) affects EPSFem by increasing intensity in the neighboring pixel areas (FIG. 4 d). In the process of deconvolution, EPSFem is roughly estimated by measuring the intensity ratio of the real image to the ghost images as a function of imaging depth. The convolution matrix, [C]est is constructed based on the estimated EPSFem. The source pixel vector, {Ss}est is acquired by the product of the inverse transform of [C]est and the acquired pixel vector, {Sa,,q
    {S s}est =[C] est ×{S acq}   (2)
  • The restored image is presented in FIG. 4(b). The signal decay of a depth sequence of restored images is measured and the decay coefficient, c is significantly reduced to 1.58 after the deconvolution algorithm because the scattered emission photon can now be corrected and reassigned. The ghost images are almost completely eliminated as a result. Restoration algorithms can be further refined such as by adding maximum likelihood estimation to minimize image structural overlap between neighboring pixels. This simple deconvolution approach improves very effectively the performance of MAPMT-based MMM and allows this system to perform within a factor of two compared with conventional multiphoton microscope.
  • The performance comparison of the two MMM systems can also be evaluated for the imaging of biological tissues. The specimen is an ex-vivo brain tissue section with neurons expressing green fluorescent protein (GFP). Thy1-GFP transgenic mice are deeply anesthetized with 2.5% Avertin (0.025 ml/g i.p.) and transcardially perfused with PBS, followed by 4% paraformaldehyde. Brains are dissected and placed overnight in cold 4% paraformaldehyde. 1-mm thick coronal sections are taken by vibrotome, mounted and coverslipped on microscope slides using adhesive silicone isolators (JTR20-A2-1.0, Grace Bio-Labs, Bend, Oreg.). The specimen is imaged in 3D with both CCD-based MMM and a MAPMT-based MMM. The objective used is 20× water immersion with NA 0.95 (XLUMPLFL20XW, Olympus, Melville, N.Y.). The input laser power is 300 mW at 890 nm wavelength. The frame rate is 0.3 frames per second with 320×320 pixels. The slow frame rate is set in order to collect enough emission photons up to 120 μm deep. The total imaging depth is 120 μm with 1.5 μm depth increment. Representative images are shown in FIGS. 5 a-5 i. The first column of images are from CCD-based MMM at surface, 30 μm, and 75 μm deep. The second column of images are the ones from MAPMT-based MMM, raw images and the third column are after deconvolution processing. On the surface, the dendritic structures of neurons are visible in all images. However, the image from CCD-based MMM does not provide as good contrast of neurons as MAPMT-based MMM. This is because some of the emission photons that are initially forward propagating into the tissue are eventually backscattered. These backscattered photons are acquired in the incorrect pixels of the CCD and degrades the image SNR. Starting at about 30 □m, background noise increases and thin dendrite structure becomes invisible in CCD-based MMM images. On the other hand, in the images from MAPMT-based MMM, dendrites are still visible due to lower background noise and higher SNR. In the image of 75 μm deep from MAPMT-based MMM, ghost images of a bright cell body appear in the neighboring pixels. The ghost images are restored to the original image after the deconvolution process is applied. And also it is noted that the intensity of the original image is increased.
  • However, additional improvements of this system can be made. First, since the MAPMT is positioned in the image plane, the location of each excitation focus corresponds to the center position of the matching pixel of the MAPMT. The effective detector area scales quadratically with the separation of the foci. Therefore, with wider foci separation, the MAPMT has higher collection efficiency for scattered emission photons. In the current configuration, the excitation foci are separated from each other by 45 μm so that the effective detector area for each channel of the MAPMT is 45 μm×45 μm. The size of imaging field with 8×8 foci becomes 360 μm×360 μm. As the excitation foci are separated more, the system becomes less sensitive to the scattering of emission photons. The maximum separation of excitation foci is limited by either the field of view of the objective or apertures of other collection optics. The 20× water immersion objective used has the field of view of 1000 μm in diameter. This allows positioning the foci as far apart as almost 100 microns in this example.
  • A limitation of the MAPMT-based MMM system compared with a CCD-based MMM design is that the signals are de-scanned. In the de-scanned configuration, emission photons are processed by more optical elements including the scanner mirror before they are collected at the MAPMT suffering more optical loss at each reflection. Further, the de-scanned geometry also has a longer optical path that contributed to the loss of some scattered photons due to the finite aperture of the optics. The signal collection efficiency is approximately 70% in this example due to additional optical elements. An MAPMT-based MMM system in a non-de-scanned geometry for example can recover this loss.
  • The MAPMT is manufactured with a current quantum efficiency of about 20% compared to 80% quantum efficiency of the CCD camera. However, MAPMT has very low noise. It has 20 dark counts per second without cooling and can be several orders of magnitude lower with cooling. Since the MAPMT has a readout rate of approximately 20 KHz, the typical dark count per pixel. is less than 1×10−3. In comparison, the CCD noise is dominated by both read noise and dark noise which are a few counts per pixel. Therefore, for very low photon count situation, i.e. dim sample or high frame rate, the MAPMT system can have superior performance. MAPMTs with higher sensitivity cathode materials such as GaAsP can provide a system with a quantum efficiency up to about 40-50%.
  • The photon sensitivity of each channel is not equal and can vary up to 50%. This effect is further compounded by the Gaussian geometry of the excitation beam which results in higher excitation efficiency at the center pixels verses the edge region. This problem has been solved previously using multiple reflecting beam splitter to generate equal intensity beam-lets. The MAPMT-based MMM system can be further improved by utilizing this type of beam splitter with an additional flat field correction algorithm to remove inherent sensitivity non-uniformity of the MAPMT.
  • There is also cross talk between neighboring pixels of MAPMT. The typical crosstalk is minimal at about 2% when the photons are collected at the center of each pixel. However, this cross talk can be removed by post-processing of the image similar to ghost image removal discussed previously.
  • In MMM imaging, more power of excitation light is required. Assuming that input power of 10 mW is needed for each excitation focus, generation of 64 excitation foci requires 640 mW input power. In the imaging of turbid tissue specimens, more input power is required to compensate the signal loss due to excitation photon scattering. In case of a tissue specimen whose mean free path length is 160 μm at excitation wavelength, the input power of 2200 mW is required to image at 100 μm deep, assuming that signal level is decreased only due to excitation photon scattering and there is no change in collection efficiency of emission photons. Therefore, the current power of Ti-Sapphire laser is limited for MMM imaging and can further increase in imaging speed by the use of even more foci.
  • Referring to FIG. 6, a preferred embodiment provides for parallelized illumination and detection device which uses a common focusing device, such as an objective lens. The device provides simultaneous measurement of intensity, lifetime, spectroscopic or other information from the focal spots ( foci 151, 152, 153). Light from a first illumination light path 141, a second illumination light path 142 and a third illumination light path 143, which paths present to each other at relative angles, enter a common focusing device 110 (such as an objective lens). The focusing device 110 generates from each illumination light path 141, 142, 143 a separate intensity cone. A first detector 121, a second detector 122 and a third detector 123 detect light generated by the intensity cones associated with each first, second and third illumination path, respectively. Light from a first illumination path 141 illuminates the focus spot 151 in the sample 105, with the detected light following a first illumination and detection light path 161 and first detection light path 111 to reach the first detector 121. Similarly, light from second and third illumination paths 142, 143 illuminate the focal locations 152 and 153, respectively, in the sample 120, with the detected light following second and third illumination and detection light paths 162 and 163 and second and third detection light path 112 and 113 to reach the second and third detectors 122 and 123, respectively.
  • In the multi-photon case, light from each path generates a 3D intensity distribution in its associated focus, according to the multi photon excitation process. The detectors 111,112,113 detect all the light in the ‘detection cone’ associated with their active area. This light includes light generated by the light path associated with each detector (for example, light from the first focus 151 is detected by first detector 121), as well as light that is generated in the first focus 151 but scatters around the first focus 151 on its way to the first detector 151, and light that is generated in the second and/or third foci 152, 153 and is then scattered into the detection cone of the first detection light detection path 111.
  • In the confocal case, a confocal pinhole is placed in front of the detectors, for instance, in FIG. 6 a confocal pinhole can be placed between each detector 121, 122 and/or 123 and the associated reflectors and collimation lens 126, 127, 128, 131, 132 and/or 133, respectively. As a consequence of the pin-hole impeding much of the scattered light, according to the confocal principle, only light from the focal spot associated with that detector is collected in each detector. For example, following the first light path, only light from the 3D light distribution in the first focus 151 is detected by detector 121. A setup could as well consists of a mixture of detectors with and without a confocal pinhole.
  • In order to reduce cross talk between the light beams due to scattering, the illumination light and the associated detection can be time multiplexed.
  • Still referring to FIG. 6, a device according to the invention can become an imaging device by the illumination beams 141,142,143 being angle-scanned with respect to the focusing device 110. The imaging of the x-y planes is enabled by rotating the device through two perpendicular angles theta and phi around the x and y axes, respectively. The intensity information is recorded along with the angular position of the device and reconstructed by a image processor. Imaging of zy planes can be achieved as well by scanning the sample in respect to the imaging device in xy. In an imaging mode, in which the beams are scanned, the device is capable of simultaneously generating 2D images of sub-regions of samples. By simultaneously imaging with each separate illumination and detection pathway, the speed at which images are generated can be increased by the number of illumination paths and detection channels.
  • Imaging in the z plane occurs by moving the imaging device with respect to the sample, or vice-versa. The intensity information is recorded along with the z-position of the sample or device and reconstructed by an image processor.
  • Another embodiment according to the invention provides for a multifocal, multiphoton microscope based on MAPMT, as illustrated in FIG. 7, in which an expanded excitation beam 104A comes from bottom of FIG. 7 and illuminates a square microlens array 140A. A plurality of optical pathways is generated by the micro lens array 140A in conjunction with lens L1; for instance, in the embodiment illustrated here the microlens array 140A splits the excitation beam 104A into 8×8 multiple beams (i.e., 64 beamlets). In FIG. 7, only two beamlets 141A, 142A are ray-traced. A specimen 105A is scanned with an 8×8 array of excitation foci 150A, which includes focus spots 151A and 152A illuminated by beamlets 141A and 142A respectively. The sample area that each excitation focus covers can be relatively small the focus is in x and y direction, the full width half maximum (FWHM) of the focus is 200-1000 micro meter. In z direction the FWHM is 200-5000 micro meter. In an imaging configuration each foci scans an area of the size of the distance of the foci, meaning 10-1000 microns (the scanning is accomplished by an optical scanner 180A such as, a galvo-scanner). The two lenses L2 and L3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110A. The detection light paths, 111A and 112A, respectively, resemble the illumination light path until the light paths are separated by a light reflector, which is in this case a dichroic mirror 130A. The light is then focused by a common lens L4 onto the multi anode PMT detectors 120A. The emission beam-lets are collected at pixels 121A and 122A, respectively, of a multi anode PMT (MAPMT) 120A. The MAPMT 120A, which has the same number of pixels as excitation beamlets, detects the signal of 8×8 pixels synchronized with the scanning. The intensity information is recorded along with the angular position of the scanner and reconstructed by an image processor.
  • As shown in FIG. 8, a further embodiment according to the invention provides for a multifocal, multiphoton microscope based on MAPMT, in which an expanded excitation beam 104B comes from laser 101B and illuminates a micro lens array 140B. A plurality of optical pathways is generated by the micro lens array 140B in conjunction with lens L1; for instance, in the embodiment illustrated here two beamlets 141B, 142B are ray-traced. A specimen 105B is scanned with an array of at least one excitation foci 151B and/or 152B which are illuminated by beamlets 141B and 142B respectively. The scanning is accomplished by a scanner 180B. The two lenses L2 and L3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110B. The detection paths, 111B and 112B, respectively, depart from illumination light path when separated by dichroic mirror 130B. The light is then focused by a common lens L4 and reflector 134B onto two multi anode PMT detectors 120B, 124B. The MAPMT detectors 120B, 124B each detect the same number of pixels as are emitted excitation beamlets, integrating the signal of the at least one pixel synchronized with the scanning.
  • A z-piezo actuator 109B (such as MIPOS 250 SG, micro-objective positioning system, integrated strain gauge motion: 200 μm (closed loop), Piezo System Jena controllable by controller 170B is attached to the objective lens 110B in order to move it in the z direction for 3D image generation. The sample 105B is attached to a sample stage 115B, which can be moved in x, y and z directions, also controllable by controller 170B and/or computer 176B. Light reflector 134B (such as, for example, a dichroic mirror) is positioned in the detection pathways to enable multi channel imaging by a first MAPMT detector 120B and a second MAPMT detector 124B, for multi channel imaging.
  • An IR block filter 116B (such as e700sp Special, Multi-Photon Blocking, Block 750-1000nm>OD 6, Chroma Technology Corp is positioned in the detection pathway to separate the long wavelength excitation light from the short wavelength detection light. The filter 116B is exchangeable with a variety of filters or can be removed completely for reflected light confocal imaging. The filter 116B can be mounted on a motorized mount, which allows it to exchanged via a controller 170B and/or computer interface 176B. A band-pass filter 117B (such as 560DCXR for detecting DAPI and FITC, Chroma Technology Corp) (560DCXR for the transmission of light generated by the excitation of GFP and Rhodamin, HQ460/40 for the transmission of light generated by the excitation of DAPI, HQ630/60 for the transmission of light generated by the excitation of Alexa 594 bandpass; Chroma Technology Corp) is positioned in front of each of the multi anode PMTs 120B, 124B, in order to detect certain spectra. The band- pass filters 117B, 117B are exchangeable with other different filters and can be mounted on a motorized mount enabling changing of filters via a controller 170B and/or computer interface 176B. The same sample region can then be imaged with a different set of band-pass filters for more than two-color imaging.
  • A detection-part light-shield enclosure 118B is used to shield the detection part of the apparatus from ambient light. A variable iris 119B (such as for the case of a manual version D20S—Standard Iris, 20.0 mm max. Aperature; Thorlabs. Motorized versions of equivalent devices are available as well) is positioned in the focal plane of the micro lens array 140B in order to enable single spot illumination. For 8×8 foci imaging, the iris 119B is relatively open and for fewer or single spot imaging the iris is relatively closed, so that only a few or one micro lenses illuminate the sample. The variable iris 119B does not have to be in a round shape; and it will be square in shape when only a certain array of micro lenses should be blocked to enable illumination with a view of selected foci only. The variable iris 119B can be motorized and controlled via controller 170B (such as, for example by connection 191B), and/or via the computer 176B.
  • A micro lens foci mask 125B (such as a thin (for example 0.3 mm) aluminum sheet in which small wholes (for example 0.5 mm wholes) are drilled, at the points of where the micro lens focuses) positioned proximate the micro lens array 140B is a pinhole mask with a large pinhole size that enables the transmission of most of the light focused by the micro lenses, but otherwise it blocks ambient and stray light from the laser.
  • A first reflector 131B generates a first laser reference beam 165B from the incident laser beam (for monitoring the laser illumination power, wavelength and angular pointing stability). The reference beam 165B projects upon the diode or detector 160B which generates a signal that measures the laser illumination power, wavelength and angular pointing stability.
  • A further embodiment of the invention provides for a scan reference beam 166B from a scan reference beam illumination source 168B to be projected via reflector 172B and reflector 132B onto the scan region, whereupon the returning scan reference beam returns via reflector 132B to pass through dichroic 172B and lens 174B to be received by detector 164B. The scan beam is provided for monitoring the scanning accuracy. Detector 164B can be a diode or CCD detector or another type detection device. As shown in FIG. 8, an embodiment of the invention can provide for the detector 164B to be a CCD camera, which can be used to compare images generated by CCD camera detection methods and other detection methods according to the invention that employ one or more multi anode PMTs as described above.
  • A high voltage power supply 188B supplies power to the multi anode PMTs. Multi channel photon counting cards 184B, 186B are connected to each element of the MAPMTs, with one photon counting device for every multi anode PMT element, such as, for example, MAPMT elements 120B and 124B. A computer 176B (including input devices, such as, for example, a keyboard and mouse) can be provided in one embodiment, connected to computer display 178B. The computer 176B can be connected to controller 170B.
  • The computer 176B controls numerous elements of the invention either directly and/or indirectly through controller 170B, and one skilled in the art will appreciate that numerous alternative configurations can be implemented within the scope of the invention.
  • One embodiment provides for the computer 176B to be programmed with a processing software and for the computer 176B to control a number of optical elements through a variety of electronic interfaces. For example, without limitation, the computer 176B and/or the controller 170B can be electronically interfaced with the scanner 180B and the multi channel photon counting cards 184B, 186B to perform the steps of scanning and data acquisition. Further the computer 176B can perform imaging post-processing steps. The display 178B can be used to display the acquired images in real-time after further processing.
  • A laser power attenuator 163B can be provided to control the laser incident power. The attenuator 163B can be controlled by the controller 170B and/or by the computer 176B in order to enable power adjustments for different samples and different locations in samples. During imaging at different depths in the sample, for example, the laser power can be automatically adjusted, so that the laser power can be increased at higher penetration depth. The attenuator 163B is integrated in order to make laser power adjustments, such as, for example, low power at the sample surface and increased power at increased penetration depth.
  • A third reflector 133B generates a second laser reference beam 167B from the incident laser beam (also for monitoring the laser illumination power, wavelength and angular pointing stability). This second laser reference beam 167B projects upon a second diode or detector 161B to generate a signal that measures the laser illumination power, wavelength and angular pointing stability. A laser power attenuator 163B controls the laser incident power and is integrated in order to make laser power adjustments, such as, for example, low power at the sample surface and increased power at increased penetration depth. Laser 101B is an illumination light source, such as a titanium sapphire laser (Mai Tai HP, Spectra Physics).
  • Multi-photon microscopy works most efficiently with short laser pulses owing to dispersion, the optical elements in the illumination pathway broaden the initially short laser pulse. The pulse compressor 102B is built from a pair of standard high reflectance mirrors and a pair of prisms (IB-21.7-59.2-LAFN28, Material: LaFN28; CVI Laser Corp., Albuquerque, New. Mex. 87123) mounted on translational and rotational stages pre-chirps the laser pulse in order to attain a short laser pulse in the focus of the objective lens.
  • A confocal pinhole array optionally can be placed between either of the multi anode PMT arrays 120B, 124B and the band- pass filters 117B, 117B, respectively. This option enables the system to be used for confocal microscopy or for multi-photon microscopy with confocal detection.
  • A telescope 103B expands the laser beam. With different expansion ratios, a different number of micro lenses can be illuminated. With a small beam expansion for example, a relatively smaller array of 2×2 micro lenses can illuminated and, thus, an array of only 2×2 foci is generated. As the beam expansion is made larger, an array of 8×8 or more micro lenses can be illuminated and, thus, an array of 8×8 or more foci is created. A further preferred embodiment employs a set of at least two mirrors 135B, 136B after the telescope 3B for precise beam alignment.
  • A mechanical micro lens holder 145B enables the precise positioning of the micro lens array 140B with respect to the multi anode PMTs 120B, 124B in the x, y and z directions. The holder 145B can be a motorized holder and can be controlled through a computer interface 176B, or, alternatively, can be controlled via a controller 170B, which controller in turn can be directed by computer 176B.
  • Mechanical multianode PMT holder 125B, 126B enables the precise positioning of the multi anode PMTs 124B, 120B, respectively, with respect to the micro lens array 140B in the x, y and z directions. The holders 125B, 126B can be motorized holders and can be controlled through a computer interface 176B, or, alternatively, can be controlled via a controller 170B, which controller in turn can be directed by computer 176B.
  • The computer 176B, or the controller 170B, or the computer and controller together can be configured to control automatically or to control in supervised fashion, one or more of the following elements, without limitation: the scan reference beam illumination source 168B, the sample piezo stage 115B, the objective z-piezo stage 109B, the scan reference beam detector 164B, the scanner 180B (by connection 193B), the IR block filter 116B (by connection 194B), the band-pass filters 117B (for example, by connection 195B), the laser source 101B, the laser attenuator 163B, the first laser reference beam detector 161B, the second laser reference beam detector 160B, the pulse compressor 102B, the multi-photon channel counting cards 184B, 186B, the mechanical multi-anode PMT holders 125B, 126B (for example, by connection 190B), the variable iris 119B (such as, for example, by connection 191B), and the mechanical micro lens array holder 145B (for example, by connection 192B).
  • The focal region has a focal pattern variation in xy plane. The foci can be distributed unevenly, e.g., the rows and columns do not have to be spaced uniformly.
  • Also, a system can be built, in which there are additional rows and/or columns of PMT's at the outer region of the array of detection tubes. For example, there can be more than 8×8 rows and columns in both the micro lens array and the detector. This is particularly important for detecting scattered photons of the outer foci and for using the information of the scattered photons from the outer foci for deconvolution purposes.
  • Further, an embodiment of the invention provides for a system in which there are more detector elements than there are foci, so that a plurality of detector elements (or detection pixels) collect the photons of one, optically conjugated foci. For example, a 16×16 detector array can used as a detector device, while an array of 8×8 foci can be illuminated by an 8×8 multi lens array. Smaller and larger PMT-to-foci ratios can be utilized. In particular, a detector array in which one foci is optically conjugated to an uneven number of detector elements can be employed. This is important for detecting scattered photons in the neighboring channels of the to the focus optically conjugated detector and for using the information of the scattered photons for deconvolution purposes.
  • The image of the sample is formed by scanning in the optical plane (xy) when the intensity signal from the detectors is correlated with the foci positions. The foci scan the specimen in the x direction, then move an increment in the y direction, and then raster in the x direction again until the sample is fully covered at some desired resolution. During the rastering, intensity light signals are recorded by the multi anode PMT. These signals are then saved along with the foci positions in the computer and can be concurrently or afterwards displayed by the computer display or other graphics outputs. The foci positions are known by the scanner position (beam scan) or the sample position (stage scan). The smaller the step increments, the higher the resolution the final image will be. The scanning can be performed in a raster fashion, or in many other ways, such as with time multiplexed methods, or scanning simultaneously at different depths.
  • Referring to FIG. 9, an embodiment of the invention provides for generating and detecting a 3D foci pattern in focal region 154C. A source of light is directed onto a micro lens array 140C and a plurality of optical pathways is generated by the micro lens array 140C in conjunction with lens L1; for instance, in the embodiment illustrated here, the microlens array 140C splits the excitation beam 104C into 8×8 multiple beams (i.e., 64 beamlets). In FIG. 9, only one beamlet 141C is ray-traced. A specimen 105C is scanned with an 8×8 array of excitation foci, which includes focus spot 151C illuminated by beamlet 141C. The scanning is accomplished by an optical scanner 180C. The two lenses L2 and L3 guide the plurality of optical pathways onto the rear aperture of the focusing device 110C. The detection light path, 111C, resembles the illumination light path until the light paths are separated by a light reflector, which in this case is a dichroic mirror 130C. The light is then focused by a common lens L4 onto the multi anode PMT detector pixel element 121C. Changing the focal length or the positions of the micro lenses of the microlens array 140C with respect to each other generates collimated and non-collimated beams at the back aperture of the objective lens 110C. These beams generate a 3D pattern of foci. The 3D pattern of foci generates light which is collected by the detector array. According to the positions along the optical axis of the micro lens array, the positions of the PMT's are changed. In case of one-photon illumination, such as is illustrated in the “Option I” detection region 124C, no confocal pinholes are placed in front of the detectors. In an alternative embodiment, such as is illustrated in the “Option II” detection region 126C, a plurality of confocal pinholes 128C are placed in front of the plurality of detectors cells. Each of the detection options 124C, 126C can be used for single photon and/or for multi photon imaging. The MAPMT, which has the same number of pixels as excitation beamlets, integrates the signal of 8×8 pixels synchronized with the scanning, although other array dimensions can also be used.
  • FIGS. 10(a)-(i) illustrate in greater detail the arrangement and progression of foci corresponding to the relative shifting in position of micro array lenses and MAPMT pixels shown in FIG. 9. FIG. 10(a) shows an expanded detail of the focal region with 3D foci pattern. FIG. 10(b) illustrates an array of excitation light beams (in this case, an array of 2×8 beams) illuminating a focusing device 110D, such as, for example, an objective lens, as viewed here in the x-z plane. In accordance with differing degrees of collimation of at least two of the light beams, focal points 151C and 152C are created for the two beams at certain distances, d1 and d2, respectively, along the optical axis (z-axis). According to the relative angle of the illumination light beams with respect to the optical axis, the array of foci are separated in the optical plane. Controlling parameters of the beams provider selection of a variety of 3D foci distribution(s). For collimated light, the excitation foci 151C is at a distance from the focal objective, fobj, designated here as distance d1. For a second beam that is not collimated perfectly, an excitation foci 152C is at a distance d2 that is not equal to the focal objective, as depicted in FIG. 10(b). FIG. 10(c), depicting a “static” view of an x-y plane “slice” at focal depth d1, illustrates a first row of 8 foci (of the 2×8 array in this example) all lying at the same focal depth d1, understanding that any one of these foci may correspond with the 151D focus point in the x-z plane view of FIG. 10(b). Similarly, FIG. 10(d), depicting a “static” view of a second x-y plane at focal depth d2, illustrates a second row of 8 foci (of the 2×8 array) all lying at the same focal depth d2, any one of which foci might correspond with the 152C focus point in the x-z plane view of FIG. 10(b). FIG. 10(e) illustrates the separation of the two x-y planes as viewed in the y-z plane, where it can be seen that the two rows of foci are separated by a difference in focal depth D=d1−d2. FIG. 10(f) shows a 3D, 8×8 beam matrix (64 beams) of excitation foci that have been generated. Each row of 8 foci lies in a different z plane, as depicted in a view of the same set of 64 foci as seen in the y-z plane (FIG. 10(g)); this is a graphic depiction of a “still” configuration, i.e., without the array being moved in a scanning mode. FIG. 10(h) illustrates an x/y view in a scanning configuration, where each line of foci is scanned in the xy plane to cover the whole yx image in its particular z-plane. A number of xy planes are shown simultaneously, but actually each plane lies at a different focal depth on the z-axis. FIG. 10(i) provides a view of a section in the yz plane, illustrating the scanning configuration while the z/y scan is performed, i.e., scanning along the y axis and through multiple depth layers in z. As a result, a 3D volume can be imaged by only scanning the foci array in xy. Note that the x/y, x/z and y/z coordinates illustrate the associated planes; they can be displayed with an arrow in their positive or negative direction.
  • 3D AM-PMT MMM can be used in multi photon endoscope device in accordance with another preferred embodiment of the invention (to be added).
  • Referring to FIG. 11(c), a further embodiment provides for a 3D cytometer, based on multi-focal, multi-photon microscope with a multi anode PMT detector. A 10 W solid state pump laser 100D pumps a titanium sapphire laser 101D (Millennia X & Tsunami, Spectra Physics, Mountain View, Calif.), which generates maximum output power of 2.5 W at 800 nm, and 120 fs pulses at a repetition rate of 76 MHz. The light is conducted through two reflectors 137D, 138D and then passes through a first telescope 103D, two additional reflectors 139D, 133D, an attenuator 163D, and a second telescope 203D. After passing through another reflector 136D, the light is subsequently split into an array of beams by the micro lens array 140D and is transmitted by lenses L1, L2 and L3 onto the back aperture of the objective lens 110D, thus creating multiple foci in the focal plane. The micro beams are scanned by a xy-scanner 180D (Cambridge Technologies, Cambridge, Mass.). The fluorescence is collected by the same lenses and separated from the illumination light by a dichroic filter 130D and a two-photon block filter 116D. The fluorescence passes through lens L4 and is then separated into two spectral channels by the dichroic filter 134D and directed onto the multi-anode PMTs 124D, 120D. The degree of spectral separation can be chosen depending upon the application. The embodiment disclosed here uses a red/green and a green/blue filter to accomplish the spectral separation. The variation of the magnification of the telescope 203D enables the utilization of, for example, 4×4, 6×6 or 8×8 arrays of micro lenses, among other size arrays.
  • FIG. 11(b) illustrates an image of the array of foci in the focus of the objective lens, such image as can be taken by a CCD camera, where here the foci are not scanning. The foci are 45 μm apart in resulting in a potential scanning field of 240 μm when 6×6 foci are utilized.
  • FIG. 11(c) shows a z-profile and a corresponding fit function of a 200 nm bead. The system shows a resolution of 2.4 μm, which is close to a theoretical value of 2.2 μm, considering the under-fulfillment of the back aperture of the objective lens. Acquisition speed for this scanning profile was 10 frames per second. The profile is averaged over 5 consecutive pixels, reducing the sampling from 30 nm per pixel to 150 nm per pixel.
  • Referring to FIG. 12(a) a further embodiment of the invention provides for multi color detection MMM in the xz-plane. An array of 2×8 beams is generated by the micro lens array 140E. The setup here is illustrated with two 1×8 beam lines. The distance between the foci in each line is determined by the combination of the source beam configuration and the micro lens array 140E. Two light beams are conducted through the micro lens array 140E and intermediate optics onto the focal plane of the microscope in which they create two lines of 1×8 foci. For simplified visualization, in FIG. 12(a) only 3 of the 16 beam traces in a 2×8 setup are illustrated. The full field is then scanned by the mirror oscillation of the scanning mirror 180E, in which the scanning amplitudes need to be adapted to the distances of the foci. On the detection side a holographic diffraction grating 192E is incorporated that diffracts the multiple wavelengths emitted from the sample onto the photo-multiplier arrays of two stacked multi-anode PMTs 120E, 124E. In the setup the two multi-anode PMTs will be stacked on top of each other, each serving as a spectral detection device for one line of 1×8 foci. The grating 192E properties (pitch/inch) and the focal length of the focusing lens (L4), which determines the distance between the grating and the multi-anode PMT, have to be chosen in accordance to the anticipated fluorescent probes used for staining the tissue sample. For this embodiment, a transmission grating 192E is used. Nevertheless, comparable and/or better efficiency can be achieved in embodiments that use a reflection grating or a prism. FIG. 12(b) illustrates the illumination foci and their scanning in the focal xy-plane. Scanning is indicated for two arrays of 8 foci each. FIG. 12(c) shows the detection path of two beams projected in the yz-plane through grating 192E and lens L4 onto the stack of two AM- PMTs 120E, 124E. FIG. 12(d) shows the detection path projected in the xz-plane, where the beams are depicted passing through grating 192E and lens L4 with each of eight color bands being collected by the two AM- PMTs 120E and 124E. FIG. 12(e) illustrates the anodes of the multi-anode PMTs 120E, 124E in the x/y plane, showing that the 8×8 anode arrays of the detectors each detects one of the two 1×8 beam lines, where each 1×8 beam line has been diffracted by the grating 192E into eight color bands.
  • Referring to FIG. 13, a beam splitter device 400 can be used to create a homogenous intensity profile over a plurality of beamlets. Depending upon its design, the beam splitter splits one beam into 256, 128, 64, 36 or 16 approximately equally powered beams by one or more fully reflective or semitransparent mirrors. In FIG. 13 50% and 100% indicate the percent reflectance of the mirrors used, where a series of fully reflective mirrors 420 with one longer semi-transparent 410 mirror splits the beam in the x-plane (BS-X). By combining two such cubes in series, it is possible to generate a 2D array of beamlets. The internal optics of a second beam-splitting cube for the y-plane (BS-Y), are the same as for x-plane beam splitting cube. The beams are then focused by micro lens 430 (or via other multifocal optics) through lens 432 and objective lens 434 onto the focal plane.
  • FIGS. 14(a)-14(d) illustrate additional preferred embodiments for providing multifocal illumination including a micro-lens array 140N from expanded beam 201N in FIG. 14 a and FIG. 14 b a diffractive optical element 205N separates beam 201N into a separated plurality of beams which are coupled to focal locations as previously described herein. In FIG. 14 c a plurality of optical fibers 220N can be used to provide a plurality of beams with lens L1 for delivery to the focal locations or spots. As seen in FIG. 14 d the fibers 220N can position beams in different directions for smaller or greater focal separation.
  • In addition to the primary use of this instrument for two photon microscopy, other multi-photon sensing and imaging methods can also be used with the system described herein including:
  • 2, 3, or more photon excitation microscopy,
  • second, third ore more Harmonic Generation microscopy,
  • coherent anti Stokes Raman scattering (CARS) microscopy,
  • multi photon quantum dot imaging,
  • surface plasmon imaging, and
  • Stimulated Emission Depletion (STED) microscopy
  • With the implementation of a confocal pinhole array, shown in FIG. 8, confocal microscopy can be preformed with the same instrument.
  • FIGS. 15(a)-15(d) illustrate further preferred embodiments for use with detectors which can be a multi anode PMT or an array of single detectors, connected via optical fiber. The detectors can be PMT's or avalanche photo diodes, or the detector array can be a combined device (like a multi anode PMT), connected via optical fiber, an avalanche photon diode array, a CMOS imaging detector, or a CCD camera in which each pixel or each area of binned pixels is correlated to one focus, or a CCD camera in which more than one pixel or more than one binned pixel area is correlated to one focus. As seen in FIG. 15 a, the detector 210P can be coupled directly to optical fibers 220P which receive light from lens L1. As shown in FIG. 15 b, individual detectors 210P can collect at different angles, or as seen in FIGS. 15 c and 15 d, a detector array 212P can detect at the same or different angles respectively.
  • Referring to FIG. 16, the optimal number of foci for a two photon excitation process at a certain laser power for samples with different damage thresholds can be determined. The optimal number of foci will depend on (i) the damage threshold, (ii) the quadratic dependency of the two-photon signal to the laser power, and (iii) the limited amount of laser power. In the graph shown in FIG. 16, the laser power is limited to 1.2 W at the sample, while the damage threshold of the sample ranges can be 10 mW, 20 mW and 50 mW. As a result, the optimal number of foci is 120, 60 and 24 respectively. In general, the appropriate power level for two-photon imaging is constrained by two basic boundary conditions: (a) the minimum accepted signal-to-noise ratio determines the minimum power that can be used, whereas (b) the damage threshold of the sample determines the maximum power. In an MMM system, the limited laser power is distributed over a large number of foci. The best signal is obtained when the number of foci is chosen in a manner such that each of the foci delivers a power level just below the damage threshold for the sample. The relationship is illustrated in FIG. 16. It is possible to obtain less signal from the sample as more foci are used, owing to the squared dependence of signal on laser power. A judicious choice of power levels and of number of foci must be made in order to obtain optimal results. Therefore, a preferred method and system provides for a versatile system in which the number of foci can be varied with respect to the sample threshold. The threshold can be different for different penetration depths into the sample and can therefore be adjusted by the attenuator 163B.
  • Time multiplexed illumination and detection enable MMM microscopy with one detector only, which is gated to the excitation light pulse. In one variation of a multi-photon MMM, the illumination light source is a pulsed laser. In this case, a Ti:Sa Laser with a repetition rate of approx. 80MHz and a pulse width of approx. 100-200 fs as an example. In the standard illumination version of this MMM, all beams carry the same pulse distribution along time. As a consequence, the array of excitation foci in the focal region is formed simultaneously. For image formation, during or after at least one pulse has illuminated the sample, the beam or sample is scanned on both axis perpendicular to the optical plane; here indicated by x≧0.
  • In the example of multiplexed operation 500 shown in FIGS. 17 a and 17 b 36 detection elements are collecting light from 18 simultaneously illuminated spots. The delay between the illumination pulses 502 is alternated between foci shown at 504. This configuration can be imagined to be accomplished with more detection channels per simultaneously illuminated foci. For image formation, during or after at least one pulse has illuminated the sample, the beam or sample is scanned on both axes perpendicular to the optical plane; here indicated at 506 by x≧0. Depending on the particular configuration, there are several different advantages. First, the light in non-corresponding detection channels has an additional time delay relative to light from the foci corresponding to the detector channel 508 that is receiving light at a particular time. In fast processes, the resulting signals may not or may minimally overlap and thus be registered to the proper foci directly. In slower processes, where the overlap may be significant, the temporal separation will aid numerical registration and deconvolution algorithms. Furthermore, when the number of detectors matches or exceeds the number of illumination foci, the response in the neighboring non-corresponding detectors can be used to generate additional information about the sample. The temporal delays introduced into the illumination foci mean that this supersampling condition exists even when the number of detectors is the same as the number of foci.
  • In anther example, alternating excitation foci pattern can be detected by a multichannel detector with smaller number of elements than number of foci.
  • In cases in which the repetition rate of the laser is lower that in the case of the Ti:Sa laser, a time multiplexed MMM illumination and a detection in a single channel can be used. In this particular case, the repetition rate of the laser is a hundred times lower than in previous examples. In a time multiplexed version, each beam carries a pulse which is temporally separated in regards to pulses of the other beams. In one particular case, they are separated evenly over the time period of one laser repetition, so that at evenly distributed time points, one single foci is illuminated at a time. If a fast detector is correlated with the pulse distribution and capable of detecting each pulse separately during this short time period, an MMM with only one detection element can be used. This detection element has a corresponding detection area to collect light which is generated by each individual foci during its scan. For image formation, during or after at least one pulse cycle has illuminated the sample, the beam or sample is scanned on both axis perpendicular to the optical plane. As a result, optical cross-talk is completely eliminated, as the light from the different foci is excited and detected at different time points. Applications for this case are excitation processes which appear instantly, like scattering effects (such as Second Harmonic Generation (SHG) or Coherent Ramen Anti Stokes scattering (CARS)). This configuration can be used suited for a non-de-scanning configuration. Other repetition and detection rates are possible.
  • In the case of using a pixellated detector such as a CCD or a CMOS imager, an array of 3×3 beams, FIG. 18 a, illuminates the focusing device, forming an array of 3×3 foci (FIG. 18 b). As seen in FIG. 18 c, the image of the scattering distribution of the foci is imaged by the CCD camera. The wide-field image is accumulated per scanned illumination point. The scattering distribution of each foci can be recorded on many CCD pixels. The illumination or sample can be scanned and the wide-field data can be further processed to form an image or statistical representation from many object points. This configuration can be employed in a non-de-scanning configuration as well.
  • The above described systems and methods can be used for imaging of all semi-transparent and highly scattering materials; 2D and 3D, and in particular for imaging of human and animal tissue, cells in suspension, plants and other biological material.
  • The illumination can be achieved with visible light and alternated with the MMM scanning measurement or out of band illumination light can be used and the camera measurement can be taken simultaneously with the MMM measurement. This configuration can be used for large field imaging, sample guided MMM measurements, conventional staining measurements, and online MMM measurement process control, for example, bubble formation monitoring, and laser spot diagnostics.
  • There is a large variety of fluorescent that can be used with various embodiments of the invention dyes. In general they fall into two families: Dyes that have to be applied to stain the tissue “from the outside” and dyes, that are expressed from animals as proteins. Most commonly used dyes by external staining MitoTracker Red, DAPI, Hoechst 33342, Cy2-IgG, Alexa Fluor 546, Rhodamine 123, Alexa Fluor 488, FITC-IgG, Acridine Orange, Cy3-IgG, Issamine Rhodamine, TexasRed-Phalloidin, TexasRed-IgG, Alexa Fluor 594, Propodium Idonide. Dyes genetically expressed by genetically modified animals: green fluorescent protein (GFP) and other dyes in this family: Enhanced GFP EGFP, Yellow fluorescent protein (YFP), Enhanced YFP. Auto fluorescent imaging does not use a particular dye, but can be used as part of an imaging technique.
  • Besides confocal microscopy (fluorescent, as well as reflected light confocal), these include all other multi-photon microscopy techniques, such as, 2, 3, or more photon excitation microscopy, Second (SHG), Third (THG) ore more Harmonic Generation microscopy, Coherent Anti Stokes Raman Scattering (CARS) microscopy, multi photon quantum dot imaging, surface plasmon imaging and Stimulated Emission Depletion (STED) Microscopy. These techniques can be used with or without staining methods. The scattering techniques, such as SHG, THG, CARS are developed to be able to image without any staining involved.
  • FIG. 19 illustrates a probe or endoscope apparatus according to an embodiment of the invention, having a handle portion 272F and an insertable probe portion 270F, wherein light delivered from a light source 244F (which can be a laser or other light source) is delivered through an optical wave guide 234F (such as, for example, optical fiber, hollow fiber optics, photonic band gap fiber, or other wave guide) to an optical connector 224F (such as, for example, a pigtail), whereupon an expanded beam 104F passes through a lens or optionally through lens pair telescope 103F and then through a micro lens array, or other optical device that creates a plurality of optical pathways, 140F. The illumination path can then pass through lens L1, dichroic 130F and lenses L2 and L3 onto the rear aperture of the objective 110F The beam is made to scan by scanner 180F which can tilt in the x and/or y directions, and the return fluorescent signal is directed by dichroic 130F and reflector 136F, optionally an IR block filter 116F through lens L4 and optionally a band pass filter 117F onto a multi-anode PMT detector 120F. In an alternative embodiment, a plurality of confocal pinholes 119F are placed in front of the plurality of detectors cells. The detector 120F can be connected to a controller 170F and to an image processing computer 176F. The scanner 180F can also be controllably connected by electrical connector 193F to a controller 170F and/or computer 176F. In FIG. 19, the proportions of the endoscope have relationship to the focal distances of the micro lens array, fm, the lenses L1-L4, being fL1, fL2, fL3 and fL4, offset distances d1 and d2, and the size will be related to the relative size of the various elements.
  • Referring to FIGS. 20(a)-(b) through FIGS. 24(a)-(b), in which common elements share the same numbering between figures, the active area, relative proximate orientation of active detector elements (such as, for example, the active area of multiple anode photomultiplier tube detector elements), and the distance of the foci and the intermediate optics have an important relationship to the effectiveness of detecting scattered light from one or more light spots in a sample specimen, as explained in the following.
  • FIG. 20(a) depicts one PMT and one excitation focus, and the direction of scattered light with respect to the detection light cone of the active detection area will control whether or not the photon will be detected. FIG. 20(a) provides an illustration of how the size of the active detection area relates to scattered light detection from a spot created by multi photon excitation, as follows: An illumination light beam 204G coming from the left (parallel solid lines), generates a multi photon excitation light spot 251G (so-called excitation point spread function) in the sample 105G, in which the structure causes a multi-photon excitation process. Within this 3D sample region, light is generated according to the multi-photon excitation principle and scattered on its path (such as, for example, an auto-fluorescent tissue sample). The potential detection path is illustrated by the very thin bounding lines enclosing the stippled shaded region, which are geometrically determined by the side boundaries of the active PMT detection area 222G of the detector 120G. In this configuration, all the photons that propagate within the detection cone, indicated by the shaded region, and that travel in the direction of the detector 120G, are collected in the active detection area 222G. Photons that propagate in the opposite direction, or that are scattered outside of the detection cone defined by the optics and thus outside of the active detection area 222G of the detector 120G, are not detected. This is depicted in FIG. 20(b) as well, where the detection area 224G (dashed box) in the sample focal plane 210G corresponds to the active detection area 222G, while a potential scattering region 254G (circle) extends beyond the confines of the detection area 224G. Three examples of photon path are shown in FIG. 20(a), as follows: (1) An unscattered photon 261G, traveling in the opposite direction to the incident light beam follows a path 262G (solid line), which path 262G lies within the detection light cone and travels towards the active detector area 222G and is thus collected; (2) a first scattered photon 271G that is scattered within the detection cone follows a path 272G (short-dash line) and travels in the direction of the active detector area 222G and is thus detected; and (3) a second scattered photon 281G that follows a path 282G (long-dash line) which travels generally in the direction of the detector but does not fall into the detection light cone of the active PMT detection area 222G, and thus it is not detected. Light generated in the spot 251G is detected by the same objective lens 110G. It can as well be detected by an opposing objective lens and collected by a detector associated with the detection area to the light spot at the opposing side. Then also photons in the detection cone of the second, opposing lens, traveling towards the direction of the incident light into the opposing detector, are collected.
  • Referring to FIG. 21(a), when scattered light from one excitation foci 251G (single excitation point spread function (PSF)) is detected in a setup with two PMT elements 120G, 124G, the gap 232G (also marked as “g”) between the active detection area 222G of detector element 120G and the active area of the second element 124G will correspond to the gap 236G in FIG. 21(b) between two detection areas 224G, 226G in the sample focal plane. If the scattering region 254G around the excitation foci 251G extends into the detection area 226G, then the photon 281G that is scattered beyond the detection cone for active detector area 222G of detector 120G can follow photon path 282G into the adjacent detector 124G.
  • Scattered light detection from a spot 251G created by multi photon excitation, detected by two large area detectors 120G, 124G, positioned next to each other, are separated by a distance: In this case, the unscattered photon 261G and the first scattered photon 271G are still collected by the active detection area 222G of the first detector 120G. The second scattered photon 281G is not lost, but is collected by the second detector 124G. This effect of light being scattered into detectors other than the optically conjugated detectors is termed “optical cross talk”.
  • FIG. 22(a) illustrates scattered light detection with two PMTs 120G, 124G and two excitation foci 251G, 252G, where again the issue of “optical cross talk” is relevant. Here a second illumination light path 206G at an angle Ψ1 with respect to the first illumination light path 204G, creates a second focus 252G (excitation PSF) at a distance δ from the illumination light spot 251G. In FIG. 22(a), only one, unscattered photon 291G is illustrated in order to simplify the drawing. This photon 291G originating from 252G follows optical path 292G into detector 124G (although the illustration includes a collimating lens between the reflector and the detector and a refraction in path 272G by said lens is depicted, owing to constraints in the size of the drawing and to illustrate better the features emphasized here as aspects of an embodiment of the invention no refraction in the paths of 282G and 292G is depicted in this illustration). Light originating from the second light spot 252G will be collected by the second detector 124G, but also by the first detector 120G as well because photons from the second light spot 252G are similarly scattered as photons from the first light spot 251G. FIG. 22(b) illustrates this by showing scattering area 256G overlapping both the detection areas 226G and 224G (again the gap 236G between the detection areas in the focal plane will correspond with the gap 232G between the boundaries of the active detector regions). As a result, scattered light from light spot 252G is falling in the detection cone of first detector 120G and will be detected by that detector 120G (i.e., optical crosstalk). With increased scattering of photons in the sample, the optical crosstalk increases. Samples with a low mean free path length (MFP) for photons will induce photons to scatter more (scattering more times at equal traveling lengths); thus, samples with low photon MFP will induce higher optical cross talk. In addition, increasing the imaging depth will raises the probability that a photon will be scattered on its way through the sample to the detector, because the traveling length in the high scattering media is longer. Because light with longer wavelength is scattered significantly less, applications with relatively longer detection wavelengths have often been preferred, in order to reduce the effect of optical cross talk. However, preferred embodiments of the invention provide methods for reducing optical cross talk without having to shift to longer wavelengths.
  • Referring to FIG. 23(a), for instance, the invention provides for reducing optical cross talk by increasing the gap distance between the excitation foci, this gap distance depicted as γ, and simultaneously increasing the gap 232G between the active detection areas in the detection elements. In this case, the second illumination light path 206G is separated further by an angle Ω21 from the first light path 204G, generating an illumination light spot 252G in a location that is larger distance Δ>δ from the illumination light spot 251G. The detectors 120G, 124G are also separated from each other by increasing gap 232G between the active detector areas to a value “G”, where G>g, such that the second unscattered photon path 282G no longer falls into the second detector 124G. FIG. 23(b), illustrates this by showing no overlap between scattering regions 254G, 256G and the neighboring detection areas 226G, 224G, respectively. With this configuration, optical cross talk is reduced because fewer photons end up in the “wrong” channel; however, some scattered photons will not be detected because their paths will pass between the active detection areas of more widely separated detectors. Thus, detection light (signal) is lost.
  • Referring to FIG. 24(a), a preferred embodiment of the invention provides for reducing optical cross talk without inducing signal loss, by increasing the distance between the excitation foci and simultaneously increasing the active detection area of the detector elements. By separating the foci and the associated detectors, the optical cross talk is reduced. By increasing the active area of the detectors, most scattered photons are collected. In FIG. 24(a) this is depicted by the second scattered photon path 282G being collected by its corresponding detector 120G. FIG. 24(b) shows the expanded detection areas 226G can encompass the scattering
  • In a further embodiment, changing the optical configuration of the apertures and focal length of the lenses in the optical system can create the same effect. Changing the aperture and the focal length of the micro lens array, increasing the area of the scan mirror, changing the aperture and the focal length of the lenses L1, L2, L3 and L4 has a similar effect of reducing cross talk without loss of signal. An example case is presented in tabular format in FIG. W-22(a)-(e).
  • The optical configurations for two alternative embodiments of the invention, Example A and Example B, are presented in FIGS. 25(a)-(e). FIG. 25(a) lists the objective lens specifications, which are the same for both Examples A and B (i.e., Olympus, 180 mm tube lens, XLUMPLFL 20× magnification objective; water immersion; 0.95 numerical aperture; 17.1 mm back aperture; 2 mm working distance; 9 mm focal length; 22 mm field number; and 1.1 mm corrected field).
  • FIGS. 25(b)-(c) list the details of the illumination path. A different micro lens array is described for each embodiment, but in both the micro lenses are square shaped.
  • In Example A, the side aperture pitch of each micro lens is 1.06 mm and the diagonal is about 1.5 mm. For the entire array in Example A the 8 lenses per side create a side aperture of 8.48 mm and a diagonal aperture across the array of 11.99 mm. The focal distance of each micro lens is 17 mm in Example A. In the embodiment of Example B, the aperture or pitch of each micro lens in the array is 1.9 mm and its focal distance is 25 mm.
  • The focal lengths of the lenses L1 and L4 are 50 mm and 103.77 mm, respectively, in Example A, while in Example B they are 40 mm and 46.32 mm, respectively. When standard optical components are used, they can approximate components with a focal distance of 100 mm and 45 mm for L4 in the configurations A and B, respectively. In both the embodiments of Example A and B, the focal lengths of lenses L3 and L4 are 30 mm and 125 mm, respectively. The diameter of illumination of the back aperture of the objective lens for both Examples A and B remains approximately constant at 13.0 mm and 12.7 mm, respectively. This results in an ‘under-illumination’ of the back-aperture of the objective lens which has a back aperture of 17.1 mm in diameter. This is desirable, so an optimal (maximal) employment of the illumination light power is warranted.
  • As listed in FIG. 25(d), the two embodiments, Example A and B, achieve different distances between excitation foci in the optical plane: Example A has a foci distance of 46 microns, whereas Example B has a foci distance of 103 microns. The total optical field, listed below the foci distance, results from the fact that in this particular case, an 8×8 configuration of foci is chosen. It has a square side of 366 microns for configuration A and 821 microns for configuration B, when the foci are scanned. FIG. 25(e) lists the details of the detection path for each example. These alternative examples, A and B, can be created according to the layouts of either FIG. 7 or FIG. 8, according to embodiments of the invention. In both cases the size of the MAPMT can remain constant at about 2 mm×2 mm. Employing a MAPMT with larger detection elements can increase the detection efficiency.
  • FIG. 26(a)and (b) illustrate the foci distribution in the focal plane of the objective lens for the embodiment Examples A and B, respectively. The conjugated detection area of each channel of the multi anode PMT is by a factor of 5 larger in the Example B than in Example A.
  • The optimal distance between the foci is influenced by three factors: (1) the optimal number of foci that are needed to generate as much light as possible (this number can be distinguished in accordance with the graph in FIG. 16; (2) the corrected field of the focusing device, such as an objective lens (the larger the corrected field, the further the foci can be separated from each other and the more the optical cross talk can be reduced); and (3) the numerical aperture (NA) of the objective lens for high resolution imaging (the larger the numerical aperture of the objective lens, the more photons can be collected and the better the images are). Nevertheless, there is a compromise between the NA of the lens and its effective field of view. Therefore, the objective lens used in a most preferred embodiment of the invention has a large NA of around 1.0 or greater and is capable of imaging a large effective field of view, preferably of approximately 1-6 mm.
  • At a fixed number of foci, a large field objective provides an advantage for certain embodiments of the invention, because the foci can be further separated. An objective lens with large field of view enables large separation of foci and thus reduces optical crosstalk. In FIG. 26(c) and (d) two objective lenses with different fields of view are shown, 600 micron objective field versus 6000 micron objective field, respectively. The conjugated detection area of each channel of the multi anode PMT associated with the focal plane within this field of view is a factor of 100 larger in the objective of FIG. 26(c) versus FIG. 26(d). Commercially available objective lenses with a large numerical aperture (NA) of around and above 1 usually have a field of view for which they are corrected between (100× objectives) around 200 mm and (20× objectives) 1000 mm. With the Olympus XLUMPLFL20× water immersion objective mentioned above and used in embodiment Examples A and B, when an array of 8×8 foci is employed, the optimal distance between the foci is 111.11 microns and the total field imaged is approximately 1000 microns.
  • In the embodiments shown in FIGS. 7 and 8, the active detection area of the different detector channels in the MAPMT is approx. 2 mm and limited by the commercially available MAPMT devices. If this area is increases, the optical cross talk and the collection light efficiency is increased.
  • One embodiment of a method for data post-processing according to the invention is illustrated in FIG. 27(a) and provides for data post-processing starting at a step 310H. Step 310H can include initiating a computer program and/or software application automatically as part of a data acquisition step in a computer that is connected directly to the imaging apparatus and/or can include a series of human-supervised data-processing steps. The data processing can be automatically intiated by the computer and proceed entirely automatically according to a data-processing control software application and/or the program may proceed semi-automatically with opportunities for human supervision and intervention in one or more of the data processing steps. An embodiment of the data-processing method follows the start step 310H with a next step to load the image data 312H. This can include accessing raw data and metadata from storage devices, where metadata (data about the data) includes, inter alia and without limitation: foci number, pixel dimensions; pixel spacing; channels; instrument parameters (including, without limitation, optics, objective, illumination, wavelengths, beam-splitting, phasing, polarity, light pumping, pulse compression, chirping, upconversion, dispersion, diffraction, source-light properties, source light stability, source attenuation, reference scanning, micro lens configuration and properties, focal lengths, filter types and positioning, detection configuration, detector type, detector active area, detector sensitivity and stability, and other detector specifications and properties, inter alia); sample properties and sample information, such as, for example, for biological samples (including biological and non-biological information, such as, for example, tissue type, specimen type, size, weight, source, storage, tracking, scattering properties, stain/dye type, specimen history, and other physical properties of the specimen) or sample properties and sample information for chemical and/or physical material samples; and scanner data, including, without limitation, scanning type, scanning mode, scan method, tracking, frequency, certainty, precision, scan stability, resolution and other information about the scanning. Data can be stored as XML, text, binary or in any fashion, in electronic form and/or in retrievable and scannable physical formats. In one preferred embodiment of the data post-processing method, the next step 314H is deconvolution of the image data, which deconvolution is described further below. Following this, the data can be saved in an optional step 316H, whereupon the post-processing can optionally be stopped 318H. An embodiment also allows the processing to continue to a next step 320H that comprises performing an intensity normalization on the data, which normalization steps are described in more detail below, then optionally saving the data (step 322H), and stopping the data processing sequence (step 324H).
  • Referring to FIG. 27(b), another embodiment of the invention provides for additional and/or alternative method(s) for data post-processing, In one embodiment, the processing Alternatively, an embodiment of the method The post-processing steps can include a number of substeps. Step 332H can include accessing metadata from a storage device, including here by reference all the description of possible metadata described above for the steps illustrated in FIG. 27(a). Step 334H can include normalizing, filtering (de-noising), and blending (integrating) of multifoci subimages, and further can include registering subimages into a single image. Step 336H can include filtering and normalizing images produced from corrected subimages. Step 338H can include registering, building mosaics, and blending sets of corrected images into a larger whole. Also, optionally, at this step 338H, an embodiment of the method of the invention provides for creating lower resolution images of the larger image to facilitate access, as well as images from different perspectives (such as, image views taken of the xy-, xz-, and/or yz-planes) and creating data-compressed versions of the data and/or results (e.g., JPEG, wavelet compression, inter alia without limitation). Step 340H can include segmenting images into objects, which segmentation step can either be manual, automated or a combination of both. Step 342H can include parameterizing the objects, samples or specimens (such as, for example, size, shape, spectral signature). Step 344H can include classifying objects into higher order structures/features (e.g. material stress or cracks, vasculature, nuclei, cell boundaries, extra-cellular matrix, and location, inter alia, without limitation). Step 346H can include statistically analyzing parameterized objects (such as, for example, by correlation methods, principal component analysis, hierarchical clustering, SVMs, neural net classification, and/or other methods). Step 332H can include presenting results to one or more persons on one or more local or distant display devices (examples include: 3D/2D images, annotated images, histograms, cluster plots, overlay images, and color coded images, inter alia).
  • The post-processing steps can include a number of substeps, including, among other steps those illustrated in FIG. 27(b), without limitation:
  • i) after data access 332H normalizing, filtering (de-noising), blending (integrating) of multifoci subimages 334H;
  • ii) Registering subimages into a single image;
  • iii) Filtering and normalizing images produced from corrected subimages 336H;
  • iv) Registering, building mosaics, and blending sets of corrected images into a larger whole 338H;
  • v) Optionally, at this stage, lower resolution images can be created of the larger image to facilitate access, as well as images from different perspectives (xy, xz, yz).
  • vi) Data-compressed versions (e.g., JPEG, wavelet compression, inter alia without limitation) can be produced;
  • vii) Segmenting images into objects 340H This segmentation can either be manual, automated or a combination of both.
  • viii) Parameterizing the objects 342H (for instance, size, shape, spectral signature).
  • ix) Classifying objects into higher order structures/features 344H (e.g. material stress or cracks, vasculature, nuclei, cell boundaries, extra-cellular matrix, and location, inter alia, without limitation)
  • x) Statistically analyzing parameterized objects 346H (e.g., by correlation or other methods).
  • xi) Presenting results to user on display device 348H (examples include: 3D/2D images, annotated images, histograms, cluster plots, overlay images, and color coded images).
  • FIGS. 28(a) and(b) relate to image normalization. The multi foci power map of the MMM when a micro lens array is implemented. The numbers resemble the power in each foci in the sample. Due to the Gaussian beam profile in an example, 51.2 mW are inhomogeneously spread over the 36 foci. 24.1 mW contribute to the foci lying beyond the 6×6 foci matrix and are thus lost.
  • The normalized signal distribution resembles the normalized power map squared and shows an intensity drop of 45% toward the corner PMTs in respect to the center PMTs. The laser power was attenuated to 75.3 mW in the sample and can reach a maximal value of approx. 645 mW, resembling a power of approx. 15 mW for the center foci in the sample. Measured intensity profile can be generated by imaging a homogeneously distributed fluorescent dye under a cover slip. The intensity measurement is not only mapped by the power/intensity distribution of the foci, but also by the sensitivity of the detector array. As a result it resembles the “true” measured intensity distribution. The image consists of 192×192 pixels and was generated by an array of 6×6 foci which were scanned across a uniform fluorescent dye sample.
  • 2D xy Image Normalization is Carried Out in Different Ways:
  • Case 1: The normalized inverse of this intensity image (from a uniform fluorescent dye) is multiplied with the yx images taken of the sample. The resulting images are then displayed and saved as a normalized image.
  • Case 2: A large number of images from a sample at various positions (and thus with a random underlying intensity structure) is averaged. This image is then inversed and normalized. This image is multiplied with the original data is then displayed and saved as a normalized image.
  • Case 3: A simplified image is generated which consists of 36 sub-images (generated by the 6×6 foci). Each of the sub-images carries the average intensity generated by the specific foci. For example, all 32×32 pixels in the top left sub image carry the same number; 45. The image is then inversed and normalized. This image multiplied with the original data is then displayed and saved as a normalized image. An image can be generated either from the intensity image generated by the process of case 1 (fluorescent image) or case 2 (over many images averaged). 3D xyz image normalization is carried out in a similar fashion as in case 2 of the xy image normalization. A z-intensity profile (an example is FIG. 28 b) is generated by averaging the intensity signal the xy planes form different positions in z. As the penetration depth increases, the average intensity decreases along the z-axis. In order to get a good average intensity for the z-intensity profile, images from a sample at various positions (and thus with a random underlying intensity structure) are averaged. This z-intensity profile is then inversed and normalized. Each image plane is then multiplied with the according normalization number generated by this process.
  • A method for multifocal multiphoton imaging of a specimen in accordance with a preferred embodiment of the present invention:
  • (0) Start
  • (1) Sample pre processing (optional)
  • (2) place the sample in the region of focus of the focusing device (objective lens)
  • (3) determine imaging parameters
  • (4) set imaging parameters
  • (5) image
  • (6) Process images for feedback purposes (optional)
  • (7) display the images (optional)
  • (8) save the data
  • (9) process the data (optional)
  • (10) save the processed data (optional)
  • (11) display the processed data (optional)
  • Concerning the order of the steps, (1) and (2) can be switched: (4), (5) and (6) can be switched.
  • In more detail these are
  • (1) Sample Pre Processing
  • Apply tissue staining
  • Apply optical clearing agents
  • (2) Place the Sample in the Region of Focus of the Focusing Device (Objective Lens):
  • Determine desired sample region
  • Place the sample in the region of focus of the focusing device (objective lens): or
  • Place the region of focus of the focusing device (objective lens) on to the sample
  • (3) Determine Imaging Parameters
  • Determine imaging parameters by which include variable or multiple values for each of the parameters
      • measurements performed manually or in an automated fashion outside or within the imaging procedure comprising
        • the region of interest
          • Sample shape
          • Sample margins
          • Emission intensity
          • Emission wavelength
          • Emission polarization
          • . . . *
        • sample
          • type
          • photons mean free path
          • power threshold
          • sample fixation
          • sample labeling
          • . . . *
        • illumination
        • detector
        • . . . *
        • The imaging parameters comprise
          • illumination wavelength
          • illumination power
          • illumination polarization
          • scanning speed
          • maximal penetration depth
          • sampling
          • . . . *
            (4) Set Imaging parameters
  • A computer program is fed with the imaging potentially dynamically adjustable (including feedback from the measurement) parameters and controls the imaging procedure.
  • (5) image
  • Point measurement.
  • 1D Scan: Collect Data from a Region of Interest
  • 2D Scan:
  • Image a 2D region of interest by scanning the foci in parallel across the imaging plane (XY)
  • the 2D scanning starts for example at a corner of an area and is then scanned in a raster until the area is covered according to the imaging parameters (current implementation)
  • or it can be scanned in any other way (even random scan is allowed), as long as the area is covered according to the imaging parameters and the position of the foci is known by the signal sent or received by the scanner.
  • A key consideration for the improvement in the measurement is that the detector measures the sample for each scanning position, without overlap.
  • 3D Scan:
  • move the focusing device (objective lens) in reference to the sample along the optical axis (Z) and repeat the 2D imaging process. Either, the focusing device (objective lens) or the sample can be moved. Right now, the focusing device (objective lens) is moved stepwise in regards to the fixed sample.
  • 2D imaging along the optical axis (Z) can begin at any point in the sample and end at any point of the sample within the region of interest.
  • the current movement is though depending on the application:
      • (a) begin scanning a 2D scan of the top layer of the tissue sample, or
      • begin scanning a little outside the sample, to be able to determine the top, or
      • begin scanning within the sample, to prevent beam-sample interactions (like burning or such thing), which take place at the surface-immersion medium barrier.
      • (b) then move the focusing device (objective lens) by means of a piezo in the direction of the sample in increments determined by the imaging parameters
      • start at (a) again
      • Stop at a point in the sample, which is determined by the imaging parameter
      • Move the piezo back to its original position
      • Move the sample to a different position and start with (a) again (area imaging)
  • For some applications it is preferable, if the movement can be reversed (starting inside of the sample and then move out), or performed in a random, fashion, covering the whole area, as long as the z-position is known.
  • The z position of the foci is known as the piezo position is known
  • The 2D scanning is done while the z-scan from one position to the next takes place or after the z-scan has completed its move to the next position.
  • Images of 2D sections can be done alone, without any 3D movement involved.
  • (6) Process Images for Feedback Purposes (Optional)
  • (7) Display the Images (Optional)
  • (8) Save the Data
  • Process data before saving (Optionally)
  • (9) Post Process the Data (Optional)
  • Image normalization
  • linear image deconvolution
  • nonlinear image deconvolution
  • (10) Save the Processed Data (Optional)
  • (11) Display the Processed Data (Optional)
  • In addition to mechanistic applications, time-resolved measurements, either alone or in conjunction with spectral measurements, can greatly aid in distinguishing signals from different reporter probes and processes, such as simple scattering and non-linear scattering. For cytometry applications, the additional information from time-resolved measurements can potentially increase the number of probes which can be used simultaneously, provide images cell morphology by detection of second harmonic generation, and aid in deconvolution of images from highly scattering samples.
  • FIGS. 29(a) and 29(c) relate to a deconvolution process. In FIG. 29(a) Illumination foci in the optical plane. (foci f11-f33 are illustrated in an enlarged) along with an object, illuminated by foci f22. In FIG. 29(b) the detection signals are scattered along with the detection areas a11-a33. FIG. 29(c) are example of signal counts detected by the associated channels of the multi channel detector (signal from area all is collected by the detector channel c11; a12 by c12 and so on) at a certain time point, when the focus f1 scans the center of the object (a). The relative signal distribution between the channels is dependent on the mean free path of the detection photons in the media and the penetration depth. It is constant however, if a homogeneous scattering distribution is assumed (For many samples this can de assumed in the first approximation). Low mean free path means highly scattering, means higher amount of signal in other than channel c22. As the penetration depth into the sample increases, the chance, that a photon is scattered on its way to the detector array increases and thus the described “optical cross talk” increases as well. More scattered light is found in the channels, neighboring the outer channels c11, c12, c13, c21, c23, c31, c32 and c33. The signals in these detection elements are smaller though and are not illustrated for simplicity.
  • FIGS. 30(a)-30(d) display a 1 dimensional (1D) deconvolution exemplifying the final 2D deconvolution executed in the linear image deconvolution. For simplification only nearest neighbors are shown. A linear convolution with a delta function with inversed side lobes (FIG. 30(c) (Illustration only in along one channel number direction) results in a linearly de-convolved image in which only the channel c22 carries a signal. In practice, this function can either be modeled or measured. If the deconvolution process is shown for simplicity only in x direction. It will be carried out in both x and y directions, and will result in an image in which only channel c22 will carry a signal.
  • Assuming a homogeneously scattering material (which can be assumed for samples in the first approximation), the relative and absolute height of the peaks of the delta function is fixed for every channel at its neighbors at a certain imaging depth into the sample. As a result, xy images can be linearly de-convoluted.
  • The linear deconvolution of cross-talk is primarily a 2D process. The values of the weighting matrix depend on several factors. The optical contribution to the cross talk increases with increasing penetration depth. Furthermore, the channels have different sensitivities, there is electronic cross talk between channels that varies from channel to channel and other factors influence the amount of total cross talk between the channels.
  • The cross talk for each individual channel can be determined experimentally. An example is where, one focus illuminates the sample or a test object and the whole array of detectors detects the signal. At different penetration depths a cross talk matrix is measured for each channel. This matrix is then used to carry out the deconvolution. Data of such a measurement at the sample surface and at a penetration depth of 200 mm, can be used. The measurement is repeated for every channel for example by moving the iris from transmitting light from one single micro lens to the next (in this case for channels c11 to c33). Similar alternative methods are also possible, for example by illuminating with all of the foci but using a sample with large object spacing. Furthermore, models can also replace experimental determination.
  • An entire 2D image consists of collections of ensembles for a non linear deconvolution, of pixels, from each detector. The key point is that relationships between entire ensembles, and certain regions of pixels between ensembles, can be established to constrain the variation of the weighting matrix to aid convergence without assumptions, or with minimal assumptions, of the sample or the processes which cause the variation in the weighting matrix.
  • For example, continuity of the values across the boundaries of the ensemble can be generally required. In the case with the minimal assumption that the objects under observation are smaller than the region covered by an individual ensemble, the ensembles can be considered largely independent, except due to the cross-talk introduced by the weighting matrix. The ideal image can be recovered by simultaneously solving for a weighting matrix which minimizes the covariance between ensembles. In the other case where the objects under observation are of similar size or larger than the regions covered by the ensembles, minimal models of the object (such as from image morphology or segmentation of the collected image, etc . . . ) can be used to form constraints.
  • Additional model dependent and independent constraints can also be applied by consideration of the planes above and below the plane under evaluation. Further constraints can also be applied to the weighting matrix from either general (such as continuation, smoothness, sharpness, etc . . . ) or model based considerations .
  • While this invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.

Claims (160)

1. A multifocal imaging system comprising:
a multifocal optical device that provides a plurality of optical pathways;
a scanner that provides relative movement between the plurality of optical pathways and a material to be imaged;
an optical system that couples light from the optical device onto a region of interest of the material;
a detector array that detects light from a plurality of focal locations in the region of interest to generate image data, the detector array having a plurality of detector elements correlated with the focal locations; and
an image processor connected to the detector array.
2. The system of claim 1 wherein the scanner comprises a rotating mirror or a resonant mirror.
3. The system of claim 1 wherein each detector element has a collection area corresponding to a scattering distribution for each of the plurality of focal locations.
4. The system of claim 1 wherein the detector array detects a fluorescence signal from each focal location.
5. The system of claim 1 wherein the detector array comprises a multi-anode photomultiplier tube imaging detector having at least 64 detector elements.
6. The system of claim 1 further comprising a focusing lens system that adjusts a depth of focus within a sample in the range of 0 μm to 2000 μm.
7. The system of claim 1 wherein the detector comprises an array of photomultiplier elements.
8. The system of claim 1 further comprising a computer program that forms images including a deconvolution of pixel values with a scattering correction function.
9. The system of claim 1 further comprising a computer program that processes the image data.
10. The system of claim 9 wherein the program comprises a linear deconvolution process.
11. The system of claim 9 wherein the program comprises a non-linear deconvolution process.
12. The system of claim 9 wherein the program comprises a deconvolution process including scattering correction function.
13. The system of claim 12 wherein the image data comprises a three dimensional representation of a scanned region of interest, the representation having a plurality of pixel values, the scattering correction function including a plurality of adjacent pixel values for each pixel value of the representation.
14. The system of claim 13 wherein the plurality of adjacent pixel values comprises a weighting matrix that corrects for light scattering from tissue along adjacent optical pathways.
15. The system of claim 13 further comprising a holder for a material to be scanned.
16. The system of claim 1 wherein each of the plurality of optical pathways defines a plurality of focal locations in an imaging plane.
17. The system of claim 9 wherein the program comprises a normalization process.
18. The system of claim 1 wherein the multifocal optical device comprises a micro lens array.
19. The system of claim 16 wherein a distance between adjacent focal locations in the imaging plane is between 40 and 200 microns.
20. The system of claim 18 wherein the micro lens array has at least 64 lens elements.
21. The system of claim 1 wherein the detector array comprises a first detector array and a second detector array.
22. The system of claim 1 wherein the optical system comprises a moveable objective lens.
23. The system of claim 22 wherein the objective lens moves along an axis through the region of interest.
24. The system of claim 1 wherein the optical system comprises a first lens and a tube lens.
25. The system of claim 18 further comprising an iris defining an exit aperture of the micro lens array.
26. The system of claim 1 further comprising a confocal pinhole array adjacent to the detector array.
27. The system of claim 1 further comprising a bandpass filter adjacent to the detector array.
28. The system of claim 1 further comprising a dichroic mirror that reflects light returning from the region of interest onto the detector array.
29. The system of claim 1 further comprising a first reflector positioned along an optical path between a light source and a micro lens array, the first reflector coupling light to a first detector.
30. The system of claim 1 further comprising a beam expander positioned between a light source and a micro lens array, a second reflector and a second detector.
31. The system of claim 21 further comprising a reflector that separates light returning from the region of interest onto a first optical path towards the first detector array and onto a second optical path towards a second detector array.
32. The system of claim 21 wherein the first detector array is a first photomultiplier array and the second detector is a second photomultiplier array.
33. The system of claim 15 wherein the holder is moveable in three orthogonal directions.
34. The system of claim 1 further comprising a light source.
35. The system of claim 1 wherein the light source comprises a laser.
36. The system of claim 34 further comprising a pulse compressor optically coupled to the light source.
37. The system of claim 34 further comprising an attenuator that adjusts light intensity.
38. The system of claim 18 further comprising a moveable micro lens array holder.
39. The system of claim 38 wherein the moveable micro lens array holder scans in three orthogonal directions.
40. The system of claim 1 further comprising a detector lens that focuses light returning from each focal location onto a corresponding detector element.
41. The system of claim 1 further comprising a controller connected to the scanner that controls scanning speed and resolution.
42. The system of claim 41 wherein the focal locations are separated from each other by at least 10 microns.
43. The system of claim 41 wherein the controller receives feedback control signals from a detector that monitors a light characteristics.
44. The system of claim 43 wherein the detector detects a reference beam and generates reference signals.
45. The system of claim 1 further comprising a reflector that reflects a portion of scanning light and a third detector that measures the scanning light.
46. The system of claim 1 wherein the detector array comprises a plurality of detector elements that detect light from focal locations that are separated from each other by more than 25 microns.
47. The system of claim 1 wherein the optical pathways each have a focal location within the region of interest, adjacent focal locations being separated by distance in a range between 0.2 and 20 times a mean free path of light illuminating in a tissue or material to be imaged.
48. The system of claim 47 wherein the distance between focal locations is correlated with a material to be scanned.
49. The system of claim 1 wherein the detector array comprises detector elements positioned at different focal distances to image at different depths within the region of interest.
50. The system of claim 1 wherein the multifocal optical device provides a plurality of optical pathways having different focal depths within the region of interest.
51. The system of claim 34 wherein the light source emits light at a wavelength such that at least two photons of the light that are incident at a focal location of a material within the region of interest are necessary induce a fluorescence emission from the material.
52. The system of claim 51 wherein the light source emits at a wavelength such that at least three photons of the light are incident at a focal location are necessary to induce fluorescence of the material.
53. The system of claim 1 wherein the multifocal optical device comprises a diffractive optical element.
54. The system of claim 1 wherein the multifocal optical element comprises a plurality of optical fibers.
55. The system of claim 1 further comprising a fiber optic device that couples a light source to the multifocal optical element.
56. The system of claim 55 wherein the fiber optic device comprises a coherent fiber optic bundle.
57. The system of claim 1 further comprising a fiber optic device that transmits light along an optical path between the region of interest and the detector array.
58. The system of claim 57 wherein the fiber optic device comprises a multichannel plate.
59. The system of claim 1 further comprising a spectral dispersing element that separates light returning from the region of interest into a plurality of wavelengths that are detected by the detector array.
60. The system of claim 59 wherein the spectral dispersing element comprises a transmission grating.
61. The system of claim 1 wherein the system comprises a light source connected to a probe with a fiber optic cable.
62. The system of claim 61 wherein the probe comprises a handle and a distal probe.
63. The system of claim 62 wherein the handle houses the multifocal optical element and the scanner and the distal probe houses the optical system.
64. The system of claim 63 wherein the distal probe is rigidly attached to the handle and further comprises a rigid center endoscope body.
65. The system of claim 63 wherein the optical system includes a distal lens.
66. The system of claim 63 wherein the optical system comprises a first lens, a second lens and a distal objective lens.
67. The system of claim 1 further comprising a second light source.
68. The system of claim 67 wherein the second light source provides a stationary light beam that is optically coupled to an output lens with a reflector.
69. The system of claim 63 wherein the handle further comprises the detector array.
70. The system of claim 69 wherein the detector array comprises array of photomultiplier tubes remotely connected to the image processor.
71. The system of claim 1 wherein the detector comprises a CMOS imaging device.
72. The system of claim 1 wherein the detector further comprising a binning charge coupled device (CCD) camera such that each binned pixel region has a light collection area corresponding to a scattering distribution from each focal location.
73. The system of claim 1 wherein the detector array comprises a plurality of avalanche photodiodes.
74. The system of claim 1 further comprising a laser light source including a picosecond laser or a femtosecond laser.
75. The system of claim 1 wherein the system has a resolution in the region of interest of about 0.1 microns to about 2.0 microns.
76. The system of claim 1 wherein the system images at least 5 frames per second, each frame having at least 256 by 256 pixels.
77. The system of claim 41 wherein the controller actuates illumination of different focal regions and controls detector readout in a time multiplexed process.
78. The system of claim 1 wherein the multifocal optical element is moveable by the controller.
79. The system of claim 1 further comprising a confocal light collection system
80. The system of claim 79 further comprising multiphoton light excitation.
81. A method for multifocal imaging comprising:
illuminating a region of interest with light using a plurality of optical pathways;
providing relative movement between the plurality of optical pathways and the region of interest; and
detecting light from a plurality of focal locations in the region of interest to generate image data.
82. The method of claim 81 further comprising providing relative movement by scanning with a rotating mirror or a resonant mirror.
83. The method of claim 81 further comprising detecting with a detector array having a plurality of detector elements, each detector element having a collection area corresponding to a scattering distribution for each of a plurality of focal locations.
84. The method of claim 83 further comprising detecting a fluorescence signal from each focal location, the detector being connected to an image processor.
85. The method of claim 81 further comprising detecting with a multi-anode photomultiplier tube imaging detector having at least 64 detector elements.
86. The method of claim 81 further comprising providing a focusing lens system that adjusts a depth of focus within a sample in the range of 0 μm to 2000 μm.
87. The method of claim 81 wherein the detector comprises an array of photomultiplier elements.
88. The method of claim 81 further comprising forming images by a deconvoluting pixel values with a scattering correction function.
89. The method of claim 81 further comprising processing the image data with a computer program on an image processor.
90. The method of claim 89 further comprising processing with the program including a linear deconvolution process.
91. The method of claim 89 further comprising processing with the program including a non-linear deconvolution process.
92. The method of claim 89 further comprising processing with a deconvolution process including a scattering correction function.
93. The method of claim 92 further comprising processing image data including a three dimensional representation of a scanned region of interest, the representation having a plurality of pixel values, the scattering correction function including a plurality of adjacent pixel values for each pixel value of the representation.
94. The method of claim 93 further comprising using the plurality of adjacent pixel values as a weighting matrix that corrects for light scattering from tissue along adjacent optical pathways.
95. The method of claim 81 further comprising providing a holder for a material to be scanned.
96. The method of claim 81 further comprising using each of the plurality of optical pathways to illuminate a plurality of focal locations in an imaging plane.
97. The method of claim 89 further comprising processing the image data with a normalization process.
98. The method of claim 81 further comprising forming the plurality of optical pathways with a micro lens array, a diffractive optical element or a plurality of optical fibers.
99. The method of claim 96 further comprising providing a distance between adjacent focal locations in the imaging plane between 40 and 200 microns.
100. The system of claim 98 further comprising providing a micro lens array having at least 64 lens elements.
101. The method of claim 81 further comprising providing a detector array having a first detector array and a second detector array.
102. The method of claim 81 further comprising providing an optical system having a moveable objective lens.
103. The method of claim 102 further comprising moving the objective lens along an axis through the region of interest.
104. The method of claim 81 further comprising providing an optical system having a first lens and a tube lens.
105. The method of claim 98 further comprising using an iris defining an exit aperture of the micro lens array.
106. The method of claim 81 further comprising obtaining a confocal image of a material with the detector array.
107. The method of claim 81 further comprising providing a bandpass filter adjacent to the detector array.
108. The method of claim 81 further comprising providing a dichroic mirror that reflects light returning from the region of interest onto the detector array.
109. The method of claim 81 further comprising providing a first reflector positioned along an optical path between a light source and the multifocal optical device, the first reflector coupling light to a first detector.
110. The method of claim 81 further comprising a beam expander positioned between a light source and the multifocal optical device, a second reflector and a second detector.
111. The method of claim 101 further comprising a reflector that separates light returning from the region of interest onto a first optical path towards the first detector array and onto a second optical path towards a second detector array.
112. The method of claim 101 further comprising providing the first detector array including a first photomultiplier array and providing the second detector including a second photomultiplier array.
113. The system of claim 81 further comprising providing optical pathways that have a focal location within the region of interest, adjacent focal locations being separated by distance in a range between 0.2 and 20 times a mean free path of the illuminating light of tissue or material being imaged.
114. The method of claim 81 further comprising providing a distance between adjacent focal locations that is correlated with a mean free path of light within a material to be scanned.
115. The method of claim 81 further comprising providing a detector elements positioned at different focal distances to image at different depths within the region of interest.
116. The method of claim 81 further comprising providing multifocal optical device having a plurality of optical pathways having different focal depths within the region of interest.
117. The method of claim 81 further comprising providing a light source that emits light at a wavelength such that at least two photons of the light that are incident at a focal location of a material within the region of interest are necessary induce a fluorescence emission from the material.
118. The method of claim 117 further comprising illuminating with light at a wavelength such that at least three photons of the light are incident at a focal location are necessary to induce fluorescence of the material.
119. The method of claim 81 further comprising providing a fiber optic device that couples a light source to the multifocal optical element.
120. The method of claim 81 further comprising providing a fiber optic device that transmits light along an optical path between the region of interest and the detector array.
121. The method of claim 81 further comprising applying a dye to a material to be imaged.
122. The method of claim 81 further comprising detecting a fluorescent protein in tissue.
123. The method of claim 81 further comprising detecting a genetically introduced fluorescent material.
124. The method of claim 81 further comprising detecting autofluorescence of a material.
125. The met-hod of claim 81 further comprising collecting time resolved spectroscopic data from the region of interest.
126. The method of claim 125 wherein the step of collecting time resolved data comprises collecting fluorescence lifetime data.
127. The method of claim 81 further comprising performing harmonic generation microscopy.
128. The method of claim 81 further comprising detecting Raman scattered data from each of the focal locations.
129. The method of claim 128 further comprising performing a coherent anti-Stokes Raman scattering measurement of a material.
130. The method of claim 81 further comprising collecting a multiphoton quantum data image from the region of interest.
131. The method of claim 81 further comprising collecting a surface plasmon image from the region of interest.
132. The method of claim 81 further comprising performing stimulated emission depletion microscopy of a material.
133. The method of claim 81 further comprising providing a probe having a handle and a probe element connected to the handle and illuminating a tissue region of a subject with the probe to collect data.
134. The method of claim 133 further comprising inserting the probe element within the body of a mammalian subject to collect image data of tissue within the subject.
135. The method of claim 133 further comprising inserting the probe element within a body cavity or lumen of a subject.
136. The method of claim 133 further comprising providing a control circuit, a detector array, a multifocal optical element and an optical scanner within the handle.
137. The method of claim 133 further comprising coupling a light source to the handle with a fiber optic cable.
138. The method of claim 136 further comprising connecting the control circuit to an external image processor.
139. The method of claim 133 wherein the probe element comprises an endoscope body.
140. The method of claim 139 wherein the endoscope body has a length of at least 5 cm.
141. The method of claim 81 further comprising forming a plurality of beams that simultaneously provide focal locations at a plurality of depths within a material to be scanning, and scanning the material at the plurality of depths simultaneously to provide a three dimensional image data set.
142. The method of claim 81 further comprising performing time multiplexed illumination of focal locations.
143. The method of claim 142 further comprising using a controller to actuate a light source to provide the time multiplexed illumination.
144. The method of claim 142 further comprising selecting a pulse separation and pulse width parameters.
145. The method of claim 142 further comprising detecting the focal locations with a single detections channels.
146. The method of claim 81 further comprising forming an image of a mammalian organ.
147. The method of claim 81 further comprising determining whether tissue cells are cancerous.
148. The method of claim 81 further comprising forming an image of vascular tissue.
149. The method of claim 81 further comprising sectioning a portion of tissue such as brain tissue.
150. The method of claim 81 further comprising measuring a response to a therapeutic agent in tissue.
151. A multifocal light detecting system comprising:
a multifocal optical device that provides a plurality of light beams;
an optical system that couples light from the optical device onto a region of interest of the material;
a detector device that detects light from a plurality of focal locations in the region of interest to generate data; and
a processor connected to the detector.
152. The system of claim 151 further comprising a scanner such as a rotating mirror or a resonant mirror.
153. The system of claim 151 wherein the detector has a collection area corresponding to a scattering distribution for each of the plurality of focal locations.
154. The system of claim 151 wherein the detector detects time resolved data for deconvoultions.
155. The system of claim 151 further comprising a computer program that process time resolved data in combination with spectroscopic data to distinguish components of tissue.
156. A method for multifocal light detection comprising:
illuminating a region of interest with light using a plurality of optical pathways; and
detecting light from a plurality of focal locations in the region of interest to generate data.
157. The method of claim 156 further comprising providing relative movement between the pathways and a material by scanning with a rotating mirror or a resonant mirror.
158. The method of claim 156 further comprising detecting with a detector array having a plurality of detector elements, each detector element having a collection area corresponding to a scattering distribution for each of a plurality of focal locations and collecting time realized data and fluorescence data.
159. The method of claim 156 further comprising a detection array of photomultiplier elements.
160. The method of claim 156 further comprising providing a light source that emits light at a wavelength such that at least two photons of the light that are incident at a focal location of a material within the region of interest are necessary induce a fluorescence emission from the material.
US11/442,702 2005-05-25 2006-05-25 Multifocal imaging systems and method Abandoned US20070057211A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/442,702 US20070057211A1 (en) 2005-05-25 2006-05-25 Multifocal imaging systems and method
US15/815,536 US10598597B2 (en) 2005-05-25 2017-11-16 Multifocal imaging systems and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US68460805P 2005-05-25 2005-05-25
US11/442,702 US20070057211A1 (en) 2005-05-25 2006-05-25 Multifocal imaging systems and method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/815,536 Continuation US10598597B2 (en) 2005-05-25 2017-11-16 Multifocal imaging systems and method

Publications (1)

Publication Number Publication Date
US20070057211A1 true US20070057211A1 (en) 2007-03-15

Family

ID=37075066

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/442,702 Abandoned US20070057211A1 (en) 2005-05-25 2006-05-25 Multifocal imaging systems and method
US15/815,536 Active US10598597B2 (en) 2005-05-25 2017-11-16 Multifocal imaging systems and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/815,536 Active US10598597B2 (en) 2005-05-25 2017-11-16 Multifocal imaging systems and method

Country Status (3)

Country Link
US (2) US20070057211A1 (en)
EP (2) EP2703871A3 (en)
WO (1) WO2006127967A2 (en)

Cited By (143)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070285658A1 (en) * 2006-06-12 2007-12-13 Neptec Optical Solutions, Inc. High-speed, rugged, time-resolved, raman spectrometer for sensing multiple components of a sample
US20080027279A1 (en) * 2007-10-24 2008-01-31 Abou El Kheir Tarek A N Endoscopic System and Method for Therapeutic Applications and Obtaining 3-Dimensional Human Vision Simulated Imaging With Real Dynamic Convergence
US20080068065A1 (en) * 2005-04-27 2008-03-20 International Business Machines Corp. Electronically scannable multiplexing device
US20080277595A1 (en) * 2007-05-10 2008-11-13 Pacific Biosciences Of California, Inc. Highly multiplexed confocal detection systems and methods of using same
US20080283772A1 (en) * 2007-05-10 2008-11-20 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced authofluorescence
WO2008140758A1 (en) * 2007-05-10 2008-11-20 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced autofluorescence
US20090015923A1 (en) * 2007-07-09 2009-01-15 Auld Jack R Multi-Spot Ophthalmic Laser Probe
US20090141140A1 (en) * 2007-12-03 2009-06-04 Robinson M Dirk End-to-end design of electro-optic imaging systems for color-correlated objects
US20090185734A1 (en) * 2008-01-18 2009-07-23 Hemocue Ab Apparatus and method for analysis of particles in a liquid sample
US20090236549A1 (en) * 2008-03-21 2009-09-24 Vogt William I Apparatus for improving detection efficiency of multiphoton microscopy systems by focus compensation, pupil image division, and parallel pupil rearrangement
US20100033719A1 (en) * 2007-02-26 2010-02-11 Koninklijke Philips Electronics N.V. Method and device for optical analysis of a tissue
US20100078575A1 (en) * 2008-08-22 2010-04-01 Reilly Michael T Versatile Surface Plasmon Resonance Analyzer with an Integral Surface Plasmon Resonance Enhanced Fluorescence Mode
US20100167413A1 (en) * 2007-05-10 2010-07-01 Paul Lundquist Methods and systems for analyzing fluorescent materials with reduced autofluorescence
WO2010141486A1 (en) * 2009-06-01 2010-12-09 Bio-Rad Laboratories, Inc. Calibration of imaging device for biological/chemical samples
US20100322502A1 (en) * 2009-06-18 2010-12-23 Olympus Corporation Medical diagnosis support device, image processing method, image processing program, and virtual microscope system
US20110077528A1 (en) * 2008-03-28 2011-03-31 Volcano Coproration Method and apparatus for simultaneous hemoglobin reflectivity measurement and oct measurement, thrombus detection and treatment, and oct flushing
WO2011056658A1 (en) * 2009-10-27 2011-05-12 Duke University Multi-photon microscopy via air interface objective lens
US20110122366A1 (en) * 2009-11-24 2011-05-26 Smith Ronald T Single-fiber multi-spot laser probe for ophthalmic endoillumination
US20110144627A1 (en) * 2009-12-15 2011-06-16 Smith Ronald T Multi-spot laser probe
US20110228116A1 (en) * 2010-03-16 2011-09-22 Eli Margalith Spectral imaging of moving objects with a stare down camera
US20110249148A1 (en) * 2008-12-22 2011-10-13 Koninklijke Philips Electronics N.V. Cmos imager with single photon counting capability
US20110304721A1 (en) * 2010-06-09 2011-12-15 Honeywell International Inc. Method and system for iris image capture
US20120005793A1 (en) * 2009-03-23 2012-01-05 Nenad Ocelic Near field optical microscope
US20120105593A1 (en) * 2010-10-29 2012-05-03 Sony Corporation Multi-view video and still 3d capture system
US20120257038A1 (en) * 2011-04-07 2012-10-11 Valerica Raicu High speed microscope with narrow detector and pixel binning
US20120320184A1 (en) * 2011-06-17 2012-12-20 Leica Microsystems Cms Gmbh Microscope and Method for Fluorescence Imaging Microscopy
US20130032734A1 (en) * 2010-04-26 2013-02-07 Santori Charles M Non-uniform grating
US20130153788A1 (en) * 2009-07-10 2013-06-20 The Govt. of the United States of America, as represented by the Secretary, D.H.H.S. Non-contact total emission detection method and system for multi-photon microscopy
WO2013096454A1 (en) * 2011-12-19 2013-06-27 Perceptron, Inc. Non-contact sensor having improved laser spot
US20130256564A1 (en) * 2010-11-22 2013-10-03 Deutsches Krebsforschungszentrum STED Microscopy With Pulsed Excitation, Continuous Stimulation, And Gated Registration Of Spontaneously Emitted Fluorescence Light
US8594455B2 (en) * 2011-09-28 2013-11-26 The United States Of America As Represented By The Secretary Of The Army System and method for image enhancement and improvement
US20140012104A1 (en) * 2012-02-21 2014-01-09 National Taiwan University Method for Observing, Identifying, and Detecting Blood Cells
US20140023993A1 (en) * 2011-04-08 2014-01-23 British Columbia Cancer Agency Branch Apparatus and methods for multiphoton microscopy
US20140098213A1 (en) * 2012-10-05 2014-04-10 Canon Kabushiki Kaisha Imaging system and control method for same
US20140118524A1 (en) * 2010-04-28 2014-05-01 Sebastian Munck Method And Apparatus For The Imaging Of A Labeled Biological Sample
US20140178078A1 (en) * 2012-08-14 2014-06-26 The Institute Of Optics And Electronics, The Chinese Academy Of Sciences Method of eliminating offset in spot centroid due to crosstalk
US8771978B2 (en) 2010-11-15 2014-07-08 Tissuevision, Inc. Systems and methods for imaging and processing tissue
WO2014134314A1 (en) * 2013-03-01 2014-09-04 The Johns Hopkins University Light sources, medical devices, and methods of illuminating an object of interest
WO2015022147A1 (en) * 2013-08-15 2015-02-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
WO2015022145A1 (en) * 2013-08-15 2015-02-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
US20150069268A1 (en) * 2009-06-17 2015-03-12 W.O.M. World Of Medicine Ag Device and method for multi-photon fluorescence microscopy for obtaining information from biological tissue
EP2881778A1 (en) * 2013-12-04 2015-06-10 Olympus Corporation Scanning light microscope with focus adjustment
US20150160446A1 (en) * 2012-01-24 2015-06-11 Carl Zeiss Microscopy Gmbh Microscope and method for high-resolution 3d fluorescence microscopy
WO2013103475A3 (en) * 2011-12-09 2015-06-18 Massachusetts Institute Of Technology Portable optical fiber probe-based spectroscopic scanner for rapid cancer diagnosis
US9095414B2 (en) * 2011-06-24 2015-08-04 The Regents Of The University Of California Nonlinear optical photodynamic therapy (NLO-PDT) of the cornea
US20150310242A1 (en) * 2014-04-24 2015-10-29 Sick Ag Camera and method for the detection of a moved flow of objects
US9282237B2 (en) 2014-07-17 2016-03-08 Schlage Lock Company Llc Multifocal iris recognition device
US20160195793A1 (en) * 2010-01-22 2016-07-07 Fianium Ltd. Optical Sources
JP2017502300A (en) * 2013-12-24 2017-01-19 ティシュヴィジョン、インコーポレーテッド Multifocal multiphoton imaging system and method
US20170030835A1 (en) * 2014-04-17 2017-02-02 The Regents Of The University Of California Parallel acquisition of spectral signals from a 2-d laser beam array
US9594024B2 (en) * 2015-02-06 2017-03-14 Commissariat à l'énergie atomique et aux énergies alternatives Method for correcting a signal backscattered by a sample and associated device
DE102015116598A1 (en) 2015-09-30 2017-03-30 Carl Zeiss Microscopy Gmbh Method and microscope for high-resolution imaging by means of SIM
AU2015296920B2 (en) * 2014-07-28 2017-04-13 Alcon Inc. Increased depth of field microscope and associated devices, systems, and methods
US9677869B2 (en) 2012-12-05 2017-06-13 Perimeter Medical Imaging, Inc. System and method for generating a wide-field OCT image of a portion of a sample
WO2017184420A1 (en) * 2016-04-21 2017-10-26 Bribbla Dynamics Llc Optical system for reference switching
US9820652B2 (en) 2013-12-09 2017-11-21 The Board Of Trustees Of The Leland Stanford Junior University Multi-photon microscope having an excitation-beam array
CN108027425A (en) * 2015-09-18 2018-05-11 罗伯特·博世有限公司 Laser radar sensor
IT201600117339A1 (en) * 2016-11-21 2018-05-21 Crestoptics S P A APPARATUS AS A SUPER-RESOLUTION FOR FLUORESCENT ANALYSIS OF THE EYE BACKGROUND
WO2018094290A1 (en) 2016-11-18 2018-05-24 Tissuevision, Inc. Automated tissue section capture, indexing and storage system and methods
US9989754B2 (en) 2012-03-09 2018-06-05 Carl Zeiss Microscopy Gmbh Light scanning microscope with spectral detection
US20180157021A1 (en) * 2015-06-02 2018-06-07 Life Technologies Corporation Systems and methods for generating a structured illumination image
WO2018169601A1 (en) * 2017-03-13 2018-09-20 The Charles Stark Draper Laboratory, Inc. Light detection and ranging (lidar) system and method
US20180267283A1 (en) * 2015-01-20 2018-09-20 Hamamatsu Photonics K.K. Image acquisition device and image acquisition method
US20180329196A1 (en) * 2017-05-11 2018-11-15 Kaiser Optical Systems Inc. Endoscopic immersion probe end optics for laser spectroscopy
CN109073873A (en) * 2016-04-01 2018-12-21 国立大学法人浜松医科大学 Image capturing device and image acquisition method
US10245181B2 (en) 2012-12-21 2019-04-02 Alcon Research, Ltd. Grin fiber multi-spot laser probe
US10261298B1 (en) * 2014-12-09 2019-04-16 The Board Of Trustees Of The Leland Stanford Junior University Near-infrared-II confocal microscope and methods of use
CN109656014A (en) * 2019-01-31 2019-04-19 北京超维景生物科技有限公司 Multichannel phosphor collection device and three dimensional non-linear laser scanning cavity endoscope
US10274426B2 (en) 2014-12-23 2019-04-30 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US20190163132A1 (en) * 2016-05-06 2019-05-30 Uwm Research Foundation, Inc. Snapshot optical tomography system and method of acquiring an image with the system
JP2019174245A (en) * 2018-03-28 2019-10-10 国立大学法人 東京大学 X-ray photography method and x-ray photography device
CN110441311A (en) * 2019-07-22 2019-11-12 中国科学院上海光学精密机械研究所 The multifocal camera lens of multiaxis for the imaging of more object planes
US10499797B2 (en) 2007-07-03 2019-12-10 The Board Of Trustees Of The Leland Stanford Junior University System and method useful for sarcomere imaging via objective-based microscopy
US10527538B2 (en) 2015-05-28 2020-01-07 Accellix Ltd. System and apparatus for blind deconvolution of flow cytometer particle emission
US10551605B2 (en) 2014-12-23 2020-02-04 Apple Inc. Confocal inspection system having non-overlapping annular illumination and collection regions
US10577573B2 (en) 2017-07-18 2020-03-03 Perimeter Medical Imaging, Inc. Sample container for stabilizing and aligning excised biological tissue samples for ex vivo analysis
US10598597B2 (en) 2005-05-25 2020-03-24 Massachusetts Institute Of Technology Multifocal imaging systems and method
WO2019118582A3 (en) * 2017-12-12 2020-03-26 Trustees Of Boston University Multi-z confocal imaging system
US10626453B2 (en) 2014-04-14 2020-04-21 Sri International Portable nucleic acid analysis system and high-performance microfluidic electroactive polymer actuators
US10718931B2 (en) 2014-12-23 2020-07-21 Apple Inc. Confocal inspection system having averaged illumination and averaged collection paths
CN111693545A (en) * 2020-06-02 2020-09-22 哈尔滨工程大学 Composite structure array probe for testing lath and optical fiber white light interference device
US10788403B2 (en) 2015-03-11 2020-09-29 Tissuevision, Inc. Systems and methods for serial staining and imaging
US10816472B2 (en) 2015-01-20 2020-10-27 Hamamatsu Photonics K.K. Image acquisition device and image acquisition method
CN112051247A (en) * 2020-08-21 2020-12-08 杭州电子科技大学 Lens-free imaging device based on laminated imaging and phase recovery method thereof
WO2020257513A1 (en) * 2019-06-18 2020-12-24 Ruolin Li Method, system and apparatus for a raman spectroscopic measurement system
US10898726B2 (en) * 2018-09-25 2021-01-26 Siemens Healthcare Gmbh Providing an annotated medical image data set for a patient's radiotherapy planning
US10908084B2 (en) 2008-10-14 2021-02-02 Timothy M. Ragan Devices and methods for direct-sampling analog time-resolved detection
US20210116693A1 (en) * 2018-03-27 2021-04-22 PXYL Limited Non-linear optical scanning microscope
US11025885B2 (en) 2015-09-24 2021-06-01 Ouster, Inc. Optical system for collecting distance information within a field
CN113168019A (en) * 2019-10-18 2021-07-23 谷歌有限责任公司 Diffractive optical element for large field imaging
US11076080B2 (en) 2019-12-05 2021-07-27 Synaptics Incorporated Under-display image sensor for eye tracking
CN113390850A (en) * 2021-06-02 2021-09-14 复旦大学 Gastric Raman femtosecond picosecond image mapping method based on U-shaped convolution neural network
US11153513B2 (en) * 2019-08-19 2021-10-19 Synaptics Incorporated Light source for camera
US11154188B2 (en) 2019-06-20 2021-10-26 Cilag Gmbh International Laser mapping imaging and videostroboscopy of vocal cords
US11172826B2 (en) 2016-03-08 2021-11-16 Enspectra Health, Inc. Non-invasive detection of skin disease
CN113820035A (en) * 2021-09-28 2021-12-21 应急管理部天津消防研究所 Femtosecond laser filamentation remote non-contact temperature measuring device and measuring method
US11307143B2 (en) * 2017-12-12 2022-04-19 Allen Institute Systems, apparatuses and methods for simultaneous multi-plane imaging
US11389066B2 (en) 2019-06-20 2022-07-19 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11398011B2 (en) 2019-06-20 2022-07-26 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed laser mapping imaging system
US11412152B2 (en) 2019-06-20 2022-08-09 Cilag Gmbh International Speckle removal in a pulsed hyperspectral imaging system
US11412920B2 (en) 2019-06-20 2022-08-16 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11432706B2 (en) 2019-06-20 2022-09-06 Cilag Gmbh International Hyperspectral imaging with minimal area monolithic image sensor
US11471055B2 (en) 2019-06-20 2022-10-18 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11477390B2 (en) 2019-06-20 2022-10-18 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor
US11516374B2 (en) 2019-06-05 2022-11-29 Synaptics Incorporated Under-display image sensor
US11516388B2 (en) 2019-06-20 2022-11-29 Cilag Gmbh International Pulsed illumination in a fluorescence imaging system
US11533417B2 (en) 2019-06-20 2022-12-20 Cilag Gmbh International Laser scanning and tool tracking imaging in a light deficient environment
US11540696B2 (en) 2019-06-20 2023-01-03 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11550057B2 (en) 2019-06-20 2023-01-10 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11579080B2 (en) 2017-09-29 2023-02-14 Apple Inc. Resolve path optical sampling architectures
US11585749B2 (en) 2015-09-01 2023-02-21 Apple Inc. Reference switch architectures for noncontact sensing of substances
US11589819B2 (en) 2019-06-20 2023-02-28 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a laser mapping imaging system
WO2023035281A1 (en) * 2021-09-07 2023-03-16 清华大学 Dot matrix laser scanning-based flow imaging system
US11617541B2 (en) 2019-06-20 2023-04-04 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for fluorescence imaging
US11622094B2 (en) 2019-06-20 2023-04-04 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11624830B2 (en) 2019-06-20 2023-04-11 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for laser mapping imaging
US11633089B2 (en) 2019-06-20 2023-04-25 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor
US11633149B2 (en) 2017-04-28 2023-04-25 Enspectra Health, Inc. Systems and methods for imaging and measurement of sarcomeres
US11668919B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Driving light emissions according to a jitter specification in a laser mapping imaging system
US11671691B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Image rotation in an endoscopic laser mapping imaging system
US11674848B2 (en) 2019-06-20 2023-06-13 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for hyperspectral imaging
US11700995B2 (en) 2019-06-20 2023-07-18 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11716543B2 (en) 2019-06-20 2023-08-01 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11727542B2 (en) 2019-06-20 2023-08-15 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11758256B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11754500B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Minimizing image sensor input/output in a pulsed fluorescence imaging system
US11762180B2 (en) * 2017-06-16 2023-09-19 University Court Of The University Of St Andrews Three-photon light sheet imaging
EP3394579B1 (en) * 2015-12-21 2023-09-20 Verily Life Sciences LLC Systems and methods for determining an identity of a probe in a target based on colors and locations of two or more fluorophores in the probe and in the target
US11789250B2 (en) * 2019-11-06 2023-10-17 Technische Universität Braunschweig Optical detection device and method for operating an optical detection device
US11793399B2 (en) 2019-06-20 2023-10-24 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed hyperspectral imaging system
WO2023211772A1 (en) * 2022-04-25 2023-11-02 The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Lens array based imaging system with improved field of view
US11821989B2 (en) 2019-06-20 2023-11-21 Cllag GmbH International Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation
US11823403B2 (en) 2017-12-27 2023-11-21 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11854175B2 (en) 2019-06-20 2023-12-26 Cilag Gmbh International Fluorescence imaging with fixed pattern noise cancellation
US11852318B2 (en) 2020-09-09 2023-12-26 Apple Inc. Optical system for noise mitigation
US11877065B2 (en) 2019-06-20 2024-01-16 Cilag Gmbh International Image rotation in an endoscopic hyperspectral imaging system
US11882352B2 (en) 2019-06-20 2024-01-23 Cllag GmbH International Controlling integral energy of a laser pulse in a hyperspectral,fluorescence, and laser mapping imaging system
US11898909B2 (en) 2019-06-20 2024-02-13 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11903563B2 (en) 2019-06-20 2024-02-20 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11925328B2 (en) 2019-06-20 2024-03-12 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral imaging system
US11931009B2 (en) 2019-06-20 2024-03-19 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a hyperspectral imaging system
US11937784B2 (en) 2019-06-20 2024-03-26 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11960131B2 (en) 2022-01-13 2024-04-16 Apple Inc. Integrated photonics device having integrated edge outcouplers

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080262484A1 (en) * 2007-04-23 2008-10-23 Nlight Photonics Corporation Motion-controlled laser surface treatment apparatus
DE102009013147A1 (en) * 2009-03-05 2010-09-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for spectroscopy analysis e.g. Raman-spectroscopy, of surface or volume section of sample e.g. biological cell, in measuring arrangement, involves integrating radiations registered in focus volume over duration of measuring interval
US8903192B2 (en) 2010-10-14 2014-12-02 Massachusetts Institute Of Technology Noise reduction of imaging data
WO2013131062A1 (en) * 2012-03-02 2013-09-06 The Regents Of The University Of California System and method for time-resolved fluorescence imaging and pulse shaping
DE102012204128B4 (en) * 2012-03-15 2023-11-16 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
DE102013015933A1 (en) * 2013-09-19 2015-03-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
DE102013015932A1 (en) * 2013-09-19 2015-03-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
US10083507B2 (en) 2014-04-07 2018-09-25 Mimo Ag Method for the analysis of image data representing a three-dimensional volume of biological tissue
CN111855621B (en) 2015-02-24 2023-11-10 国立大学法人东京大学 Dynamic high-speed high-sensitivity imaging device and imaging method
WO2017073737A1 (en) 2015-10-28 2017-05-04 国立大学法人東京大学 Analysis device
WO2020206362A1 (en) * 2019-04-04 2020-10-08 Inscopix, Inc. Multi-modal microscopic imaging
EP4160319A1 (en) 2016-08-15 2023-04-05 Osaka University Electromagnetic wave phase/amplitude generation device, electromagnetic wave phase/amplitude generation method, and electromagnetic wave phase/amplitude generation program
DE102016119730A1 (en) 2016-10-17 2018-04-19 Carl Zeiss Microscopy Gmbh Optical group for detection light for a microscope, method for microscopy and microscope
GB201701691D0 (en) * 2017-02-01 2017-03-15 Illumina Inc System and method with reflective fiducials
US11896944B2 (en) 2017-02-01 2024-02-13 Illumina, Inc. System and method with fiducials responding to multiple excitation frequencies
GB201701688D0 (en) 2017-02-01 2017-03-15 Illumia Inc System and method with fiducials in non-recliner layouts
GB201701689D0 (en) 2017-02-01 2017-03-15 Illumia Inc System and method with fiducials of non-closed shapes
GB201701686D0 (en) 2017-02-01 2017-03-15 Illunina Inc System & method with fiducials having offset layouts
US11169092B2 (en) * 2017-07-11 2021-11-09 Hamamatsu Photonics K.K. Sample observation device and sample observation method
US11041756B2 (en) * 2017-10-20 2021-06-22 Charted Scientific Inc. Method and apparatus of filtering light using a spectrometer enhanced with additional spectral filters with optical analysis of fluorescence and scattered light from particles suspended in a liquid medium using confocal and non confocal illumination and imaging
WO2019148024A1 (en) 2018-01-26 2019-08-01 Park Jong Kang Systems and methods to reduce scattering in temporal focusing multiphoton microscopy
JP7369385B2 (en) 2018-06-13 2023-10-26 シンクサイト株式会社 Methods and systems for cytometry
JP6953632B2 (en) * 2018-07-09 2021-10-27 オリンパス株式会社 Optical analyzers, optical analysis methods and trained models
CN109557653B (en) * 2018-12-20 2021-06-29 浙江大学 Differential confocal microscopic imaging method and device based on algorithm recovery
CN109668869A (en) * 2018-12-28 2019-04-23 中国科学院长春光学精密机械与物理研究所 A kind of hand-held reflection Confocal laser-scanning microscopy detection device
KR20220024561A (en) 2019-06-19 2022-03-03 라이프 테크놀로지스 홀딩스 프리베이트 리미티드 Biological Analysis Devices and Systems
WO2022056385A1 (en) 2020-09-14 2022-03-17 Singular Genomics Systems, Inc. Methods and systems for multidimensional imaging
US20220139743A1 (en) * 2020-11-04 2022-05-05 Tokyo Electron Limited Optical Sensor for Inspecting Pattern Collapse Defects
CN113126205B (en) * 2021-04-20 2022-08-02 电子科技大学 Sectional type plane imaging system and method based on optical switch
FR3127048A1 (en) * 2021-09-10 2023-03-17 Universite de Bordeaux Optical microscope and method for high resolution optical microscopy with increased field of view
US20230371813A1 (en) * 2022-05-17 2023-11-23 EyeQ Technologies, Inc. Three-dimensional ocular endoscope device and methods of use

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5109149A (en) * 1990-03-15 1992-04-28 Albert Leung Laser, direct-write integrated circuit production system
US5233197A (en) * 1991-07-15 1993-08-03 University Of Massachusetts Medical Center High speed digital imaging microscope
US5583342A (en) * 1993-06-03 1996-12-10 Hamamatsu Photonics K.K. Laser scanning optical system and laser scanning optical apparatus
US5633695A (en) * 1995-08-14 1997-05-27 Canon Kabushiki Kaisha Beam steering optical system and method and ophthalmic apparatus using same having spaced apart irradiation and observation paths
US5691839A (en) * 1993-04-15 1997-11-25 Kowa Company Ltd. Laser scanning optical microscope
US5783814A (en) * 1994-01-18 1998-07-21 Ultrapointe Corporation Method and apparatus for automatically focusing a microscope
US6020591A (en) * 1997-07-11 2000-02-01 Imra America, Inc. Two-photon microscopy with plane wave illumination
US6248988B1 (en) * 1998-05-05 2001-06-19 Kla-Tencor Corporation Conventional and confocal multi-spot scanning optical microscope
US6392795B2 (en) * 1998-08-28 2002-05-21 Olympus Optical Co., Ltd. Microscope with a dynamic damper
US20040125372A1 (en) * 2002-10-17 2004-07-01 Walla Peter Jomo Multi-parameter fluorimetric analysis in a massively parallel multi-focal arrangement and the use thereof
US7209287B2 (en) * 2000-09-18 2007-04-24 Vincent Lauer Confocal optical scanning device
US20070254280A1 (en) * 2003-04-16 2007-11-01 Lingvitae As Method of Identifying Characteristic of Molecules
US7366394B2 (en) * 2002-12-27 2008-04-29 Kansai Technology Licensing Organization Co., Ltd. Multilayer observation optical microscope and multilayer observation unit
US7372985B2 (en) * 2003-08-15 2008-05-13 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US20080130093A1 (en) * 2004-09-14 2008-06-05 Yeda Research & Development Company Ltd. Microscope System and Method
US7502107B2 (en) * 2003-08-22 2009-03-10 Secretary, Department Of Atomic Energy, Government Of India Apparatus and method for transport of microscopic object(s)
US8771978B2 (en) * 2010-11-15 2014-07-08 Tissuevision, Inc. Systems and methods for imaging and processing tissue

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4960330A (en) 1985-07-16 1990-10-02 Kerschmann Russell L Image recording apparatus
IL88359A (en) 1988-11-10 1993-06-10 Cubital Ltd Method and apparatus for volumetric digitization of 3-dimensional objects
JP2659429B2 (en) * 1989-03-17 1997-09-30 株式会社日立製作所 Photoacoustic signal detection method and apparatus, and semiconductor element internal defect detection method
US5156019A (en) 1990-11-19 1992-10-20 Mccormick James B Frozen tissue sectioning apparatus and method
FR2705917B1 (en) 1993-06-02 1995-09-08 Tabone Herve Aspiration microtome, especially for histological and similar work.
WO1998002851A1 (en) 1996-07-12 1998-01-22 Advanced Pathology Systems, Inc. Image recording with optical sectioning
JP3816632B2 (en) * 1997-05-14 2006-08-30 オリンパス株式会社 Scanning microscope
US6603537B1 (en) 1998-08-21 2003-08-05 Surromed, Inc. Optical architectures for microvolume laser-scanning cytometers
JP3099063B2 (en) 1998-12-28 2000-10-16 大阪大学長 Multiphoton microscope
DE19904592C2 (en) 1999-02-05 2001-03-08 Lavision Gmbh Beam splitter device
JP2000292705A (en) 1999-04-05 2000-10-20 Olympus Optical Co Ltd Scanning microscope
US6387653B1 (en) 1999-04-09 2002-05-14 Culterra, Llc Apparatus and method for automatically producing tissue slides
US6449039B1 (en) * 1999-07-28 2002-09-10 Thermo Noran Inc. Laser scanning fluorescence microscopy with compensation for spatial dispersion of fast laser pulses
US7217573B1 (en) * 1999-10-05 2007-05-15 Hitachi, Ltd. Method of inspecting a DNA chip
US7003143B1 (en) 1999-11-02 2006-02-21 Hewitt Charles W Tomographic microscope for high resolution imaging and method of analyzing specimens
US6623977B1 (en) * 1999-11-05 2003-09-23 Real-Time Analyzers, Inc. Material for surface-enhanced Raman spectroscopy, and SER sensors and method for preparing same
AU2839601A (en) 1999-12-02 2001-06-12 Evotec Oai Ag High rate screening method and device for optically detecting samples
AU782452B2 (en) 1999-12-13 2005-07-28 Government of The United States of America, as represented by The Secretary Department of Health & Human Services, The National Institutes of Health, The High-throughput tissue microarray technology and applications
US6423960B1 (en) 1999-12-31 2002-07-23 Leica Microsystems Heidelberg Gmbh Method and system for processing scan-data from a confocal microscope
US7127573B1 (en) * 2000-05-04 2006-10-24 Advanced Micro Devices, Inc. Memory controller providing multiple power modes for accessing memory devices by reordering memory transactions
AU2002232892B2 (en) 2000-10-24 2008-06-26 Intrexon Corporation Method and device for selectively targeting cells within a three -dimensional specimen
JP2004515779A (en) * 2000-12-15 2004-05-27 ザ ジェネラル ホスピタル コーポレーション Indirect imaging
US7110118B2 (en) 2000-12-19 2006-09-19 Trustees Of Boston University Spectral imaging for vertical sectioning
US7274446B2 (en) 2001-04-07 2007-09-25 Carl Zeiss Jena Gmbh Method and arrangement for the deep resolved optical recording of a sample
DE10118355B4 (en) 2001-04-12 2005-07-21 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and apparatus for multiphoton excitation of a sample
US7155076B2 (en) * 2001-06-15 2006-12-26 The Regents Of The University Of California Target molecules detection by waveguiding in a photonic silicon membrane
US6785432B2 (en) * 2001-06-15 2004-08-31 The Regents Of The University Of California Target molecules detection by waveguiding in a photonic silicon membrane
DE10151217B4 (en) 2001-10-16 2012-05-16 Carl Zeiss Microlmaging Gmbh Method for operating a laser scanning microscope
EP1461592B1 (en) 2001-12-05 2019-04-10 The J. David Gladstone Institutes Robotic microscopy systems
JP4632634B2 (en) * 2002-03-27 2011-02-16 オリンパス株式会社 Confocal microscope apparatus and observation method using confocal microscope apparatus
US7738945B2 (en) 2002-04-19 2010-06-15 University Of Washington Method and apparatus for pseudo-projection formation for optical tomography
US7197193B2 (en) 2002-05-03 2007-03-27 Creatv Microtech, Inc. Apparatus and method for three dimensional image reconstruction
AU2003272667A1 (en) * 2002-09-26 2004-04-19 Bio Techplex Corporation Method and apparatus for screening using a waveform modulated led
US7038848B2 (en) 2002-12-27 2006-05-02 Olympus Corporation Confocal microscope
DE10300091A1 (en) 2003-01-04 2004-07-29 Lubatschowski, Holger, Dr. microtome
WO2004070653A2 (en) * 2003-01-31 2004-08-19 Discovery Partners International Image analysis system and method
DE10327531B4 (en) * 2003-06-17 2006-11-30 Leica Microsystems Cms Gmbh Method for measuring fluorescence correlations in the presence of slow signal fluctuations
DE10327486B4 (en) * 2003-06-17 2006-07-20 Leica Microsystems Cms Gmbh Device for determining directional transport processes
JP2005017127A (en) 2003-06-26 2005-01-20 Institute Of Physical & Chemical Research Interferometer and shape measuring system
DE10335471A1 (en) * 2003-08-02 2005-03-03 Leica Microsystems Heidelberg Gmbh Detector and method for detecting weak fluorescent radiation with a microscope system
EP2259050A3 (en) 2003-08-26 2010-12-22 Blueshift Biotechnologies, Inc. Time dependent fluorescence measurements
DE10339311B4 (en) * 2003-08-27 2006-04-27 Leica Microsystems Cms Gmbh System and method for setting a fluorescence spectral measurement system for microscopy
US20060211752A1 (en) 2004-03-16 2006-09-21 Kohn Leonard D Use of phenylmethimazoles, methimazole derivatives, and tautomeric cyclic thiones for the treatment of autoimmune/inflammatory diseases associated with toll-like receptor overexpression
JP4593141B2 (en) 2004-03-26 2010-12-08 オリンパス株式会社 Optical scanning observation device
DE602005007403D1 (en) 2004-03-25 2008-07-24 Olympus Corp Scanning confocal microscope
US7170675B2 (en) * 2004-05-19 2007-01-30 Celloptic, Inc. Method and system for wide-field multi-photon microscopy having a confocal excitation plane
US7919325B2 (en) * 2004-05-24 2011-04-05 Authentix, Inc. Method and apparatus for monitoring liquid for the presence of an additive
WO2005123909A2 (en) 2004-06-09 2005-12-29 The Board Of Trustees Of The Leland Stanford Junior University Isolation and characterization of muscle regenerating cells
US7677289B2 (en) 2004-07-08 2010-03-16 President And Fellows Of Harvard College Methods and apparatuses for the automated production, collection, handling, and imaging of large numbers of serial tissue sections
DE102004034970A1 (en) 2004-07-16 2006-02-02 Carl Zeiss Jena Gmbh Scanning microscope and use
DE102004034962A1 (en) * 2004-07-16 2006-02-16 Carl Zeiss Jena Gmbh Microscope with increased resolution
US20070258122A1 (en) 2004-10-06 2007-11-08 Bc Cancer Agency Computer-Tomography Microscope and Computer-Tomography Image Reconstruction Methods
KR100624454B1 (en) 2004-12-23 2006-09-18 삼성전기주식회사 Hybrid lens unit and hybrid lens array
US7382464B2 (en) 2005-01-20 2008-06-03 Carl Zeiss Meditec, Inc. Apparatus and method for combined optical-coherence-tomographic and confocal detection
US7767414B1 (en) 2005-04-20 2010-08-03 The Board Of Trustees Of The Leland Stanford Junior University Optical imaging of molecular characteristics of biological specimen
US20070057211A1 (en) 2005-05-25 2007-03-15 Karsten Bahlman Multifocal imaging systems and method
KR20080023292A (en) 2005-05-27 2008-03-13 더 보드 오브 리전츠 오브 더 유니버시티 오브 텍사스 시스템 Optical coherence tomographic detection of cells and compositions
US8355776B2 (en) 2005-05-27 2013-01-15 Board Of Regents, The University Of Texas System Hemoglobin contrast in magneto-motive optical doppler tomography, optical coherence tomography, and ultrasound imaging methods and apparatus
US7801590B2 (en) 2005-05-27 2010-09-21 Board Of Regents, The University Of Texas System Optical coherence tomographic detection of cells and killing of the same
US7329860B2 (en) 2005-11-23 2008-02-12 Illumina, Inc. Confocal imaging methods and apparatus
ATE529732T1 (en) 2006-10-30 2011-11-15 Ventana Med Syst Inc THIN FILM APPARATUS AND METHOD
GB2477648B (en) 2008-08-29 2013-02-20 Lee H Angros In situ heat induced antigen recovery and staining apparatus and method

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5109149A (en) * 1990-03-15 1992-04-28 Albert Leung Laser, direct-write integrated circuit production system
US5233197A (en) * 1991-07-15 1993-08-03 University Of Massachusetts Medical Center High speed digital imaging microscope
US5691839A (en) * 1993-04-15 1997-11-25 Kowa Company Ltd. Laser scanning optical microscope
US5583342A (en) * 1993-06-03 1996-12-10 Hamamatsu Photonics K.K. Laser scanning optical system and laser scanning optical apparatus
US5783814A (en) * 1994-01-18 1998-07-21 Ultrapointe Corporation Method and apparatus for automatically focusing a microscope
US5633695A (en) * 1995-08-14 1997-05-27 Canon Kabushiki Kaisha Beam steering optical system and method and ophthalmic apparatus using same having spaced apart irradiation and observation paths
US6020591A (en) * 1997-07-11 2000-02-01 Imra America, Inc. Two-photon microscopy with plane wave illumination
US6248988B1 (en) * 1998-05-05 2001-06-19 Kla-Tencor Corporation Conventional and confocal multi-spot scanning optical microscope
US6392795B2 (en) * 1998-08-28 2002-05-21 Olympus Optical Co., Ltd. Microscope with a dynamic damper
US7209287B2 (en) * 2000-09-18 2007-04-24 Vincent Lauer Confocal optical scanning device
US20040125372A1 (en) * 2002-10-17 2004-07-01 Walla Peter Jomo Multi-parameter fluorimetric analysis in a massively parallel multi-focal arrangement and the use thereof
US7366394B2 (en) * 2002-12-27 2008-04-29 Kansai Technology Licensing Organization Co., Ltd. Multilayer observation optical microscope and multilayer observation unit
US20070254280A1 (en) * 2003-04-16 2007-11-01 Lingvitae As Method of Identifying Characteristic of Molecules
US7372985B2 (en) * 2003-08-15 2008-05-13 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US7502107B2 (en) * 2003-08-22 2009-03-10 Secretary, Department Of Atomic Energy, Government Of India Apparatus and method for transport of microscopic object(s)
US20080130093A1 (en) * 2004-09-14 2008-06-05 Yeda Research & Development Company Ltd. Microscope System and Method
US8771978B2 (en) * 2010-11-15 2014-07-08 Tissuevision, Inc. Systems and methods for imaging and processing tissue

Cited By (248)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7514327B2 (en) * 2005-04-27 2009-04-07 International Business Machines Corporation Electronically scannable multiplexing device
US8178362B2 (en) 2005-04-27 2012-05-15 International Business Machines Corporation Electronically scannable multiplexing device
US20080068065A1 (en) * 2005-04-27 2008-03-20 International Business Machines Corp. Electronically scannable multiplexing device
US7795044B2 (en) * 2005-04-27 2010-09-14 International Business Machines Corporation Electronically scannable multiplexing device
US8552414B2 (en) 2005-04-27 2013-10-08 International Business Machines Corporation Electronically scannable multiplexing device
US10598597B2 (en) 2005-05-25 2020-03-24 Massachusetts Institute Of Technology Multifocal imaging systems and method
WO2007146753A2 (en) * 2006-06-12 2007-12-21 Neptec Optical Solutions, Inc. High-speed, rugged, time-resolved, raman spectrometer for sensing multiple conponents of a sample
WO2007146753A3 (en) * 2006-06-12 2008-07-24 Neptec Optical Solutions Inc High-speed, rugged, time-resolved, raman spectrometer for sensing multiple conponents of a sample
US7602488B2 (en) * 2006-06-12 2009-10-13 Neptec Optical Solutions, Inc. High-speed, rugged, time-resolved, raman spectrometer for sensing multiple components of a sample
US20070285658A1 (en) * 2006-06-12 2007-12-13 Neptec Optical Solutions, Inc. High-speed, rugged, time-resolved, raman spectrometer for sensing multiple components of a sample
US8040495B2 (en) * 2007-02-26 2011-10-18 Koninklijke Philips Electronics N.V. Method and device for optical analysis of a tissue
US20100033719A1 (en) * 2007-02-26 2010-02-11 Koninklijke Philips Electronics N.V. Method and device for optical analysis of a tissue
AU2008251861B2 (en) * 2007-05-10 2014-03-20 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced autofluorescence
WO2008140758A1 (en) * 2007-05-10 2008-11-20 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced autofluorescence
US20080283772A1 (en) * 2007-05-10 2008-11-20 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced authofluorescence
US20100167413A1 (en) * 2007-05-10 2010-07-01 Paul Lundquist Methods and systems for analyzing fluorescent materials with reduced autofluorescence
US7714303B2 (en) 2007-05-10 2010-05-11 Pacific Biosciences Of California, Inc. Methods and systems for analyzing fluorescent materials with reduced authofluorescence
US20080277595A1 (en) * 2007-05-10 2008-11-13 Pacific Biosciences Of California, Inc. Highly multiplexed confocal detection systems and methods of using same
US10499797B2 (en) 2007-07-03 2019-12-10 The Board Of Trustees Of The Leland Stanford Junior University System and method useful for sarcomere imaging via objective-based microscopy
US20090015923A1 (en) * 2007-07-09 2009-01-15 Auld Jack R Multi-Spot Ophthalmic Laser Probe
US7566173B2 (en) * 2007-07-09 2009-07-28 Alcon, Inc. Multi-spot ophthalmic laser probe
WO2009038668A3 (en) * 2007-09-14 2009-05-22 Pacific Biosciences California Highly multiplexed confocal detection systems and methods of using same
WO2009038668A2 (en) * 2007-09-14 2009-03-26 Pacific Biosciences Of California, Inc. Highly multiplexed confocal detection systems and methods of using same
US20080027279A1 (en) * 2007-10-24 2008-01-31 Abou El Kheir Tarek A N Endoscopic System and Method for Therapeutic Applications and Obtaining 3-Dimensional Human Vision Simulated Imaging With Real Dynamic Convergence
US8105233B2 (en) 2007-10-24 2012-01-31 Tarek Ahmed Nabil Abou El Kheir Endoscopic system and method for therapeutic applications and obtaining 3-dimensional human vision simulated imaging with real dynamic convergence
US20090141140A1 (en) * 2007-12-03 2009-06-04 Robinson M Dirk End-to-end design of electro-optic imaging systems for color-correlated objects
US8149319B2 (en) * 2007-12-03 2012-04-03 Ricoh Co., Ltd. End-to-end design of electro-optic imaging systems for color-correlated objects
WO2009091317A1 (en) * 2008-01-18 2009-07-23 Hemocue Ab Apparatus and method for analysis of particles in a liquid sample
US20090185734A1 (en) * 2008-01-18 2009-07-23 Hemocue Ab Apparatus and method for analysis of particles in a liquid sample
US20090236549A1 (en) * 2008-03-21 2009-09-24 Vogt William I Apparatus for improving detection efficiency of multiphoton microscopy systems by focus compensation, pupil image division, and parallel pupil rearrangement
US7943889B2 (en) * 2008-03-21 2011-05-17 Prairie Technologies, Inc. Apparatus for improving detection efficiency of multiphoton microscopy systems by focus compensation, pupil image division, and parallel pupil rearrangement
US20110077528A1 (en) * 2008-03-28 2011-03-31 Volcano Coproration Method and apparatus for simultaneous hemoglobin reflectivity measurement and oct measurement, thrombus detection and treatment, and oct flushing
US8526001B2 (en) * 2008-08-22 2013-09-03 Ciencia, Inc. Versatile surface plasmon resonance analyzer with an integral surface plasmon resonance enhanced fluorescence mode
US8368897B2 (en) * 2008-08-22 2013-02-05 Ciencia, Inc. Versatile surface plasmon resonance analyzer with an integral surface plasmon resonance enhanced fluorescence mode
US20100078575A1 (en) * 2008-08-22 2010-04-01 Reilly Michael T Versatile Surface Plasmon Resonance Analyzer with an Integral Surface Plasmon Resonance Enhanced Fluorescence Mode
US10908084B2 (en) 2008-10-14 2021-02-02 Timothy M. Ragan Devices and methods for direct-sampling analog time-resolved detection
US8610808B2 (en) * 2008-12-22 2013-12-17 Koninklijke Philips N.V. Color CMOS imager with single photon counting capability
US20110249148A1 (en) * 2008-12-22 2011-10-13 Koninklijke Philips Electronics N.V. Cmos imager with single photon counting capability
US20120005793A1 (en) * 2009-03-23 2012-01-05 Nenad Ocelic Near field optical microscope
US8832861B2 (en) * 2009-03-23 2014-09-09 Neaspec Gmbh Near field optical microscope
US8913127B2 (en) 2009-06-01 2014-12-16 Bio-Rad Laboratories, Inc. Calibration of imaging device for biological/chemical samples
US20110134238A1 (en) * 2009-06-01 2011-06-09 Bio-Rad Laboratories, Inc. Calibration of imaging device for biological/chemical samples
WO2010141486A1 (en) * 2009-06-01 2010-12-09 Bio-Rad Laboratories, Inc. Calibration of imaging device for biological/chemical samples
US9846121B2 (en) * 2009-06-17 2017-12-19 W.O.M. World Of Medicine Gmbh Device and method for multi-photon fluorescence microscopy for obtaining information from biological tissue
US20150069268A1 (en) * 2009-06-17 2015-03-12 W.O.M. World Of Medicine Ag Device and method for multi-photon fluorescence microscopy for obtaining information from biological tissue
US20100322502A1 (en) * 2009-06-18 2010-12-23 Olympus Corporation Medical diagnosis support device, image processing method, image processing program, and virtual microscope system
US8759792B2 (en) * 2009-07-10 2014-06-24 The United States Of America, As Represented By The Secretary, Dept. Of Health And Human Services Non-contact total emission detection method and system for multi-photon microscopy
US20130153788A1 (en) * 2009-07-10 2013-06-20 The Govt. of the United States of America, as represented by the Secretary, D.H.H.S. Non-contact total emission detection method and system for multi-photon microscopy
WO2011056658A1 (en) * 2009-10-27 2011-05-12 Duke University Multi-photon microscopy via air interface objective lens
US8398240B2 (en) 2009-11-24 2013-03-19 Alcon Research, Ltd. Single-fiber multi-spot laser probe for ophthalmic endoillumination
US20110122366A1 (en) * 2009-11-24 2011-05-26 Smith Ronald T Single-fiber multi-spot laser probe for ophthalmic endoillumination
US20110144627A1 (en) * 2009-12-15 2011-06-16 Smith Ronald T Multi-spot laser probe
US8951244B2 (en) 2009-12-15 2015-02-10 Alcon Research, Ltd. Multi-spot laser probe
US20160195793A1 (en) * 2010-01-22 2016-07-07 Fianium Ltd. Optical Sources
US10527907B2 (en) * 2010-01-22 2020-01-07 Nkt Photonics A/S Optical sources
US11675245B2 (en) 2010-01-22 2023-06-13 Nkt Photonics A/S Optical sources
US8687055B2 (en) * 2010-03-16 2014-04-01 Eli Margalith Spectral imaging of moving objects with a stare down camera
US20110228116A1 (en) * 2010-03-16 2011-09-22 Eli Margalith Spectral imaging of moving objects with a stare down camera
US9529128B2 (en) * 2010-04-26 2016-12-27 Hewlett Packard Enterprise Development Lp Non-uniform grating
US20130032734A1 (en) * 2010-04-26 2013-02-07 Santori Charles M Non-uniform grating
US20140118524A1 (en) * 2010-04-28 2014-05-01 Sebastian Munck Method And Apparatus For The Imaging Of A Labeled Biological Sample
US9569828B2 (en) * 2010-04-28 2017-02-14 Vib Vzw Method and apparatus for the imaging of a labeled biological sample
US8957956B2 (en) * 2010-06-09 2015-02-17 Honeywell International Inc. Method and system for iris image capture
US20110304721A1 (en) * 2010-06-09 2011-12-15 Honeywell International Inc. Method and system for iris image capture
US20120105593A1 (en) * 2010-10-29 2012-05-03 Sony Corporation Multi-view video and still 3d capture system
US8842168B2 (en) * 2010-10-29 2014-09-23 Sony Corporation Multi-view video and still 3D capture system
US10908087B2 (en) 2010-11-15 2021-02-02 Tissuevision, Inc. Systems and methods for imaging and processing tissue
US8771978B2 (en) 2010-11-15 2014-07-08 Tissuevision, Inc. Systems and methods for imaging and processing tissue
US9983134B2 (en) 2010-11-15 2018-05-29 Timothy Ragan Systems and methods for imaging and processing tissue
US20130256564A1 (en) * 2010-11-22 2013-10-03 Deutsches Krebsforschungszentrum STED Microscopy With Pulsed Excitation, Continuous Stimulation, And Gated Registration Of Spontaneously Emitted Fluorescence Light
US9551658B2 (en) * 2010-11-22 2017-01-24 Max-Planck-Gesellschaft Zur Foerderung Der Wissenschaften E.V. STED microscopy with pulsed excitation, continuous stimulation, and gated registration of spontaneously emitted fluorescence light
US9103721B2 (en) 2011-04-07 2015-08-11 Uwm Research Foundation, Inc. High speed microscope with spectral resolution
US20120257038A1 (en) * 2011-04-07 2012-10-11 Valerica Raicu High speed microscope with narrow detector and pixel binning
US8982206B2 (en) * 2011-04-07 2015-03-17 Uwm Research Foundation, Inc. High speed microscope with narrow detector and pixel binning
US20140023993A1 (en) * 2011-04-08 2014-01-23 British Columbia Cancer Agency Branch Apparatus and methods for multiphoton microscopy
US9687152B2 (en) * 2011-04-08 2017-06-27 Britisg Columbia Cancer Agency Branch Apparatus and methods for multiphoton microscopy
US9372334B2 (en) * 2011-06-17 2016-06-21 Leica Microsystems Cms Gmbh Microscope and method for fluorescence imaging microscopy
US20120320184A1 (en) * 2011-06-17 2012-12-20 Leica Microsystems Cms Gmbh Microscope and Method for Fluorescence Imaging Microscopy
US9095414B2 (en) * 2011-06-24 2015-08-04 The Regents Of The University Of California Nonlinear optical photodynamic therapy (NLO-PDT) of the cornea
US10292865B2 (en) 2011-06-24 2019-05-21 The Regents Of The University Of California Nonlinear optical photodynamic therapy (NLO-PDT) of the cornea
US8594455B2 (en) * 2011-09-28 2013-11-26 The United States Of America As Represented By The Secretary Of The Army System and method for image enhancement and improvement
WO2013103475A3 (en) * 2011-12-09 2015-06-18 Massachusetts Institute Of Technology Portable optical fiber probe-based spectroscopic scanner for rapid cancer diagnosis
US8520219B2 (en) 2011-12-19 2013-08-27 Perceptron, Inc. Non-contact sensor having improved laser spot
CN104136882A (en) * 2011-12-19 2014-11-05 感知器股份有限公司 Non-contact sensor having improved laser spot
WO2013096454A1 (en) * 2011-12-19 2013-06-27 Perceptron, Inc. Non-contact sensor having improved laser spot
US20150160446A1 (en) * 2012-01-24 2015-06-11 Carl Zeiss Microscopy Gmbh Microscope and method for high-resolution 3d fluorescence microscopy
US9885860B2 (en) * 2012-01-24 2018-02-06 Carl Zeiss Microscopy Gmbh Microscope and method for high-resolution 3D fluorescence microscopy
US20140012104A1 (en) * 2012-02-21 2014-01-09 National Taiwan University Method for Observing, Identifying, and Detecting Blood Cells
US9989754B2 (en) 2012-03-09 2018-06-05 Carl Zeiss Microscopy Gmbh Light scanning microscope with spectral detection
US9236943B2 (en) * 2012-08-14 2016-01-12 The Institute Of Optics And Electronics, The Chinese Academy Of Sciences Method of eliminating offset in spot centroid due to crosstalk
US20140178078A1 (en) * 2012-08-14 2014-06-26 The Institute Of Optics And Electronics, The Chinese Academy Of Sciences Method of eliminating offset in spot centroid due to crosstalk
US20140098213A1 (en) * 2012-10-05 2014-04-10 Canon Kabushiki Kaisha Imaging system and control method for same
US10359271B2 (en) 2012-12-05 2019-07-23 Perimeter Medical Imaging, Inc. System and method for tissue differentiation in imaging
US9677869B2 (en) 2012-12-05 2017-06-13 Perimeter Medical Imaging, Inc. System and method for generating a wide-field OCT image of a portion of a sample
US10245181B2 (en) 2012-12-21 2019-04-02 Alcon Research, Ltd. Grin fiber multi-spot laser probe
WO2014134314A1 (en) * 2013-03-01 2014-09-04 The Johns Hopkins University Light sources, medical devices, and methods of illuminating an object of interest
US10317657B2 (en) 2013-08-15 2019-06-11 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
JP2019117387A (en) * 2013-08-15 2019-07-18 カール ツァイス マイクロスコピー ゲーエムベーハーCarl Zeiss Microscopy Gmbh High-resolution scanning microscopic method
WO2015022147A1 (en) * 2013-08-15 2015-02-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
JP2016532154A (en) * 2013-08-15 2016-10-13 カール ツァイス マイクロスコピー ゲーエムベーハーCarl Zeiss Microscopy Gmbh High resolution scanning microscopy
US10281701B2 (en) 2013-08-15 2019-05-07 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
CN108873285A (en) * 2013-08-15 2018-11-23 卡尔蔡司显微镜有限责任公司 High resolution scanning microscopy
WO2015022145A1 (en) * 2013-08-15 2015-02-19 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
US11372223B2 (en) 2013-08-15 2022-06-28 Carl Zeiss Microscopy Gmbh High-resolution scanning microscopy
CN105556370A (en) * 2013-08-15 2016-05-04 卡尔蔡司显微镜有限责任公司 High-resolution scanning microscopy
JP2016528557A (en) * 2013-08-15 2016-09-15 カール ツァイス マイクロスコピー ゲーエムベーハーCarl Zeiss Microscopy Gmbh High resolution scanning microscopy
EP2881778A1 (en) * 2013-12-04 2015-06-10 Olympus Corporation Scanning light microscope with focus adjustment
US9983396B2 (en) 2013-12-04 2018-05-29 Olympus Corporation Scanning microscope having focal position adjustment unit which includes a deflecting element
JP2015108718A (en) * 2013-12-04 2015-06-11 オリンパス株式会社 Scanning microscope
US9820652B2 (en) 2013-12-09 2017-11-21 The Board Of Trustees Of The Leland Stanford Junior University Multi-photon microscope having an excitation-beam array
JP2017502300A (en) * 2013-12-24 2017-01-19 ティシュヴィジョン、インコーポレーテッド Multifocal multiphoton imaging system and method
US10626453B2 (en) 2014-04-14 2020-04-21 Sri International Portable nucleic acid analysis system and high-performance microfluidic electroactive polymer actuators
US20170030835A1 (en) * 2014-04-17 2017-02-02 The Regents Of The University Of California Parallel acquisition of spectral signals from a 2-d laser beam array
US10156522B2 (en) * 2014-04-17 2018-12-18 The Regents Of The University Of California Parallel acquisition of spectral signals from a 2-D laser beam array
US20150310242A1 (en) * 2014-04-24 2015-10-29 Sick Ag Camera and method for the detection of a moved flow of objects
US9282237B2 (en) 2014-07-17 2016-03-08 Schlage Lock Company Llc Multifocal iris recognition device
AU2015296920B2 (en) * 2014-07-28 2017-04-13 Alcon Inc. Increased depth of field microscope and associated devices, systems, and methods
US9844314B2 (en) * 2014-07-28 2017-12-19 Novartis Ag Increased depth of field microscope and associated devices, systems, and methods
US10261298B1 (en) * 2014-12-09 2019-04-16 The Board Of Trustees Of The Leland Stanford Junior University Near-infrared-II confocal microscope and methods of use
US11726036B2 (en) 2014-12-23 2023-08-15 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US11035793B2 (en) 2014-12-23 2021-06-15 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US10551605B2 (en) 2014-12-23 2020-02-04 Apple Inc. Confocal inspection system having non-overlapping annular illumination and collection regions
US10274426B2 (en) 2014-12-23 2019-04-30 Apple Inc. Optical inspection system and method including accounting for variations of optical path length within a sample
US10718931B2 (en) 2014-12-23 2020-07-21 Apple Inc. Confocal inspection system having averaged illumination and averaged collection paths
US10488640B2 (en) * 2015-01-20 2019-11-26 Hamamatsu Photonics K.K. Image acquisition device and image acquisition method
US20180267283A1 (en) * 2015-01-20 2018-09-20 Hamamatsu Photonics K.K. Image acquisition device and image acquisition method
US10816472B2 (en) 2015-01-20 2020-10-27 Hamamatsu Photonics K.K. Image acquisition device and image acquisition method
US9594024B2 (en) * 2015-02-06 2017-03-14 Commissariat à l'énergie atomique et aux énergies alternatives Method for correcting a signal backscattered by a sample and associated device
US11519832B2 (en) 2015-03-11 2022-12-06 Tissuevision, Inc. Systems and methods for serial staining and imaging
US10788403B2 (en) 2015-03-11 2020-09-29 Tissuevision, Inc. Systems and methods for serial staining and imaging
US10809179B2 (en) 2015-05-28 2020-10-20 Accellix Ltd. System and apparatus for blind deconvolution of flow cytometer particle emission
US10527538B2 (en) 2015-05-28 2020-01-07 Accellix Ltd. System and apparatus for blind deconvolution of flow cytometer particle emission
US20180157021A1 (en) * 2015-06-02 2018-06-07 Life Technologies Corporation Systems and methods for generating a structured illumination image
US10634892B2 (en) * 2015-06-02 2020-04-28 Life Technologies Corporation Systems and methods for generating a structured illumination image
US11585749B2 (en) 2015-09-01 2023-02-21 Apple Inc. Reference switch architectures for noncontact sensing of substances
CN108027425A (en) * 2015-09-18 2018-05-11 罗伯特·博世有限公司 Laser radar sensor
US10996322B2 (en) * 2015-09-18 2021-05-04 Robert Bosch Gmbh Lidar sensor
US20180267148A1 (en) * 2015-09-18 2018-09-20 Robert Bosch Gmbh Lidar sensor
US11190750B2 (en) 2015-09-24 2021-11-30 Ouster, Inc. Optical imaging system with a plurality of sense channels
US11025885B2 (en) 2015-09-24 2021-06-01 Ouster, Inc. Optical system for collecting distance information within a field
US11178381B2 (en) 2015-09-24 2021-11-16 Ouster, Inc. Optical system for collecting distance information within a field
US11956410B2 (en) 2015-09-24 2024-04-09 Ouster, Inc. Optical system for collecting distance information within a field
US11196979B2 (en) 2015-09-24 2021-12-07 Ouster, Inc. Optical system for collecting distance information within a field
US11202056B2 (en) 2015-09-24 2021-12-14 Ouster, Inc. Optical system with multiple light emitters sharing a field of view of a pixel detector
US11627298B2 (en) 2015-09-24 2023-04-11 Ouster, Inc. Optical system for collecting distance information within a field
DE102015116598A1 (en) 2015-09-30 2017-03-30 Carl Zeiss Microscopy Gmbh Method and microscope for high-resolution imaging by means of SIM
EP3394579B1 (en) * 2015-12-21 2023-09-20 Verily Life Sciences LLC Systems and methods for determining an identity of a probe in a target based on colors and locations of two or more fluorophores in the probe and in the target
US11172826B2 (en) 2016-03-08 2021-11-16 Enspectra Health, Inc. Non-invasive detection of skin disease
US11877826B2 (en) 2016-03-08 2024-01-23 Enspectra Health, Inc. Non-invasive detection of skin disease
US10890530B2 (en) 2016-04-01 2021-01-12 National University Corporation Hamamatsu University School Of Medicine Image acquisition device and image acquisition method
CN109073873A (en) * 2016-04-01 2018-12-21 国立大学法人浜松医科大学 Image capturing device and image acquisition method
AU2017253712B8 (en) * 2016-04-21 2019-11-21 Apple Inc. Optical system for reference switching
WO2017184420A1 (en) * 2016-04-21 2017-10-26 Bribbla Dynamics Llc Optical system for reference switching
US20190128734A1 (en) * 2016-04-21 2019-05-02 Apple Inc. Optical system for reference switching
AU2017253712A1 (en) * 2016-04-21 2019-06-27 Apple Inc. Optical system for reference switching
US10788366B2 (en) * 2016-04-21 2020-09-29 Apple Inc. Optical system for reference switching
US11243115B2 (en) 2016-04-21 2022-02-08 Apple Inc. Optical system for reference switching
AU2017253712B2 (en) * 2016-04-21 2019-11-07 Apple Inc. Optical system for reference switching
US20190163132A1 (en) * 2016-05-06 2019-05-30 Uwm Research Foundation, Inc. Snapshot optical tomography system and method of acquiring an image with the system
US10845759B2 (en) * 2016-05-06 2020-11-24 Uwm Research Foundation, Inc. Snapshot optical tomography system and method of acquiring an image with the system
WO2018094290A1 (en) 2016-11-18 2018-05-24 Tissuevision, Inc. Automated tissue section capture, indexing and storage system and methods
IT201600117339A1 (en) * 2016-11-21 2018-05-21 Crestoptics S P A APPARATUS AS A SUPER-RESOLUTION FOR FLUORESCENT ANALYSIS OF THE EYE BACKGROUND
WO2018169601A1 (en) * 2017-03-13 2018-09-20 The Charles Stark Draper Laboratory, Inc. Light detection and ranging (lidar) system and method
US10473766B2 (en) 2017-03-13 2019-11-12 The Charles Stark Draper Laboratory, Inc. Light detection and ranging (LiDAR) system and method
US11633149B2 (en) 2017-04-28 2023-04-25 Enspectra Health, Inc. Systems and methods for imaging and measurement of sarcomeres
CN108872188A (en) * 2017-05-11 2018-11-23 凯塞光学系统股份有限公司 Endoscope immersion probe tips optical device for laser spectroscopy
CN108872188B (en) * 2017-05-11 2021-07-13 凯塞光学系统股份有限公司 Endoscope immersion probe tip optics for laser spectroscopy
US10481385B2 (en) * 2017-05-11 2019-11-19 Kaiser Optical Systems Inc. Endoscopic immersion probe end optics for laser spectroscopy
US20180329196A1 (en) * 2017-05-11 2018-11-15 Kaiser Optical Systems Inc. Endoscopic immersion probe end optics for laser spectroscopy
US11762180B2 (en) * 2017-06-16 2023-09-19 University Court Of The University Of St Andrews Three-photon light sheet imaging
US10577573B2 (en) 2017-07-18 2020-03-03 Perimeter Medical Imaging, Inc. Sample container for stabilizing and aligning excised biological tissue samples for ex vivo analysis
US10894939B2 (en) 2017-07-18 2021-01-19 Perimeter Medical Imaging, Inc. Sample container for stabilizing and aligning excised biological tissue samples for ex vivo analysis
US11579080B2 (en) 2017-09-29 2023-02-14 Apple Inc. Resolve path optical sampling architectures
WO2019118582A3 (en) * 2017-12-12 2020-03-26 Trustees Of Boston University Multi-z confocal imaging system
US11042016B2 (en) 2017-12-12 2021-06-22 Trustees Of Boston University Multi-Z confocal imaging system
US11307143B2 (en) * 2017-12-12 2022-04-19 Allen Institute Systems, apparatuses and methods for simultaneous multi-plane imaging
US11823403B2 (en) 2017-12-27 2023-11-21 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11900623B2 (en) 2017-12-27 2024-02-13 Cilag Gmbh International Hyperspectral imaging with tool tracking in a light deficient environment
US20210116693A1 (en) * 2018-03-27 2021-04-22 PXYL Limited Non-linear optical scanning microscope
JP2019174245A (en) * 2018-03-28 2019-10-10 国立大学法人 東京大学 X-ray photography method and x-ray photography device
JP7078815B2 (en) 2018-03-28 2022-06-01 秀和 三村 X-ray imaging method and X-ray imaging device
US10898726B2 (en) * 2018-09-25 2021-01-26 Siemens Healthcare Gmbh Providing an annotated medical image data set for a patient's radiotherapy planning
CN109656014A (en) * 2019-01-31 2019-04-19 北京超维景生物科技有限公司 Multichannel phosphor collection device and three dimensional non-linear laser scanning cavity endoscope
US11516374B2 (en) 2019-06-05 2022-11-29 Synaptics Incorporated Under-display image sensor
US11280675B2 (en) 2019-06-18 2022-03-22 Ruolin Li Method, system and apparatus for a Raman spectroscopic measurement system
WO2020257513A1 (en) * 2019-06-18 2020-12-24 Ruolin Li Method, system and apparatus for a raman spectroscopic measurement system
US11617541B2 (en) 2019-06-20 2023-04-04 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for fluorescence imaging
US11740448B2 (en) 2019-06-20 2023-08-29 Cilag Gmbh International Driving light emissions according to a jitter specification in a fluorescence imaging system
US11477390B2 (en) 2019-06-20 2022-10-18 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor
US11432706B2 (en) 2019-06-20 2022-09-06 Cilag Gmbh International Hyperspectral imaging with minimal area monolithic image sensor
US11516388B2 (en) 2019-06-20 2022-11-29 Cilag Gmbh International Pulsed illumination in a fluorescence imaging system
US11412920B2 (en) 2019-06-20 2022-08-16 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11533417B2 (en) 2019-06-20 2022-12-20 Cilag Gmbh International Laser scanning and tool tracking imaging in a light deficient environment
US11540696B2 (en) 2019-06-20 2023-01-03 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11550057B2 (en) 2019-06-20 2023-01-10 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11412152B2 (en) 2019-06-20 2022-08-09 Cilag Gmbh International Speckle removal in a pulsed hyperspectral imaging system
US11398011B2 (en) 2019-06-20 2022-07-26 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed laser mapping imaging system
US11589819B2 (en) 2019-06-20 2023-02-28 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a laser mapping imaging system
US11944273B2 (en) 2019-06-20 2024-04-02 Cilag Gmbh International Fluorescence videostroboscopy of vocal cords
US11612309B2 (en) 2019-06-20 2023-03-28 Cilag Gmbh International Hyperspectral videostroboscopy of vocal cords
US11949974B2 (en) 2019-06-20 2024-04-02 Cilag Gmbh International Controlling integral energy of a laser pulse in a fluorescence imaging system
US11622094B2 (en) 2019-06-20 2023-04-04 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11389066B2 (en) 2019-06-20 2022-07-19 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11624830B2 (en) 2019-06-20 2023-04-11 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for laser mapping imaging
US11633089B2 (en) 2019-06-20 2023-04-25 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor
US11291358B2 (en) 2019-06-20 2022-04-05 Cilag Gmbh International Fluorescence videostroboscopy of vocal cords
US11668919B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Driving light emissions according to a jitter specification in a laser mapping imaging system
US11668920B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Driving light emissions according to a jitter specification in a fluorescence imaging system
US11668921B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Driving light emissions according to a jitter specification in a hyperspectral, fluorescence, and laser mapping imaging system
US11671691B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Image rotation in an endoscopic laser mapping imaging system
US11674848B2 (en) 2019-06-20 2023-06-13 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for hyperspectral imaging
US11937784B2 (en) 2019-06-20 2024-03-26 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11686847B2 (en) 2019-06-20 2023-06-27 Cilag Gmbh International Pulsed illumination in a fluorescence imaging system
US11700995B2 (en) 2019-06-20 2023-07-18 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11716543B2 (en) 2019-06-20 2023-08-01 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11712155B2 (en) 2019-06-20 2023-08-01 Cilag GmbH Intenational Fluorescence videostroboscopy of vocal cords
US11940615B2 (en) 2019-06-20 2024-03-26 Cilag Gmbh International Driving light emissions according to a jitter specification in a multispectral, fluorescence, and laser mapping imaging system
US11727542B2 (en) 2019-06-20 2023-08-15 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11471055B2 (en) 2019-06-20 2022-10-18 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11747479B2 (en) 2019-06-20 2023-09-05 Cilag Gmbh International Pulsed illumination in a hyperspectral, fluorescence and laser mapping imaging system
US11758256B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11754500B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Minimizing image sensor input/output in a pulsed fluorescence imaging system
US11154188B2 (en) 2019-06-20 2021-10-26 Cilag Gmbh International Laser mapping imaging and videostroboscopy of vocal cords
US11931009B2 (en) 2019-06-20 2024-03-19 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a hyperspectral imaging system
US11788963B2 (en) 2019-06-20 2023-10-17 Cilag Gmbh International Minimizing image sensor input/output in a pulsed fluorescence imaging system
US11925328B2 (en) 2019-06-20 2024-03-12 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral imaging system
US11793399B2 (en) 2019-06-20 2023-10-24 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed hyperspectral imaging system
US11924535B2 (en) 2019-06-20 2024-03-05 Cila GmbH International Controlling integral energy of a laser pulse in a laser mapping imaging system
US11821989B2 (en) 2019-06-20 2023-11-21 Cllag GmbH International Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation
US11903563B2 (en) 2019-06-20 2024-02-20 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11898909B2 (en) 2019-06-20 2024-02-13 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11854175B2 (en) 2019-06-20 2023-12-26 Cilag Gmbh International Fluorescence imaging with fixed pattern noise cancellation
US11882352B2 (en) 2019-06-20 2024-01-23 Cllag GmbH International Controlling integral energy of a laser pulse in a hyperspectral,fluorescence, and laser mapping imaging system
US11877065B2 (en) 2019-06-20 2024-01-16 Cilag Gmbh International Image rotation in an endoscopic hyperspectral imaging system
CN110441311A (en) * 2019-07-22 2019-11-12 中国科学院上海光学精密机械研究所 The multifocal camera lens of multiaxis for the imaging of more object planes
US11153513B2 (en) * 2019-08-19 2021-10-19 Synaptics Incorporated Light source for camera
CN113168019A (en) * 2019-10-18 2021-07-23 谷歌有限责任公司 Diffractive optical element for large field imaging
US11789250B2 (en) * 2019-11-06 2023-10-17 Technische Universität Braunschweig Optical detection device and method for operating an optical detection device
US11076080B2 (en) 2019-12-05 2021-07-27 Synaptics Incorporated Under-display image sensor for eye tracking
CN111693545A (en) * 2020-06-02 2020-09-22 哈尔滨工程大学 Composite structure array probe for testing lath and optical fiber white light interference device
CN112051247A (en) * 2020-08-21 2020-12-08 杭州电子科技大学 Lens-free imaging device based on laminated imaging and phase recovery method thereof
US11852318B2 (en) 2020-09-09 2023-12-26 Apple Inc. Optical system for noise mitigation
CN113390850A (en) * 2021-06-02 2021-09-14 复旦大学 Gastric Raman femtosecond picosecond image mapping method based on U-shaped convolution neural network
US11835442B2 (en) 2021-09-07 2023-12-05 Tsinghua University Flow imaging system based on matrix laser scanning
WO2023035281A1 (en) * 2021-09-07 2023-03-16 清华大学 Dot matrix laser scanning-based flow imaging system
CN113820035A (en) * 2021-09-28 2021-12-21 应急管理部天津消防研究所 Femtosecond laser filamentation remote non-contact temperature measuring device and measuring method
US11960131B2 (en) 2022-01-13 2024-04-16 Apple Inc. Integrated photonics device having integrated edge outcouplers
WO2023211772A1 (en) * 2022-04-25 2023-11-02 The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Lens array based imaging system with improved field of view

Also Published As

Publication number Publication date
US10598597B2 (en) 2020-03-24
WO2006127967A3 (en) 2007-03-08
EP2703871A2 (en) 2014-03-05
US20180202935A1 (en) 2018-07-19
EP1889111A2 (en) 2008-02-20
EP2703871A3 (en) 2014-09-03
WO2006127967A2 (en) 2006-11-30

Similar Documents

Publication Publication Date Title
US10598597B2 (en) Multifocal imaging systems and method
JP4067826B2 (en) Imaging system and imaging method thereof
JP5112430B2 (en) System and method for quantifying matrix related tissue dynamics and disease
CN101228428B (en) Fluorescent nanoscopy method
Homma et al. Wide-field and two-photon imaging of brain activity with voltage and calcium-sensitive dyes
Kim et al. Multifocal multiphoton microscopy based on multianode photomultiplier tubes
Becker et al. Picosecond fluorescence lifetime microscopy by TCSPC imaging
US8970671B2 (en) Nondiffracting beam detection devices for three-dimensional imaging
Becker et al. Multiwavelength TCSPC lifetime imaging
US20040068193A1 (en) Optical devices for medical diagnostics
HU227859B1 (en) Real-time 3d nonlinear microscope measuring system and its application
JP2007504445A (en) Time-dependent fluorescence measurement
US7692160B2 (en) Method and system of optical imaging for target detection in a scattering medium
US8964183B2 (en) Systems and methods for screening of biological samples
Haustein et al. Trends in fluorescence imaging and related techniques to unravel biological information
JP2006275964A (en) Shading correction method for scanning fluorescence microscope
Niesner et al. Intravital two‐photon microscopy: focus on speed and time resolved imaging modalities
US20230221178A1 (en) Apparatus and a method for fluorescence imaging
Crosignani et al. Deep tissue imaging by enhanced photon collection
Becker et al. Lifetime imaging with the Zeiss LSM-510
Larson et al. Fundamentals of reflectance confocal microscopy
Olivier et al. Confocal laser scanning microscopy
Talbot et al. A multispectral FLIM tomograph for in-vivo imaging of skin cancer
Becker et al. Fluorescence lifetime images and correlation spectra obtained by multidimensional TCSPC
Luu et al. More than double the fun with two-photon excitation microscopy

Legal Events

Date Code Title Description
AS Assignment

Owner name: MASSACHUSETTS INSTITUTE OF TECHNOLOGY, MASSACHUSET

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAHLMAN, KARSTEN;KIM, KI H.;RAGAN, TIMOTHY;AND OTHERS;REEL/FRAME:018631/0007

Effective date: 20061114

AS Assignment

Owner name: NATIONAL INSTITUTES OF HEALTH (NIH), U.S. DEPT. OF

Free format text: CONFIRMATORY LICENSE;ASSIGNOR:MASSACHUSETTS INSTITUTE OF TECHNOLOGY;REEL/FRAME:027802/0634

Effective date: 20120216

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION