US20080033301A1 - Method for performing a procedure according to a biometric image - Google Patents

Method for performing a procedure according to a biometric image Download PDF

Info

Publication number
US20080033301A1
US20080033301A1 US11/739,342 US73934207A US2008033301A1 US 20080033301 A1 US20080033301 A1 US 20080033301A1 US 73934207 A US73934207 A US 73934207A US 2008033301 A1 US2008033301 A1 US 2008033301A1
Authority
US
United States
Prior art keywords
image
iris
procedure
patient
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/739,342
Inventor
Michael DellaVecchia
Larry Donoso
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PHILADELPHIA RETINA ENDOWMENT FUND A PENNSYLVANIA Corp
PHILADELPHIA RETINA ENDOWMENT FUND
Original Assignee
PHILADELPHIA RETINA ENDOWMENT FUND
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/011,187 external-priority patent/US6648473B2/en
Priority to US11/739,342 priority Critical patent/US20080033301A1/en
Application filed by PHILADELPHIA RETINA ENDOWMENT FUND filed Critical PHILADELPHIA RETINA ENDOWMENT FUND
Assigned to PHILADELPHIA RETINA ENDOWMENT FUND reassignment PHILADELPHIA RETINA ENDOWMENT FUND ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DONOSO, LARRY, MR.
Publication of US20080033301A1 publication Critical patent/US20080033301A1/en
Priority to US12/121,038 priority patent/US7775665B2/en
Priority to US12/754,750 priority patent/US20100204571A1/en
Assigned to PHILADELPHIA RETINA ENDOWMENT FUND reassignment PHILADELPHIA RETINA ENDOWMENT FUND ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DONOSO, LARRY A
Assigned to PHILADELPHIA RETINA ENDOWMENT FUND, A PENNSYLVANIA CORPORATION reassignment PHILADELPHIA RETINA ENDOWMENT FUND, A PENNSYLVANIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DONOSO, LARRY
Assigned to PHILADELPHIA RETINA ENDOWMENT FUND reassignment PHILADELPHIA RETINA ENDOWMENT FUND ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DONOSO, LARRY, DELLAVECCHIA, MICHAEL, VORONTSOV, MIKHAIL
Priority to US13/175,910 priority patent/US20110263972A1/en
Priority to US13/400,085 priority patent/US20120150064A1/en
Priority to US13/707,293 priority patent/US20130096544A1/en
Priority to US14/023,488 priority patent/US20140009742A1/en
Priority to US14/514,461 priority patent/US20160198952A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F9/00821Methods or devices for eye surgery using laser for coagulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons

Definitions

  • This invention relates to a method and a system for high-resolution retinal imaging, eye aberration compensation, and diagnostics based on adaptive optics with direct optimization of an image quality metric using a stochastic parallel perturbative gradient descent technique.
  • Adaptive optics is a promising technique for both diagnostics of optical aberrations of the eye and substantially aberration-free high-resolution imaging of the retina.
  • adaptive correction is based on illumination of the retina by a collimated laser beam to create a small size laser location on the retina surface with consequent measurement of phase aberrations of the wave scattered by the retina tissue. Correction of eye optical aberrations is then performed using the conventional phase conjugation technique.
  • This traditional approach has several important drawbacks.
  • One important drawback is the danger due to an invasive use of the laser beam focused onto the retina.
  • Other drawbacks include overall system complexity and the high cost of the necessary adaptive optics elements such as a wavefront sensor and wavefront reconstruction hardware.
  • the laser beam location size on the retina is not small enough to use it as a reference point-type light source and hence conjugation of the measured wavefront does not result in optimal optical aberration correction.
  • the traditional approach can produce a turbid image that can make performing an operation with a microscope difficult.
  • U.S. Pat. No. 5,912,731 entitled “Hartmann-type Optical Wavefront Sensor” issued to DeLong, et al. on Jun. 5, 1999 teaches an adaptive optics system using adjustable optical elements to compensate for aberrations in an optical beam.
  • the aberrations may be caused, for example, by propagation of the beam through the atmosphere.
  • the aberrated beam can be reflected from a deformable mirror having many small elements, each having an associated separate actuator.
  • Part of the reflected beam taught by DeLong can be split off and directed to impinge on a sensor array which provides measurements indicative of the wavefront distortion in the reflected beam.
  • the wavefront distortion measurements can then be fed back to the deformable mirror to provide continuous corrections by appropriately moving the mirror elements.
  • Configurations such as this wherein the array of small lenses as referred to as a lenslet array, can be referred to as Shack-Hartmann wavefront sensors.
  • DeLong teaches a wavefront sensor for use in measuring local phase tilt in two dimensions over an optical beam cross section, using only one lenslet arrangement and one camera sensor array.
  • the measurements of DeLong are made with respect to first and second orthogonal sets of grid lines intersecting at points of interest corresponding to positions of optical device actuators. While this method does teach the way to correct aberrations in a non-laser light system, it cannot be used in cases where lasers are required.
  • U.S. Pat. No. 6,019,472 issued to Koester, et al. entitled “Contact Lens Element For Examination or Treatment of Ocular Tissues” issued on Feb. 1, 2000 teaches a multi-layered contact lens element including a plurality of lens elements wherein a first lens element has a recess capable of holding a volume of liquid against a cornea of the eye. A microscope is connected to the contact lens element to assist in the examination or treatment of ocular tissues.
  • U.S. Pat. No. 6,143,011 issued to Hood, et al. entitled “Hydrokeratome For Refractive Surgery” issued on Nov. 7, 2000 teaches a high speed liquid jet for forming an ophthalmic incisions.
  • the Hood, et al. system is adapted for high precision positioning of the jet carrier.
  • An airway beam may be provided by a collimated LED or laser diode. The laser beam can be used to align the system.
  • U.S. Pat. No. 6,155,684 issued to Billie, et al. entitled “Method and Apparatus for Precompensating The Refractive Properties of the Human Eye With Adaptive Optical Feedback Control” issued on Dec. 5, 2000.
  • Billie, et al. teaches a system for directing a beam of light through the eye and reflecting the light from the retina.
  • a lenslet array is used to obtain a digitized acuity map from the reflected light for generating a signal that programs an active mirror.
  • the optical paths of individuals beams in and the beam of light are made to appear to be substantially equal to each other.
  • the incoming beam can be precompensated to allow for the refractive aberrations of the eyes that are evidenced by the acuity map.
  • the invention includes a method for clarifying an optical/digital image of an object to perform a procedure on an object having the steps of applying to the object a light beam formed of incoherent light and reflecting the applied incoherent light beam from the object to provide a reflected light beam and providing electrical signals representative of the reflected light beam.
  • An image quality metric is determined in accordance with the electrical signals and an image is determined in accordance with the image quality metric.
  • the procedure is performed in accordance with the image quality metric.
  • a procedure is performed on an eye having an iris.
  • An iris biometric image representative of the iris is obtained and the procedure is performed on an eye in accordance with the iris biometric image.
  • a method for optimizing electromagnetic energy in a system for processing an image of an object in order to perform a procedure on an object includes applying to the object a plurality of light beams formed of incoherent light at a plurality of differing frequencies and reflecting the plurality of applied incoherent light beams from the object to provide a plurality of reflected light beams.
  • the method also includes providing a corresponding plurality of electrical signals representative of the reflected light beams of the plurality of reflected light beams and determining a corresponding plurality of image quality metrics in accordance with the plurality of electrical signals.
  • a corresponding plurality of images is determined in accordance with the plurality of image quality metrics and an image of the plurality of images is selected in accordance with a predetermined image criterion to provide a selected image.
  • the method also includes determining a frequency of the plurality of differing frequencies in accordance with the selected image to provide a determined frequency and performing the procedure on an object in accordance with the determined frequency.
  • the inventions also deals with new methods of high-resolution imaging and construction of images of the retina, and adaptive correction and diagnostics of eye optical aberrations, as well as such imaging of articles of manufacture, identifying articles and controlling a manufacturing process. Additionally, the method is applicable to identifying individuals in accordance with such images for medical purposes and for security purposes, such as a verification of an identity of an individual. These applications can be performed using adaptive optics techniques based on parallel stochastic perturbative gradient descent (PSPGD) optimization. This method of optimization is also known as simultaneous perturbation stochastic approximation (SPSA) optimization. Compensation of optical aberrations of the eye and improvement of retina image resolution can be accomplished using an electronically controlled phase spatial light modulator (SLM) as a wavefront aberration correction interfaced with an imaging sensor and a feedback controller that implements the PSPGD control algorithm.
  • SLM electronically controlled phase spatial light modulator
  • Examples of the electronically-controlled phase SLMs include a pixelized liquid-crystal device, micro mechanical mirror array, and deformable, piston or tip-tilt mirrors.
  • Wavefront sensing can be performed at the SLM and the wavefront aberration compensation is performed using retina image data obtained with an imaging camera (CCD, CMOS etc.) or with a specially designed very large scale integration imaging chip (VLSI imager).
  • the retina imaging data are processed to obtain a signal characterizing the quality of the retinal image (image quality metric) used to control the wavefront correction and compensate the eye aberrations.
  • the image quality computation can be performed externally using an imaging sensor connected with a computer or internally directly on an imaging chip.
  • the image quality metric signal is used as an input signal for the feedback controller.
  • the controller computes control voltages applied to the wavefront aberration correction.
  • the controller can be implemented as a computer module, a field programmable gate array (FPGA) or a VLSI micro-electronic system performing computations required for optimization of image quality metrics based on the PSPGD algorithm.
  • the use of the PSPGD optimization technique for adaptive compensation of eye aberration provides considerable performance improvement if compared with the existing techniques for retina imaging and eye aberration compensation and diagnostics, and therapeutic applications.
  • the first advantage is that the PSPGD algorithm does not require the use of laser illumination of the retina and consequently significantly reduces the risk of retina damage caused by a focused coherent laser beam.
  • a further advantage is that the PSPGD algorithm does not require the use of a wavefront sensor or wavefront aberration reconstruction computation. This makes the entire system low-cost and compact if compared with the existing adaptive optics systems for retina imaging.
  • the PSPGD algorithm can be implemented using a parallel analog, mix-mode analog-digital or parallel digital controller because of its parallel nature. This significantly speeds up the operations of the PSPGD algorithm, providing continuous retina image improvement, eye aberration compensation and diagnostics.
  • Optical aberration correction is based on direct optimization of the quality of an retina image obtained using a white light, incoherent, partially coherent imaging system.
  • the novel imaging system includes a multi-electrode phase spatial light modulator, or an adaptive mirror controlled with a computer or with a specially designed FPGA or VLSI system.
  • the calculated image quality metric is optimized using a parallel stochastic gradient descent algorithm.
  • the adaptive optical system is used in order to compensate severe optical aberrations of the eye and thus provide a high-resolution image and/or of the retina tissue and the eye aberration diagnostic.
  • FIGS. 1 A,B show a schematic representation of system suitable for practicing the eye aberration correcting method of the present invention.
  • FIG. 2 shows a flow chart representation of control algorithm suitable for use in the system of FIG. 1 when practicing the method of the present invention.
  • FIGS. 3 A,B show images of an artificial retina before and after correction of an aberration
  • FIGS. 4 A,B show an eye and a biometric image of the iris of the eye.
  • FIG. 5 shows a block diagram representation of an iris biometric image comparison system which can be used with the aberration correcting system of FIG. 1 .
  • FIG. 6 shows a block diagram representation of an iris positioning system which can be used in cooperation with the aberration correcting system of FIG. 1 .
  • FIG. 7 shows an illumination frequency optimization system which can be used in cooperation with the aberration correcting system of FIG. 1 .
  • FIG. 8 shows an image superpositioning system which can be used with the aberration correcting system of FIG. 1 .
  • FIGS. 1 A,B there are shown schematic representations of the aberration correcting system 10 of the present invention.
  • a light beam from a white light source 1 is redirected by a mirror 2 in order to cause it to enter an eye.
  • the white light beam from the light source 1 can be any kind of incoherent light.
  • the light from the mirror 2 reaches the retina 4 of the eye and reflected light exits the eye to provide two light beams, one passing in each direction, as indicated by arrow 3 .
  • the exiting light beam then passes through an SLM 5 .
  • the light beam from the SLM 5 enters an image sensor 6 .
  • the image sensor 6 can be a charge coupled capacitor device or any other device capable of sensing and digitizing the light beam from the SLM 5 .
  • the imaging sensor 6 can include an imaging chip for performing the calculations required to determine an image quality metric.
  • the image quality metric can thus be computed on the imaging chip directly or it can be calculated using a separate computational device/computer 7 that calculates the image quality metric of the retina image. It is the use of a digitized image in this manner that permits the use of an incoherent light rather than a coherent light for performing the operations of the aberration correction correcting system 10 .
  • the computational device 7 sends a measurement signal representative of the image quality metric to a controller 8 .
  • the controller 8 implements a PSPGD algorithm by computing control voltages and applying the computed control voltages to the SLM 5 .
  • the PSPGD algorithm used by the controller 8 can be any conventional PSPGD algorithm known to those of ordinary skill in the art.
  • the controller 8 continuously receives digital information about the quality of the image and continuously updates the control voltages applied to the SLM 5 until the quality of the retina image is optimized according to predetermined image quality optimization criteria.
  • FIGS. 2 and 3 A,B there are shown a flow chart representation of a portion of a PSPGD control algorithm 20 for use in cooperation with the aberration correcting system 10 in order to practice the present invention as well as representations of the corrected image, both before correction ( 3 A) and after correction ( 3 B).
  • a single iterative step of the PSPGD control algorithm 20 is shown with a loop for repeating the single iterative step until the quality of the compensation is acceptable.
  • step 25 of the PSPGD control algorithm 20 a measurement and calculation of the image quality metric is performed.
  • This step includes the retinal image capture performed by the sensor 5 and the calculation of the image quality metric performed by the computational device 7 within the aberration correcting system 10 .
  • the image captured by the sensor 5 at the beginning of the operation of the PSPGD control algorithm 20 can be substantially as shown in FIG. 3A , as previously described.
  • the image quality metric can be a sharpness function.
  • the Laplacian can be calculated by convolving the image with a Laplacian kernel.
  • the convolving of the image can be performed by a special purpose VLSI microchip.
  • the convolving of the image can be performed using a computer that receives an image from a digital camera as described in more detail below.
  • different digital high-pass filters can be used rather than the Laplacian operator.
  • a frequency distribution function can be used rather than a sharpness function when determining the image quality metric.
  • the use of a frequency distribution function allows the system to distinguish tissues of different colors. This is useful where different kinds of tissue, for example, different tumors, have different colors. Locating tumors in this manner also permits the invention to provide tumor location information, such as a grid location on a grid having a pre-determined reference in order to assist in diagnosis and surgery. It also permits the invention to provide tumor size and type information. Additionally, the use of a frequency distribution function permits a surgeon to determine which light frequencies are best for performing diagnosis and surgery.
  • F is the Fourier transform operator and [[ ⁇ ]] ⁇ is a parameter that is dependent upon the dynamic range of the used image.
  • step 30 of the PSPGD control algorithm 20 random perturbations in the voltages applied to the SLM 5 electrodes are generated.
  • the SLM 5 can be a liquid crystal membrane for modifying the light beam according to the electrical signals from controller 8 in a manner well understood by those skilled in the art.
  • step 35 of the PSPGD control algorithm 20 a measurement of the perturbed image quality metric and a computation of the image quality perturbation ⁇ J (m) are performed.
  • a two-sided perturbation can be used.
  • two measurements of the cost function perturbations J + and J ⁇ are taken.
  • the two measurements correspond to sequentially applied differential perturbations +u j /2 and ⁇ u j /2.
  • the process steps 25 - 45 of the PSPGD control algorithm 20 are repeated interactively until the image quality metric has reached an acceptable level as determined in step 50 .
  • the choice of an acceptable level of the image quality metric is a conventional one well known to those skilled in the art.
  • the aberration is then corrected and an image of the retina can be taken.
  • the image resulting from the operation of the PSPGD algorithm 20 can be as shown in FIG. 3B .
  • the eye aberration function (x,y) can be calculated from known voltages applied to wavefront correction ⁇ u j ⁇ at the end of the iterative optimization process and known response functions of ⁇ S j (x,y) ⁇ wavefront correction.
  • the iris biometric image 90 is a biometric image of the iris 84 , which can be obtained using an iris scanning system, such as the aberration correcting system 10 .
  • the iris biometric image 90 can be obtained by any other system (not shown) capable of scanning and digitizing an iris and providing an image that is characteristic of the iris, such as a bar code type output as shown in FIG. 4B .
  • every human eye has an unique iris biometric image when it is scanned and digitized in this manner.
  • an iris biometric image can be used as a unique identifier of an individual in the manner that fingerprints are used or even to distinguish between the left and right eyes of an individual.
  • a plurality of locations 92 within the iris 84 can be defined.
  • four locations 92 can be selected.
  • the four locations 92 can be disposed on the corners of a rectangle which is concentric with the iris 84 .
  • the locations 92 can thus be easily used to find the center of the iris 84 .
  • the four locations 92 are represented on the iris biometric image 90 in accordance with the mathematical relationships previously described.
  • the xy coordinates of the locations 92 may be mapped into corresponding xy coordinates within the iris biometric image 90 if a spatial transform such as the sharpness function is used, while they may be convolved over areas of the iris biometric image 90 if a frequency or other transform is used.
  • Various features already occurring in the eye 80 also have corresponding representations within the iris biometric image 90 .
  • the location and study of such features can be used to diagnose pathologies, for example, to diagnose tumors and to determine the position of the eye iris 84 .
  • a feature can be studied several times over a period of time to determine how its parameters are is changing.
  • the iris biometric image comparison system 100 receives the previously determined iris biometric image 90 as one of its inputs. Additionally, a new iris biometric image 95 is produced, for example, before or during the performance of a procedure on the eye 80 . The new iris biometric image 95 is received by the image comparison system 100 as a second input. The new iris biometric image 95 can be provided by the aberration correction system 10 . The light beam used to obtain the iris biometric image 95 can be the same light beam being used for other purposes during the procedure.
  • the image can be optimized by executing additional iterations of the PSPGD control algorithm 20 .
  • the algorithm can be iterated until a predetermined image quality is obtained and computing the image quality metric within the computer 7 as previously described.
  • increased image sensitivity quality can be obtained by increasing the number of pixels in the digitized image or increase image sensitivity can be obtained by increasing the number of measuring points in the iris 84 .
  • the iris biometric image 90 can be assumed by the image comparison system 100 to be the correct iris biometric image of the iris 84 upon which the procedure is to be performed. Furthermore, it can be assumed that the iris biometric image 90 applied to the image comparison system 100 was obtained when the position and orientation of the eye 80 were correct.
  • the iris biometric images 90 , 95 are compared by the image comparison system 100 at decision 104 .
  • a determination is made as to whether the iris biometric image 95 is an image of the same iris 84 that was imaged to produce the enrolled iris biometric image 90 .
  • Any of the well known correlation techniques can be used for the comparison.
  • Substantially similar correlation techniques can be used for the comparison if the locations 92 are used or if other markings within the iris 84 are used. The sensitivity of the comparison can be adjusted by those skilled in the art.
  • decision 104 determines whether the procedure being performed on the eye 80 is not continued as shown in block 102 . If the determination of decision 104 is positive, then a determination can be made in decision 106 whether the iris 84 is positioned in the xy directions correctly and oriented or rotated correctly at the time that the iris biometric image 95 was produced.
  • the determination of decision 106 can be used for a number of purposed. For example, it could be used to direct a beam of light to a predetermined location within the eye 80 . Thus, if the determination of decision 106 is negative, the beam can be redirected as shown in block 110 . The position of the iris 84 can be checked again in decision 106 . When the position of the iris 84 is correct, the procedure can begin, as shown in block 112 .
  • the determination of decision 106 can be made in accordance with the representations of locations 92 within the iris 84 selected when iris biometric image 90 was obtained. If corresponding locations are found in the iris biometric image 95 in the same positions, the determination of decision 106 is positive. Alternately, the determination of decision 106 can be made in accordance with predetermined features or markings within the iris 84 other than the locations 92 .
  • the method of the image comparison system 100 can be used to determine whether the iris 84 is rotated or translated in the direction of either of the axes orthogonal to the arrow 3 shown in FIGS. 1 A,B.
  • the iris positioning system 120 is adapted to precisely position the iris 84 while performing a procedure on the eye 80 .
  • the iris positioning system 120 differs from the iris biometric image comparison system 100 primarily in the fact that the iris positioning system 120 is provided with a servo 124 .
  • the servo 124 is effective in modifying the relative positions of the iris 84 and the camera 6 of the aberration correcting system 10 which can be coupled to equipment (not shown) used to perform the procedure in the eye.
  • a correction signal representative of the error is calculated.
  • the error correction signal is applied to the servo 124 .
  • the servo 124 is adapted to receive the error correction signal resulting from the determinations of decision 106 and to adjust the relative positions of the iris 84 and the equipment performing the procedure in accordance with the signal in a manner well understood by those skilled in the art.
  • Servos 124 capable of applying both rotational and multi-axis translational corrections are both provided in the preferred embodiment of the invention. Either the object such as the iris 84 or the equipment can be moved in response to the determination of decision 106 .
  • the method of the iris positioning system 120 can be repeatedly performed, or constantly performed, during the performance of a procedure on the eye 80 to re-capture, re-evaluate or refine the process the eye 80 .
  • the relative positions of the iris 84 and the procedure equipment can be kept correct at all times.
  • the illumination frequency optimization system 130 is an alternate embodiment of the aberration correcting system 10 .
  • a variable frequency light source 132 rather than a single frequency light source applies a light beam to the eye 80 .
  • the variable frequency light source 132 can be a tunable laser, a diode, filters in front of a light source, a diffraction grating or any other source of a plurality of frequencies of light.
  • An image quality metric can be obtained and optimized in the manner previously described with respect to system 10 .
  • variable frequency light source 132 it is possible to conveniently adjust the frequency of the light beam used to illuminate the eye 80 or object 80 at a plurality of differing frequencies and to obtain a plurality of corresponding image quality metrics.
  • the frequency of the light applied to the eye 80 by the variable frequency light source 132 can be repeatedly adjusted and a new image quality metric can be obtained at each frequency.
  • Each image quality metric obtained in this manner can be optimized to a predetermined level. The levels of optimization can be equal or they can differ. While the optimizations should be done using the frequency distribution, it is possible to return to images optimized using the frequency distribution and sharpen using the sharpness function.
  • tissue can be visualized best with differing frequencies of light.
  • tumors, lesions, blood and various tissues as well as tissues of varying pathologies can be optimally visualized at different frequencies since their absorption and reflection properties vary.
  • the best light for visualizing selected features can be determined.
  • there can be several optimized images for one eye For example, there can be different optimized images, for a tumor, for a lesion and for blood. The determination of the best frequency for each image can be a subjective judgment made by a skilled practitioner.
  • a skilled practitioner can use the illumination frequency optimization system 130 to emphasize and de-emphasize selected features within images of the eye 80 .
  • the iris 84 may be clouded due to inflammation of the eye 80 or the presence of blood in the eye 80 . It is possible to effectively remove the effects of the inflammation blood with the assistance of the frequency optimization system 130 by varying the frequency of the light provided by the light source 132 until the optimum frequency is found for de-emphasizing the inflammation or blood and permitting the obscured features to be seen. In general, it is often possible to visualize features when another feature is superimposed on them by removing the superimposed feature using system 130 .
  • a plurality of images of the eye 80 can be provided and the frequency at which the blood or inflammation is least apparent can be determined. Removing these features from the iris biometric image 95 can facilitate its comparison with the iris biometric image 90 . Furthermore, when the biometric image 95 is obtained from the iris 110 of a person wearing sunglasses, it is possible to remove the effects of the sunglasses in the same manner and identify an eye 80 behind the sunglasses. This feature is useful when identifying people outside of laboratory conditions.
  • Image superposition system 150 In many cases it is desirable to perform a procedure on an eye 80 when selected features of the eye 80 are obscured by other features, where different features are visualized best at different frequencies, or where the criteria for emphasizing and de-emphasizing features can change during a procedure. Image superposition 100 can be used to obtain improved feature visualization under these and other circumstances.
  • white light is often preferred for illuminating an iris 84 because in many cases white light shows the most features.
  • white light is used to illuminate an iris 84 when the iris 84 is clouded with blood, the blood can block the white light. This can make it difficult, or even impossible, to visualize the features that are obscured by the blood.
  • One solution to this problem is to use red light to illuminate the iris 84 and visualizes the features obscured by the blood.
  • the image superposition system 150 can solve this problem by superimposing two images such as the direct image 166 and the projected image 170 , where the images 166 , 170 are obtained using light sources of differing frequencies.
  • the optimum frequencies for obtaining each of the images 166 , 170 can be determined using the illumination frequency optimization system 130 .
  • an object 168 to be visualized can be illuminated with incoherent white light to provide the direct image 166 .
  • Illumination of the object 168 by white light to produce the direct image 166 can be provided using any of the known methods for providing such illumination of objects to provide digital images.
  • the direct image 166 can be sensed and digitized using an image sensor 152 which senses light traveling from the object 168 in the direction indicated by the arrows 156 , 164 .
  • the image sensor 152 senses the direct image 166 of the object 168 by way of a superposition screen 160 .
  • the superposition screen 160 can be formed of any material capable of transmitting a portion to the light applied to it from the object 168 to the image sensor 152 , and reflecting a portion of the same light.
  • the superposition screen 168 can be formed of glass or plastic.
  • a viewer, a TV screen or a gradient filter can also serve as the superposition screen 160 .
  • the screen 160 can also be a gradient filter.
  • the angle 172 of the superposition screen 160 can be adjusted to control the amount of light it transmits and the amount it reflects.
  • the projected image 170 of the object 168 can be obtained using, for example, the aberration correcting system 10 as previously described. Illumination with red light or any other frequency of light can be used within the aberration correcting system 10 to obtain the superposition image 178 .
  • the superposition image 178 is applied to an image projector 176 by the aberration correcting system 10 .
  • the image projector 176 transmits the projected image 170 in accordance with the superposition image 178 in the direction indicated by the arrow 174 and applies it to the superposition screen 160 .
  • a portion of the projected image 170 applied to the superposition screen 160 by the projector 176 is reflected off of the superposition screen 160 and applied to the image sensor 152 in the direction indicated by the arrow 156 .
  • the amount of the projected image 170 reflected to the image sensor 152 can be adjusted by adjusting the angle 172 of the superposition screen 160 .
  • the image projector 176 is disposed in a location adapted to apply the projected image 170 to the superposition screen 160 in the same region of the superposition screen 160 where the direct image 166 is applied.
  • Adjustment of the angle 172 results in emphasizing and de-emphasizing the images 166 , 170 relative to each other. This is useful, for example, where different features visualized selectively at differing frequencies must be brought in and out of visualization in the composite image for different purposes. Another time where this is useful is when the intensity of one of the images 166 , 170 is too high relative to the other and must be adjusted down or too low and must be adjusted up.
  • either or both of the images 166 , 170 can be optimized using the PSPGD algorithm 20 within the aberration correction system 10 .
  • the images 166 , 170 can be optimized to differing degrees by the PSPGD algorithm 20 and with differing optimization criteria in order to emphasis one over the other or to selectively visualize selected features within the images 166 , 170 and thus, within the composite image sensed by image sensor 152 . This permits selected features of the eye 80 to be brought into view and brought out of view as convenient at different times during a diagnosis or a procedure.
  • the illumination used to obtain the images 166 , 170 superimposed by the image superposition system 150 does not need to be red and white light.
  • the illumination used can be light of any differing frequencies.
  • the frequencies selected for obtaining the images 166 , 170 can be selected in accordance with the sharpness function on the frequency distribution as previously described.
  • the images superimposed by the image superposition system 150 do not need to be obtained by way of a camera, such as the camera 6 of the aberration correction system 10 .
  • a microscope, an endoscope, or any other type of device having an image sensor capable of capturing transmission, absorption or reflection properties of an object or tissue in a normal state or enhancement by such materials as markers and chromophores and thereby providing an optical/digital signal that can be applied to the computer 7 for optimization using the PSPGD algorithm 20 can be used.
  • an image obtained from an endoscope or a microscope can be superimposed upon an image obtained from an camera using the method of the present invention.
  • Images from endoscopes, microscopes and other devices can be digitized, and superimposed and synthesized with each other. It will be understood that images obtained from such devices and optimized using the PSPGD algorithm 20 can be used in any other way that images obtained from the PSPGD algorithm 20 using camera 6 are used.
  • the invention may be used for ophthalmological procedures such as photocoagulation, optical biopsies such as measuring tumors anywhere in the eye, providing therapy, performing surgery, diagnosis or measurements. Additionally, it can be used for performing procedures on eyes outside of laboratory or medical environments.
  • the method of the present invention can be applied to any other objects capable of being imaged in addition to eyes and images of an object provided. In accordance with the method of the invention can be used when performing such procedures on other objects.

Abstract

A method for performing a procedure on a patient, includes obtaining a biometric image representative of the patient and performing the procedure on the patient in accordance with the biometric image. The patient has an iris and the biometric image comprises an iris biometric image and the procedure comprises a medical procedure. First and second iris biometric images; are obtained and the first and second iris biometric images are compared to provide a biometric comparison result. A patient is identified in accordance with the biometric comparison result. The patient has at least one feature and the feature is represented within at least one of the first and second iris biometric images.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of Invention
  • This invention relates to a method and a system for high-resolution retinal imaging, eye aberration compensation, and diagnostics based on adaptive optics with direct optimization of an image quality metric using a stochastic parallel perturbative gradient descent technique.
  • Adaptive optics is a promising technique for both diagnostics of optical aberrations of the eye and substantially aberration-free high-resolution imaging of the retina. In existing adaptive optics techniques adaptive correction is based on illumination of the retina by a collimated laser beam to create a small size laser location on the retina surface with consequent measurement of phase aberrations of the wave scattered by the retina tissue. Correction of eye optical aberrations is then performed using the conventional phase conjugation technique.
  • This traditional approach has several important drawbacks. One important drawback is the danger due to an invasive use of the laser beam focused onto the retina. Other drawbacks include overall system complexity and the high cost of the necessary adaptive optics elements such as a wavefront sensor and wavefront reconstruction hardware. More importantly, due to aberrations the laser beam location size on the retina is not small enough to use it as a reference point-type light source and hence conjugation of the measured wavefront does not result in optimal optical aberration correction. Additionally, the traditional approach can produce a turbid image that can make performing an operation with a microscope difficult.
  • One prior art method using a laser is taught in U.S. Pat. No. 6,095,651 entitled “Method and Apparatus for Improving Vision and the Resolution of Retinal Images”, issued to Williams, et al. on Aug. 1, 2000. In Williams, et al. teaches a method and apparatus for improving resolution of retinal images. In this method, a point source of light is produced on the retina by a laser beam. The source is reflected from the retina and received at a lenslet array of a Hartman-Shack wavefront sensor. Thus, higher order aberrations of the eye can be measured and data can be obtained for compensating the aberrations using a system including a laser. U.S. Pat. Nos. 5,777,719 and 5,949,521 provide essentially the same teachings. While these references teach satisfactory methods for compensating aberrations, there is some small risk of damaging the retina since these methods require applying laser beams to the retina.
  • U.S. Pat. No. 5,912,731, entitled “Hartmann-type Optical Wavefront Sensor” issued to DeLong, et al. on Jun. 5, 1999 teaches an adaptive optics system using adjustable optical elements to compensate for aberrations in an optical beam. The aberrations may be caused, for example, by propagation of the beam through the atmosphere. The aberrated beam can be reflected from a deformable mirror having many small elements, each having an associated separate actuator.
  • Part of the reflected beam taught by DeLong can be split off and directed to impinge on a sensor array which provides measurements indicative of the wavefront distortion in the reflected beam. The wavefront distortion measurements can then be fed back to the deformable mirror to provide continuous corrections by appropriately moving the mirror elements. Configurations such as this, wherein the array of small lenses as referred to as a lenslet array, can be referred to as Shack-Hartmann wavefront sensors.
  • Additionally, DeLong teaches a wavefront sensor for use in measuring local phase tilt in two dimensions over an optical beam cross section, using only one lenslet arrangement and one camera sensor array. The measurements of DeLong are made with respect to first and second orthogonal sets of grid lines intersecting at points of interest corresponding to positions of optical device actuators. While this method does teach the way to correct aberrations in a non-laser light system, it cannot be used in cases where lasers are required.
  • U.S. Pat. No. 6,007,204 issued to Fahrenkrug, et al. entitled “Compact Ocular Measuring System”, issued on Dec. 28, 1999, teaches a method for determining refractive aberrations of the eye. In the system taught by Fahrenkrug, et al. a beam of light is focused at the back of the eye of the patient so that a return light path from the eye impinges upon a sensor having a light detecting surface. A micro optics array is disposed between the sensor and the eye along the light path. The lenslets of the micro optics array focus incremental portions of the outgoing wavefront onto the light detecting surface so that the deviations and the positions of the focused portions can be measured. A pair of conjugate lenses having differing focal lengths is also disposed along the light path between the eye and the micro optics array.
  • U.S. Pat. No. 6,019,472, issued to Koester, et al. entitled “Contact Lens Element For Examination or Treatment of Ocular Tissues” issued on Feb. 1, 2000 teaches a multi-layered contact lens element including a plurality of lens elements wherein a first lens element has a recess capable of holding a volume of liquid against a cornea of the eye. A microscope is connected to the contact lens element to assist in the examination or treatment of ocular tissues.
  • U.S. Pat. No. 6,086,204, issued to Magnante entitled “Methods and Devices To Design and Fabricate Surfaces on Contact Lenses and On Corneal Tissue That Correct the Eyes Optical Aberrations” on Jul. 11, 2000. Magnante teaches a method for measuring the optical aberrations of an eye either with or without a contact lens in place on the cornea. A mathematical analysis is performed on the optical aberrations of the eye to design a modified shape for the original contact lens or cornea that will correct the optical aberrations. An aberration correcting surface is fabricated on the contact lens by a process that includes laser ablation and thermal molding. The source of light can be coherent or incoherent.
  • U.S. Pat. No. 6,143,011, issued to Hood, et al. entitled “Hydrokeratome For Refractive Surgery” issued on Nov. 7, 2000 teaches a high speed liquid jet for forming an ophthalmic incisions. The Hood, et al. system is adapted for high precision positioning of the jet carrier. An airway beam may be provided by a collimated LED or laser diode. The laser beam can be used to align the system.
  • U.S. Pat. No. 6,155,684, issued to Billie, et al. entitled “Method and Apparatus for Precompensating The Refractive Properties of the Human Eye With Adaptive Optical Feedback Control” issued on Dec. 5, 2000. Billie, et al. teaches a system for directing a beam of light through the eye and reflecting the light from the retina. A lenslet array is used to obtain a digitized acuity map from the reflected light for generating a signal that programs an active mirror. In accordance with the signal the optical paths of individuals beams in and the beam of light are made to appear to be substantially equal to each other. Thus, the incoming beam can be precompensated to allow for the refractive aberrations of the eyes that are evidenced by the acuity map.
  • Additional methods for using adaptive optics to compensate for aberrations of the human eye are taught in J. Liang, D. Williams and D. Miller, “Supernormal Vision and High-Resolution Retinal Imaging Through Adaptive Optics,” J. Opt. Soc. Am. A, Vol. 14, No. 11, pp. 2884-2891, 1997 and F. Vargas-Martin, P. Prieto, and P. Artal, “Correction of the Aberrations in the Human Eye with a Liquid-Crystal Spatial Light Modulator: Limits to Performance,” J. Opt. Soc. Am. A, Vol. 15, No. 9, pp. 2552-2561, 1998. Additionally, J. Liang, B. Grimm, S. Goelz, and J. Bille, “Objective Measurement of Wave Aberrations of the Human Eye with the Use of a Hartmann-Shack Wave-Front Sensor,” J. Opt. Soc. Am. A, Vol. 11, No. 7, pp. 1949-1957, 1994 teaches such a use of adaptive optics.
  • Furthermore, it is known in the art to use a PSPGD optimization algorithm in different applications. For example, see M. Vorontsov, and V. Sivokon, “Stochastic Parallel-Gradient-Descent Technique for High-Resolution Wave-Front Phase-Distortion Correction,” J. Opt. Soc. Am. A, Vol. 15, No. 10, pp. 2745-2758, 1998. Also see M. Vorontsov, G. Carhart, and J. Ricklin, “Adaptive Phase-Distortion Correction Based on Parallel Gradient-Descent Optimization,” Optics Letters, Vol. 22, No. 12, pp. 907-909, 1997.
  • It is well known in the art to scan an iris and obtain an iris biometric image. See, for example, U.S. Pat. Nos. 4,641,349, 5,291,560, 5,359,669, 5,719,950, 6,289,113, 6,377,699, 6,526,160, 6,532,298, 6,539,100, 6,542,624, 6,546,121, 6,549,118, 6,556,699, 6,594,377, 6,614,919, and U.S. Patent Application Nos. 20010026632A1, 20020080256A1, 20030095689A1, 20030120934A1, 20020057438A1, 20020132663A1, 20030018522A1, 20020158750A1. However, such images were often not optimal and their applicability was somewhat limited.
  • 2. Description of Related Art
  • All references cited herein are incorporated herein by reference in their entireties.
  • BRIEF SUMMARY OF THE INVENTION
  • The invention includes a method for clarifying an optical/digital image of an object to perform a procedure on an object having the steps of applying to the object a light beam formed of incoherent light and reflecting the applied incoherent light beam from the object to provide a reflected light beam and providing electrical signals representative of the reflected light beam. An image quality metric is determined in accordance with the electrical signals and an image is determined in accordance with the image quality metric. The procedure is performed in accordance with the image quality metric.
  • In a further method of the invention a procedure is performed on an eye having an iris. An iris biometric image representative of the iris is obtained and the procedure is performed on an eye in accordance with the iris biometric image.
  • Additionally a method for optimizing electromagnetic energy in a system for processing an image of an object in order to perform a procedure on an object is provided. The method includes applying to the object a plurality of light beams formed of incoherent light at a plurality of differing frequencies and reflecting the plurality of applied incoherent light beams from the object to provide a plurality of reflected light beams. The method also includes providing a corresponding plurality of electrical signals representative of the reflected light beams of the plurality of reflected light beams and determining a corresponding plurality of image quality metrics in accordance with the plurality of electrical signals. A corresponding plurality of images is determined in accordance with the plurality of image quality metrics and an image of the plurality of images is selected in accordance with a predetermined image criterion to provide a selected image. The method also includes determining a frequency of the plurality of differing frequencies in accordance with the selected image to provide a determined frequency and performing the procedure on an object in accordance with the determined frequency.
  • The inventions also deals with new methods of high-resolution imaging and construction of images of the retina, and adaptive correction and diagnostics of eye optical aberrations, as well as such imaging of articles of manufacture, identifying articles and controlling a manufacturing process. Additionally, the method is applicable to identifying individuals in accordance with such images for medical purposes and for security purposes, such as a verification of an identity of an individual. These applications can be performed using adaptive optics techniques based on parallel stochastic perturbative gradient descent (PSPGD) optimization. This method of optimization is also known as simultaneous perturbation stochastic approximation (SPSA) optimization. Compensation of optical aberrations of the eye and improvement of retina image resolution can be accomplished using an electronically controlled phase spatial light modulator (SLM) as a wavefront aberration correction interfaced with an imaging sensor and a feedback controller that implements the PSPGD control algorithm.
  • Examples of the electronically-controlled phase SLMs include a pixelized liquid-crystal device, micro mechanical mirror array, and deformable, piston or tip-tilt mirrors. Wavefront sensing can be performed at the SLM and the wavefront aberration compensation is performed using retina image data obtained with an imaging camera (CCD, CMOS etc.) or with a specially designed very large scale integration imaging chip (VLSI imager). The retina imaging data are processed to obtain a signal characterizing the quality of the retinal image (image quality metric) used to control the wavefront correction and compensate the eye aberrations.
  • The image quality computation can be performed externally using an imaging sensor connected with a computer or internally directly on an imaging chip. The image quality metric signal is used as an input signal for the feedback controller. The controller computes control voltages applied to the wavefront aberration correction. The controller can be implemented as a computer module, a field programmable gate array (FPGA) or a VLSI micro-electronic system performing computations required for optimization of image quality metrics based on the PSPGD algorithm.
  • The use of the PSPGD optimization technique for adaptive compensation of eye aberration provides considerable performance improvement if compared with the existing techniques for retina imaging and eye aberration compensation and diagnostics, and therapeutic applications. The first advantage is that the PSPGD algorithm does not require the use of laser illumination of the retina and consequently significantly reduces the risk of retina damage caused by a focused coherent laser beam. A further advantage is that the PSPGD algorithm does not require the use of a wavefront sensor or wavefront aberration reconstruction computation. This makes the entire system low-cost and compact if compared with the existing adaptive optics systems for retina imaging. Additionally, the PSPGD algorithm can be implemented using a parallel analog, mix-mode analog-digital or parallel digital controller because of its parallel nature. This significantly speeds up the operations of the PSPGD algorithm, providing continuous retina image improvement, eye aberration compensation and diagnostics.
  • Thus, in the adaptive correction technique of the present invention neither laser illumination nor wavefront sensing are required. Optical aberration correction is based on direct optimization of the quality of an retina image obtained using a white light, incoherent, partially coherent imaging system. The novel imaging system includes a multi-electrode phase spatial light modulator, or an adaptive mirror controlled with a computer or with a specially designed FPGA or VLSI system. The calculated image quality metric is optimized using a parallel stochastic gradient descent algorithm. The adaptive optical system is used in order to compensate severe optical aberrations of the eye and thus provide a high-resolution image and/or of the retina tissue and the eye aberration diagnostic.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • The invention will be described in conjunction with the following drawings in which like reference numerals designate like elements and wherein:
  • FIGS. 1A,B show a schematic representation of system suitable for practicing the eye aberration correcting method of the present invention.
  • FIG. 2 shows a flow chart representation of control algorithm suitable for use in the system of FIG. 1 when practicing the method of the present invention.
  • FIGS. 3A,B show images of an artificial retina before and after correction of an aberration
  • FIGS. 4A,B show an eye and a biometric image of the iris of the eye.
  • FIG. 5 shows a block diagram representation of an iris biometric image comparison system which can be used with the aberration correcting system of FIG. 1.
  • FIG. 6 shows a block diagram representation of an iris positioning system which can be used in cooperation with the aberration correcting system of FIG. 1.
  • FIG. 7 shows an illumination frequency optimization system which can be used in cooperation with the aberration correcting system of FIG. 1.
  • FIG. 8 shows an image superpositioning system which can be used with the aberration correcting system of FIG. 1.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Referring now to FIGS. 1A,B there are shown schematic representations of the aberration correcting system 10 of the present invention. In the aberration correcting system 10 a light beam from a white light source 1 is redirected by a mirror 2 in order to cause it to enter an eye. In accordance with the present invention the white light beam from the light source 1 can be any kind of incoherent light.
  • The light from the mirror 2 reaches the retina 4 of the eye and reflected light exits the eye to provide two light beams, one passing in each direction, as indicated by arrow 3. The exiting light beam then passes through an SLM 5. The light beam from the SLM 5 enters an image sensor 6. The image sensor 6 can be a charge coupled capacitor device or any other device capable of sensing and digitizing the light beam from the SLM 5.
  • The imaging sensor 6 can include an imaging chip for performing the calculations required to determine an image quality metric. The image quality metric can thus be computed on the imaging chip directly or it can be calculated using a separate computational device/computer 7 that calculates the image quality metric of the retina image. It is the use of a digitized image in this manner that permits the use of an incoherent light rather than a coherent light for performing the operations of the aberration correction correcting system 10.
  • The computational device 7 sends a measurement signal representative of the image quality metric to a controller 8. The controller 8 implements a PSPGD algorithm by computing control voltages and applying the computed control voltages to the SLM 5. The PSPGD algorithm used by the controller 8 can be any conventional PSPGD algorithm known to those of ordinary skill in the art. In the preferred embodiment of the invention, the controller 8 continuously receives digital information about the quality of the image and continuously updates the control voltages applied to the SLM 5 until the quality of the retina image is optimized according to predetermined image quality optimization criteria.
  • Referring now to FIGS. 2 and 3A,B there are shown a flow chart representation of a portion of a PSPGD control algorithm 20 for use in cooperation with the aberration correcting system 10 in order to practice the present invention as well as representations of the corrected image, both before correction (3A) and after correction (3B). In order to simplify the drawing a single iterative step of the PSPGD control algorithm 20 is shown with a loop for repeating the single iterative step until the quality of the compensation is acceptable.
  • In step 25 of the PSPGD control algorithm 20 a measurement and calculation of the image quality metric is performed. This step includes the retinal image capture performed by the sensor 5 and the calculation of the image quality metric performed by the computational device 7 within the aberration correcting system 10. The image captured by the sensor 5 at the beginning of the operation of the PSPGD control algorithm 20 can be substantially as shown in FIG. 3A, as previously described. One can use any relevant metric entity as an image quality metric. For example, in one embodiment of the PSPGD control algorithm 20 the image quality metric can be a sharpness function. A sharpness function suitable for use in the present invention can be defined as
    J=∫|∇ 2 I(x,y)|dxdy
    where I(x,y) is the intensity distribution in the image, and ∇2 is the Laplacian operator over the image. The Laplacian can be calculated by convolving the image with a Laplacian kernel. The convolving of the image can be performed by a special purpose VLSI microchip. Alternately, the convolving of the image can be performed using a computer that receives an image from a digital camera as described in more detail below. In another embodiment different digital high-pass filters can be used rather than the Laplacian operator.
  • Additionally, a frequency distribution function can be used rather than a sharpness function when determining the image quality metric. The use of a frequency distribution function allows the system to distinguish tissues of different colors. This is useful where different kinds of tissue, for example, different tumors, have different colors. Locating tumors in this manner also permits the invention to provide tumor location information, such as a grid location on a grid having a pre-determined reference in order to assist in diagnosis and surgery. It also permits the invention to provide tumor size and type information. Additionally, the use of a frequency distribution function permits a surgeon to determine which light frequencies are best for performing diagnosis and surgery.
  • The image quality metric J can also be calculated either optically or digitally using the expression introduced in:
    J=∫| F{exp [iγI(x,y)]}|4 dxdy
  • Where F is the Fourier transform operator and [[ã]] γ is a parameter that is dependent upon the dynamic range of the used image.
  • In step 30 of the PSPGD control algorithm 20 random perturbations in the voltages applied to the SLM 5 electrodes are generated. The SLM 5 can be a liquid crystal membrane for modifying the light beam according to the electrical signals from controller 8 in a manner well understood by those skilled in the art.
  • In order to generate the perturbations for application to the electrodes for the SLM 5 random numbers with any statistical properties can be used as perturbations. For example, uncorrelated random coin flip perturbations having identical amplitudes|uj and the Bernoulli probability distribution:
    duj=±p, Pr(du j =+p)=0.5
    for all j=1, . . . , N (N=the number of control channels) and iteration numbers can be used. Note that Non-Bernoulli perturbations are also allowed in the PSPGD control algorithm 20.
  • In step 35 of the PSPGD control algorithm 20 a measurement of the perturbed image quality metric and a computation of the image quality perturbation δJ(m) are performed. Following the determination of the perturbed image quality metric, the gradient estimations
    {tilde over (J)}′j (m)=δJ(m)πj (m)
    are computed as shown in step 40.
  • The updated control voltages are then determined as shown in step 45. Therefore, a calculation of:
    u j (m+1) = j (m) −γδJ (m)πj (m)
    is performed.
  • To further improve the accuracy of gradient estimation in the PSPGD control algorithm 20 a two-sided perturbation can be used. In a two-sided perturbation two measurements of the cost function perturbations J+ and J are taken. The two measurements correspond to sequentially applied differential perturbations +uj/2 and −uj/2.
  • It follows that:
    dJ=dJ 30 −dJ and
    {tilde over (J)}′j=δJδuj
    which can produce a more accurate gradient estimate.
  • The process steps 25-45 of the PSPGD control algorithm 20 are repeated interactively until the image quality metric has reached an acceptable level as determined in step 50. The choice of an acceptable level of the image quality metric is a conventional one well known to those skilled in the art. As shown in step 55 the aberration is then corrected and an image of the retina can be taken. The image resulting from the operation of the PSPGD algorithm 20 can be as shown in FIG. 3B.
  • The eye aberration function (x,y) can be calculated from known voltages applied to wavefront correction {uj} at the end of the iterative optimization process and known response functions of {Sj(x,y)} wavefront correction. j ( x , y ) = j = 1 N u j S j ( x , y )
  • Referring now to FIGS. 4A,B, there is shown an eye 80 having an iris 84 with a pupil 88 therein and an iris biometric image 90. The iris biometric image 90 is a biometric image of the iris 84, which can be obtained using an iris scanning system, such as the aberration correcting system 10. In an alternate embodiment of the invention, the iris biometric image 90 can be obtained by any other system (not shown) capable of scanning and digitizing an iris and providing an image that is characteristic of the iris, such as a bar code type output as shown in FIG. 4B. Furthermore, it will be understood that every human eye has an unique iris biometric image when it is scanned and digitized in this manner. Thus, an iris biometric image can be used as a unique identifier of an individual in the manner that fingerprints are used or even to distinguish between the left and right eyes of an individual.
  • When the predetermined image quality is obtained, a plurality of locations 92 within the iris 84 can be defined. In one preferred embodiment of the invention, four locations 92 can be selected. The four locations 92 can be disposed on the corners of a rectangle which is concentric with the iris 84. The locations 92 can thus be easily used to find the center of the iris 84. The four locations 92 are represented on the iris biometric image 90 in accordance with the mathematical relationships previously described. Thus, the xy coordinates of the locations 92 may be mapped into corresponding xy coordinates within the iris biometric image 90 if a spatial transform such as the sharpness function is used, while they may be convolved over areas of the iris biometric image 90 if a frequency or other transform is used.
  • Various features already occurring in the eye 80 also have corresponding representations within the iris biometric image 90. The location and study of such features can be used to diagnose pathologies, for example, to diagnose tumors and to determine the position of the eye iris 84. As a further example, a feature can be studied several times over a period of time to determine how its parameters are is changing.
  • Referring now to FIG. 5, there is shown the iris biometric image comparison system 100. The iris biometric image comparison system 100 receives the previously determined iris biometric image 90 as one of its inputs. Additionally, a new iris biometric image 95 is produced, for example, before or during the performance of a procedure on the eye 80. The new iris biometric image 95 is received by the image comparison system 100 as a second input. The new iris biometric image 95 can be provided by the aberration correction system 10. The light beam used to obtain the iris biometric image 95 can be the same light beam being used for other purposes during the procedure.
  • When using the aberration correcting system 10, the image can be optimized by executing additional iterations of the PSPGD control algorithm 20. The algorithm can be iterated until a predetermined image quality is obtained and computing the image quality metric within the computer 7 as previously described. In addition to performing more iterations of the PSPGD control algorithm 20, increased image sensitivity quality can be obtained by increasing the number of pixels in the digitized image or increase image sensitivity can be obtained by increasing the number of measuring points in the iris 84.
  • When performing the method of the image comparison system 100 the iris biometric image 90 can be assumed by the image comparison system 100 to be the correct iris biometric image of the iris 84 upon which the procedure is to be performed. Furthermore, it can be assumed that the iris biometric image 90 applied to the image comparison system 100 was obtained when the position and orientation of the eye 80 were correct.
  • The iris biometric images 90, 95 are compared by the image comparison system 100 at decision 104. A determination is made as to whether the iris biometric image 95 is an image of the same iris 84 that was imaged to produce the enrolled iris biometric image 90. Any of the well known correlation techniques can be used for the comparison. Substantially similar correlation techniques can be used for the comparison if the locations 92 are used or if other markings within the iris 84 are used. The sensitivity of the comparison can be adjusted by those skilled in the art.
  • If the determination of decision 104 is negative, then the procedure being performed on the eye 80 is not continued as shown in block 102. If the determination of decision 104 is positive, then a determination can be made in decision 106 whether the iris 84 is positioned in the xy directions correctly and oriented or rotated correctly at the time that the iris biometric image 95 was produced. The determination of decision 106 can be used for a number of purposed. For example, it could be used to direct a beam of light to a predetermined location within the eye 80. Thus, if the determination of decision 106 is negative, the beam can be redirected as shown in block 110. The position of the iris 84 can be checked again in decision 106. When the position of the iris 84 is correct, the procedure can begin, as shown in block 112.
  • The determination of decision 106 can be made in accordance with the representations of locations 92 within the iris 84 selected when iris biometric image 90 was obtained. If corresponding locations are found in the iris biometric image 95 in the same positions, the determination of decision 106 is positive. Alternately, the determination of decision 106 can be made in accordance with predetermined features or markings within the iris 84 other than the locations 92. The method of the image comparison system 100 can be used to determine whether the iris 84 is rotated or translated in the direction of either of the axes orthogonal to the arrow 3 shown in FIGS. 1A,B.
  • Referring now to FIG. 6, there is shown the iris positioning system 120. The iris positioning system 120 is adapted to precisely position the iris 84 while performing a procedure on the eye 80. The iris positioning system 120 differs from the iris biometric image comparison system 100 primarily in the fact that the iris positioning system 120 is provided with a servo 124. The servo 124 is effective in modifying the relative positions of the iris 84 and the camera 6 of the aberration correcting system 10 which can be coupled to equipment (not shown) used to perform the procedure in the eye.
  • In the iris positioning system 120 a determination is made in decision 104 whether the iris biometric images 90, 95 were made on the same eye as previously described with respect to image comparison system 100. The procedure is continued only if a positive determination is made. A determination is then made in decision 106 whether the iris 84 is in the correct position. The determination of decision 106 can be made by comparing the iris biometric images 90, 95 in accordance with the locations 92 or any other markings within the iris 84 as previously described. The determination made can be, for example, whether the iris 84 is rotated or translated in the x or y direction at the time that the iris biometric image 95 is obtained.
  • When a determination is made that the iris 84 is in an incorrect position, a correction signal representative of the error is calculated. The error correction signal is applied to the servo 124. The servo 124 is adapted to receive the error correction signal resulting from the determinations of decision 106 and to adjust the relative positions of the iris 84 and the equipment performing the procedure in accordance with the signal in a manner well understood by those skilled in the art. Servos 124 capable of applying both rotational and multi-axis translational corrections are both provided in the preferred embodiment of the invention. Either the object such as the iris 84 or the equipment can be moved in response to the determination of decision 106.
  • The method of the iris positioning system 120 can be repeatedly performed, or constantly performed, during the performance of a procedure on the eye 80 to re-capture, re-evaluate or refine the process the eye 80. Thus, the relative positions of the iris 84 and the procedure equipment can be kept correct at all times.
  • Referring now to FIG. 7, there is shown the illumination frequency optimization system 130. The illumination frequency optimization system 130 is an alternate embodiment of the aberration correcting system 10. Within the frequency optimization system 130 a variable frequency light source 132 rather than a single frequency light source applies a light beam to the eye 80. The variable frequency light source 132 can be a tunable laser, a diode, filters in front of a light source, a diffraction grating or any other source of a plurality of frequencies of light. An image quality metric can be obtained and optimized in the manner previously described with respect to system 10.
  • Using the variable frequency light source 132, it is possible to conveniently adjust the frequency of the light beam used to illuminate the eye 80 or object 80 at a plurality of differing frequencies and to obtain a plurality of corresponding image quality metrics. In order to do this, the frequency of the light applied to the eye 80 by the variable frequency light source 132 can be repeatedly adjusted and a new image quality metric can be obtained at each frequency. Each image quality metric obtained in this manner can be optimized to a predetermined level. The levels of optimization can be equal or they can differ. While the optimizations should be done using the frequency distribution, it is possible to return to images optimized using the frequency distribution and sharpen using the sharpness function.
  • It is well understood that differing types of tissue can be visualized best with differing frequencies of light. For example, tumors, lesions, blood and various tissues as well as tissues of varying pathologies can be optimally visualized at different frequencies since their absorption and reflection properties vary. Thus, by adjusting the frequency applied to the eye 80 by the variable frequency light source 132 and viewing the results, the best light for visualizing selected features can be determined. Furthermore, using this method there can be several optimized images for one eye. For example, there can be different optimized images, for a tumor, for a lesion and for blood. The determination of the best frequency for each image can be a subjective judgment made by a skilled practitioner.
  • A skilled practitioner can use the illumination frequency optimization system 130 to emphasize and de-emphasize selected features within images of the eye 80. For example, when obtaining an iris biometric image 95, the iris 84 may be clouded due to inflammation of the eye 80 or the presence of blood in the eye 80. It is possible to effectively remove the effects of the inflammation blood with the assistance of the frequency optimization system 130 by varying the frequency of the light provided by the light source 132 until the optimum frequency is found for de-emphasizing the inflammation or blood and permitting the obscured features to be seen. In general, it is often possible to visualize features when another feature is superimposed on them by removing the superimposed feature using system 130.
  • In order to remove the effects of the inflammation or blood, a plurality of images of the eye 80 can be provided and the frequency at which the blood or inflammation is least apparent can be determined. Removing these features from the iris biometric image 95 can facilitate its comparison with the iris biometric image 90. Furthermore, when the biometric image 95 is obtained from the iris 110 of a person wearing sunglasses, it is possible to remove the effects of the sunglasses in the same manner and identify an eye 80 behind the sunglasses. This feature is useful when identifying people outside of laboratory conditions.
  • Referring now to FIG. 8, there is shown the image superposition system 150. In many cases it is desirable to perform a procedure on an eye 80 when selected features of the eye 80 are obscured by other features, where different features are visualized best at different frequencies, or where the criteria for emphasizing and de-emphasizing features can change during a procedure. Image superposition 100 can be used to obtain improved feature visualization under these and other circumstances.
  • For example, white light is often preferred for illuminating an iris 84 because in many cases white light shows the most features. However, if white light is used to illuminate an iris 84 when the iris 84 is clouded with blood, the blood can block the white light. This can make it difficult, or even impossible, to visualize the features that are obscured by the blood. One solution to this problem is to use red light to illuminate the iris 84 and visualizes the features obscured by the blood.
  • However, the red light could fail to optimally visualize the features which are normally visualized best using, for example, white light. The image superposition system 150 can solve this problem by superimposing two images such as the direct image 166 and the projected image 170, where the images 166, 170 are obtained using light sources of differing frequencies. The optimum frequencies for obtaining each of the images 166, 170 can be determined using the illumination frequency optimization system 130.
  • For example, an object 168 to be visualized can be illuminated with incoherent white light to provide the direct image 166. Illumination of the object 168 by white light to produce the direct image 166 can be provided using any of the known methods for providing such illumination of objects to provide digital images. The direct image 166 can be sensed and digitized using an image sensor 152 which senses light traveling from the object 168 in the direction indicated by the arrows 156, 164.
  • The image sensor 152 senses the direct image 166 of the object 168 by way of a superposition screen 160. The superposition screen 160 can be formed of any material capable of transmitting a portion to the light applied to it from the object 168 to the image sensor 152, and reflecting a portion of the same light. For example, the superposition screen 168 can be formed of glass or plastic. A viewer, a TV screen or a gradient filter can also serve as the superposition screen 160. The screen 160 can also be a gradient filter. In a preferred embodiment of the invention, the angle 172 of the superposition screen 160 can be adjusted to control the amount of light it transmits and the amount it reflects.
  • The projected image 170 of the object 168 can be obtained using, for example, the aberration correcting system 10 as previously described. Illumination with red light or any other frequency of light can be used within the aberration correcting system 10 to obtain the superposition image 178. The superposition image 178 is applied to an image projector 176 by the aberration correcting system 10. The image projector 176 transmits the projected image 170 in accordance with the superposition image 178 in the direction indicated by the arrow 174 and applies it to the superposition screen 160.
  • A portion of the projected image 170 applied to the superposition screen 160 by the projector 176 is reflected off of the superposition screen 160 and applied to the image sensor 152 in the direction indicated by the arrow 156. The amount of the projected image 170 reflected to the image sensor 152 can be adjusted by adjusting the angle 172 of the superposition screen 160. The image projector 176 is disposed in a location adapted to apply the projected image 170 to the superposition screen 160 in the same region of the superposition screen 160 where the direct image 166 is applied. When the images 166, 170 are applied to the superposition screen 160 in this manner, they are superimposed and the image sensed by the image sensor 152 is thus the superposition or composite of the images 166, 170.
  • Adjustment of the angle 172 results in emphasizing and de-emphasizing the images 166, 170 relative to each other. This is useful, for example, where different features visualized selectively at differing frequencies must be brought in and out of visualization in the composite image for different purposes. Another time where this is useful is when the intensity of one of the images 166, 170 is too high relative to the other and must be adjusted down or too low and must be adjusted up.
  • In various alternate embodiments of the image superposition system 150, either or both of the images 166,170 can be optimized using the PSPGD algorithm 20 within the aberration correction system 10. Furthermore, the images 166, 170 can be optimized to differing degrees by the PSPGD algorithm 20 and with differing optimization criteria in order to emphasis one over the other or to selectively visualize selected features within the images 166,170 and thus, within the composite image sensed by image sensor 152. This permits selected features of the eye 80 to be brought into view and brought out of view as convenient at different times during a diagnosis or a procedure.
  • Thus, the illumination used to obtain the images 166, 170 superimposed by the image superposition system 150 does not need to be red and white light. The illumination used can be light of any differing frequencies. The frequencies selected for obtaining the images 166, 170 can be selected in accordance with the sharpness function on the frequency distribution as previously described.
  • The images superimposed by the image superposition system 150 do not need to be obtained by way of a camera, such as the camera 6 of the aberration correction system 10. A microscope, an endoscope, or any other type of device having an image sensor capable of capturing transmission, absorption or reflection properties of an object or tissue in a normal state or enhancement by such materials as markers and chromophores and thereby providing an optical/digital signal that can be applied to the computer 7 for optimization using the PSPGD algorithm 20 can be used. Thus, for example, an image obtained from an endoscope or a microscope can be superimposed upon an image obtained from an camera using the method of the present invention. Images from endoscopes, microscopes and other devices can be digitized, and superimposed and synthesized with each other. It will be understood that images obtained from such devices and optimized using the PSPGD algorithm 20 can be used in any other way that images obtained from the PSPGD algorithm 20 using camera 6 are used.
  • The description herein will so fully illustrate my invention that others may, by applying current or future knowledge, adopt the same for use under various conditions of service. For example, the invention may be used for ophthalmological procedures such as photocoagulation, optical biopsies such as measuring tumors anywhere in the eye, providing therapy, performing surgery, diagnosis or measurements. Additionally, it can be used for performing procedures on eyes outside of laboratory or medical environments. Furthermore, the method of the present invention can be applied to any other objects capable of being imaged in addition to eyes and images of an object provided. In accordance with the method of the invention can be used when performing such procedures on other objects.
  • While the invention has been described in detail and with reference to specific examples thereof, it will be apparent to one skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope thereof.

Claims (16)

1-24. (canceled)
25. A method for performing a procedure on a patient, comprising:
(a) obtaining a biometric image representative of said patient; and
(b) performing said procedure on said patient in accordance with said biometric image.
26. The method for performing a procedure on an patient of claim 25, wherein said patient has an iris and said biometric image comprises an iris biometric image.
27. The method for performing a procedure on an patient of claim 26, wherein said procedure comprises a medical procedure.
28. The method for performing a procedure on an patient of claim 26, further comprising:
(a) obtaining first and second iris biometric images; and
(b) comparing said first and second iris biometric images to provide a biometric comparison result.
29. The method for performing a procedure on an patient of claim 28, further comprising identifying a patient in accordance with said biometric comparison result.
30. The method for performing a procedure on an patient of claim 28, wherein said patient has at least one feature and said feature is represented within at least one of said first and second iris biometric images.
31. The method for performing a procedure on an patient of claim 30, further comprising identifying an iris in accordance with said at least one feature.
32. The method for performing a procedure on an patient of claim 31, further comprising performing a medical procedure in accordance with said identifying.
33. The method for performing a procedure on an patient of claim 30, further comprising determining a location of said iris in accordance with said at least one feature.
34. The method for performing a procedure on an patient of claim 30, further comprising determining an orientation of said iris in accordance with said at least one feature.
35. The method for performing a procedure on an patient of claim 30, further comprising altering a relative location of said iris in accordance with said at least one feature.
36. The method for performing a procedure on an patient of claim 26, further comprising performing a surgical procedure in accordance with said biometric image.
37. The method for performing a procedure on an patient of claim 26, further comprising performing a medical diagnosis on said patient in accordance with said biometric image.
38. The method for performing a procedure on an patient of claim 27, further comprising translating said patient within a coordinate system in accordance with said biometric image.
39. The method for performing a procedure on an patient of claim 27, further comprising performing said medical procedure upon an eye of said patient in accordance with said biometric image.
US11/739,342 2001-11-13 2007-04-24 Method for performing a procedure according to a biometric image Abandoned US20080033301A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US11/739,342 US20080033301A1 (en) 2001-11-13 2007-04-24 Method for performing a procedure according to a biometric image
US12/121,038 US7775665B2 (en) 2001-11-13 2008-05-15 Method for optically scanning objects
US12/754,750 US20100204571A1 (en) 2001-11-13 2010-04-06 Method for performing a procedure according to a biometric image
US13/175,910 US20110263972A1 (en) 2001-11-13 2011-07-04 Method for performing a procedure according to a biometric image
US13/400,085 US20120150064A1 (en) 2001-11-13 2012-02-19 Method for performing a procedure according to a biometric image
US13/707,293 US20130096544A1 (en) 2001-11-13 2012-12-06 Method for performing a procedure according to a biometric image
US14/023,488 US20140009742A1 (en) 2001-11-13 2013-09-11 Imaging Device
US14/514,461 US20160198952A1 (en) 2001-11-13 2014-10-15 Imaging Device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/011,187 US6648473B2 (en) 2001-11-13 2001-11-13 High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics
US10/696,046 US7377647B2 (en) 2001-11-13 2003-10-29 Clarifying an image of an object to perform a procedure on the object
US11/739,342 US20080033301A1 (en) 2001-11-13 2007-04-24 Method for performing a procedure according to a biometric image

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US10/011,187 Continuation-In-Part US6648473B2 (en) 2001-11-13 2001-11-13 High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics
US10/696,046 Division US7377647B2 (en) 2001-11-13 2003-10-29 Clarifying an image of an object to perform a procedure on the object

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US10/696,046 Division US7377647B2 (en) 2001-11-13 2003-10-29 Clarifying an image of an object to perform a procedure on the object
US12/754,750 Continuation US20100204571A1 (en) 2001-11-13 2010-04-06 Method for performing a procedure according to a biometric image

Publications (1)

Publication Number Publication Date
US20080033301A1 true US20080033301A1 (en) 2008-02-07

Family

ID=39030130

Family Applications (8)

Application Number Title Priority Date Filing Date
US10/696,046 Expired - Fee Related US7377647B2 (en) 2001-11-13 2003-10-29 Clarifying an image of an object to perform a procedure on the object
US11/739,342 Abandoned US20080033301A1 (en) 2001-11-13 2007-04-24 Method for performing a procedure according to a biometric image
US12/754,750 Abandoned US20100204571A1 (en) 2001-11-13 2010-04-06 Method for performing a procedure according to a biometric image
US13/175,910 Abandoned US20110263972A1 (en) 2001-11-13 2011-07-04 Method for performing a procedure according to a biometric image
US13/400,085 Abandoned US20120150064A1 (en) 2001-11-13 2012-02-19 Method for performing a procedure according to a biometric image
US13/707,293 Abandoned US20130096544A1 (en) 2001-11-13 2012-12-06 Method for performing a procedure according to a biometric image
US14/023,488 Abandoned US20140009742A1 (en) 2001-11-13 2013-09-11 Imaging Device
US14/514,461 Abandoned US20160198952A1 (en) 2001-11-13 2014-10-15 Imaging Device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/696,046 Expired - Fee Related US7377647B2 (en) 2001-11-13 2003-10-29 Clarifying an image of an object to perform a procedure on the object

Family Applications After (6)

Application Number Title Priority Date Filing Date
US12/754,750 Abandoned US20100204571A1 (en) 2001-11-13 2010-04-06 Method for performing a procedure according to a biometric image
US13/175,910 Abandoned US20110263972A1 (en) 2001-11-13 2011-07-04 Method for performing a procedure according to a biometric image
US13/400,085 Abandoned US20120150064A1 (en) 2001-11-13 2012-02-19 Method for performing a procedure according to a biometric image
US13/707,293 Abandoned US20130096544A1 (en) 2001-11-13 2012-12-06 Method for performing a procedure according to a biometric image
US14/023,488 Abandoned US20140009742A1 (en) 2001-11-13 2013-09-11 Imaging Device
US14/514,461 Abandoned US20160198952A1 (en) 2001-11-13 2014-10-15 Imaging Device

Country Status (1)

Country Link
US (8) US7377647B2 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090318911A1 (en) * 2006-09-12 2009-12-24 Shalesh Kaushal Devices and Methods for Computer-Assisted Surgery
US20110007949A1 (en) * 2005-11-11 2011-01-13 Global Rainmakers, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
CN102058389A (en) * 2009-11-17 2011-05-18 佳能株式会社 Adaptive optics apparatus, adaptive optics method, and imaging apparatus
US20120019780A1 (en) * 2010-07-23 2012-01-26 Canon Kabushiki Kaisha Ophthalmic apparatus, control method for the same, and storage medium
CN102947834A (en) * 2010-05-19 2013-02-27 普罗秋斯数字健康公司 Tracking and delivery confirmation of pharmaceutical products
US8945005B2 (en) 2006-10-25 2015-02-03 Proteus Digital Health, Inc. Controlled activation ingestible identifier
US8956288B2 (en) 2007-02-14 2015-02-17 Proteus Digital Health, Inc. In-body power source having high surface area electrode
US8956287B2 (en) 2006-05-02 2015-02-17 Proteus Digital Health, Inc. Patient customized therapeutic regimens
US8961412B2 (en) 2007-09-25 2015-02-24 Proteus Digital Health, Inc. In-body device with virtual dipole signal amplification
US9060708B2 (en) 2008-03-05 2015-06-23 Proteus Digital Health, Inc. Multi-mode communication ingestible event markers and systems, and methods of using the same
US9083589B2 (en) 2006-11-20 2015-07-14 Proteus Digital Health, Inc. Active signal processing personal health signal receivers
US9195074B2 (en) 2012-04-05 2015-11-24 Brien Holden Vision Institute Lenses, devices and methods for ocular refractive error
US9198608B2 (en) 2005-04-28 2015-12-01 Proteus Digital Health, Inc. Communication system incorporated in a container
US9201250B2 (en) 2012-10-17 2015-12-01 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US9235683B2 (en) 2011-11-09 2016-01-12 Proteus Digital Health, Inc. Apparatus, system, and method for managing adherence to a regimen
US20160022119A1 (en) * 2014-07-24 2016-01-28 Z Square Ltd. Multicore fiber endoscopes
US9541773B2 (en) 2012-10-17 2017-01-10 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US9603550B2 (en) 2008-07-08 2017-03-28 Proteus Digital Health, Inc. State characterization based on multi-variate data fusion techniques
US9756874B2 (en) 2011-07-11 2017-09-12 Proteus Digital Health, Inc. Masticable ingestible product and communication system therefor
US9883819B2 (en) 2009-01-06 2018-02-06 Proteus Digital Health, Inc. Ingestion-related biofeedback and personalized medical therapy method and system
US9941931B2 (en) 2009-11-04 2018-04-10 Proteus Digital Health, Inc. System for supply chain management
US10084880B2 (en) 2013-11-04 2018-09-25 Proteus Digital Health, Inc. Social media networking based on physiologic information
US10187121B2 (en) 2016-07-22 2019-01-22 Proteus Digital Health, Inc. Electromagnetic sensing and detection of ingestible event markers
US10223905B2 (en) 2011-07-21 2019-03-05 Proteus Digital Health, Inc. Mobile device and system for detection and communication of information received from an ingestible device
US10398294B2 (en) 2014-07-24 2019-09-03 Z Square Ltd. Illumination sources for multicore fiber endoscopes
US10398161B2 (en) 2014-01-21 2019-09-03 Proteus Digital Heal Th, Inc. Masticable ingestible product and communication system therefor
US10441194B2 (en) 2007-02-01 2019-10-15 Proteus Digital Heal Th, Inc. Ingestible event marker systems
US10517506B2 (en) 2007-05-24 2019-12-31 Proteus Digital Health, Inc. Low profile antenna for in body device
US11612321B2 (en) 2007-11-27 2023-03-28 Otsuka Pharmaceutical Co., Ltd. Transbody communication systems employing communication channels
US11744481B2 (en) 2013-03-15 2023-09-05 Otsuka Pharmaceutical Co., Ltd. System, apparatus and methods for data collection and assessing outcomes
US11950615B2 (en) 2021-11-10 2024-04-09 Otsuka Pharmaceutical Co., Ltd. Masticable ingestible product and communication system therefor

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7775665B2 (en) * 2001-11-13 2010-08-17 Dellavecchia Michael A Method for optically scanning objects
US7802883B2 (en) 2007-12-20 2010-09-28 Johnson & Johnson Vision Care, Inc. Cosmetic contact lenses having a sparkle effect
US8823488B2 (en) * 2010-02-19 2014-09-02 Wavelight Ag Medical treatment system and method for operation thereof
DE102010051281A1 (en) * 2010-11-12 2012-05-16 Carl Zeiss Meditec Ag Method for the model-based determination of the biometry of eyes
US10076671B2 (en) * 2012-05-25 2018-09-18 Ojai Retinal Technology, Llc Apparatus for retina phototherapy
US11157944B2 (en) * 2013-09-13 2021-10-26 Liveramp, Inc. Partner encoding of anonymous links to protect consumer privacy
EP2919641A1 (en) 2013-09-30 2015-09-23 Vysoké Ucení Technické V Brne Ophthalmic diagnostic apparatus and method of its operation
US10061125B2 (en) * 2014-03-04 2018-08-28 California Institute Of Technology Directional optical receiver
US10925479B2 (en) * 2016-10-13 2021-02-23 Ronald Michael Kurtz Networked system of mobile communication platforms for nonpharmacologic constriction of a pupil
CN106821696B (en) * 2017-02-08 2021-07-27 王三铭 Far-distance reading and writing device for preventing and treating myopia of students and using method
CN108710289B (en) * 2018-05-18 2021-11-09 厦门理工学院 Relay base quality optimization method based on improved SPSA
CN108982399B (en) * 2018-07-09 2021-04-06 安徽建筑大学 Flue ammonia concentration laser on-line measuring system
CN110448266B (en) * 2018-12-29 2022-03-04 中国科学院宁波工业技术研究院慈溪生物医学工程研究所 Random laser confocal line scanning three-dimensional ophthalmoscope and imaging method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5029220A (en) * 1990-07-31 1991-07-02 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Optical joint correlator for real-time image tracking and retinal surgery
US20020084904A1 (en) * 1996-12-20 2002-07-04 Carlos De La Huerga Electronic identification apparatus
US6961448B2 (en) * 1999-12-30 2005-11-01 Medtronic, Inc. User authentication in medical device systems
US7561183B1 (en) * 2002-10-08 2009-07-14 Unisys Corporation Mobile issuance of official documents with biometric information encoded thereon

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4641349A (en) * 1985-02-20 1987-02-03 Leonard Flom Iris recognition system
US7512254B2 (en) * 2001-11-07 2009-03-31 Symbol Technologies, Inc. System and method for mobile biometric authentication
US5179950A (en) * 1989-11-13 1993-01-19 Cyberonics, Inc. Implanted apparatus having micro processor controlled current and voltage sources with reduced voltage levels when not providing stimulation
US5291560A (en) * 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US5359669A (en) * 1992-04-13 1994-10-25 Motorola, Inc. Remote retinal scan identifier
ES2110841T5 (en) 1994-03-24 2005-12-16 Minnesota Mining And Manufacturing Company BIOMETRIC PERSONAL AUTHENTICATION SYSTEM.
US5777719A (en) * 1996-12-23 1998-07-07 University Of Rochester Method and apparatus for improving vision and the resolution of retinal images
CA2277276C (en) * 1997-01-17 2007-08-21 British Telecommunications Public Limited Company Security apparatus and method
US6019472A (en) * 1997-05-12 2000-02-01 Koester; Charles J. Contact lens element for examination or treatment of ocular tissues
US6783459B2 (en) * 1997-08-22 2004-08-31 Blake Cumbers Passive biometric customer identification and tracking system
US5999639A (en) * 1997-09-04 1999-12-07 Qualia Computing, Inc. Method and system for automated detection of clustered microcalcifications from digital mammograms
US5912731A (en) * 1997-12-04 1999-06-15 Trw Inc. Hartmann-type optical wavefront sensor
JP3271750B2 (en) * 1998-03-05 2002-04-08 沖電気工業株式会社 Iris identification code extraction method and device, iris recognition method and device, data encryption device
US6143011A (en) * 1998-04-13 2000-11-07 Surgijet, Inc. Hydrokeratome for refractive surgery
US6007204A (en) * 1998-06-03 1999-12-28 Welch Allyn, Inc. Compact ocular measuring system
JP3315648B2 (en) * 1998-07-17 2002-08-19 沖電気工業株式会社 Iris code generation device and iris recognition system
JP3610234B2 (en) * 1998-07-17 2005-01-12 株式会社メディア・テクノロジー Iris information acquisition device and iris identification device
US6289113B1 (en) * 1998-11-25 2001-09-11 Iridian Technologies, Inc. Handheld iris imaging apparatus and method
US6377699B1 (en) * 1998-11-25 2002-04-23 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6532298B1 (en) * 1998-11-25 2003-03-11 Iridian Technologies, Inc. Portable authentication device and method using iris patterns
JP2000189403A (en) * 1998-12-25 2000-07-11 Oki Electric Ind Co Ltd Iris region extraction and individual identifying device
KR100320465B1 (en) * 1999-01-11 2002-01-16 구자홍 Iris recognition system
US6539100B1 (en) * 1999-01-27 2003-03-25 International Business Machines Corporation Method and apparatus for associating pupils with subjects
US6050687A (en) * 1999-06-11 2000-04-18 20/10 Perfect Vision Optische Geraete Gmbh Method and apparatus for measurement of the refractive properties of the human eye
US6086204A (en) * 1999-09-20 2000-07-11 Magnante; Peter C. Methods and devices to design and fabricate surfaces on contact lenses and on corneal tissue that correct the eye's optical aberrations
JP3652951B2 (en) * 2000-02-10 2005-05-25 株式会社ニデック Ophthalmic equipment
JP3825222B2 (en) * 2000-03-24 2006-09-27 松下電器産業株式会社 Personal authentication device, personal authentication system, and electronic payment system
US20020057438A1 (en) * 2000-11-13 2002-05-16 Decker Derek Edward Method and apparatus for capturing 3D surface and color thereon in real time
US6930707B2 (en) * 2000-12-22 2005-08-16 International Business Machines Corporation Digital camera apparatus with biometric capability
US20020091937A1 (en) * 2001-01-10 2002-07-11 Ortiz Luis M. Random biometric authentication methods and systems
US20020158750A1 (en) * 2001-04-30 2002-10-31 Almalik Mansour Saleh System, method and portable device for biometric identification
US20030018522A1 (en) * 2001-07-20 2003-01-23 Psc Scanning, Inc. Biometric system and method for identifying a customer upon entering a retail establishment
US6648473B2 (en) * 2001-11-13 2003-11-18 Philadelphia Retina Endowment Fund High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5029220A (en) * 1990-07-31 1991-07-02 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Optical joint correlator for real-time image tracking and retinal surgery
US20020084904A1 (en) * 1996-12-20 2002-07-04 Carlos De La Huerga Electronic identification apparatus
US6961448B2 (en) * 1999-12-30 2005-11-01 Medtronic, Inc. User authentication in medical device systems
US7561183B1 (en) * 2002-10-08 2009-07-14 Unisys Corporation Mobile issuance of official documents with biometric information encoded thereon

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9198608B2 (en) 2005-04-28 2015-12-01 Proteus Digital Health, Inc. Communication system incorporated in a container
US8798334B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US9613281B2 (en) 2005-11-11 2017-04-04 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US8818053B2 (en) 2005-11-11 2014-08-26 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798330B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798333B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8260008B2 (en) 2005-11-11 2012-09-04 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US20110007949A1 (en) * 2005-11-11 2011-01-13 Global Rainmakers, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798331B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US9792499B2 (en) 2005-11-11 2017-10-17 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US10102427B2 (en) 2005-11-11 2018-10-16 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US8956287B2 (en) 2006-05-02 2015-02-17 Proteus Digital Health, Inc. Patient customized therapeutic regimens
US11928614B2 (en) 2006-05-02 2024-03-12 Otsuka Pharmaceutical Co., Ltd. Patient customized therapeutic regimens
US20090318911A1 (en) * 2006-09-12 2009-12-24 Shalesh Kaushal Devices and Methods for Computer-Assisted Surgery
US8512323B2 (en) 2006-09-12 2013-08-20 University Of Florida Research Foundation, Inc. Devices and methods for computer-assisted surgery
US11357730B2 (en) 2006-10-25 2022-06-14 Otsuka Pharmaceutical Co., Ltd. Controlled activation ingestible identifier
US8945005B2 (en) 2006-10-25 2015-02-03 Proteus Digital Health, Inc. Controlled activation ingestible identifier
US10238604B2 (en) 2006-10-25 2019-03-26 Proteus Digital Health, Inc. Controlled activation ingestible identifier
US9083589B2 (en) 2006-11-20 2015-07-14 Proteus Digital Health, Inc. Active signal processing personal health signal receivers
US9444503B2 (en) 2006-11-20 2016-09-13 Proteus Digital Health, Inc. Active signal processing personal health signal receivers
US10441194B2 (en) 2007-02-01 2019-10-15 Proteus Digital Heal Th, Inc. Ingestible event marker systems
US8956288B2 (en) 2007-02-14 2015-02-17 Proteus Digital Health, Inc. In-body power source having high surface area electrode
US11464423B2 (en) 2007-02-14 2022-10-11 Otsuka Pharmaceutical Co., Ltd. In-body power source having high surface area electrode
US10517506B2 (en) 2007-05-24 2019-12-31 Proteus Digital Health, Inc. Low profile antenna for in body device
US8961412B2 (en) 2007-09-25 2015-02-24 Proteus Digital Health, Inc. In-body device with virtual dipole signal amplification
US9433371B2 (en) 2007-09-25 2016-09-06 Proteus Digital Health, Inc. In-body device with virtual dipole signal amplification
US11612321B2 (en) 2007-11-27 2023-03-28 Otsuka Pharmaceutical Co., Ltd. Transbody communication systems employing communication channels
US9258035B2 (en) 2008-03-05 2016-02-09 Proteus Digital Health, Inc. Multi-mode communication ingestible event markers and systems, and methods of using the same
US9060708B2 (en) 2008-03-05 2015-06-23 Proteus Digital Health, Inc. Multi-mode communication ingestible event markers and systems, and methods of using the same
US9603550B2 (en) 2008-07-08 2017-03-28 Proteus Digital Health, Inc. State characterization based on multi-variate data fusion techniques
US11217342B2 (en) 2008-07-08 2022-01-04 Otsuka Pharmaceutical Co., Ltd. Ingestible event marker data framework
US10682071B2 (en) 2008-07-08 2020-06-16 Proteus Digital Health, Inc. State characterization based on multi-variate data fusion techniques
US9883819B2 (en) 2009-01-06 2018-02-06 Proteus Digital Health, Inc. Ingestion-related biofeedback and personalized medical therapy method and system
US10305544B2 (en) 2009-11-04 2019-05-28 Proteus Digital Health, Inc. System for supply chain management
US9941931B2 (en) 2009-11-04 2018-04-10 Proteus Digital Health, Inc. System for supply chain management
CN102058389A (en) * 2009-11-17 2011-05-18 佳能株式会社 Adaptive optics apparatus, adaptive optics method, and imaging apparatus
US20150208917A1 (en) * 2009-11-17 2015-07-30 Canon Kabushiki Kaisha Adaptive optics apparatus, adaptive optics method, and imaging apparatus
US9044174B2 (en) * 2009-11-17 2015-06-02 Canon Kabushiki Kaisha Adaptive optics apparatus, adaptive optics method, and imaging apparatus
US10105050B2 (en) * 2009-11-17 2018-10-23 Canon Kabushiki Kaisha Adaptive optics apparatus, adaptive optics method, and imaging apparatus
US20110116042A1 (en) * 2009-11-17 2011-05-19 Canon Kabushiki Kaisha Adaptive optics apparatus, adaptive optics method, and imaging apparatus
TWI557672B (en) * 2010-05-19 2016-11-11 波提亞斯數位康健公司 Computer system and computer-implemented method to track medication from manufacturer to a patient, apparatus and method for confirming delivery of medication to a patient, patient interface device
US10529044B2 (en) * 2010-05-19 2020-01-07 Proteus Digital Health, Inc. Tracking and delivery confirmation of pharmaceutical products
US20130073312A1 (en) * 2010-05-19 2013-03-21 Proteus Digital Health, Inc. Tracking and Delivery Confirmation of Pharmaceutical Products
CN102947834A (en) * 2010-05-19 2013-02-27 普罗秋斯数字健康公司 Tracking and delivery confirmation of pharmaceutical products
US20120019780A1 (en) * 2010-07-23 2012-01-26 Canon Kabushiki Kaisha Ophthalmic apparatus, control method for the same, and storage medium
CN102370458A (en) * 2010-07-23 2012-03-14 佳能株式会社 Ophthalmic apparatus, control method for the same, and storage medium
US8646915B2 (en) * 2010-07-23 2014-02-11 Canon Kabushiki Kaisha Ophthalmic apparatus, control method for the same, and storage medium
US9756874B2 (en) 2011-07-11 2017-09-12 Proteus Digital Health, Inc. Masticable ingestible product and communication system therefor
US10223905B2 (en) 2011-07-21 2019-03-05 Proteus Digital Health, Inc. Mobile device and system for detection and communication of information received from an ingestible device
US9235683B2 (en) 2011-11-09 2016-01-12 Proteus Digital Health, Inc. Apparatus, system, and method for managing adherence to a regimen
US9195074B2 (en) 2012-04-05 2015-11-24 Brien Holden Vision Institute Lenses, devices and methods for ocular refractive error
US10838235B2 (en) 2012-04-05 2020-11-17 Brien Holden Vision Institute Limited Lenses, devices, and methods for ocular refractive error
US11809024B2 (en) 2012-04-05 2023-11-07 Brien Holden Vision Institute Limited Lenses, devices, methods and systems for refractive error
US11644688B2 (en) 2012-04-05 2023-05-09 Brien Holden Vision Institute Limited Lenses, devices and methods for ocular refractive error
US10209535B2 (en) 2012-04-05 2019-02-19 Brien Holden Vision Institute Lenses, devices and methods for ocular refractive error
US10203522B2 (en) 2012-04-05 2019-02-12 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US10466507B2 (en) 2012-04-05 2019-11-05 Brien Holden Vision Institute Limited Lenses, devices and methods for ocular refractive error
US9535263B2 (en) 2012-04-05 2017-01-03 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US9575334B2 (en) 2012-04-05 2017-02-21 Brien Holden Vision Institute Lenses, devices and methods of ocular refractive error
US10948743B2 (en) 2012-04-05 2021-03-16 Brien Holden Vision Institute Limited Lenses, devices, methods and systems for refractive error
US11320672B2 (en) 2012-10-07 2022-05-03 Brien Holden Vision Institute Limited Lenses, devices, systems and methods for refractive error
US9759930B2 (en) 2012-10-17 2017-09-12 Brien Holden Vision Institute Lenses, devices, systems and methods for refractive error
US9201250B2 (en) 2012-10-17 2015-12-01 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US9541773B2 (en) 2012-10-17 2017-01-10 Brien Holden Vision Institute Lenses, devices, methods and systems for refractive error
US10534198B2 (en) 2012-10-17 2020-01-14 Brien Holden Vision Institute Limited Lenses, devices, methods and systems for refractive error
US10520754B2 (en) 2012-10-17 2019-12-31 Brien Holden Vision Institute Limited Lenses, devices, systems and methods for refractive error
US11333903B2 (en) 2012-10-17 2022-05-17 Brien Holden Vision Institute Limited Lenses, devices, methods and systems for refractive error
US11744481B2 (en) 2013-03-15 2023-09-05 Otsuka Pharmaceutical Co., Ltd. System, apparatus and methods for data collection and assessing outcomes
US10084880B2 (en) 2013-11-04 2018-09-25 Proteus Digital Health, Inc. Social media networking based on physiologic information
US10398161B2 (en) 2014-01-21 2019-09-03 Proteus Digital Heal Th, Inc. Masticable ingestible product and communication system therefor
US20160022119A1 (en) * 2014-07-24 2016-01-28 Z Square Ltd. Multicore fiber endoscopes
US10398294B2 (en) 2014-07-24 2019-09-03 Z Square Ltd. Illumination sources for multicore fiber endoscopes
US9661986B2 (en) * 2014-07-24 2017-05-30 Z Square Ltd. Multicore fiber endoscopes
US10797758B2 (en) 2016-07-22 2020-10-06 Proteus Digital Health, Inc. Electromagnetic sensing and detection of ingestible event markers
US10187121B2 (en) 2016-07-22 2019-01-22 Proteus Digital Health, Inc. Electromagnetic sensing and detection of ingestible event markers
US11950615B2 (en) 2021-11-10 2024-04-09 Otsuka Pharmaceutical Co., Ltd. Masticable ingestible product and communication system therefor

Also Published As

Publication number Publication date
US20120150064A1 (en) 2012-06-14
US20100204571A1 (en) 2010-08-12
US20130096544A1 (en) 2013-04-18
US20110263972A1 (en) 2011-10-27
US20140009742A1 (en) 2014-01-09
US7377647B2 (en) 2008-05-27
US20160198952A1 (en) 2016-07-14
US20040165146A1 (en) 2004-08-26

Similar Documents

Publication Publication Date Title
US7377647B2 (en) Clarifying an image of an object to perform a procedure on the object
US8714741B2 (en) Method for selecting images
KR100797857B1 (en) Customized corneal profiling
US7077521B2 (en) System and method for reconstruction of aberrated wavefronts
US6439720B1 (en) Method and apparatus for measuring optical aberrations of the human eye
US6379005B1 (en) Method and apparatus for improving vision and the resolution of retinal images
US7659971B2 (en) Lensometers and wavefront sensors and methods of measuring aberration
EP3001945B1 (en) Lensometers and wavefront sensors and methods of measuring aberration
US6575574B2 (en) High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics
AU781722B2 (en) Spatial filter for enhancing Hartmann-Shack images and associated methods
WO1992001417A1 (en) Vision measurement and correction
JP2001524662A (en) Objective measurement and correction of optical system using wavefront analysis
US7226166B2 (en) Optimizing the properties of electromagnetic energy in a medium using stochastic parallel perturbation gradient descent optimization adaptive optics
US20020154269A1 (en) Stereoscopic measurement of cornea and illumination patterns
US7690787B2 (en) Arrangement for improving the image field in ophthalmological appliances
US20040165147A1 (en) Determining iris biometric and spatial orientation of an iris in accordance with same
US20040263779A1 (en) Hartmann-Shack wavefront measurement
US9665771B2 (en) Method and apparatus for measuring aberrations of an ocular optical system

Legal Events

Date Code Title Description
AS Assignment

Owner name: PHILADELPHIA RETINA ENDOWMENT FUND, PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DONOSO, LARRY, MR.;REEL/FRAME:019697/0569

Effective date: 20070727

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: PHILADELPHIA RETINA ENDOWMENT FUND,PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DONOSO, LARRY A;REEL/FRAME:024433/0490

Effective date: 20100522

AS Assignment

Owner name: PHILADELPHIA RETINA ENDOWMENT FUND, A PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DONOSO, LARRY;REEL/FRAME:024596/0320

Effective date: 20100429

AS Assignment

Owner name: PHILADELPHIA RETINA ENDOWMENT FUND, PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DELLAVECCHIA, MICHAEL;DONOSO, LARRY;VORONTSOV, MIKHAIL;SIGNING DATES FROM 20040220 TO 20040311;REEL/FRAME:024736/0358