US20050264759A1 - Fundus imaging - Google Patents

Fundus imaging Download PDF

Info

Publication number
US20050264759A1
US20050264759A1 US11/196,222 US19622205A US2005264759A1 US 20050264759 A1 US20050264759 A1 US 20050264759A1 US 19622205 A US19622205 A US 19622205A US 2005264759 A1 US2005264759 A1 US 2005264759A1
Authority
US
United States
Prior art keywords
image
vectors
fft
row
column
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/196,222
Inventor
Martin Gersten
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/196,222 priority Critical patent/US20050264759A1/en
Publication of US20050264759A1 publication Critical patent/US20050264759A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Definitions

  • This invention relates to digital signal processing and, more particularly, to processing images of the retina to obtain accurate information.
  • the ophthalmoscope is useful in detecting early signs of diabetes, cardiovascular disease and other related conditions, obtaining a wide view of the interior of the eye and the periphery of the retina with this instrument generally requires that the iris be dilated.
  • a more convenient tool for patient screening is provided by the non-mydriatic fundus camera which can be used as a diagnostic tool by ophthalmologists, optometrists and other medical professionals. Information is sought, for example, about the condition of the retina's dendritic pattern of blood vessels, the macula and the optic nerve among the main features of the fundus.
  • the non-mydriatic camera attempts to focus a ring of light through the pupil to illuminate the retina without the need for dilation of the pupil.
  • the ring of light diverges as it exits the posterior crystalline lens of the eye so that a large, defocused spot is formed on the retina.
  • the return optical path between the fundus and the camera again traverses the crystalline lens
  • the retinal image is acquired by the camera through the central hole in the ring of light.
  • the retinal image obtained this way contains several major types of distortion. Some of these, such as distortion due to the camera and its optical system can be compensated by calibration with a standard test image. Others, arising in inaccessible portions of the optical path, such as intraocular reflections, haze in the vitreous humor, haze and flare due to a cataractous lens diffusing the peripheral ring of light into the central ‘viewing hole’, or corneal inhomogeneities, have not yielded to available image processing methods.
  • images may be blurred as a result of axial movement of the patient's head, high velocity, micro-saccadic eye movements, or the operator's inability simultaneously to achieve all of the conditions necessary for a sharp image.
  • the very eyes which require the highest level of diagnostic image quality are the ones most likely to suffer from one or more of the foregoing image degradation problems.
  • images obtained heretofore by conventional fundus cameras have suffered from an indeterminate amount of distortion and artifactual noise. It would be of inestimable value to be able to remove such distortions from the image acquired by a fundus camera.
  • the degradation of the retinal image acquired by a fundus camera may be considered to be the point spread function (PSF) of the optical path, including the media of the eye itself (cornea, aqueous humor, crystalline lens, aqueous humor, intraocular membranes, etc.) together with image acquisition apparatus (principally the fundus camera).
  • PSF point spread function
  • this point spread function is deemed to be constant from point to point throughout the image, or at least through particular segments thereof.
  • the point-to-point amplitude of the retinal image which consists largely of dendritic networks of blood vessels, varies from point-to-point in a quasi-random manner.
  • the detected (distorted) image consists of the randomly varying ideal image convolved with the relatively constant PSF which affects every part of the image in substantially the same way.
  • the retina sampled for example in straight lines, such as rows and columns, yields a “random” video signal due to the dendritic fractal nature of the retina's anatomy.
  • Any amplitude vector in the image no matter in which direction the image is sliced, e.g., whether vertically or horizontally, will contain the PSF convolved with the retinal image data. Simple inspection of any vector will reveal nothing but random variations in amplitude because the convolved PSF has the effect only of blurring and adding haze to the ideal “random” image features.
  • the spatial-spectrum of the PSF (the modulation-transfer-function or MTF) of the image may easily be separated from the spatial-spectra of the ideal image data, as elucidated below.
  • MTF modulation-transfer-function
  • a convolution in the space-domain is equivalent to a vector multiplication in the frequency domain.
  • the image is first divided into rows and columns and each row is converted from a space representation to a frequency representation through the application of a Fast Fourier Transform. Each transformed vector is stored in computer memory.
  • Each of these vectors contains the spectrum of the row's image data multiplied by the spectrum of the PSF (MTF). If each transformed vector is correlated with every other transformed vector in the image, the random variation produced by the “ideal” signal will eventually be swamped by the steady value of the MTF.
  • the value of the MTF (or its transform, the PSF) is what we are looking for and what we need to obtain the ideal image from the image obtained by the fundus camera.
  • the Nth root of the resultant vector is computed where N is the number of rows that have been multiplied.
  • the resultant vector now contains the mean MTF of the optical system. Each element of this mean MTF vector is offset by a more or less constant DC term.
  • This term is the product of all of the random amplitudes of the ideal image data of each column.
  • the minimum value in the result vector may now be located and subtracted from all of the elements of the result vector to yield the correct MTF values.
  • coefficients may be applied to the offset data to yield a more accurate estimate of the MTF. Dividing each FFT-transformed row by the MTF and obtaining the inverse FFT of the rows yields the horizontally restored ideal image. The procedure may then be repeated by similarly correlating all of the column vectors to yield the vertically restored ideal image.
  • FIG. 1 is a flow chart of the process of the invention.
  • I represent the object of interest, e.g., the retina of the human eye.
  • E represent the point spread function (PSF) of the media of the eye and let the PSF of a camera such as a fundus camera (not shown) and imaging optics be represented by C.
  • the actual image A reported by the camera is the retinal object (I) degraded by being convolved with E and C.
  • a convolution is a simple vector multiplication. Therefore, a fast Fourier transform (FFT) taken of the actual image A is the product of the FFTs of I, E and C.
  • FFT fast Fourier transform
  • the FFT of A can be assumed to be the product of the FFTs of I and E. So, to improve A, we must divide its Fourier transform by the Fourier transform of E. In the simplest case, E should remain constant throughout the media of the eye. Let us scan and digitize the image obtained by the fundus camera. For example, let the image be scanned by rows and columns. In our acquired discrete digital image therefore, E has been convolved with every line and row of the image A. Now, let us consider the retinal object, I, which contains blood vessels and nerves arranged in 2-dimensional dendritic fractal-like pseudo-random structures.
  • step 100 the image acquired by the fundus camera is digitized.
  • an FFT is taken of the “next” row of the digitzed image data. The first time the process is executed the “next” row is, of course, the first row.
  • step 102 a determination is made whether the last row of the digitized data has been FFT'd.
  • step 103 if the last row had been reached in step 102 , the row vector of FFT data is copied to the FFT result vector.
  • step 105 the result vector is multiplied by the next FFT row vector. This is the correlation step discussed above.
  • step 107 a determination is made whether all of the FFT row vectors have been correlated.
  • step 110 which computes the Nth root of the FFT result vector, where N is the number of rows.
  • step 112 a minimum offset is subtracted from all elements of the FFT result vector.
  • step 114 each FFT row vector is divided by the FFT result vector.
  • step 116 the inverse FFT is taken and in step 200 the foregoing procedure is repeated for each column of the acquired image data. The inverse FFT produces a corrected image.
  • the method of the present invention may be applied to restore images acquired in other ways (e.g., ultrasound, CT scans, snapshots, satellite photos, etc.) so long as the image does not contain appreciable amounts of periodic data.
  • Transforms other than the FFT may advantageously be used.
  • the Z-Transform may in general be used to take discrete signal amplitudes (such as those obtained by sampling an image of the retina taken by a fundus camera) into a complex-variable domain where it plays a similar role to the one that the Laplace Transform does in the continuous time domain.
  • the Z-transform offers a different way solving problems and designing discrete domain applications. It will also be apparent that the scanning or sampling of the image may be done radially from any desired center of interest, such as the center of sight, the fovea, the macula, etc., and that non-linear scans may also be employed so that more samples are taken at areas of interest. It may also be useful, prior to the application of the restoration method described above, to re-sample images at a higher resolution than the original and then low-pass filter the result. Subjecting the images to such “anti-aliasing”, prior to computing the MTF, may produce a result that is useful in some applications. Further and other modifications will be apparent to those skilled in the art and may be made without, however, departing from the spirit and scope of the invention.

Abstract

Degradations of the image acquired by a fundus camera, including those due to intraocular defects are reduced by digitizing the image, taking an FFT of row and column image data, correlating the FFTs of the rows and of the columns to obtain resultant row product vectors and column vectors. The Nth root of the resultant row and column product vectors is computed, where N is the respective numbers of rows and columns. A minimum offset term is subtracted from each element of the resultant vector to obtain the PSF spatial spectrum (MTF) of the eye. Each row FFT and each column FFT is then divided by the MTF after which the inverse FFT yields a restored distortion-reduced image.

Description

    FIELD OF THE INVENTION
  • This invention relates to digital signal processing and, more particularly, to processing images of the retina to obtain accurate information.
  • BACKGROUND OF THE INVENTION
  • While the ophthalmoscope is useful in detecting early signs of diabetes, cardiovascular disease and other related conditions, obtaining a wide view of the interior of the eye and the periphery of the retina with this instrument generally requires that the iris be dilated. A more convenient tool for patient screening is provided by the non-mydriatic fundus camera which can be used as a diagnostic tool by ophthalmologists, optometrists and other medical professionals. Information is sought, for example, about the condition of the retina's dendritic pattern of blood vessels, the macula and the optic nerve among the main features of the fundus. The non-mydriatic camera attempts to focus a ring of light through the pupil to illuminate the retina without the need for dilation of the pupil. The ring of light diverges as it exits the posterior crystalline lens of the eye so that a large, defocused spot is formed on the retina. The return optical path between the fundus and the camera again traverses the crystalline lens The retinal image is acquired by the camera through the central hole in the ring of light.
  • Unfortunately, the retinal image obtained this way contains several major types of distortion. Some of these, such as distortion due to the camera and its optical system can be compensated by calibration with a standard test image. Others, arising in inaccessible portions of the optical path, such as intraocular reflections, haze in the vitreous humor, haze and flare due to a cataractous lens diffusing the peripheral ring of light into the central ‘viewing hole’, or corneal inhomogeneities, have not yielded to available image processing methods. In addition to the foregoing, and despite the use of strobed flash, images may be blurred as a result of axial movement of the patient's head, high velocity, micro-saccadic eye movements, or the operator's inability simultaneously to achieve all of the conditions necessary for a sharp image. Thus, the very eyes which require the highest level of diagnostic image quality (those exhibiting intraocular pathology) are the ones most likely to suffer from one or more of the foregoing image degradation problems. Accordingly, images obtained heretofore by conventional fundus cameras have suffered from an indeterminate amount of distortion and artifactual noise. It would be of inestimable value to be able to remove such distortions from the image acquired by a fundus camera.
  • SUMMARY OF THE INVENTION
  • We have discovered that, the degradation of the retinal image acquired by a fundus camera may be considered to be the point spread function (PSF) of the optical path, including the media of the eye itself (cornea, aqueous humor, crystalline lens, aqueous humor, intraocular membranes, etc.) together with image acquisition apparatus (principally the fundus camera). In accordance with the principles of the invention, for processing purposes, this point spread function is deemed to be constant from point to point throughout the image, or at least through particular segments thereof. However, the point-to-point amplitude of the retinal image, which consists largely of dendritic networks of blood vessels, varies from point-to-point in a quasi-random manner. The detected (distorted) image, therefor, consists of the randomly varying ideal image convolved with the relatively constant PSF which affects every part of the image in substantially the same way. The retina, sampled for example in straight lines, such as rows and columns, yields a “random” video signal due to the dendritic fractal nature of the retina's anatomy. Any amplitude vector in the image, no matter in which direction the image is sliced, e.g., whether vertically or horizontally, will contain the PSF convolved with the retinal image data. Simple inspection of any vector will reveal nothing but random variations in amplitude because the convolved PSF has the effect only of blurring and adding haze to the ideal “random” image features.
  • However, if the image is converted from a conventional ‘space-domain’ representation to a ‘frequency-domain’ representation, then the spatial-spectrum of the PSF (the modulation-transfer-function or MTF) of the image may easily be separated from the spatial-spectra of the ideal image data, as elucidated below. As is well known (Castleman, et al), a convolution in the space-domain is equivalent to a vector multiplication in the frequency domain. In accordance with the invention, the image is first divided into rows and columns and each row is converted from a space representation to a frequency representation through the application of a Fast Fourier Transform. Each transformed vector is stored in computer memory. Each of these vectors contains the spectrum of the row's image data multiplied by the spectrum of the PSF (MTF). If each transformed vector is correlated with every other transformed vector in the image, the random variation produced by the “ideal” signal will eventually be swamped by the steady value of the MTF. The value of the MTF (or its transform, the PSF), however, is what we are looking for and what we need to obtain the ideal image from the image obtained by the fundus camera. When all of the rows have thus been correlated to form a resultant vector, the Nth root of the resultant vector is computed where N is the number of rows that have been multiplied. The resultant vector now contains the mean MTF of the optical system. Each element of this mean MTF vector is offset by a more or less constant DC term. This term is the product of all of the random amplitudes of the ideal image data of each column. We have found, in practice, that the minimum value in the result vector may now be located and subtracted from all of the elements of the result vector to yield the correct MTF values. However, in some special cases, coefficients may be applied to the offset data to yield a more accurate estimate of the MTF. Dividing each FFT-transformed row by the MTF and obtaining the inverse FFT of the rows yields the horizontally restored ideal image. The procedure may then be repeated by similarly correlating all of the column vectors to yield the vertically restored ideal image.
  • BRIEF DESCRIPTION OF THE DRAWING
  • The foregoing objects and features of the invention may become more apparent from the ensuing written description when read together with the drawing in which:
  • FIG. 1 is a flow chart of the process of the invention.
  • DETAILED DESCRIPTION
  • Before referring to FIG. 1, an overall review of the process of the invention may be helpful. Let I represent the object of interest, e.g., the retina of the human eye. Let E represent the point spread function (PSF) of the media of the eye and let the PSF of a camera such as a fundus camera (not shown) and imaging optics be represented by C. Then the actual image A reported by the camera is the retinal object (I) degraded by being convolved with E and C. In the frequency domain, a convolution is a simple vector multiplication. Therefore, a fast Fourier transform (FFT) taken of the actual image A is the product of the FFTs of I, E and C. If, for the moment, we accept that ,the PSF of a high quality fundus camera makes a negligible contribution to the image then the FFT of A can be assumed to be the product of the FFTs of I and E. So, to improve A, we must divide its Fourier transform by the Fourier transform of E. In the simplest case, E should remain constant throughout the media of the eye. Let us scan and digitize the image obtained by the fundus camera. For example, let the image be scanned by rows and columns. In our acquired discrete digital image therefore, E has been convolved with every line and row of the image A. Now, let us consider the retinal object, I, which contains blood vessels and nerves arranged in 2-dimensional dendritic fractal-like pseudo-random structures. Therefore: 1-dimensional column/row amples of I are essentially random waveforms. But, of course, the random waveforms are all convolved with E. So, if we correlate all of the rows, we will get a result row vector which is the constant E sitting on top of a “DC” offset which represents all of the randomness of the ideal image, I. We do the same for the columns of the image and measure its DC offset and subtract it, leaving the PSF of the EYE itself. Then we divide each row and column of A by E and we do an inverse FFT and obtain a corrected and clear image.
  • Referring now to FIG. 1, at step 100 the image acquired by the fundus camera is digitized. At step 101 an FFT is taken of the “next” row of the digitzed image data. The first time the process is executed the “next” row is, of course, the first row. At step 102 a determination is made whether the last row of the digitized data has been FFT'd. At step 103, if the last row had been reached in step 102, the row vector of FFT data is copied to the FFT result vector. In step 105 the result vector is multiplied by the next FFT row vector. This is the correlation step discussed above. In step 107 a determination is made whether all of the FFT row vectors have been correlated. The process continues with step 110 which computes the Nth root of the FFT result vector, where N is the number of rows. In step 112 a minimum offset is subtracted from all elements of the FFT result vector. In step 114 each FFT row vector is divided by the FFT result vector. In step 116 the inverse FFT is taken and in step 200 the foregoing procedure is repeated for each column of the acquired image data. The inverse FFT produces a corrected image.
  • What has been described is deemed to be illustrative of the principles of the invention. It will be apparent, for example, to those skilled in the art that the method of the present invention may be applied to restore images acquired in other ways (e.g., ultrasound, CT scans, snapshots, satellite photos, etc.) so long as the image does not contain appreciable amounts of periodic data. Transforms other than the FFT may advantageously be used. For example, the Z-Transform may in general be used to take discrete signal amplitudes (such as those obtained by sampling an image of the retina taken by a fundus camera) into a complex-variable domain where it plays a similar role to the one that the Laplace Transform does in the continuous time domain. Like the Laplace transform, the Z-transform offers a different way solving problems and designing discrete domain applications. It will also be apparent that the scanning or sampling of the image may be done radially from any desired center of interest, such as the center of sight, the fovea, the macula, etc., and that non-linear scans may also be employed so that more samples are taken at areas of interest. It may also be useful, prior to the application of the restoration method described above, to re-sample images at a higher resolution than the original and then low-pass filter the result. Subjecting the images to such “anti-aliasing”, prior to computing the MTF, may produce a result that is useful in some applications. Further and other modifications will be apparent to those skilled in the art and may be made without, however, departing from the spirit and scope of the invention.

Claims (7)

1. Removing from a retinal image acquired by a fundus camera, image degradations arising from intraocular defects, comprising the steps of:
a.) digitizing said acquired image;
b.) taking an FFT of said digitized image by rows and columns;
c.) correlating said FFTs to obtain resultant row product vectors and column vectors;
d.) finding the root equal to the respective numbers of rows and columns of the resultant row and column product vectors to obtain quotients;
e.) subtracting from each of said quotients a minimum offset term to obtain the PSF spatial spectrum (MTF) of the eye;
f.) dividing each row FFT and each column FFT by said MTF; and
g.) taking the inverse FFT to yield a restored distortion-reduced image.
2. The method of removing from an acquired image degradations arising from optical defects in inaccessible portions of the optical path, comprising the steps of
a.) digitizing said acquired image;
b.) scanning said acquired image along predetermined paths to obtain vectors of data;
c.) taking a discrete transform of said vectors of data;
d.) correlating said discrete transform of said vectors to obtain resultant product vectors;
e.) finding roots of the resultant product vectors for each of said predetermined paths;
f.) subtracting from each of said roots a minimum offset term to obtain a point spread function spatial spectrum (MTF);
g.) dividing each discrete transform of said vectors by said MTF; and
h.) taking the inverse discrete transform to yield a restored distortion-reduced image.
3. The method of claim 2 wherein said acquired image is a retinal image acquired by a fundus camera.
4. The method of claim 3 wherein at least one of said predetermined paths traverses a predetermined feature of said retinal image.
5. The method of claim 3 wherein said discrete transform is a fast Fourier transform.
6. The method of claim 4 wherein said predetermined paths are row and column paths of said image.
7. The method of claim 5 wherein there are N of said predetermined paths and said root is the Nth root of said product vectors.
US11/196,222 2001-12-05 2005-08-03 Fundus imaging Abandoned US20050264759A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/196,222 US20050264759A1 (en) 2001-12-05 2005-08-03 Fundus imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/010,432 US6928193B2 (en) 2001-12-05 2001-12-05 Fundus imaging
US11/196,222 US20050264759A1 (en) 2001-12-05 2005-08-03 Fundus imaging

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/010,432 Continuation US6928193B2 (en) 2001-12-05 2001-12-05 Fundus imaging

Publications (1)

Publication Number Publication Date
US20050264759A1 true US20050264759A1 (en) 2005-12-01

Family

ID=21745736

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/010,432 Expired - Fee Related US6928193B2 (en) 2001-12-05 2001-12-05 Fundus imaging
US11/196,222 Abandoned US20050264759A1 (en) 2001-12-05 2005-08-03 Fundus imaging

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/010,432 Expired - Fee Related US6928193B2 (en) 2001-12-05 2001-12-05 Fundus imaging

Country Status (1)

Country Link
US (2) US6928193B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101048240B1 (en) * 2006-06-09 2011-07-08 오스람 게젤샤프트 미트 베쉬랭크터 하프퉁 Discharge lamp with holding device for electrodes
CN104933713A (en) * 2015-06-12 2015-09-23 杭州电子科技大学 Image MTF (Modulation Transfer Function) estimation method using edge analysis
CN106997773A (en) * 2015-10-13 2017-08-01 三星电子株式会社 Apparatus and method for performing Fourier transformation

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10225855A1 (en) * 2002-06-07 2003-12-24 Zeiss Carl Jena Gmbh Method and arrangement for evaluating images taken with a fundus camera
US20040086153A1 (en) * 2002-10-30 2004-05-06 Yichang Tsai Methods and systems for recognizing road signs in a digital image
JP4985062B2 (en) * 2006-04-14 2012-07-25 株式会社ニコン camera
CN110310235B (en) * 2019-05-21 2021-07-27 北京至真互联网技术有限公司 Fundus image processing method, device and equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5193124A (en) * 1989-06-29 1993-03-09 The Research Foundation Of State University Of New York Computational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing, and obtaining improved focus images
US5661816A (en) * 1991-10-22 1997-08-26 Optikos Corporation Image analysis system
US5737456A (en) * 1995-06-09 1998-04-07 University Of Massachusetts Medical Center Method for image reconstruction
US6231186B1 (en) * 2000-03-24 2001-05-15 Bausch & Lomb Surgical, Inc. Eye measurement system
US6285799B1 (en) * 1998-12-15 2001-09-04 Xerox Corporation Apparatus and method for measuring a two-dimensional point spread function of a digital image acquisition system
US20030001098A1 (en) * 2001-05-09 2003-01-02 Stoddart Hugh A. High resolution photon emission computed tomographic imaging tool
US6511180B2 (en) * 2000-10-10 2003-01-28 University Of Rochester Determination of ocular refraction from wavefront aberration data and design of optimum customized correction
US6567570B1 (en) * 1998-10-30 2003-05-20 Hewlett-Packard Development Company, L.P. Optical image scanner with internal measurement of point-spread function and compensation for optical aberrations
US6837855B1 (en) * 1997-12-18 2005-01-04 Michel Puech Use of an ultrasonic transducer for echographic exploration of human or animal body tissues or organs in particular of the eyeball posterior segment
US6854846B2 (en) * 2000-02-09 2005-02-15 Michael Quigley Fundus photographic technique to determine eye refraction for optic disc size calculations
US6887203B2 (en) * 2000-01-06 2005-05-03 Ultralink Ophthalmics Inc. Ophthalmological ultrasonography scanning apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4102563A (en) * 1975-12-01 1978-07-25 Canon Kabushiki Kaisha Eye fundus camera free from undesired reflected and diffused light beams
US5868134A (en) * 1993-09-21 1999-02-09 Kabushiki Kaisha Topcon Retinal disease analyzer
US5841911A (en) * 1995-06-06 1998-11-24 Ben Gurion, University Of The Negev Method for the restoration of images disturbed by the atmosphere
US5835265A (en) * 1996-07-18 1998-11-10 Computed Anatomy Incorporated Large numerical aperture imaging device
KR100247938B1 (en) * 1997-11-19 2000-03-15 윤종용 Digital focusing apparatus and method of image processing system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5193124A (en) * 1989-06-29 1993-03-09 The Research Foundation Of State University Of New York Computational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing, and obtaining improved focus images
US5661816A (en) * 1991-10-22 1997-08-26 Optikos Corporation Image analysis system
US5737456A (en) * 1995-06-09 1998-04-07 University Of Massachusetts Medical Center Method for image reconstruction
US6837855B1 (en) * 1997-12-18 2005-01-04 Michel Puech Use of an ultrasonic transducer for echographic exploration of human or animal body tissues or organs in particular of the eyeball posterior segment
US6567570B1 (en) * 1998-10-30 2003-05-20 Hewlett-Packard Development Company, L.P. Optical image scanner with internal measurement of point-spread function and compensation for optical aberrations
US6285799B1 (en) * 1998-12-15 2001-09-04 Xerox Corporation Apparatus and method for measuring a two-dimensional point spread function of a digital image acquisition system
US6887203B2 (en) * 2000-01-06 2005-05-03 Ultralink Ophthalmics Inc. Ophthalmological ultrasonography scanning apparatus
US6854846B2 (en) * 2000-02-09 2005-02-15 Michael Quigley Fundus photographic technique to determine eye refraction for optic disc size calculations
US6231186B1 (en) * 2000-03-24 2001-05-15 Bausch & Lomb Surgical, Inc. Eye measurement system
US6511180B2 (en) * 2000-10-10 2003-01-28 University Of Rochester Determination of ocular refraction from wavefront aberration data and design of optimum customized correction
US20030001098A1 (en) * 2001-05-09 2003-01-02 Stoddart Hugh A. High resolution photon emission computed tomographic imaging tool

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101048240B1 (en) * 2006-06-09 2011-07-08 오스람 게젤샤프트 미트 베쉬랭크터 하프퉁 Discharge lamp with holding device for electrodes
CN104933713A (en) * 2015-06-12 2015-09-23 杭州电子科技大学 Image MTF (Modulation Transfer Function) estimation method using edge analysis
CN106997773A (en) * 2015-10-13 2017-08-01 三星电子株式会社 Apparatus and method for performing Fourier transformation

Also Published As

Publication number Publication date
US20030103660A1 (en) 2003-06-05
US6928193B2 (en) 2005-08-09

Similar Documents

Publication Publication Date Title
US20050264759A1 (en) Fundus imaging
Salmon et al. An automated reference frame selection (ARFS) algorithm for cone imaging with adaptive optics scanning light ophthalmoscopy
US8998411B2 (en) Light field camera for fundus photography
Ricco et al. Correcting motion artifacts in retinal spectral domain optical coherence tomography via image registration
US5579063A (en) Methods and devices for the measurement of the degradation of image quality on the retina of the human eye due to cataract
US20060126019A1 (en) Methods and systems for wavefront analysis
JP2014509544A (en) System and method for efficiently obtaining measurements of a human eyeball using tracking
Iskander et al. Analyzing the dynamic wavefront aberrations in the human eye
Hammer et al. Active retinal tracker for clinical optical coherence tomography systems
Meitav et al. Improving retinal image resolution with iterative weighted shift-and-add
US20030090629A1 (en) High-resolution retina imaging and eye aberration diagnostics using stochastic parallel perturbation gradient descent optimization adaptive optics
Bek Fine structure in diabetic retinopathy lesions as observed by adaptive optics imaging. A qualitative study
Cheng et al. Robust three-dimensional registration on optical coherence tomography angiography for speckle reduction and visualization
US8184149B2 (en) Ophthalmic apparatus and method for increasing the resolution of aliased ophthalmic images
Molodij et al. Enhancing retinal images by extracting structural information
Schramm et al. 3D retinal imaging and measurement using light field technology
US8144960B2 (en) Method for measuring the concentration of a substance using advanced image processing techniques
Nourrit et al. Blind deconvolution for high-resolution confocal scanning laser ophthalmoscopy
JP2023511723A (en) Method and apparatus for orbit determination and imaging
EP3459434A1 (en) Ophthalmologic apparatus and method of controlling the same
Dan et al. Evaluation of optic disc measurements with the glaucoma-scope
Xu et al. Tracking retinal motion with a scanning laser ophthalmoscope.
JP6776317B2 (en) Image processing equipment, image processing methods and programs
Tsaritsyn Automatic Retinal Blood Velocity Estimation in the Living Human Retina, using Adaptive Optics Scanning Laser Ophthalmoscopy
KR20230165812A (en) Dry eye classification method, ophthalmic device using the same, and learning device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION