US20110129134A1 - Methods and systems for detection of retinal changes - Google Patents

Methods and systems for detection of retinal changes Download PDF

Info

Publication number
US20110129134A1
US20110129134A1 US12/762,545 US76254510A US2011129134A1 US 20110129134 A1 US20110129134 A1 US 20110129134A1 US 76254510 A US76254510 A US 76254510A US 2011129134 A1 US2011129134 A1 US 2011129134A1
Authority
US
United States
Prior art keywords
images
analysis
image
patient
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/762,545
Inventor
Joâo Diogo de Oliveira e Ramos
Nélson Augusto de Sousa Vilhena
Frederico Teles de Campos Costa Santos
João Paulo da Silva Pinto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/762,545 priority Critical patent/US20110129134A1/en
Publication of US20110129134A1 publication Critical patent/US20110129134A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Definitions

  • the present invention is directed to systems and methods for the presentation and analysis of digital color fundus images (retinographies) to aid in the diagnosis of retinopathies, e.g., diabetic retinopathy and age-related macular degeneration, among others.
  • retinopathies e.g., diabetic retinopathy and age-related macular degeneration, among others.
  • Diabetic retinopathy is a cause of blindness in diabetic patients and so it is important to be able to screen patients for evidence of this disease. Such screening typically involves examination of digital color fundus images to identify different symptoms related to the disease; microaneurysms, hemorrhages, vascular abnormalities, among others. In particular, the number, density and locations of microaneurysms are important factors to quantify the progression of diabetic retinopathy in its early stages.
  • the system includes a record module configured to permit digital color fundus images taken at different times to be imported; an image set module configured to group two or more of said images (e.g., on an eye-by-eye and/or patient-by-patient basis) as part of an analysis; a processing module configured to generate analyses results for selected groups of said images; and an analysis tool set configured to permit viewing of the analysis results, as well as the images, on a eye-by-eye basis and to collect inputs and instructions relating to said analyses and images from a user.
  • a record module configured to permit digital color fundus images taken at different times to be imported
  • an image set module configured to group two or more of said images (e.g., on an eye-by-eye and/or patient-by-patient basis) as part of an analysis
  • a processing module configured to generate analyses results for selected groups of said images
  • an analysis tool set configured to permit viewing of the analysis results, as well as the images, on a eye-by-eye basis and to collect inputs and instructions relating to said analyses and images
  • the record module allows each image to be associated with various information, including some or all of the following indicators: a date on which the respective image was captured, whether the image is of a right or left eye, the kind of equipment used to capture the image, the field and angle of acquisition, and whether or not mydriasis was used at the time the image was acquired.
  • the image set module is adapted to group selected images either manually (the system automatically disposes them in a chronological manner), or with one image being selected as a reference and images within a designated time interval being grouped with respect to that reference automatically.
  • the processing module is configured to permit image pre-processing and co-registration and various information analysis. Images may be pre-processed prior to analysis through enhancements in contrast and/or brightness.
  • the analysis of the images is performed so as to automatically detect pathological markers of retinopathy, for example through detection of significant retinal changes over time (e.g., as presented in a sequence of images of a patient), detection, identification and quantification of microaneurysms, and/or detection and identification of other relevant pathological markers of retinopathy. Detecting significant retinal changes over time generally involves automatically detecting changes that manifest between different images, indicating either progression or regression of pathological markers (e.g., microaneurysms) of retinopathy.
  • pathological markers e.g., microaneurysms
  • microaneurysms may be automatically detected and identified in various patient images.
  • the system may suggest a follow-up/referral to an ophthalmologist in accordance with a decision tree or other diagnostic aid.
  • embodiments of the present system are configured to compute clinically relevant information such as microaneurysms formation rates, microaneurysms disappearance rates, and microaneurysms activity levels, for example by means of the number of microaneurysms at each visit, the number of new microaneurysms formed during a follow-up period and the number of microaneurysms that suffered occlusion during the follow-up period.
  • clinically relevant information such as microaneurysms formation rates, microaneurysms disappearance rates, and microaneurysms activity levels, for example by means of the number of microaneurysms at each visit, the number of new microaneurysms formed during a follow-up period and the number of microaneurysms that suffered occlusion during the follow-up period.
  • the present invention may include an analysis tool that has a log-in module (which may accept user credentials such as a user name and password), a main page, a patient module, a patient dashboard, and an image explorer.
  • the main page permits access to a user dashboard, which tracks analysis jobs either in progress or those that have been already processed but have not yet been reviewed, a patient list, a search page and a new patient registration module.
  • Each patient has a dedicated patient dashboard, which acts as a repository for associated information (e.g., clinical histories and demographic information), images and analyses for the patient and allows a user to explore those images and analyses in detail.
  • the patient information can be displayed in any convenient fashion, for example in a collapsed or expanded view.
  • the patient dashboard also provides the user an opportunity to choose/view information such as any detected significant retinal changes over time. This may include the detection, identification and quantification of microaneurysms and/or other relevant pathological markers of retinopathy.
  • a user can submit new images to be processed as part of an analysis and can have that analysis information returned in a convenient, timeline view.
  • the image explorer is configured to permit a user to view the images in detail. Views may be configured to include one, two or multiple images, allowing co-registered side-by-side comparisons of images taken at different times, etc. Full screen and partial screen viewing can be accommodated. Both the image explorer and the analysis explorer have associated tool sets, allowing a user to markup or comment an image or analysis. Such markups or comments are stored as separate layers with the associated image. Tools include zoom features (e.g., magnifying lenses), pointers (which allow selection and repositioning of objects), pencils (which permit drawing), commenting fields for text, lesions identifiers, and measuring tools to indicate distances.
  • zoom features e.g., magnifying lenses
  • pointers which allow selection and repositioning of objects
  • pencils which permit drawing
  • commenting fields for text, lesions identifiers and measuring tools to indicate distances.
  • a calibration feature allows for accurate scaling, for example with reference to the optic disc and fovea (both of which are automatically detected and may be manually confirmed by the user). Different highlights, colors, display options and screen choices (e.g., backgrounds and the like) can also be used. The comments and/or markups can be viewed, edited and/or deleted according to user wishes.
  • FIG. 1 illustrates an example of a computer system in which embodiments of the present invention may be instantiated
  • FIG. 2 illustrates an example of an image processing system configured according to an embodiment of the present invention.
  • FIG. 3 illustrates an example of a program flow for a user of an instantiation of the present invention, when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 4 illustrates an example of a log-in screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 5 illustrates an example of a main page for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 6 illustrates an example of a user dashboard for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 7 illustrates an example of a patient record selected from a patient list for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 8 illustrates an example of a patient dashboard for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 9 illustrates an example of a microaneurysm analysis output screen for an instantiation of the present invention, in particular for the detection and identification of microaneurysms, when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 10 illustrates an example of an analysis explorer screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 11 illustrates an example of the use of a calibration tool for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 12 illustrates an example of an image explorer screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1 .
  • FIG. 13 illustrates an example of a differences analysis output screen for an instantiation of the present invention, in particular for the detection and identification of differences between a baseline image and images of an image set, when executing on a computer system such as that illustrated in FIG. 1 .
  • Described herein are systems and methods for analyzing digital color fundus images in brief, the present system allows users to import digital color fundus images over time, and group such images for processing so as to generate analyses.
  • Analyses are information modules based on a selected group of images and, optionally, related information.
  • An analysis tool allows users to view and manipulate the analyses via a graphical user interface for aid in identifying and classifying microaneurysms and other features observable in the images, and, more generally, to allow for the detection of retinal changes over time.
  • microaneurysms are detected in identified locations, allowing the system to compute microaneurysm turnover indicators such as formation and disappearance rates. Additionally the present system automatically highlights differences between retinographies and clearly identifies changes difficult to be detected by the human eye.
  • the system includes an analysis explorer that includes a tool set, allowing a user to document, draw-over and comment images, as well as an on-screen dashboard. A variety of reports can be created and images archived.
  • Various embodiments of the present invention may be implemented with the aid of computer-implemented processes or methods (i.e., computer programs or routines) that may be rendered in any computer language including, without limitation, C#, C/C++, Matlab, assembly language, markup languages, and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), JavaTM and the like.
  • CORBA Common Object Request Broker Architecture
  • JavaTM JavaTM
  • the present invention can be implemented with an apparatus to perform the operations described herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a computer system that has been selectively activated or configured by a computer program executed by the computer system.
  • a computer program may be stored in/on a computer-readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), electrically programmable ROMs (EPROMs), electrically erasable and programmable ROMs (EEPROMs), magnetic or optical cards, or any type of tangible media suitable for storing electronic instructions, and each readable by a computer system processor or the like.
  • Computer system 100 upon which an embodiment of the invention may be implemented is shown.
  • Computer system 100 includes a bus 102 or other communication mechanism for communicating information, and a processor 104 coupled with the bus 102 for executing the computer software which is an embodiment of the invention and for processing information (such as digital color fundus images) in accordance therewith.
  • Computer system 100 also includes a main memory 106 , such as a RAM or other dynamic storage device, coupled to the bus 102 for storing information and instructions to be executed by processor 104 .
  • Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 104 .
  • Computer system 100 further includes a ROM or other static storage device 108 coupled to the bus 102 for storing static information and instructions for the processor 104 .
  • a storage device 110 such as a hard drive or solid state storage device, is provided and coupled to the bus 102 for storing information and instructions.
  • Computer system 100 may be coupled via the bus 102 to a display 112 , such as a liquid crystal display (LCD) or other display device, for displaying information to a user.
  • a display 112 such as a liquid crystal display (LCD) or other display device
  • An input device 114 is coupled to the bus 102 for communicating information and command selections to the processor 104 .
  • cursor control 116 is Another type of user input device
  • cursor control 116 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 104 and for controlling cursor movement on the display 112 .
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y) allowing the device to specify positions in a plane.
  • Computer system 100 also includes a communication interface 118 coupled to the bus 102 .
  • Communication interface 108 provides a two-way data communications between computer system 100 and other devices, for example via a local area network (LAN) 120 . These communications may take place over wired and/or wireless communication links.
  • LAN local area network
  • communication interface 118 sends and receives electrical, electromagnetic or optical signals which carry digital data streams representing various types of information.
  • two or more computer systems 100 may be networked together in a conventional manner with each using the communication interface 118 .
  • Network 120 may also provides communication to one or more networks and other devices associated with those networks.
  • network 120 may provide a connection to a server 122 (which may store images for processing) and/or to data equipment operated by an Internet Service Provider (ISP) 124 .
  • ISP 124 in turn provides data communication services through the world wide communication network now commonly referred to as the “Internet” 126 , through which remote computer systems such as remote server 128 may be accessed.
  • System 200 includes a recording module 202 configured to allow digital color fundus images to be added to an image database 204 over time.
  • the database 204 is located locally, at a personal computer on which an instantiation of system 200 is executing, or may be located remotely for example at a network server 122 or a remote server 128 .
  • System 200 also includes an analysis tool 206 which manages the input and output of information from/to a user interface 208 .
  • the user interface 208 may be presented via a display, such as display 112 , and may be adapted to receive alphanumeric and/or cursor control inputs via a keyboard 114 and/or a cursor control device 116 .
  • the user interface is adapted, as discussed in detail below, to display information to a user and collect inputs and instructions from a user, said instructions generally relating to the manipulation and control of images to allow for noninvasive diagnosis for retinopathies.
  • an image set module 210 Associated with analysis tool 206 , more specifically with its patient dashboard module 220 , is an image set module 210 , which is configured to permit grouping of digital color fundus images as part of the analysis thereof.
  • the actual analysis is performed by a processing module 212 , configured to generate information to aid in diagnostic activities based on a selected group of the images.
  • recording module 202 allows for digital color fundus images to be imported and stored in database 204 .
  • various information may be imported and stored with the images. For example, information such as the date the image was captured, an indication of whether the image is of a right or left eye, the kind of equipment used to capture the image, and the field and angle of acquisition, and whether or not mydrasis was used at the time the image was captured, can all be recorded and associated with each image as part of database 204 . Such information is recorded at the same time as the image is imported.
  • image set module 210 is adapted to group images either manually (through user direction via interface 208 ), or automatically.
  • images are grouped, patient-by-patient, in a chronological order with one image being selected as a reference, a time interval defined, and all images with a date indicator falling within that interval being grouped automatically.
  • the processing module includes sub-modules for image pre-processing 214 , image co-registration 216 and information analysis 218 .
  • image pre-processing and co-registration are provided in co-pending application Ser. No. 12/629,661, filed on even date herewith.
  • the image pre-processing sub-module 214 is adapted to enhance digital color fundus images by applying changes in contrast and brightness.
  • Co-registration sub-module 216 is adapted to co-register the pre-processed images so that the features in the images are aligned with one another across different images.
  • the information analysis sub-module 218 is adapted to detect pathological markers of retinopathy by detecting significant retinal changes over time; detecting, identifying and quantifying microaneurysms that appear in the images; and/or detecting and identifying other relevant pathological markers of retinopathy.
  • detecting significant retinal changes over time involves automatically detecting differences that exist between different images of a common eye, indicating either progression or regression of pathological markers of retinopathy.
  • Detecting and identifying microaneurysms involves automatically detecting and identifying microaneurysms in the various images.
  • the information analysis sub-module 218 is further adapted to permit exploration of the information obtained from the analysis of the digital color fundus images by computing the position and areas of detected pathological signs of retinopathy and, in some instances, suggest recommended referral/follow-up actions in accordance with a decision tree. Computation of the position and areas of detected pathological signs of the retinopathy involves computation of the position and surface area of, for example, microaneurysms detected in the images.
  • Clinically relevant information that can be obtained in this fashion includes microaneurysm formation rate; microaneurysm disappearance rate; and microaneurysm activity level (for example by determining the number of microaneurysms at each visit/image capture, the number of new microaneurysms during a prescribed follow-up period and the number of microaneurysms that suffered occlusion during the follow-up period).
  • analysis tool 206 receives as inputs images added to database 204 by the user and returns information gleaned from those images as determined by processing module 212 .
  • the analysis tool includes a variety of sub-modules, such as a log-in or registration module 224 , a main page 226 , a patient module 222 , a patient dashboard 220 , and an image explorer 228 .
  • the log-in/registration module 224 is configured to restrict access to system 200 by authorized users, e.g., those that have a valid user name/password combination, which can be entered via the user interface 208 .
  • the main page 226 is displayed via the user interface and provides a variety of options, including a jobs dashboard, which includes a list of submitted jobs (i.e., analysis projects) that are in progress, waiting in queue, have failed or have finished successfully but are as-yet un-reviewed; a patient tab; a search tab, which allows the user to search for a particular patient, rather than selecting the one from a default list displayed in the patient tab; and a “new” tab, to add new patients to the system.
  • the various sub-modules or tabs when selected, may launch new windows in the user interface which facilitate user interaction with the features provided by those sub-modules.
  • the new tab provides facilities for the user to create new patient records, which can contain associated images and information and any analyses of those images.
  • Patient module 222 is the collection of patient files, accessed by means of selection though the patient or search tab in the main page.
  • information and analyses pertaining to a respective individual patient is displayed via user interface 208 for user review and interaction.
  • Patient information may include general identification information and a brief clinical history of the respective patient.
  • An expanded/collapsed viewing scheme allows the user to review only that information which is of interest at the time.
  • the user may create, edit and delete aspects of a respective patient's records, and can also launch the patient dashboard 220 , populated with images and information related to the selected patient.
  • Patient dashboard 220 comprises a display of images, analysis results and in-progress analyses related to the selected patient.
  • the dashboard is configured to provide the user the opportunity to choose various information analyses, on a right eye/left eye basis, such as the detection of significant retinal changes over time; the detection, identification and quantification of microaneurysms; and the detection and identification of other relevant pathological markers of retinopathy. Further, via the dashboard a user can submit new images (right eye and/or left eye) to be processed.
  • Image explorer 228 provides a separate window in user interface 208 in which a user may view a full-screen version of digital color fundus images.
  • the images may be displayed individually, or two at a time or even four at a time in a common screen. This multi image display capability allows a user to review how images of the same eye, taken at different times, compare with one another and thereby aids in the doctor's diagnostic process. Images can be selected for display in time sequence or in another order selected by the user. Such selection may be made by dragging and dropping representations of the images into the image explorer. Images can be seen in the image explorer if opened directly from the image database (in which case only one image is visible) or opened through an analysis (in which case the images of that analysis may be compared).
  • the image explorer also includes a variety of image review tools, such as a zoom capability (e.g., using a magnifying glass object that can be dragged over portions of the image, various drawing and marking tools (e.g., pointers which allow for selection and repositioning of objects, pencils which allow for free drawing by a user, text and commenting tools, rulers or other measuring devices, lesion identifiers, and calibration instruments which correspond to a user's validation of automated detection of features such as the optic disc and fovea within a respective image), various overlay tools, allowing different background images to be overlaid with analysis results, comments, drawings, matrices showing principle components or axes of the eye, and colors used to highlight items of significance, such as lesions and the like.
  • image review tools such as a zoom capability (e.g., using a magnifying glass object that can be dragged over portions of the image, various drawing and marking tools (e.g., pointers which allow for selection and repositioning of objects, pencils which allow for
  • system 200 Having thus provided an overview of system 200 , a detailed discussion of the operation of the various modules thereof will now be presented. This will be done from the point of view of a user of system 200 , and will include a presentation of various screens of user interface 208 .
  • This user-oriented discussion is intended to provide an example of the user of system 200 , but the system need not be used in the order presented.
  • the discussion assumes that a software instantiation of system 200 has been installed on a compatible computer system, such as computer system 100 shown in FIG. 1 , and that the software is running on that system. This may require the installation of a conventional operating system, such as the WindowsTM operating system available from Microsoft Corporation of Redmond, Wash., or another operating system, and other conventional components of a computer system, such as a browser, etc.
  • log-in screen 400 An example of a log-in screen 400 is shown in FIG. 4 .
  • Log-in screen 400 includes text boxes 402 and 404 that permit a user to enter a user name and password, respectively, and a button 406 .
  • button 406 e.g., using cursor control device 116 or keyboard 114 , the log-in process takes place.
  • system 200 may be configured to run without requiring the user to successfully navigate any identification challenge. Where such identification is required, however, a log-out facility may also be provided so that users can secure system 200 when it is not in use. In some cases, the user may be automatically logged-out after a defined period of inactivity in which system 200 does not receive any user input.
  • the main page module 304 is displayed for the user, preferably in the form of a menu.
  • An example of a main page menu 500 is shown in FIG. 5 .
  • this menu is displayed, either fully or in a minimized fashion, in a reserved area of the screen or as an overlay over a currently displayed window. This allows the user to always have the option of navigating between the components accessed through the main page, i.e., the user dashboard 306 , patient list 308 , search page 310 and new patient page 312 .
  • the user dashboard 306 displays those analyses which a user has submitted but which have not yet been reviewed.
  • FIG. 6 presents an example of a user dashboard screen 600 .
  • an analysis status menu 602 is presented. This menu includes, for each analysis, a patient identification (ID) 604 , including age and gender, an analysis IP (including analysis type) 606 , and the date and time 608 when the analysis was submitted. Analyses generally require some time to be processed and so it is expected that users will queue up several analyses (image comparison jobs) to be processed in the background which other tasks (such as the review of individual images or of previously submitted analyses) are performed in the foreground.
  • System 200 will alert the user, via an appropriately displayed analysis status message 610 in the user dashboard, when an analysis is ready for review.
  • Such review can be undertaken by selecting the completed analysis from the user dashboard.
  • an analysis is removed from the user dashboard but it is available in the respective patient dashboard for the patient to which it pertains.
  • the user dashboard also lists analyses which are still being processed, those which are awaiting processing, and those which may have stopped processing prematurely due to errors in co-registration or otherwise.
  • the patients list 308 allows access to individual patient images and information in the form of a patient's record.
  • An example of a patient list screen 700 is shown in FIG. 7 .
  • an alphabetical list 702 of the user's patients is presented by default.
  • Each record includes the patient ID 704 , the date of the patient's last visit 706 , and any known, relevant clinical information 708 .
  • the list can be sorted in a variety of ways 710 to suit the user's requirements.
  • any associated actions can be revealed or hidden 722 .
  • a user can access a respective patient record 712 , edit the patient's demographic data 714 , add or remove patients from a list 716 and also filter the patient list using a search feature 718 .
  • the user may use the search page 310 to search for a particular patient.
  • New patient records can be added using the new patient module 312 . Selecting a patient record in any of these fashions opens the respective patient dashboard 314 .
  • FIG. 8 An example of a patient dashboard 800 is shown in FIG. 8 .
  • the patient dashboard displays an overview of the analyses 802 (corresponding to the information module) for the selected patient.
  • This information is laid out in timeline-style modules for both right and left eyes: images and analyses from both eyes are displayed one above the other, for example organized by date of image capture. Users can navigate between the data displayed in these modules using cursor control keys on a keyboard or a cursor control device.
  • a patient dashboard may display a scrollable timeline 804 containing OD (right eye) and OS (left eye) microaneurysm analyses, a scrollable timeline 806 containing OD and OS differences analyses, an OD and OS image database (not shown in this illustration but selectable at the user's option) displaying previously imported OD and OS images, and a jobs module (not shown in this illustration) displaying analyses submitted but not yet ready for the selected patient.
  • a scrollable timeline 804 containing OD (right eye) and OS (left eye) microaneurysm analyses
  • a scrollable timeline 806 containing OD and OS differences analyses
  • an OD and OS image database not shown in this illustration but selectable at the user's option
  • a jobs module not shown in this illustration
  • Each patient dashboard thus provides an image database for the associated patient, i.e., an archive of the respective patient's available images, organized in a scrollable timeline containing images from OD and OS.
  • a user needs only to access that dashboard, e.g., from the patient list, and select an “add image” button 810 .
  • This will provide the user an appropriate screen where the user will be prompted for images he/she wishes to import from database 204 or from external sources. Images may be selected for import in any of a variety of fashions, for example by selecting associated cheekboxes or by dragging the images into an appropriate portion of the screen.
  • a user may be prompted to provide additional details such as the date of the image (i.e., the date it was captured), the field of view, whether it is an OS or OD image, the angle, etc.
  • microaneurysm analyses examines image sets for microaneurysms that appear within the constituent images. Microaneurysm determinations can be made by comparing features found in an image under consideration with a reference pattern. While the microaneurysm analysis is run on an image-by-image basis, results are reported with respect to image sets so that microaneurysm turnover rates can be computed and reported. Microaneurysm detection is an automated process. Differences analyses examine sets of images against a designated baseline or reference image to determine differences between the baseline and another image.
  • Differences detections can reveal features such as hemorrhages, drusen, exudates, etc. Differences detections result in differences being highlighted, but no automated characterization of the difference is provided. Instead, it is left to the user to examine the highlighted difference and decide what it represents.
  • a microaneurysm analysis for a patient, it is preferable to use at least three representative images taken over a minimum twelve month period.
  • the user selects a “new analysis” button 812 , and in response a user interface screen which permits the user to select the images to be used in the analysis will be presented.
  • the user can select any number of images, preferably three or more (per eye), for the analysis, and both OS and OD analyses can be run simultaneously.
  • the user can instruct the analysis module to begin processing the new analysis.
  • the analysis output includes the image set used for the analysis, the microaneurysm count for each such image and the computed microaneurysm turnover rates (microaneurysm formation and disappearance rates). In other instances, the analysis will compare each selected image to a baseline image and highlight relevant differences.
  • the user selects the appropriate patient (e.g., from the patients list) and enters his/her associated patient dashboard.
  • the available analyses will be presented in the scrollable timeline 806 containing OS and OD microaneurysm analyses and can be opened by appropriate selection (e.g., a double click on an analysis thumbnail 814 ). Once opened, the selected analysis will be presented for review in an analysis output screen 318 , which may be presented in a separate window.
  • the process is similar. From the patient dashboard, the user selects a new differences analysis button 816 and identifies the image dataset which the user wishes to analyze. When the differences analysis is complete, the results are returned to the patient dashboard for review in the form of a differences analysis output. Selecting the differences analysis will then allow the user to review the results in a differences analysis output screen, which may be presented in a separate window. This is discussed in greater detail below, with reference to FIG. 13 .
  • an analysis explorer 902 is provided within a microaneurysm analysis output screen 900 , an example of which is shown in FIG. 9 .
  • the analysis explorer contains a toolset to allow the user to document (e.g., using pencil and pointer tools), zoom in (using a magnifying lens tool), comment (using a text tool), measure (using a ruler tool) and compare all features in each individual image included in an analysis. These images can also be compared side-by-side, enhancing the user's ability to identify clinically significant features in the images and visually inspect the progression of those features in succeeding images.
  • individual images from the image database which are not part of any analysis, can also be explored, and, optionally, commented, using the analysis explorer (e.g., for comparison purposes).
  • Special tools allow the user to designate or mark (e.g., with appropriate icons) features as microaneurysms, hemorrhages, exudates, vascular caliber abnormalities, or drusens.
  • a calibration tool allows for fine tuning of the auto-detection of the optic disc and fovea in each image. As analyses are added to a patient's dashboard they are saved, allowing the user to return to previous analyses to compare results from various analyses over time. Analyses not yet ready for review can be monitored in the jobs module, which displays progress of queued jobs.
  • the system allows the user to interact with the automatically identified microaneurysms.
  • the user may reclassify microaneurysms as being already existent or inexistent.
  • the user may also add microaneurysms using a tool designed for that effect.
  • a microancurysm is manually marked, it is propagated, as an empty placeholder, through all the other images of the analysis.
  • the system also offers to the user a tool (a set of navigation arrows that appear near the marked microaneurysm/placeholder) for fast, centered, browsing through the referred placeholders, allowing the user to classify them as being visible or not visible.
  • Each microaneurysm analysis is provided a unique ID 904 to permit its identification.
  • the individual images 906 that make up the set of images used to computer the analysis are provided. As shown, these images are arranged in a timeline fashion, allowing the user to observe changes over time.
  • the computed differences 908 between the images are displayed. These differences highlight microaneurysms which were located in the images.
  • factors such as microancurysm turnover rates are presented both numerically and graphically, 910 and 912 , and users can switch between outputs and analysis documentation 914 , displayed on the right side of the screen. Individual images can also be explored further by activating the image explorer 916 .
  • FIG. 10 An example of the analysis explorer screen 1000 is shown in FIG. 10 . Illustrated alongside the image under consideration are icons representing the toolset 1002 discussed above and examples of the commenting and other outputs that can be produced using these tools is shown. For example, in this screen, a user has indicated one microaneurysm 1004 , hemorrhages 1006 , exudates 1008 , drusen 1010 , and VCAs 1012 . The user has also annotated the image with comments 1014 . An example of measuring tool 1016 is also shown. Users can manipulate the measuring tool by dragging and dropping the ends of the ruler onto locations of interest in the image.
  • FIG. 11 illustrates an example of the use of the calibration tool.
  • a set of axes 1104 are displayed, overlaying the image under consideration.
  • the system automatically detects the optic disc and fovea center and positions the axes accordingly. This positioning needs to be confirmed by the user.
  • the calibration tool is used in conjunction with the measurement tool.
  • the system assumes the optic disc diameter to be 1500 ⁇ m in size and calibrates the measurement tool accordingly. Updates by the user are adjusted to the referenced measurements and these changes are saved.
  • FIG. 12 illustrates an example of an image explorer screen 1200 , this one being configured for co-registered, side-by-side display of multiple images. From this screen, any image can be selected for full screen view. Further, the magnifier lens tool 1202 is shown overlaying the images and provides a higher resolution, i.e., a zoom, user-configurable using a mouse wheel, view of the portion of the image over which it is positioned. This allows for detailed comparison of common areas of different images, e.g., that were captured at different times.
  • users can enter comments in a text field 1204 , switch between various screen layouts 1206 (allowing for single or multiple images to be viewed), turn on or off various information layers 1208 (such as commenting field 1204 ), and zoom in or out 1210 to review the images being displayed.
  • the analysis explorer contains a toolset similar to that described above, to allow the user to document (e.g., using pencil and pointer tools), zoom in (using a magnifying lens tool), comment (using a text tool), measure (using a ruler tool) and compare all features in each individual image included in an analysis.
  • Special tools allow the user to designate or mark (e.g., with appropriate icons) features as hemorrhages, exudates, vascular caliber abnormalities, or drusens.
  • a documentation area 1310 allows the user to review findings and annotate the analysis.
  • the difference information 1306 (as determined from a comparison of the subject image with the baseline or reference image) is shown.
  • a toggle control 1308 allows this information to be hidden or displayed, at the user's option.
  • Individual images from the analyzed image set can be viewed as thumbnails in the image panel 1312 , and the baseline image 1314 is also available. Below the images, thumbnails 1316 of the differences between the individual images and the baseline are presented.
  • a difference can be any change in morphology, color, brightness or contrast in the same region of an eye as between a reference image and a comparison image. Differences that are found during an analysis can be overlaid on top of the baseline image for easy reference.

Abstract

An image analysis system allows users to import digital color fundus images over time, and group such images for processing so as to generate analyses. Analyses are information modules based on a selected group of images and, optionally, related information. An analysis tool allows users to view and manipulate the analyses via a graphical user interface for aid in identifying and classifying microaneurysms and other symptoms related to retinopathy and, more generally, to allow for the detection of retinal changes over time.

Description

    RELATED APPLICATION
  • This is a CONTINUATION of and claims priority to U.S. patent application Ser. No. 12/629,798, filed 2 Dec. 2009, incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention is directed to systems and methods for the presentation and analysis of digital color fundus images (retinographies) to aid in the diagnosis of retinopathies, e.g., diabetic retinopathy and age-related macular degeneration, among others.
  • BACKGROUND
  • Diabetic retinopathy is a cause of blindness in diabetic patients and so it is important to be able to screen patients for evidence of this disease. Such screening typically involves examination of digital color fundus images to identify different symptoms related to the disease; microaneurysms, hemorrhages, vascular abnormalities, among others. In particular, the number, density and locations of microaneurysms are important factors to quantify the progression of diabetic retinopathy in its early stages.
  • SUMMARY OF THE INVENTION
  • In order to address the above-described problems, systems and methods for the display and analysis of digital color fundus images to aid in the diagnosis of diabetic retinopathy and other pathologies are provided. In one embodiment, the system includes a record module configured to permit digital color fundus images taken at different times to be imported; an image set module configured to group two or more of said images (e.g., on an eye-by-eye and/or patient-by-patient basis) as part of an analysis; a processing module configured to generate analyses results for selected groups of said images; and an analysis tool set configured to permit viewing of the analysis results, as well as the images, on a eye-by-eye basis and to collect inputs and instructions relating to said analyses and images from a user.
  • In various instantiations of the invention, the record module allows each image to be associated with various information, including some or all of the following indicators: a date on which the respective image was captured, whether the image is of a right or left eye, the kind of equipment used to capture the image, the field and angle of acquisition, and whether or not mydriasis was used at the time the image was acquired. The image set module is adapted to group selected images either manually (the system automatically disposes them in a chronological manner), or with one image being selected as a reference and images within a designated time interval being grouped with respect to that reference automatically.
  • The processing module is configured to permit image pre-processing and co-registration and various information analysis. Images may be pre-processed prior to analysis through enhancements in contrast and/or brightness. The analysis of the images is performed so as to automatically detect pathological markers of retinopathy, for example through detection of significant retinal changes over time (e.g., as presented in a sequence of images of a patient), detection, identification and quantification of microaneurysms, and/or detection and identification of other relevant pathological markers of retinopathy. Detecting significant retinal changes over time generally involves automatically detecting changes that manifest between different images, indicating either progression or regression of pathological markers (e.g., microaneurysms) of retinopathy. For example, microaneurysms may be automatically detected and identified in various patient images. In addition, once pathological markers of retinopathy have been detected in the images, the system may suggest a follow-up/referral to an ophthalmologist in accordance with a decision tree or other diagnostic aid.
  • In addition to presenting the existence of pathological markers of retinopathy, embodiments of the present system are configured to compute clinically relevant information such as microaneurysms formation rates, microaneurysms disappearance rates, and microaneurysms activity levels, for example by means of the number of microaneurysms at each visit, the number of new microaneurysms formed during a follow-up period and the number of microaneurysms that suffered occlusion during the follow-up period. Such computations are made on each analysis and are available for review, along with the images themselves and the analysis results, by the user.
  • Where instantiated as computer software, the present invention may include an analysis tool that has a log-in module (which may accept user credentials such as a user name and password), a main page, a patient module, a patient dashboard, and an image explorer. The main page permits access to a user dashboard, which tracks analysis jobs either in progress or those that have been already processed but have not yet been reviewed, a patient list, a search page and a new patient registration module. Each patient has a dedicated patient dashboard, which acts as a repository for associated information (e.g., clinical histories and demographic information), images and analyses for the patient and allows a user to explore those images and analyses in detail. The patient information can be displayed in any convenient fashion, for example in a collapsed or expanded view.
  • The patient dashboard also provides the user an opportunity to choose/view information such as any detected significant retinal changes over time. This may include the detection, identification and quantification of microaneurysms and/or other relevant pathological markers of retinopathy. Through the patient dashboard a user can submit new images to be processed as part of an analysis and can have that analysis information returned in a convenient, timeline view.
  • The image explorer is configured to permit a user to view the images in detail. Views may be configured to include one, two or multiple images, allowing co-registered side-by-side comparisons of images taken at different times, etc. Full screen and partial screen viewing can be accommodated. Both the image explorer and the analysis explorer have associated tool sets, allowing a user to markup or comment an image or analysis. Such markups or comments are stored as separate layers with the associated image. Tools include zoom features (e.g., magnifying lenses), pointers (which allow selection and repositioning of objects), pencils (which permit drawing), commenting fields for text, lesions identifiers, and measuring tools to indicate distances. A calibration feature allows for accurate scaling, for example with reference to the optic disc and fovea (both of which are automatically detected and may be manually confirmed by the user). Different highlights, colors, display options and screen choices (e.g., backgrounds and the like) can also be used. The comments and/or markups can be viewed, edited and/or deleted according to user wishes.
  • These and other features of the present invention are discussed in greater detail below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is illustrated by way of example, and not limitation, in the figures of the accompanying drawings, in which:
  • FIG. 1 illustrates an example of a computer system in which embodiments of the present invention may be instantiated; and
  • FIG. 2 illustrates an example of an image processing system configured according to an embodiment of the present invention.
  • FIG. 3 illustrates an example of a program flow for a user of an instantiation of the present invention, when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 4 illustrates an example of a log-in screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 5 illustrates an example of a main page for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 6 illustrates an example of a user dashboard for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 7 illustrates an example of a patient record selected from a patient list for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 8 illustrates an example of a patient dashboard for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 9 illustrates an example of a microaneurysm analysis output screen for an instantiation of the present invention, in particular for the detection and identification of microaneurysms, when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 10 illustrates an example of an analysis explorer screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 11 illustrates an example of the use of a calibration tool for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 12 illustrates an example of an image explorer screen for an instantiation of the present invention when executing on a computer system such as that illustrated in FIG. 1.
  • FIG. 13 illustrates an example of a differences analysis output screen for an instantiation of the present invention, in particular for the detection and identification of differences between a baseline image and images of an image set, when executing on a computer system such as that illustrated in FIG. 1.
  • DETAILED DESCRIPTION
  • Described herein are systems and methods for analyzing digital color fundus images in brief, the present system allows users to import digital color fundus images over time, and group such images for processing so as to generate analyses. Analyses are information modules based on a selected group of images and, optionally, related information. An analysis tool allows users to view and manipulate the analyses via a graphical user interface for aid in identifying and classifying microaneurysms and other features observable in the images, and, more generally, to allow for the detection of retinal changes over time.
  • In embodiments of the invention, microaneurysms are detected in identified locations, allowing the system to compute microaneurysm turnover indicators such as formation and disappearance rates. Additionally the present system automatically highlights differences between retinographies and clearly identifies changes difficult to be detected by the human eye. The system includes an analysis explorer that includes a tool set, allowing a user to document, draw-over and comment images, as well as an on-screen dashboard. A variety of reports can be created and images archived.
  • Various embodiments of the present invention may be implemented with the aid of computer-implemented processes or methods (i.e., computer programs or routines) that may be rendered in any computer language including, without limitation, C#, C/C++, Matlab, assembly language, markup languages, and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), Java™ and the like. In general, however, all of the aforementioned terms as used herein are meant to encompass any series of logical steps performed in a sequence to accomplish a given purpose.
  • In view of the above, it should be appreciated that some portions of the detailed description that follows are presented in terms of algorithms and symbolic representations of operations on data within a computer system memory or other data store. These algorithmic descriptions and representations are the means used by those skilled in the computer science arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring manipulations of representations physical items, such as performing enhancements or other operations involving fundus images of the eye. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like. It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, it will be appreciated that throughout the description of the present invention, use of terms such as “processing”, “computing”, “calculating”, “determining”, “displaying” or the like, refer to the action and processes of a computer system or systems, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within its registers and memories into other data similarly represented as physical quantities within its memories or registers or other such information storage, transmission or display devices.
  • The present invention can be implemented with an apparatus to perform the operations described herein. This apparatus may be specially constructed for the required purposes, or it may comprise a computer system that has been selectively activated or configured by a computer program executed by the computer system. Such a computer program may be stored in/on a computer-readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), electrically programmable ROMs (EPROMs), electrically erasable and programmable ROMs (EEPROMs), magnetic or optical cards, or any type of tangible media suitable for storing electronic instructions, and each readable by a computer system processor or the like.
  • The algorithms and processes presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems (which become special purpose systems once appropriately programmed) may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required methods. For example, any of the methods according to the present invention can be implemented in hard-wired circuitry, by programming a general-purpose processor or by any combination of hardware and software. One of ordinary skill in the art will immediately appreciate that the invention can be practiced with computer system configurations other than those described below, including hand-held/mobile devices, multiprocessor systems, microprocessor-based and/or digital signal processor-based devices, and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. The required structure for a variety of these systems will appear from the description below.
  • Referring now to FIG. 1, an exemplary computer system 100 upon which an embodiment of the invention may be implemented is shown. Computer system 100 includes a bus 102 or other communication mechanism for communicating information, and a processor 104 coupled with the bus 102 for executing the computer software which is an embodiment of the invention and for processing information (such as digital color fundus images) in accordance therewith. Computer system 100 also includes a main memory 106, such as a RAM or other dynamic storage device, coupled to the bus 102 for storing information and instructions to be executed by processor 104. Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 104. Computer system 100 further includes a ROM or other static storage device 108 coupled to the bus 102 for storing static information and instructions for the processor 104. A storage device 110, such as a hard drive or solid state storage device, is provided and coupled to the bus 102 for storing information and instructions.
  • Computer system 100 may be coupled via the bus 102 to a display 112, such as a liquid crystal display (LCD) or other display device, for displaying information to a user. An input device 114, including alphanumeric and other keys, is coupled to the bus 102 for communicating information and command selections to the processor 104. Another type of user input device is cursor control 116, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 104 and for controlling cursor movement on the display 112. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y) allowing the device to specify positions in a plane.
  • In embodiments of the invention which are instantiated as computer programs (i.e., computer-readable/executable instructions stored on a computer-readable medium), such programs are typically stored on storage device 110 and at run time are loaded into memory 106. Processor 104 then executes sequences of the instructions contained in main memory 106 to perform the steps described below. In alternative embodiments, dedicated circuitry or modules (or, in some cases, firmware-enabled application specific integrated circuits or specialized processors) may be used in place of or in combination with computer software instructions to implement the invention.
  • Computer system 100 also includes a communication interface 118 coupled to the bus 102. Communication interface 108 provides a two-way data communications between computer system 100 and other devices, for example via a local area network (LAN) 120. These communications may take place over wired and/or wireless communication links. In any such implementation, communication interface 118 sends and receives electrical, electromagnetic or optical signals which carry digital data streams representing various types of information. For example, two or more computer systems 100 may be networked together in a conventional manner with each using the communication interface 118.
  • Network 120 may also provides communication to one or more networks and other devices associated with those networks. For example, network 120 may provide a connection to a server 122 (which may store images for processing) and/or to data equipment operated by an Internet Service Provider (ISP) 124. ISP 124 in turn provides data communication services through the world wide communication network now commonly referred to as the “Internet” 126, through which remote computer systems such as remote server 128 may be accessed.
  • Referring to FIG. 2, an example of an image processing system 200 configured according to an embodiment of the present invention is illustrated. System 200 includes a recording module 202 configured to allow digital color fundus images to be added to an image database 204 over time. In one embodiment, the database 204 is located locally, at a personal computer on which an instantiation of system 200 is executing, or may be located remotely for example at a network server 122 or a remote server 128.
  • System 200 also includes an analysis tool 206 which manages the input and output of information from/to a user interface 208. In practice, the user interface 208 may be presented via a display, such as display 112, and may be adapted to receive alphanumeric and/or cursor control inputs via a keyboard 114 and/or a cursor control device 116. The user interface is adapted, as discussed in detail below, to display information to a user and collect inputs and instructions from a user, said instructions generally relating to the manipulation and control of images to allow for noninvasive diagnosis for retinopathies.
  • Associated with analysis tool 206, more specifically with its patient dashboard module 220, is an image set module 210, which is configured to permit grouping of digital color fundus images as part of the analysis thereof. The actual analysis is performed by a processing module 212, configured to generate information to aid in diagnostic activities based on a selected group of the images.
  • As mentioned above, recording module 202 allows for digital color fundus images to be imported and stored in database 204. Along with the images themselves, various information may be imported and stored with the images. For example, information such as the date the image was captured, an indication of whether the image is of a right or left eye, the kind of equipment used to capture the image, and the field and angle of acquisition, and whether or not mydrasis was used at the time the image was captured, can all be recorded and associated with each image as part of database 204. Such information is recorded at the same time as the image is imported.
  • As will be explained in greater detail below, analysis of the digital color fundus images requires images to be selected, selected images thus comprising an analysis set images, in order to observe differences between images of the same eye that were captured at different times. Accordingly, image set module 210 is adapted to group images either manually (through user direction via interface 208), or automatically. In one embodiment, images are grouped, patient-by-patient, in a chronological order with one image being selected as a reference, a time interval defined, and all images with a date indicator falling within that interval being grouped automatically.
  • Once selected to define the analysis set, the images can be processed by processing module 212. The processing module includes sub-modules for image pre-processing 214, image co-registration 216 and information analysis 218. Examples of image pre-processing and co-registration are provided in co-pending application Ser. No. 12/629,661, filed on even date herewith. In general, the image pre-processing sub-module 214 is adapted to enhance digital color fundus images by applying changes in contrast and brightness. Co-registration sub-module 216 is adapted to co-register the pre-processed images so that the features in the images are aligned with one another across different images.
  • The information analysis sub-module 218 is adapted to detect pathological markers of retinopathy by detecting significant retinal changes over time; detecting, identifying and quantifying microaneurysms that appear in the images; and/or detecting and identifying other relevant pathological markers of retinopathy. For each individual patient, detecting significant retinal changes over time involves automatically detecting differences that exist between different images of a common eye, indicating either progression or regression of pathological markers of retinopathy. Detecting and identifying microaneurysms involves automatically detecting and identifying microaneurysms in the various images.
  • The information analysis sub-module 218 is further adapted to permit exploration of the information obtained from the analysis of the digital color fundus images by computing the position and areas of detected pathological signs of retinopathy and, in some instances, suggest recommended referral/follow-up actions in accordance with a decision tree. Computation of the position and areas of detected pathological signs of the retinopathy involves computation of the position and surface area of, for example, microaneurysms detected in the images. Clinically relevant information that can be obtained in this fashion includes microaneurysm formation rate; microaneurysm disappearance rate; and microaneurysm activity level (for example by determining the number of microaneurysms at each visit/image capture, the number of new microaneurysms during a prescribed follow-up period and the number of microaneurysms that suffered occlusion during the follow-up period).
  • As shown in the illustration, analysis tool 206 receives as inputs images added to database 204 by the user and returns information gleaned from those images as determined by processing module 212. The analysis tool includes a variety of sub-modules, such as a log-in or registration module 224, a main page 226, a patient module 222, a patient dashboard 220, and an image explorer 228. The log-in/registration module 224 is configured to restrict access to system 200 by authorized users, e.g., those that have a valid user name/password combination, which can be entered via the user interface 208.
  • Upon a successful log-in, the main page 226 is displayed via the user interface and provides a variety of options, including a jobs dashboard, which includes a list of submitted jobs (i.e., analysis projects) that are in progress, waiting in queue, have failed or have finished successfully but are as-yet un-reviewed; a patient tab; a search tab, which allows the user to search for a particular patient, rather than selecting the one from a default list displayed in the patient tab; and a “new” tab, to add new patients to the system. The various sub-modules or tabs, when selected, may launch new windows in the user interface which facilitate user interaction with the features provided by those sub-modules. The new tab provides facilities for the user to create new patient records, which can contain associated images and information and any analyses of those images.
  • Patient module 222 is the collection of patient files, accessed by means of selection though the patient or search tab in the main page. Within the patient module, information and analyses pertaining to a respective individual patient is displayed via user interface 208 for user review and interaction. Patient information may include general identification information and a brief clinical history of the respective patient. An expanded/collapsed viewing scheme allows the user to review only that information which is of interest at the time. Within the patient module 222, the user may create, edit and delete aspects of a respective patient's records, and can also launch the patient dashboard 220, populated with images and information related to the selected patient.
  • Patient dashboard 220 comprises a display of images, analysis results and in-progress analyses related to the selected patient. The dashboard is configured to provide the user the opportunity to choose various information analyses, on a right eye/left eye basis, such as the detection of significant retinal changes over time; the detection, identification and quantification of microaneurysms; and the detection and identification of other relevant pathological markers of retinopathy. Further, via the dashboard a user can submit new images (right eye and/or left eye) to be processed.
  • Image explorer 228 provides a separate window in user interface 208 in which a user may view a full-screen version of digital color fundus images. The images may be displayed individually, or two at a time or even four at a time in a common screen. This multi image display capability allows a user to review how images of the same eye, taken at different times, compare with one another and thereby aids in the doctor's diagnostic process. Images can be selected for display in time sequence or in another order selected by the user. Such selection may be made by dragging and dropping representations of the images into the image explorer. Images can be seen in the image explorer if opened directly from the image database (in which case only one image is visible) or opened through an analysis (in which case the images of that analysis may be compared).
  • The image explorer also includes a variety of image review tools, such as a zoom capability (e.g., using a magnifying glass object that can be dragged over portions of the image, various drawing and marking tools (e.g., pointers which allow for selection and repositioning of objects, pencils which allow for free drawing by a user, text and commenting tools, rulers or other measuring devices, lesion identifiers, and calibration instruments which correspond to a user's validation of automated detection of features such as the optic disc and fovea within a respective image), various overlay tools, allowing different background images to be overlaid with analysis results, comments, drawings, matrices showing principle components or axes of the eye, and colors used to highlight items of significance, such as lesions and the like.
  • Having thus provided an overview of system 200, a detailed discussion of the operation of the various modules thereof will now be presented. This will be done from the point of view of a user of system 200, and will include a presentation of various screens of user interface 208. This user-oriented discussion is intended to provide an example of the user of system 200, but the system need not be used in the order presented. The discussion assumes that a software instantiation of system 200 has been installed on a compatible computer system, such as computer system 100 shown in FIG. 1, and that the software is running on that system. This may require the installation of a conventional operating system, such as the Windows™ operating system available from Microsoft Corporation of Redmond, Wash., or another operating system, and other conventional components of a computer system, such as a browser, etc.
  • Referring to FIG. 3, use of system 200 begins with a user logging-in 302 by entering his or her user credentials. Typically, this will be a user name/password combination, but other credentials can be used. For example, in some instances a user may be required to provide a cryptographic key or other means of identification. An example of a log-in screen 400 is shown in FIG. 4. Log-in screen 400 includes text boxes 402 and 404 that permit a user to enter a user name and password, respectively, and a button 406. Upon selection of button 406, e.g., using cursor control device 116 or keyboard 114, the log-in process takes place. Of course, in some cases, system 200 may be configured to run without requiring the user to successfully navigate any identification challenge. Where such identification is required, however, a log-out facility may also be provided so that users can secure system 200 when it is not in use. In some cases, the user may be automatically logged-out after a defined period of inactivity in which system 200 does not receive any user input.
  • Upon successful log-in, the main page module 304 is displayed for the user, preferably in the form of a menu. An example of a main page menu 500 is shown in FIG. 5. In some instances, this menu is displayed, either fully or in a minimized fashion, in a reserved area of the screen or as an overlay over a currently displayed window. This allows the user to always have the option of navigating between the components accessed through the main page, i.e., the user dashboard 306, patient list 308, search page 310 and new patient page 312.
  • The user dashboard 306 displays those analyses which a user has submitted but which have not yet been reviewed. FIG. 6 presents an example of a user dashboard screen 600. Within screen 600, an analysis status menu 602 is presented. This menu includes, for each analysis, a patient identification (ID) 604, including age and gender, an analysis IP (including analysis type) 606, and the date and time 608 when the analysis was submitted. Analyses generally require some time to be processed and so it is expected that users will queue up several analyses (image comparison jobs) to be processed in the background which other tasks (such as the review of individual images or of previously submitted analyses) are performed in the foreground. System 200 will alert the user, via an appropriately displayed analysis status message 610 in the user dashboard, when an analysis is ready for review. Such review can be undertaken by selecting the completed analysis from the user dashboard. Upon review, an analysis is removed from the user dashboard but it is available in the respective patient dashboard for the patient to which it pertains. In addition to showing analyses that are ready for review, the user dashboard also lists analyses which are still being processed, those which are awaiting processing, and those which may have stopped processing prematurely due to errors in co-registration or otherwise.
  • The patients list 308 allows access to individual patient images and information in the form of a patient's record. An example of a patient list screen 700 is shown in FIG. 7. Within the screen, an alphabetical list 702 of the user's patients is presented by default. Each record includes the patient ID 704, the date of the patient's last visit 706, and any known, relevant clinical information 708. The list can be sorted in a variety of ways 710 to suit the user's requirements.
  • When an individual patient record is selected from list 702, it is shown in detail 720 within the screen and any associated actions can be revealed or hidden 722. From the patient list a user can access a respective patient record 712, edit the patient's demographic data 714, add or remove patients from a list 716 and also filter the patient list using a search feature 718. Alternatively, the user may use the search page 310 to search for a particular patient. New patient records can be added using the new patient module 312. Selecting a patient record in any of these fashions opens the respective patient dashboard 314.
  • An example of a patient dashboard 800 is shown in FIG. 8. The patient dashboard displays an overview of the analyses 802 (corresponding to the information module) for the selected patient. This information is laid out in timeline-style modules for both right and left eyes: images and analyses from both eyes are displayed one above the other, for example organized by date of image capture. Users can navigate between the data displayed in these modules using cursor control keys on a keyboard or a cursor control device. By way of example, a patient dashboard may display a scrollable timeline 804 containing OD (right eye) and OS (left eye) microaneurysm analyses, a scrollable timeline 806 containing OD and OS differences analyses, an OD and OS image database (not shown in this illustration but selectable at the user's option) displaying previously imported OD and OS images, and a jobs module (not shown in this illustration) displaying analyses submitted but not yet ready for the selected patient.
  • Each patient dashboard thus provides an image database for the associated patient, i.e., an archive of the respective patient's available images, organized in a scrollable timeline containing images from OD and OS. To import images into a patient dashboard, a user needs only to access that dashboard, e.g., from the patient list, and select an “add image” button 810. This will provide the user an appropriate screen where the user will be prompted for images he/she wishes to import from database 204 or from external sources. Images may be selected for import in any of a variety of fashions, for example by selecting associated cheekboxes or by dragging the images into an appropriate portion of the screen. At the time an image is imported, a user may be prompted to provide additional details such as the date of the image (i.e., the date it was captured), the field of view, whether it is an OS or OD image, the angle, etc.
  • From the patient dashboard, the user can also create new analyses for a patient. Two different types of analyses can be done: microaneurysm analyses and differences analyses. As the name implies, a microaneurysm analysis examines image sets for microaneurysms that appear within the constituent images. Microaneurysm determinations can be made by comparing features found in an image under consideration with a reference pattern. While the microaneurysm analysis is run on an image-by-image basis, results are reported with respect to image sets so that microaneurysm turnover rates can be computed and reported. Microaneurysm detection is an automated process. Differences analyses examine sets of images against a designated baseline or reference image to determine differences between the baseline and another image. Differences detections can reveal features such as hemorrhages, drusen, exudates, etc. Differences detections result in differences being highlighted, but no automated characterization of the difference is provided. Instead, it is left to the user to examine the highlighted difference and decide what it represents.
  • For a microaneurysm analysis for a patient, it is preferable to use at least three representative images taken over a minimum twelve month period. From the patient dashboard, the user selects a “new analysis” button 812, and in response a user interface screen which permits the user to select the images to be used in the analysis will be presented. The user can select any number of images, preferably three or more (per eye), for the analysis, and both OS and OD analyses can be run simultaneously. When the desired images for analysis have been selected, the user can instruct the analysis module to begin processing the new analysis.
  • When an analysis is complete, the results are returned to the patient dashboard for review in the form of a microancurysm analysis output. The analysis output includes the image set used for the analysis, the microaneurysm count for each such image and the computed microaneurysm turnover rates (microaneurysm formation and disappearance rates). In other instances, the analysis will compare each selected image to a baseline image and highlight relevant differences. To review a microaneurysm analysis the user selects the appropriate patient (e.g., from the patients list) and enters his/her associated patient dashboard. The available analyses will be presented in the scrollable timeline 806 containing OS and OD microaneurysm analyses and can be opened by appropriate selection (e.g., a double click on an analysis thumbnail 814). Once opened, the selected analysis will be presented for review in an analysis output screen 318, which may be presented in a separate window.
  • In the case of a differences analysis, the process is similar. From the patient dashboard, the user selects a new differences analysis button 816 and identifies the image dataset which the user wishes to analyze. When the differences analysis is complete, the results are returned to the patient dashboard for review in the form of a differences analysis output. Selecting the differences analysis will then allow the user to review the results in a differences analysis output screen, which may be presented in a separate window. This is discussed in greater detail below, with reference to FIG. 13.
  • Within a microaneurysm analysis output screen 900, an example of which is shown in FIG. 9, an analysis explorer 902 is provided. The analysis explorer contains a toolset to allow the user to document (e.g., using pencil and pointer tools), zoom in (using a magnifying lens tool), comment (using a text tool), measure (using a ruler tool) and compare all features in each individual image included in an analysis. These images can also be compared side-by-side, enhancing the user's ability to identify clinically significant features in the images and visually inspect the progression of those features in succeeding images. In some instantiations, individual images from the image database, which are not part of any analysis, can also be explored, and, optionally, commented, using the analysis explorer (e.g., for comparison purposes). Special tools allow the user to designate or mark (e.g., with appropriate icons) features as microaneurysms, hemorrhages, exudates, vascular caliber abnormalities, or drusens. In addition, a calibration tool allows for fine tuning of the auto-detection of the optic disc and fovea in each image. As analyses are added to a patient's dashboard they are saved, allowing the user to return to previous analyses to compare results from various analyses over time. Analyses not yet ready for review can be monitored in the jobs module, which displays progress of queued jobs.
  • Regarding a microaneurysms analysis output, the system allows the user to interact with the automatically identified microaneurysms. The user may reclassify microaneurysms as being already existent or inexistent. The user may also add microaneurysms using a tool designed for that effect. When a microancurysm is manually marked, it is propagated, as an empty placeholder, through all the other images of the analysis. The system also offers to the user a tool (a set of navigation arrows that appear near the marked microaneurysm/placeholder) for fast, centered, browsing through the referred placeholders, allowing the user to classify them as being visible or not visible.
  • Each microaneurysm analysis is provided a unique ID 904 to permit its identification. Within the output screen, the individual images 906 that make up the set of images used to computer the analysis are provided. As shown, these images are arranged in a timeline fashion, allowing the user to observe changes over time. In addition, to the actual images, the computed differences 908 between the images are displayed. These differences highlight microaneurysms which were located in the images. In addition, factors such as microancurysm turnover rates (determined from comparisons of the number of microaneuryms found in different images taken over time) are presented both numerically and graphically, 910 and 912, and users can switch between outputs and analysis documentation 914, displayed on the right side of the screen. Individual images can also be explored further by activating the image explorer 916.
  • An example of the analysis explorer screen 1000 is shown in FIG. 10. Illustrated alongside the image under consideration are icons representing the toolset 1002 discussed above and examples of the commenting and other outputs that can be produced using these tools is shown. For example, in this screen, a user has indicated one microaneurysm 1004, hemorrhages 1006, exudates 1008, drusen 1010, and VCAs 1012. The user has also annotated the image with comments 1014. An example of measuring tool 1016 is also shown. Users can manipulate the measuring tool by dragging and dropping the ends of the ruler onto locations of interest in the image.
  • FIG. 11 illustrates an example of the use of the calibration tool. By selecting the icon 1102 representing the calibration tool, a set of axes 1104 are displayed, overlaying the image under consideration. The system automatically detects the optic disc and fovea center and positions the axes accordingly. This positioning needs to be confirmed by the user. The calibration tool is used in conjunction with the measurement tool. The system assumes the optic disc diameter to be 1500 μm in size and calibrates the measurement tool accordingly. Updates by the user are adjusted to the referenced measurements and these changes are saved.
  • FIG. 12 illustrates an example of an image explorer screen 1200, this one being configured for co-registered, side-by-side display of multiple images. From this screen, any image can be selected for full screen view. Further, the magnifier lens tool 1202 is shown overlaying the images and provides a higher resolution, i.e., a zoom, user-configurable using a mouse wheel, view of the portion of the image over which it is positioned. This allows for detailed comparison of common areas of different images, e.g., that were captured at different times. From the image explorer, users can enter comments in a text field 1204, switch between various screen layouts 1206 (allowing for single or multiple images to be viewed), turn on or off various information layers 1208 (such as commenting field 1204), and zoom in or out 1210 to review the images being displayed.
  • Referring now to FIG. 13, an example of the differences analysis output screen 1300 is shown. Within the differences analysis output screen an analysis explorer 1302 is provided. The analysis explorer contains a toolset similar to that described above, to allow the user to document (e.g., using pencil and pointer tools), zoom in (using a magnifying lens tool), comment (using a text tool), measure (using a ruler tool) and compare all features in each individual image included in an analysis. Special tools allow the user to designate or mark (e.g., with appropriate icons) features as hemorrhages, exudates, vascular caliber abnormalities, or drusens. A documentation area 1310 allows the user to review findings and annotate the analysis.
  • Along with a preview of an individual image 1304, the difference information 1306 (as determined from a comparison of the subject image with the baseline or reference image) is shown. A toggle control 1308 allows this information to be hidden or displayed, at the user's option. Individual images from the analyzed image set can be viewed as thumbnails in the image panel 1312, and the baseline image 1314 is also available. Below the images, thumbnails 1316 of the differences between the individual images and the baseline are presented.
  • Within the context of the present invention, a difference can be any change in morphology, color, brightness or contrast in the same region of an eye as between a reference image and a comparison image. Differences that are found during an analysis can be overlaid on top of the baseline image for easy reference.
  • Thus, systems and methods for analyzing digital color fundus images have been described. Of course, the forgoing discussion of certain illustrated embodiments of the invention were not intended to place limits thereon. A variety of other screens and functions can be provided through the present system. The invention should, therefore, only be measured in terms of the claims, which now follow.

Claims (20)

1. A system comprising:
a record module configured to permit importing of digital color fundus images for processing by said system;
an image set module configured to group two of more of said images for analysis;
a processing module to generate analysis information for a selected group of said images; and
an analysis tool configured to provide a user a view of the analysis information and to collect inputs and instructions from a user via one or more tools of a graphical user interface.
2. A system according to claim 1, wherein said record module is configured to associate with each respective imported image, one or more of the following:
a date of the respective image;
an indication of whether the respective image is of a right or left eye;
information concerning equipment used to capture the respective image;
information concerning a field of acquisition for the respective image;
information concerning an angle of acquisition for the respective image;
information concerning whether or not mydriasis was used at the time of acquisition of the respective image.
3. A system according to claim 1, wherein said image set module is configured to group said images manually in a chronological manner, or with respect to a selected reference image.
4. A system according to claim 1, wherein said processing module is configured to provide image pre-processing; image co-registration; and information analysis.
5. A system according to claim 4, wherein said image pre-processing comprises enhancing said images by changes in contrast and brightness.
6. A system according to claim 4, wherein said image co-registration comprises performing co-registration of at least two of said images.
7. A system according to claim 4, wherein said information analysis comprises automated detection of pathological markers of retinopathy in at least one of the following ways: detection of significant retinal changes in time; detection, identification and quantification of microaneurysms; and detection and identification of other relevant pathological markers of retinopathy.
8. A system according to claim 7, wherein said detection of significant retinal changes in time comprises automatically detecting changes between identified ones of said images, and indicating either progression or regression of said pathological markers of retinopathy based on analysis of said identified images.
9. A system according to claim 7, wherein said detection and identification of microaneurysms comprises automatically detecting and identifying microaneurysms in identified ones of said images.
10. A system according to claim 4, wherein said information analysis comprises exploration of retrieved information through computation of positions and areas of detected pathological signs of retinopathy in analyzed ones of said images, and suggestion of recommended actions according to a decision tree.
11. A system according to claim 10, wherein the computation of positions and areas of detected pathological signs of retinopathy in analyzed ones of said images comprises computation of positions of microaneurysms and surface areas of other detected pathological signs of retinopathy in the analyzed ones of said images.
12. A system according to claim 10, wherein said suggestion of recommended actions comprises guidelines for therapeutic action.
13. A system according to claim 1, wherein the processing module is further configured to compute information regarding detection and identification of microaneurysms in the images and to present clinically relevant information including some or all of: microaneurysm formation rate; microaneurysm disappearance rate; microaneurysms activity level by means of numbers of microaneurysms at each patient visit, numbers of new microaneurysms during a follow-up period; and numbers of microaneurysms that suffered occlusion during the follow-up period.
14. A system according to claim 1, wherein the analysis tool is configured to receive as inputs said images and to return information regarding said images according to computations performed by said processing module.
15. A system according to claim 1, wherein the analysis tool includes a log-in module, a main page module, a patient module, a patient dashboard, and an image explorer.
16. A system according to claim 15, wherein said log-in module is configured to accept user identification credentials including a user name and password.
17. A system according to claim 15, wherein the main page module includes a user dashboard, a patient list, a search page, and a new patient module.
18. A system according to claim 17, wherein said user dashboard includes a list of submitted analysis jobs and their status.
19. A system according to claim 15, wherein said patient dashboard comprises those of said images related to a patient identified by said dashboard, and analysis results relevant for said patient identified by said dashboard.
20. A system according to claim 15 wherein said patient dashboard includes means to submit new analyses for a patient identified by said dashboard.
US12/762,545 2009-12-02 2010-04-19 Methods and systems for detection of retinal changes Abandoned US20110129134A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/762,545 US20110129134A1 (en) 2009-12-02 2010-04-19 Methods and systems for detection of retinal changes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/629,798 US20110129133A1 (en) 2009-12-02 2009-12-02 Methods and systems for detection of retinal changes
US12/762,545 US20110129134A1 (en) 2009-12-02 2010-04-19 Methods and systems for detection of retinal changes

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/629,798 Continuation US20110129133A1 (en) 2009-12-02 2009-12-02 Methods and systems for detection of retinal changes

Publications (1)

Publication Number Publication Date
US20110129134A1 true US20110129134A1 (en) 2011-06-02

Family

ID=44068948

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/629,798 Abandoned US20110129133A1 (en) 2009-12-02 2009-12-02 Methods and systems for detection of retinal changes
US12/762,545 Abandoned US20110129134A1 (en) 2009-12-02 2010-04-19 Methods and systems for detection of retinal changes
US13/046,758 Expired - Fee Related US8041091B2 (en) 2009-12-02 2011-03-13 Methods and systems for detection of retinal changes

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/629,798 Abandoned US20110129133A1 (en) 2009-12-02 2009-12-02 Methods and systems for detection of retinal changes

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/046,758 Expired - Fee Related US8041091B2 (en) 2009-12-02 2011-03-13 Methods and systems for detection of retinal changes

Country Status (1)

Country Link
US (3) US20110129133A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204242A1 (en) * 2011-06-27 2014-07-24 Koninklijke Philips N.V. Exam review facilitated by clinical findings management with anatomical tagging
KR101598048B1 (en) 2015-08-24 2016-02-26 주식회사 창가비앤텍 High strength lattice girder
US10163241B2 (en) * 2016-12-09 2018-12-25 Microsoft Technology Licensing, Llc Automatic generation of fundus drawings
CN109544540A (en) * 2018-11-28 2019-03-29 东北大学 A kind of diabetic retina picture quality detection method based on image analysis technology
CN109886955A (en) * 2019-03-05 2019-06-14 百度在线网络技术(北京)有限公司 Method and apparatus for handling eye fundus image
US11896382B2 (en) 2017-11-27 2024-02-13 Retispec Inc. Hyperspectral image-guided ocular imager for alzheimer's disease pathologies

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8265729B2 (en) * 2005-03-11 2012-09-11 Apteryx, Inc. Third party acquisition of images at the direction of an independent imaging application
US8996570B2 (en) * 2010-09-16 2015-03-31 Omnyx, LLC Histology workflow management system
KR101726606B1 (en) * 2010-11-18 2017-04-13 삼성전자주식회사 Method and apparatus for displaying information of mobile terminal
JP2012217144A (en) * 2011-03-30 2012-11-08 Panasonic Corp Image editing device, image editing method, and program
CN103458772B (en) 2011-04-07 2017-10-31 香港中文大学 Retinal images analysis method and device
JP5097288B2 (en) * 2011-04-28 2012-12-12 シャープ株式会社 Image forming apparatus
JP5953666B2 (en) * 2011-07-27 2016-07-20 株式会社ニデック Fundus photographing apparatus, fundus analysis method, and fundus analysis program
KR101644466B1 (en) * 2012-08-30 2016-08-01 캐논 가부시끼가이샤 Information processing apparatus and method
US10064546B2 (en) 2012-10-24 2018-09-04 Nidek Co., Ltd. Ophthalmic analysis apparatus and ophthalmic analysis program
JP6229255B2 (en) * 2012-10-24 2017-11-15 株式会社ニデック Ophthalmic analysis apparatus and ophthalmic analysis program
WO2014129339A1 (en) * 2013-02-22 2014-08-28 ソニー株式会社 Fundus image output device and method, and program
WO2014152058A1 (en) * 2013-03-15 2014-09-25 Steven Verdooner Method for detecting a disease by analysis of retinal vasculature
AU2014237346B2 (en) * 2013-03-15 2020-02-27 Hologic, Inc. System and method for reviewing and analyzing cytological specimens
US20150002812A1 (en) * 2013-06-27 2015-01-01 Nidek Co., Ltd. Image processing apparatus and storage medium
WO2014207904A1 (en) * 2013-06-28 2014-12-31 キヤノン株式会社 Image processing device and image processing method
WO2015013632A1 (en) * 2013-07-26 2015-01-29 The Regents Of The University Of Michigan Automated measurement of changes in retinal, retinal pigment epithelial, or choroidal disease
WO2015047981A1 (en) * 2013-09-24 2015-04-02 The Regents Of The University Of Michigan Systems and methods for diagnosing inherited retinal diseases
WO2015060897A1 (en) 2013-10-22 2015-04-30 Eyenuk, Inc. Systems and methods for automated analysis of retinal images
USD752616S1 (en) * 2013-10-23 2016-03-29 Ares Trading S.A. Display screen with graphical user interface
USD786900S1 (en) * 2014-03-31 2017-05-16 EMC IP Holding Company LLC Display screen with a graphical user interface
USD786899S1 (en) * 2014-03-31 2017-05-16 EMC IP Holding Company LLC Display screen with graphical user interface
US10586618B2 (en) * 2014-05-07 2020-03-10 Lifetrack Medical Systems Private Ltd. Characterizing states of subject
USD768671S1 (en) 2014-08-26 2016-10-11 Hipmunk, Inc. Portion of a display with a graphical user interface
US9757023B2 (en) 2015-05-27 2017-09-12 The Regents Of The University Of Michigan Optic disc detection in retinal autofluorescence images
EP3567599A1 (en) * 2015-06-26 2019-11-13 KCI Licensing, Inc. System and methods for implementing wound therapy protocols
CN105069803A (en) * 2015-08-19 2015-11-18 西安交通大学 Classifier for micro-angioma of diabetes lesion based on colored image
AT16426U1 (en) * 2016-03-02 2019-08-15 Mathias Zirm Dr Remote diagnostic support method
JP6843521B2 (en) * 2016-04-28 2021-03-17 キヤノン株式会社 Image processing device and image processing method
AU2016265973A1 (en) * 2016-11-28 2018-06-14 Big Picture Medical Pty Ltd System and method for identifying a medical condition
JP6927724B2 (en) * 2017-03-24 2021-09-01 株式会社トプコン Display control device, display control method, and program
JP6569701B2 (en) * 2017-06-12 2019-09-04 株式会社ニデック Ophthalmic analysis apparatus and ophthalmic analysis program
US11154194B2 (en) 2017-11-03 2021-10-26 Nanoscope Technologies, LLC Device and method for optical retinography
CN111435612B (en) * 2018-12-26 2022-06-21 福州依影健康科技有限公司 Method and system for personalized health service of mobile medical treatment
WO2020160606A1 (en) * 2019-02-07 2020-08-13 Commonwealth Scientific And Industrial Research Organisation Diagnostic imaging for diabetic retinopathy
JP2021087817A (en) * 2021-02-24 2021-06-10 キヤノン株式会社 Image processing apparatus and image processing method

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020052551A1 (en) * 2000-08-23 2002-05-02 Sinclair Stephen H. Systems and methods for tele-ophthalmology
US6698885B2 (en) * 1999-12-22 2004-03-02 Trustees Of The University Of Pennsylvania Judging changes in images of the eye
US20040105074A1 (en) * 2002-08-02 2004-06-03 Peter Soliz Digital stereo image analyzer for automated analyses of human retinopathy
US20050171974A1 (en) * 2002-06-07 2005-08-04 Axel Doering Method and arrangement for evaluating images taken with a fundus camera
US20070002275A1 (en) * 2005-07-01 2007-01-04 Siemens Corporate Research Inc. Method and System For Local Adaptive Detection Of Microaneurysms In Digital Fundus Images
US7177486B2 (en) * 2002-04-08 2007-02-13 Rensselaer Polytechnic Institute Dual bootstrap iterative closest point method and algorithm for image registration
US20070188705A1 (en) * 2004-03-12 2007-08-16 Yokohama Tlo Company Ltd. Ocular fundus portion analyzer and ocular fundus portion analyzing method
US20070214017A1 (en) * 2006-03-13 2007-09-13 General Electric Company Diagnostic imaging simplified user interface methods and apparatus
US20070222946A1 (en) * 2006-03-24 2007-09-27 Yasufumi Fukuma Fundus Observation Device
US7283653B2 (en) * 2002-11-26 2007-10-16 Siemens Aktiengesellschaft Method and system for supporting the evaluation of a picture of an eye
US20070258630A1 (en) * 2006-05-03 2007-11-08 Tobin Kenneth W Method and system for the diagnosis of disease using retinal image content and an archive of diagnosed human patient data
US20080059234A1 (en) * 1997-03-17 2008-03-06 The Board Of Regents Of The University Of Oklahoma Digital disease management system
US20080100612A1 (en) * 2006-10-27 2008-05-01 Dastmalchi Shahram S User interface for efficiently displaying relevant oct imaging data
US7474775B2 (en) * 2005-03-31 2009-01-06 University Of Iowa Research Foundation Automatic detection of red lesions in digital color fundus photographs
US7512436B2 (en) * 2004-02-12 2009-03-31 The Regents Of The University Of Michigan Method of evaluating metabolism of the eye
US7520611B2 (en) * 2001-03-01 2009-04-21 Richard Franz System for vision examination utilizing telemedicine
US20090136100A1 (en) * 2007-11-08 2009-05-28 Takao Shinohara Device and method for creating retinal fundus maps
US20090143685A1 (en) * 2007-11-13 2009-06-04 The Regents Of The University Of Michigan Method and Apparatus for Detecting Diseases Associated with the Eye
US7568800B2 (en) * 2006-06-15 2009-08-04 Topcon Corporation Apparatus and method for spectrally measuring fundus
US7583827B2 (en) * 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
US7593559B2 (en) * 2005-11-18 2009-09-22 Duke University Method and system of coregistrating optical coherence tomography (OCT) with other clinical tests

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10234674A (en) 1997-02-28 1998-09-08 Nippon Telegr & Teleph Corp <Ntt> Method for judging presence or absence of change of fundus oculi image with time
WO2000051080A1 (en) 1999-02-23 2000-08-31 The Board Of Regents Of The University Of Oklahoma Computer system for analyzing images and detecting early signs of abnormalities
WO2001087145A2 (en) 2000-05-18 2001-11-22 Michael David Abramoff Screening system for inspecting a patient's retina
WO2003020112A2 (en) 2001-08-30 2003-03-13 Philadelphia Ophthalmic Imaging Systems System and method for screening patients for diabetic retinopathy
CN1707477B (en) * 2004-05-31 2011-08-17 株式会社东芝 Group information generating system and group information generating method
JP4533028B2 (en) * 2004-07-20 2010-08-25 キヤノン株式会社 Ophthalmic image recording apparatus, method, and program
JP4901620B2 (en) 2007-07-19 2012-03-21 興和株式会社 Fundus examination image analysis system and fundus examination image analysis program
US10398599B2 (en) * 2007-10-05 2019-09-03 Topcon Medical Laser Systems Inc. Semi-automated ophthalmic photocoagulation method and apparatus
US8687862B2 (en) 2008-04-08 2014-04-01 National University Of Singapore Retinal image analysis systems and methods
JP4669891B2 (en) * 2008-10-20 2011-04-13 キヤノン株式会社 Ophthalmic imaging equipment
CN102573497B (en) * 2009-04-01 2015-09-09 眼泪科学公司 Ocular tear film Image-forming instrument
US7856135B1 (en) * 2009-12-02 2010-12-21 Aibili—Association for Innovation and Biomedical Research on Light and Image System for analyzing ocular fundus images

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080059234A1 (en) * 1997-03-17 2008-03-06 The Board Of Regents Of The University Of Oklahoma Digital disease management system
US6698885B2 (en) * 1999-12-22 2004-03-02 Trustees Of The University Of Pennsylvania Judging changes in images of the eye
US20020052551A1 (en) * 2000-08-23 2002-05-02 Sinclair Stephen H. Systems and methods for tele-ophthalmology
US7520611B2 (en) * 2001-03-01 2009-04-21 Richard Franz System for vision examination utilizing telemedicine
US7583827B2 (en) * 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
US7177486B2 (en) * 2002-04-08 2007-02-13 Rensselaer Polytechnic Institute Dual bootstrap iterative closest point method and algorithm for image registration
US20050171974A1 (en) * 2002-06-07 2005-08-04 Axel Doering Method and arrangement for evaluating images taken with a fundus camera
US20040105074A1 (en) * 2002-08-02 2004-06-03 Peter Soliz Digital stereo image analyzer for automated analyses of human retinopathy
US7283653B2 (en) * 2002-11-26 2007-10-16 Siemens Aktiengesellschaft Method and system for supporting the evaluation of a picture of an eye
US7512436B2 (en) * 2004-02-12 2009-03-31 The Regents Of The University Of Michigan Method of evaluating metabolism of the eye
US20070188705A1 (en) * 2004-03-12 2007-08-16 Yokohama Tlo Company Ltd. Ocular fundus portion analyzer and ocular fundus portion analyzing method
US7474775B2 (en) * 2005-03-31 2009-01-06 University Of Iowa Research Foundation Automatic detection of red lesions in digital color fundus photographs
US20070002275A1 (en) * 2005-07-01 2007-01-04 Siemens Corporate Research Inc. Method and System For Local Adaptive Detection Of Microaneurysms In Digital Fundus Images
US7593559B2 (en) * 2005-11-18 2009-09-22 Duke University Method and system of coregistrating optical coherence tomography (OCT) with other clinical tests
US20070214017A1 (en) * 2006-03-13 2007-09-13 General Electric Company Diagnostic imaging simplified user interface methods and apparatus
US20070222946A1 (en) * 2006-03-24 2007-09-27 Yasufumi Fukuma Fundus Observation Device
US20070258630A1 (en) * 2006-05-03 2007-11-08 Tobin Kenneth W Method and system for the diagnosis of disease using retinal image content and an archive of diagnosed human patient data
US7568800B2 (en) * 2006-06-15 2009-08-04 Topcon Corporation Apparatus and method for spectrally measuring fundus
US20080100612A1 (en) * 2006-10-27 2008-05-01 Dastmalchi Shahram S User interface for efficiently displaying relevant oct imaging data
US20090136100A1 (en) * 2007-11-08 2009-05-28 Takao Shinohara Device and method for creating retinal fundus maps
US20090143685A1 (en) * 2007-11-13 2009-06-04 The Regents Of The University Of Michigan Method and Apparatus for Detecting Diseases Associated with the Eye

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Tsai et al., "Automated Retinal Image Analysis Over the Internet", July 2008, IEEE Transactions on Information Technology in Biomedicine, Vol. 12, No. 4, 480-487 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204242A1 (en) * 2011-06-27 2014-07-24 Koninklijke Philips N.V. Exam review facilitated by clinical findings management with anatomical tagging
KR101598048B1 (en) 2015-08-24 2016-02-26 주식회사 창가비앤텍 High strength lattice girder
US10163241B2 (en) * 2016-12-09 2018-12-25 Microsoft Technology Licensing, Llc Automatic generation of fundus drawings
US20190096111A1 (en) * 2016-12-09 2019-03-28 Microsoft Technology Licensing, Llc Automatic generation of fundus drawings
US10740940B2 (en) * 2016-12-09 2020-08-11 Microsoft Technology Licensing, Llc Automatic generation of fundus drawings
US11896382B2 (en) 2017-11-27 2024-02-13 Retispec Inc. Hyperspectral image-guided ocular imager for alzheimer's disease pathologies
CN109544540A (en) * 2018-11-28 2019-03-29 东北大学 A kind of diabetic retina picture quality detection method based on image analysis technology
CN109886955A (en) * 2019-03-05 2019-06-14 百度在线网络技术(北京)有限公司 Method and apparatus for handling eye fundus image

Also Published As

Publication number Publication date
US20110129133A1 (en) 2011-06-02
US20110160562A1 (en) 2011-06-30
US8041091B2 (en) 2011-10-18

Similar Documents

Publication Publication Date Title
US8041091B2 (en) Methods and systems for detection of retinal changes
US8826173B2 (en) Graphical interface for the management of sequential medical data
US7805320B2 (en) Methods and systems for navigating a large longitudinal dataset using a miniature representation in a flowsheet
JP5377144B2 (en) Single choice clinical informatics
US8929627B2 (en) Examination information display device and method
US20080208631A1 (en) Methods and systems for providing clinical documentation for a patient lifetime in a single interface
US20080208630A1 (en) Methods and systems for accessing a saved patient context in a clinical information system
US20080208624A1 (en) Methods and systems for providing clinical display and search of electronic medical record data from a variety of information systems
US20130093781A1 (en) Examination information display device and method
US20080256490A1 (en) Decision-Based Displays for Medical Information Systems
US20180292978A1 (en) Apparatus and method for presentation of medical data
US20150154361A1 (en) Interactive whiteboard system and method
CN112292730B (en) Computing device with improved user interface for interpreting and visualizing data
WO2015095343A9 (en) Interactive whiteboard system and method
JP6128883B2 (en) Endoscopic image management apparatus and endoscopic image display method
JP7020022B2 (en) Healthcare data analysis method, healthcare data analysis program and healthcare data analysis device
KR101480429B1 (en) Apparatus and method for searching data object based emr system
JP7172093B2 (en) Computer program, display device, display system and display method
US20210251580A1 (en) Blood pressure and blood glucose measurement and tracking system and method
KR102482941B1 (en) A system for inquirying and inputting a medical information, a method and a program for that
JP6935680B2 (en) Computer programs, display devices, display systems and display methods
US20210151175A1 (en) Systems and methods indicating pending information for patients
US20140324461A1 (en) Diabetes management system with contextual drill down reports
JP7140109B2 (en) DISPLAY DEVICE, DISPLAY SYSTEM, COMPUTER PROGRAM, RECORDING MEDIUM AND DISPLAY METHOD
EP2823753A1 (en) Graphical user interface for blood glucose analysis

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION