WO1999030264A1 - Digital telepathology imaging system programmed for identifying image regions of potential interest and anticipating sequence of image acquisition - Google Patents

Digital telepathology imaging system programmed for identifying image regions of potential interest and anticipating sequence of image acquisition Download PDF

Info

Publication number
WO1999030264A1
WO1999030264A1 PCT/US1998/023760 US9823760W WO9930264A1 WO 1999030264 A1 WO1999030264 A1 WO 1999030264A1 US 9823760 W US9823760 W US 9823760W WO 9930264 A1 WO9930264 A1 WO 9930264A1
Authority
WO
WIPO (PCT)
Prior art keywords
specimen
image
clinician
interest
regions
Prior art date
Application number
PCT/US1998/023760
Other languages
French (fr)
Inventor
Stanley A. Mcclellan
Norman Wayne Fleming
Gregg L. Vaughn
Thomas S. Winokur
Gary J. Grimes
Original Assignee
Bellsouth Intellectual Property Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bellsouth Intellectual Property Corporation filed Critical Bellsouth Intellectual Property Corporation
Priority to AU13136/99A priority Critical patent/AU1313699A/en
Publication of WO1999030264A1 publication Critical patent/WO1999030264A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/40ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

A digital telemedicine imaging system (10) for enabling a clinician at a local site to observe, analyze and provide diagnostic opinions on medical specimens from patients at one or more remote surgical sites. The system (10) includes a server system (12) at each remote surgical site and a client system (14) at the local site. The server systems (12) are programmed with artificial intelligence-based software to enable the automatic identification, acquisition and transmission of image regions of likely potential interest to the clinician before the clinician begins a diagnostic session or while the clinician is observing previously transmitted images. The image identification, mechanical prepositioning and pretransmission function enhances the overall efficiency of both the imaging system and the diagnostic process supported by the system.

Description

DIGITAL TELEPATHOLOGY IMAGING SYSTEM
PROGRAMMED FOR IDENTIFYING IMAGE REGIONS OF
POTENTIAL INTEREST AND ANTICIPATING SEQUENCE OF
IMAGE ACQUISITION
BACKGROUND OF THE INVENTION Field of the Invention
The present invention relates generally to microscope and other imaging systems used to analyze biological tissue specimens during diagnostic sessions. In particular, the present invention is a digital telepathology imaging system incorporating artificial intelligence to automatically identify tissue regions of possible interest to a pathologist, anticipate the sequence of image acquisition and transmit the acquired images.
Description of the Related Art
Pathologists routinely assist surgeons by analyzing and providing diagnostic opinions on frozen sections of tissue samples removed from patients during surgery. Frozen sections are small tissue specimens which are quick-frozen, sliced, stained with dyes and placed on slides. The slides are then analyzed through a microscope by the pathologists The diagnoses provided by the pathologists are used by the surgeons during the course of the surgery for determining the nature and extent of further surgical procedures. For example, pathologists often analyze tissue samples to determine whether they are cancerous, and to determine the type of cancer present. Using this information the surgeon will determine which tissue should be removed from the patient.
A typical turnaround time for prep.aring a frozen section is about fifteen minutes. During the tissue sample diagnosis session a pathologist will typically first rapidly scan the entire sample for gross features at a relatively low magnification, and identify particular sample regions of interest for more detailed study. The selected regions of interest are then observed at relatively high magnifications. It is typical for a pathologist to observe many separate regions at a number of different magnifications during an analysis session. The length of time required by the pathologist to complete this procedure .and provide a diagnosis often is about five to seven minutes. Large hospitals and universities often maintain an on-site clinical pathology laboratory and staff of pathologists to provide these services on demand Since the diagnoses can be used to guide procedures to completion during a single surgery, the immediate access to the pathologist' s expertise during surgery can provide significant benefits in terms of both economic and recuperative aspects of healthcare delivery
Smaller and outlying hospitals often rely on periodic visits from ''circuit-riding'" pathologists that provide scheduled service to a group of hospitals This commuting arrangement can be inefficient, since certain types of surgeries must be scheduled to coincide with the pathologist's visit Travel is also an inefficient use of the pathologist's time Furthermore, if routine or emergency surgical procedures being performed when the pathologist is not scheduled suddenly require a pathologist's expertise, the tissue sample must be shipped to a remote laboratory for diagnosis The patient may then have to undergo surgery yet another time after the pathological diagnosis has been completed These procedures can result in increased costs and reduced effectiveness of healthcare delivery
A known solution to the problems associated with the lack of timely on-site access to the expertise of pathologists and other medical personnel is the application of telecommunications technology, also known as telemedicine Telepathology imaging systems, a type of remotely-controlled imaging systems which use communication technology to enable the remote observation of medical specimens, are known So-called static telepathology systems make use of a trained pathologist and a laboratory having a microscope with digital electronic imaging capabilities Systems of this type are disclosed generally in the Weinstein et al. .article Telepathology: Long-Distance Diagnosis, American Journal of Clinical Pathology, Vol. 91, pp S39 - S42, 1989 A clinician or other "sender" prepares the frozen sample tissue specimen, analyzes the specimen through the microscope, and identifies a number of specimen regions that appear to be of interest Digital images of the selected regions of interest are then generated, and electronically transferred to another site for analysis and diagnosis by a pathologist The pathologist can also communicate with the sender (e g , by telephone) during these procedures, and request images of specific sample regions Unfortunately, these static systems can be inefficient due to the relatively large time delays inherent in the procedure. The accuracy of diagnoses rendered through the use of these systems also may be lower than those rendered with dynamic systems since the systems do not provide the pathologist with the capability of selecting the images he or she may desire.
Dynamic systems such as those disclosed in the Weinstein et al. U.S. Patent 5,216,596 make use of a trained clinician and a laboratory having a microscope with a conventional analog video camera. "Live" images of the specimen sample produced by the video camera are transmitted to a television monitor using conventional NTSC or PAL analog video signal protocols, of compressed and transmitted using well-known digital video conferencing technologies for transmission to remote sites. Although systems of this type offer real-time imaging capabilities, the images are often relatively poor in quality and have poor control-response.
A digital telepathology imaging system which offers advantages of both "static" and "dynamic" architectures is disclosed in the commonly assigned United States Patent Application Serial No. 08/926,699, entitled Digital Telepathology Imaging System. Briefly, this system includes a number of "server" systems which are interfaced to a "client" system over a digital telecommunication channel. The client system is located at a hospital or other local site of a pathologist, and includes a workstation, monitor and a keyboard or other operator-actuated user input. Each server system will typically be located at a hospital or other remote surgical site which does not have a staff pathologist, and includes a workstation, monitor, microscope station and keyboard or other user input. During surgical procedures being performed at the remote surgical site of a server system, a technician or other clinician will prepare slides of frozen sample or other tissue specimens and position the specimen slides on the microscope station. A pathologist or other clinician at the local site of the client system remotely controls the operation of the server system through the use of a graphical user interface (GUI) displayed on the monitor and the operator-actuated user input. In response to digit server system control commands received from the client system over the communication channel, digital still images of the specimen are generated by the microscope station, transmitted to the client system and displayed in diagnostic-quality on the monitor. The pathologist can, for example, observe an image of the entire specimen at a relatively low magnification, and identify particular specimen regions for more detailed analysis at greater magnification levels. Server system control commands such as specimen region commands designating the selected specimen regions and magnification commands designating the desired magnification are then generated by the client system and transmitted to the server system over the communication channel. The server system then processes the control commands, manipulates the microscope station, and transmits to the client system specimen image data for the selected specimen region at the desired magnification. The communication channel will typically be a high bandwidth public network communication link capable of supporting packet-switched data communications.
In practice, the transmission times of high-resolution digital pathology images of the type produced by the system described immediately above can be relatively long. This is especially the case when relatively low-bandwidth telecommunications links are used. Additional delays in the acquisition of images selected by the pathologist are due to the relatively slow response of the mechanical systems of the microscope station (e.g., the stage drive used to reposition the slide and the objective drive used to change magnification levels). It is therefore evident that there is a need for improved digital telepathology and other such imaging systems which avoid these delays. In particular, there is a need for a system capable of more quickly generating the display of specimen images of interest to the pathologist.
BRDXF DESCRD?TION OF THE DRAWINGS
Figure 1 is a diagrammatic illustration of a telepathology imaging system including an automatic image identification and acquisition system in accordance with the present invention.
Figure 2 is a detailed diagrammatic illustration of one of the server systems shown in Figure 1.
Figure 3 is an illustration of the graphic user interface (GUI) menu generated and displayed by the client system shown in Figure 1.
Figure 4 is an illustration of a GUI Open Session menu. Figure 5 is an illustration of a tiled slide overview specimen image.
Figure 6 is an illustration of several cascaded specimen images with the GUI microscope control menu and zoom factor menu on the top image.
Figure 7 is an illustration of several cascaded specimen images with the GUI image region selection reticle on the top image.
Figure 8 is a flow diagram of an automatic image acquisition and transmission procedure in accordance with the present invention.
Figure 9 is an illustration of a tiled overview specimen image divided into subimages for evaluation in accordance with the present invention.
Figure 10 is an illustration of a specimen region image with identified regions of likely potential interest highlighted by boxes surrounding the regions.
Figure 11 is a flow diagram of a subimage evaluation procedure which can be performed in the automatic image acquisition and transmission procedure shown in Figure 8.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS System Architecture and Overview
Figure 1 is an illustration of a "virtual microscope" imaging system 10 for telepathology Imaging system 10 includes an artificial intelligence-based system in accordance with the present invention for anticipating the sequence of image acquisition desired by a pathologist. In effect, this automatic image identification and acquisition system provides a "prediagnosis" function by identifying "suspicious" areas of the image that the pathologist is likely to examine in greater detail, and predicting and generating additional images, typically at a greater magnification, of likely interest to the pathologist This system offers a number of important advantages They include the ability to acquire and transmit images of likely possible interest before they are specifically requested by the pathologist The microscope imaging system can be mechanically repositioned on the basis of specific behavior patterns of the pathologist Efficiency of the imaging system 10 is thereby increased, and the latency of diagnoses reduced
In the embodiment shown, system 10 includes one or more (three are shown) "server" systems 12 which are interfaced to a "client" system 14 over a digital telecommunication channel 16. Client system 14 is located at a hospital or other local site of a pathologist, and includes a workstation 18, monitor 20 and operator-actuated user input 21 such as keyboard 22 and/or mouse 24 Each server system 12 will typically be located at a hospital or other remote surgical site which does not have a staff pathologist, and includes a workstation 26, monitor 28, microscope station 30 and keyboard or other user input (not shown). During surgical procedures being performed at the remote site of a server system 12, a technician or other clinician will prepare slides of frozen sample or other tissue specimens and position the specimen slides on microscope station 30. A pathologist or other clinician at the local site of client system 14 remotely controls the operation of the server system 12 through the use a graphical user interface (GUI) displayed on monitor 20 and the operator-actuated user input 21. In response to digital server system control commands received from the client system 14 over the communication channel 16, digital still images of the specimen are generated by the microscope station 30, transmitted to the client system and displayed in diagnostic-quality form on the monitor 20. The pathologist can, for example, observe an image of the entire specimen at a relatively low magnification, and identify particular specimen regions for more detailed analysis at greater magnification levels. Server system control commands such as specimen region commands designating the selected specimen regions, magnification commands designating the desired magnification and illumination commands for controlling specimen illumination and imaging parameters of the camera are then generated by the client system 14 and transmitted to the server system 12 over the communication channel 16. Server system 12 processes the control commands, manipulates the specimen illumination and camera parameters, and transmits to the client system 14 specimen image data for the selected specimen region at the desired magnification.
Communication channel 16 will typically be a high bandwidth public network communication link capable of supporting packet-switched data communications. For example, ISDN (integrated services digital network) or Tl telephone lines can be used. Conventional and commercially available digital switching or routing systems (not shown) can be used to interface server system 12 and client system 14 to the communication channel 16. Prototypes of system 10 were developed using an ATM-based (asynchronous transfer mode) optical channel 16 and an Ethernet network.
A representative server system 12 can be described in greater detail with reference to Figure 2. Microscope station 30 is a conventional and commercially available instrument which includes a microscope 31 and a stage 34 for receiving and supporting specimen slides (not separately shown) to be imaged by lenses or objectives 36. The stage 34 is mounted to the base of the microscope 31 by an x-y stage drive 40 so the position of the slide with respect to the objectives 36, and therefore the region of the specimen being imaged, can be changed. The microscope 31 includes several objectives 36, each of which provides a different magnification power. Focusing is performed by a conventional automatic focus control system (not separately shown). The objectives 36 are driven and positioned by a drive 38 to change the magnification of the microscope. Although not shown, the microscope 31 also includes a conventional slide illumination light source and actuator-driven neutral density filter for controlling the specimen illumination level.
A high-resolution camera 42 generates true-color image data representative of the specimen images produced by the microscope. This image data is digitized by the workstation 26. The camera 42 and digitizing system in workstation 26 function as a digital imaging system. Prototypes of system 10 include a camera 42 having a base resolution of at least approximately 600 x 800 pixels, .and a digitizing system capable of capturing 8-bit per pixel (bpp) per color band still images (for a total of 24 bpp color). Conventional imaging control parameters of camera 42 can be accessed and controlled.
Microscope station 30 is interconnected to workstation 26 through one or more interfaces illustrated generally at 44. Prototypes of system 10, for example, include an analog interface for connecting camera 42 to the workstation 26 and its digitizing system, and RS-232 busses for interconnecting the stage drive 40, objective drive 38, automatic focus system and image illumination system to the workstation 26. Monitor 28 is interconnected to workstation 26 by bus 46. These systems are capable of acquiring a still image and displaying that image at full resolution on the monitor 28 in less than about 1 second.
Workstation 26 is a conventional programmable computer, and is programmed with server system control software to control the operation of microscope station 30 in response to digital server system control commands received from the client system 14. Control commands received from server system 14 include camera commands, specimen region commands, magnification commands, focus commands and illumination commands. The specimen region commands include information designating a specific or desired region of interest on the specimen slide, and in one embodiment include information designating coordinates on the specimen slide in terms of "stage units." In response to the receipt of specimen region commands, workstation 26 actuates stage drive 40 and moves stage 34 to position the selected region of the specimen slide under objective 36. The magnification commands include information designating the desired magnification at which the selected specimen region is to be imaged. In response to the receipt of magnification commands, workstation 26 actuates drive 38 and causes the selected objective 36 to be used to image the specimen region. In response to camera commands, camera 42 generates specimen image data representative of a still image of the specimen region being imaged by the microscope 31. As described above, this image data is digitized by the workstation 26.
As noted above, microscope 31 includes automatic focus and specimen slide illumination control systems. In response to focus commands received from client system 14, workstation 26 initiates an automatic focus routine by the automatic focus system of the microscope 31. Workstation 26 also initiates an automatic focus routine each time the magnification is changed. Workstation 26 also controls the illumination system of the microscope 31 (e.g., the neutral density filter) to change the illumination level of the specimen slide in response to illumination commands received from the client system 14, and can vary the imaging parameters of camera 42.
Prototypes of system 10 use conventional TCP/IP protocol (transport control protocol/internet protocol) for the transfer of the server system control commands and specimen image data. The TCP/IP protocol of the prototype system 10 is supported by the Windows Sockets application programming interface (API) which provides handshaking, error reporting and associated functions, .and is therefore compatible with local area network (LAN) as well as LAN-over-broadband telecommunications architectures.
Client system 14 can include a workstation 18 and monitor 20 similar to those of server system 12. Workstation 18 is programmed with client system control software and GUI software. The GUI software generates and displays a Windows™-format GUI on monitor 20. Through the use of the GUI and user input 21, the pathologist or other clinician can select the specimen regions to be imaged by the microscope station 30 of a server system 12, and the desired magnification of the selected specimen region. Server system control commands, including the specimen region commands and magnification commands designating the image selected by the pathologist, are generated by the client system control software and transmitted to the server system 12 over communication channel 16. The client system control software also processes the digital image data received from the server system 12, and generates visual displays on monitor 20 of the associated specimen images represented by the data
Graphic User Interface And System Operation
Figure 3 is an illustration of the GUI menu 50 generated by workstation 18 and displayed on monitor 20 at client system 14 The illustrated embodiment of menu 50 includes a menu bar 52 and a toolbar 54 The pathologist operates system 10 by using keyboard 22 and/or mouse 24 in a conventional manner to select commands on menu bar 52 and toolbar 54 As described below, other commands not graphically displayed on the GUI can be selected by clicking the left and right buttons (not separately shown) of the mouse 24 In addition to conventional Windows™ commands (e g , Window and Help), menu bar 52 includes the system-specific commands System, Session and Microscope Toolbar 54 includes a number of conventional Windows™ command buttons 55 - 58, microscope (e g., stage drive) control buttons 59, and a new slide button 60 Buttons 55 .and 56 are format buttons for selecting tiled or cascaded specimen image displays, respectively Arrange icon button 57 and information button 58 provide conventional Windows™ functions when selected Get slide button 60 can be actuated by the pathologist to obtain a tiled slide overview image (described in greater detail below) when a new specimen slide has been positioned for imaging on the remote microscope station 30 The tiled slide overview image can also be initiated by the technician at the server system 12 A listing of the commands and associated functions available through the selection of the System, Session, Microscope, Window and Help commands on menu bar 52 follows Many of the commands provide conventional Windows™-format functions, and are identified as "Standard" in the listing
Menu Bar Options
System
Figure imgf000013_0001
Session
Figure imgf000013_0002
Microscope
Figure imgf000014_0001
Window
Figure imgf000014_0002
Figure 4 is an illustration of a GUI open session menu displayed on monitor 20 and used by the pathologist to initiate a remote specimen observation session The open session menu will be presented to the pathologist upon the selection of the Connect command available through menu bar 52 As shown, the open session menu includes a list of server systems 12 set up for operation with the client system 14, .and displays the then-current specimen slide illumination (intensity) and image magnification (the established "user preference" values) Workstation 18 initiates and establishes an operational communication interface with the server system 12 selected by the pathologist through the open session menu The pathologist will typically begin the diagnosis of a specimen by observing a slide overview image 70 such as that shown in Figure 5. The slide overview image 70 is a relatively low magnification image of a relatively large portion of the specimen on the slide, and in one embodiment is an image of the entire specimen on the slide. In the embodiment shown, the slide overview image 70 is an assembly of separate and discrete specimen region images 72 into a composite or tiled overview image of the entire specimen on the slide. The slide overview image 70 shown in Figure 5, for example, is formed from a 3 x 4 array of specimen region images 72.
Generation of the overview image 70 is controlled by the server system control software in workstation 26. In response to Get Slide commands (or Send Slide commands initiated by the clinician at server system 12), the workstation 26 actuates objective drive 38 to select a relatively low magnification (e.g., 2x) objective 36 and initiates an automatic focus routine. Stage drive 40 and camera 42 are then actuated to sequentially obtain the adjacent specimen region images 72. The image data for the specimen region images 72 is processed by the server system to reduce the resolution of the images, and transmitted to the client system 14 where it is assembled and displayed as the slide overview image 70. The image data for each of the specimen region images 72 can be cached in memory of the workstation 18 so the image can be redisplayed as the topmost image on monitor 20.
As noted above, the generation of a slide overview image 70 can be initiated at either the server system 12 or the client system 14. After positioning a new specimen slide on microscope station 30, the clinician can, for example, initiate the generation of a slide overview image 70 by selecting a Send Slide command through a graphic user interface and user input (not separately shown) at the server system 12. Image data representative of the slide overview image 70 will then be transmitted to the client system 14 and displayed on monitor 20, thereby indicating to the pathologist the availability of the specimen slide for observation. If the image data has already been transmitted by the server system 12, the pathologist can also initiate the generation of a new slide overview image 70 from the client system through use of the Get Slide command button 60 on toolbar 54 or the Get Slide command available through menu bar 52. During observation of the slide overview image 70 the pathologist will typically identify specimen regions that he or she would like to observe in greater detail. Using the GUI and user input 21, the pathologist can select the specimen region of interest and the desired magnification at which the specimen region is to be observed. The appropriate server system control commands are then generated by the client system control software and transmitted to the server system 12 to initiate the retrieval and display of the specimen region of interest image (i.e., a detail image).
Specimen regions of interest can, for example, be selected by positioning the GUI cursor over the image and clicking the right button on mouse 24. As shown in Figure 6, this command causes a GUI microscope control menu 80 to be generated and displayed on monitor 20. Available commands in the microscope control menu 80 include Zoom and Zoom Factor. When the Zoom command is selected through user input 21, the GUI software generates and displays on the then-displayed specimen region image an indicator such as reticle 82 shown in Figure 7. Using the user input 21 the pathologist can then reposition the reticle 82 over the specimen region of interest on the image, and thereby select the specimen region for further observation. Returning to the microscope control menu 80, the pathologist can then select the Zoom Factor command and cause the zoom factor menu 84 (Figure 6) to be generated and displayed by the GUI software. The zoom factor menu 84 includes a number of discrete magnification level commands corresponding to the available magnification levels (e.g., objectives 36) on microscope station 30, as well as In and Out step commands. The In command causes the magnification to increase one available power level from the then-current magnification at the microscope station 30, while the Out command causes the magnification to decrease one available power level from the then-current magnification. By clicking the left button of mouse 24 after selecting the desired magnification and image region of interest in the manner described above, the pathologist will initiate the generation and transmission of the appropriate magnification and specimen region commands to retrieve the image.
Using the client system 14 in the manner described above, the pathologist is able to relatively quickly observe images of many different specimen regions at different magnifications. As is evident from Figure 6, multiple images can be simultaneously displayed on monitor 20. The images can be displayed in a manner which enables the pathologist to easily identify the location of the specimen region image with respect to the image overview and with respect to all other images. The magnification of the image is also displayed. Workstation 26 automatically maintains a log or cache describing in chronological sequence the control commands transmitted to the server system 12 and the coordinates of the specimen region of the retrieved image. In addition to providing an audit trail, this information streamlines the image acquisition and transfer functions and provides the pathologist with immediate feedback regarding the regions of the specimen which have been imaged. Each of the retrieved specimen images can be stored by the workstation 26 for subsequent retrieval and observation.
System 10 offers a number of important advantages, especially with respect to known "static" and "dynamic" architectures. High-resolution, diagnostic-quality, discrete color images can be retrieved and displayed in real time. The system can be efficiently operated through the user-friendly graphic user interface. Commercially available hardware can be used to implement the system. The network protocol is also robust, scalable (works on many different size channels), fast and relatively secure. The system allows pathologists to quickly provide high-quality diagnostic opinions to multiple remote locations, thereby making efficient use of their time. Accurate, precise, real-time control of the remote imaging system is achieved. The system can be integrated with electronic patient record systems to include patient information and diagnostic results. The audit trails which can be provided are also advantageous.
Automatic Imaee Identification and Acquisition Function
Artificial intelligence-based automatic image identification and acquisition software in accordance with the present invention is incorporated into the server system control software of workstation 26. The image identification and acquisition software identifies locations on previously-generated images such as 70 that will likely be of potential interest to the pathologist (e.g., are "suspicious"). The image identification and acquisition software also predicts which additional images (e.g., location and magnification) are likely to be of potential interest to the pathologist. These functions of the image identification and acquisition software are typically performed in the absence of express commands from the pathologist or client system 14, and can take place while previously selected image data is being transmitted to the client system. On the basis of these computer-assisted decisions the server system control software can control the operation of workstation 26 and/or microscope station 30 to effectively provide "prediagnoses" and enhance the overall efficiency of both the imaging system 10 and the diagnostic process supported by the imaging system.
The automatic image identification function can be described generally with reference to the flowchart in Figure 8. As shown at step 200, the automatic image identification function is initiated by generating a composite or tiled overview image. Figure 9 is an illustration of a composite or tiled overview image 100 of a stained tissue specimen 102 mounted on a slide (not shown). Overview image 100 is a composite 4 x 2 array of relatively low magnification (e.g., 2x) specimen region images 104. Following the generation of the overview image 100, and in some embodiments while the overview image is being transmitted to the client system 14, the image identification and acquisition software divides or breaks up the overview image into a plurality of subimages 106 (step 202). The subimages 106 will typically contain a relatively small number of pixels, and in one embodiment have about IK to 4K pixels. In the embodiment described with reference to Figure 9, the image identification and acquisition software divides the overview image 100 into subimages 106 formed by a rectangular array of immediately- adjacent pixels in two-dimensional Euclidean space. In other embodiments (not shown) the grouping of neighboring pixels in each subimage can be based on measures of "closeness" in other spaces (e.g., various color spaces).
Using neural net and or conventional or otherwise known artificial intelligence methodologies (e.g., image processing techniques), the subimages are evaluated to identify regions of likely potential interest to the pathologist (step 204). The evaluation performed at step 204 uses a stored tissue-specific training set of information. The tissue- specific training set contains information which describes characteristics and features of the specimen image that are likely to be associated with portions of the image that are of likely potential interest to the pathologist. The tissue-specific training set of information can include information which varies from pathologist to pathologist (i.e., is often at least partially pathologist-specific) since different pathologists will look for different characteristics and features to identify areas of potential interest. For example, color-blind pathologists will look for different characteristics than those with good color vision. The subimages are effectively evaluated at step 204 to determine whether they contain image content similar to that characterized by the tissue-specific training set.
For example, in frozen sections stained with conventional pink and blue dyes, it is known that relatively blue portions of the image are most likely to be cancerous and malignant, and therefore of likely potential interest to the pathologist. Furthermore, the greater the amount of blue color in a portion of an image, the more likely the portion of the image will be of interest to the pathologist. Conversely, the more pink a portion of an image is, the less likely it is to contain cancerous cells. The color of the portion of the tissue specimen being evaluated, also known as the color-space, is therefore a criterion that can be incorporated into the tissue-specific training set.
Another characteristic of tissue samples having a known relationship to regions of potential interest is the degree of homogeneity of the regions, also known as the spatial frequency-space. Healthy tissue will generally have cell nuclei which appear as small, isolated spots inside a regularly shaped (e.g., elliptical or circular) membrane. The bulk of the cell contains relatively large sections of matter of relatively consistent or unvarying nature (e.g., relatively large patches of pink-stained matter. Cancerous tissue, on the other hand, typically contains cells formed to a large degree by cell nuclei, and each nucleus has a well defined mass of DNA. Cancerous tissue therefore tends to be "busier," having a higher degree of variation. Known mathematical algorithms (such as those for performing discrete cosine transforms) can be used to evaluate the spatial frequency of subregions. The spatial frequency of the portion of the tissue specimen under evaluation is therefore another criterion that can be incorporated into the tissue-specific training set.
The color-space and frequency-space tissue specific training set information described above is only an example of the types of such information which can be used. Other types of tissue-specific training set information (e.g., templates of organ-specific anomalies) can be used as alternatives or in addition to those described above. Furthermore, the particular color-space and frequency-space training set information described above is directed to specific types of cancers and the commonly used pink and blue dye sets. This information will vary depending upon factors such as the type of cancers or other tissue material being looked for, the specific type of organs from which the specimen was removed, and the colors of the dye sets used to stain the specimen.
The tissue-specific training sets of information can be generated in a number of different manners. For example, they can be programmed on the basis of observations of a skilled pathologist (i.e., empirically identifying the characteristics of specimen regions selected by a pathologist as being of interest). Alternatively, a neural net can be taught the tissue-specific training set in a conventional manner through use of electronic results of evaluations performed by pathologists (e.g., the stored caches of commands from previous diagnostic sessions on similar tissues with similar stains for simile diseases).
After regions likely potential interest have been identified, they are ranked as shown at step 206. Ranking step 206 is performed to assign a relative degree of interest or priority to the identified regions. Factors and methods for performing the ranking operation are described below.
After they have been ranked, the image identification function continues by taking action on the identified regions as indicated generally by step 208. The identified regions will typically be acted upon at step 208 in order of the ranking assigned at step 206. As shown in Figure 8, a number of different actions can be initiated by the image identification and acquisition program at step 208. For example, the identified areas of likely potential interest can be highlighted on the tiled overview image 100 (step 210). Figure 10, for example, illustrates an image of a tissue sample in which identified areas of potential interest have been highlighted by surrounding them with bounding boxes. Although not visible in Figure 10, a feature of the box such as its color or line width can be used to indicate the ranking assigned to the identified region at step 206 (e.g., red boxes around the highest ranked regions of interest, and yellow boxes around regions of interest having a lower ranking). The highlighting function can be performed relatively quickly. Since the overview image will typically have been previously transmitted to the client system 14, the server system 12 needs only to transmit the coordinates on the overview image at which the highlights should appear. By observing an overview image highlighted in this manner, the pathologist is able to quickly identify regions that he or she would like to view in greater detail, thereby enhancing the efficiency of the diagnostic process.
Additional and/or alternative actions which can be initiated by the image identification and acquisition program include the acquisition and transmission of additional images that are of likely potential interest to the pathologist (step 212). This action is undertaken by the server system workstation 26 which actuates the stage drive 40 to position the identified region of the specimen 102 for imaging, actuates the objective drive 38 to select the desired (and typically higher magnification) objective 36, and then initiates an automatic focus routine. The camera 42 can then be operated to generate the specimen image data of the identified region of likely potential interest being imaged by the microscope 31. This image data can be transmitted to the client system 14, where it will be stored for possible display and observation by the pathologist.
The automatic image acquisition and transmission function described above will typically be performed on the highest ranked identified region of likely potential interest. The function can also be repeated for additional identified regions of likely potential interest as indicated by flow path 214. Repetition of this type will typically be performed on identified regions in sequence of their ranking.
Alternatively, rather than continuing to acquire and transmit more detailed images of identified regions (i.e., rather than following flow path 214), a detailed image acquired at step 212 can be broken into subimages which are then evaluated and ranked to identify regions of likely potential interest therein (i.e., steps 202, 204 and 206 are repeated for the detailed image). Any desired action, including those described above, can then taken on these identified regions of likely potential interest in the detailed image. To evaluate subimagers at greater magnifications workstation 26 can scale the information in the training set being utilized for the particular magnification factor of the subimage. These automatic image acquisition and transmission functions can be stopped by the pathologist through use of the user input 21 of the client system 14. Direct pathologist control over the server system 12 through use of user input 21 will typically take precedence and temporarily override the automatic image acquisition and transmission functions. The automatic acquisition and transmission of detailed images of identified regions of likely potential interest in the manner described above offers important advantages. Mechanical repositioning of the stage drive 40 and objective drive 38, and the autofocus routine, are relatively time consuming procedures. Transmission of the image data can also take time. If the pathologist were to initiate these activities through the client system user input 21 following his or her observation of an image, these response times cause delays which reduce the efficiency of the pathologist and increase the latency of diagnoses. Undertaking these "prepositioning" and or "pretransmission" actions before the pathologist is even available for diagnoses, or while the pathologist is reviewing previously transmitted images, results in the detailed images that the pathologist is likely going to want to observe being present for effectively immediate display on the client system monitor 20 when requested. The pathologist's time is thereby used more efficiently. Latency of diagnoses is also reduced.
Figure 11 is a flow diagram of one artificial intelligence method 204' which can be performed by workstation 26 of server system 12 to evaluate subimages and identify regions of likely potential interest (i.e., step 204 of the method shown in Figure 8). Evaluation method 204' is a first order approach based on a color-space training set of information, and is presented as an example of an image processing-type of artificial intelligence evaluation. The method 204' begins with the transformation of the RGB (red/green/blue) color coordinates of all the subimage pixels (i.e., the image data) into a suitable color-space. In this example the color coordinates are transformed into the HSV (hue/saturation/value) space (step 220). Equivalent results can be obtained in many other color-spaces by varying processing steps, initial values, etc.
Using the hue channel, which represents an index of "color," a two-sided thresholding operation is performed around pixel values representing the "blue" color band (i.e., the color if interest). In one embodiment this thresholding operation is performed around a blue color band represented by pixel values of about "184 - 200". In another version the blue color band is represented by pixel values in the more limited range of "190 - 197". All pixels which are within the threshold range (i.e., are "fairly blue") are set to binary "1". Pixels which are not within the range and are therefore not sufficiently blue to be of potential interest are set to a binary "0". In this manner the subregion image is effectively converted into a "blue and white" binary image (step 222).
Conventional image processing morphological operations of erosion (step 224) and dilation (step 226) are performed on the binary image. The erosion operation eliminates relatively isolated "blue" pixels, much like median filtering in a binary space. The dilation operation strengthens or enlarges remaining "blue" pixels (which are not isolated), effectively causing "blue" pixels in a localized neighborhood to "grow together."
Next, as shown by step 228, the blob characteristics (e.g., centroid, spread, perimeter, number of pixels) of the "blue" pixel groups are computed in a conventional manner to determine the size of the identified region of likely potential interest. These blob characteristics can, for example, be used to determine the size and location of the bounding boxes which are overlaid on the original image.
The bounding boxes or other information describing the size and shape of the identified regions of likely potential interest can be used to rank or prioritize the identified regions (step 206 in Figure 8). For example, the area of the identified region (e.g., the number of pixels in the bounding box) can be used as a measure of importance. Another characteristic which can be used as a measure of importance is the concentration of relatively small bounding boxes (e.g., the number of spatially co-located identified regions). Yet another characteristic is the compactness or density of "blue" pixels in each identified region. These characteristics can be used individually or in combinations to prioritize the identified regions of likely potential interest for subsequent action.
The image acquisition and transmission procedure described above with reference to step 212 in Figure 8 can make use of artificial intelligence methods to enhance the likelihood that the detailed images which are automatically acquired and transmitted are images that a pathologist will want to observe (i.e., images that the pathologist would have selected). An object of this procedure can be to acquire and transmit detailed images in a sequence or "trajectory" that matches the sequence the pathologist would have followed. To obtain this objective, artificial intelligence techniques can be applied to the image acquisition and transmission procedure 212. A pathologist-specific training set of information can, for example be used in the image acquisition and transmission procedure 212. It has been observed that the behavior of individual pathologists during image diagnosis sessions often exhibits specific tendencies. In other words, the manner by which they observe specimen images tends to be similar from specimen to specimen. By way of example, some pathologists scan an image (i.e., the sequence in which they acquire images of regions of interest) in a spiral pattern, while others scan images in rows like the text on this document. Some pathologists tend to jump or skip from the relatively low magnification (e.g., 2x or 4x) of the overview image such as 100 directly to relatively high magnification (e.g., 20x) images of regions of interest, while others will increase the magnification of sequential images in more gradual steps (e.g., 2x to 6x to 20x). Other information that can be incorporated into the pathologist-specific training set includes preferred specimen illumination settings and camera imaging parameters settings. Furthermore, these pathologist-specific tendencies can vary with tissue-specific information such as the type of organ from which the specimen is taken and the suspected tumor or disease type (e.g., how does the pathologist observe specimens for different types of tissue), and/or with patient-specific information such as age, gender and condition. Regions of likely potential interest can also be identified as a function of the length of time a pathologist spends observing images having specific types of features.
As noted above, the server system workstation 26 can automatically maintain a log or cache describing in chronological sequence the control commands transmitted to the server system 12 and the coordinates of retrieved specimen image regions. These stored histories, alone or in combination with the associated image data, can be used to develop training sets for associated pathologist-specific, tissue-specific and patient- specific information. Neural nets can also be taught with these stored histories. Alternatively, the pathologist-specific, tissue-specific and patient-specific information for the training sets can be empirically derived by observing pathologists during diagnostic sessions associated with the type of information desired. Furthermore, pathologist- specific information can be obtained "on the fly" during a given diagnostic session, and used to control the acquisition of subsequent images. The application of predictive methodologies such as those described above to the automatic acquisition and transmission of images of identified regions enhances the previously described advantages of the automatic acquisition function. In particular, these methodologies increase the likelihood that the detailed images which are automatically acquired and transmitted will be those that the pathologist is most interested in observing, and present for essentially immediate display and observation.
Although the present invention has been described with reference to preferred embodiments, those skilled in the art will recognize that changes can be made in form and detail without departing from the spirit and scope of the invention. In particular, although described in connection with a telepathology system, the invention can be applied to other microscope or imaging systems used to observe stained or other biologic tissue samples (e.g., Pap smear analyses).

Claims

WHAT IS CLAIMED IS:
1. A method for operating a digital imaging system to automatically identify regions on a biologic tissue specimen of likely potential interest to a clinician providing diagnostic opinions on the specimen, including: providing a training set of tissue-specific information representative of tissue specimen characteristics of likely potential interest to a clinician; generating a first specimen image at a first magnification; evaluating the first specimen image as a function of the training set of information to identify interest regions of likely potential interest; and taking action on at least one of the identified interest regions.
2. The method of claim 1 wherein evaluating the first specimen image includes: breaking the first specimen image into a plurality of subimages; and evaluating the subimages as a function of the training set information to identify the interest regions.
3. The method of claim 1 wherein: the method further includes displaying the first specimen image; and taking action on at least one of the identified interest regions includes highlighting at least one of the displayed identified interest regions.
4. The method of claim 3 wherein: the method further includes ranking the interest regions in terms of their relative potential interest; and highlighting the identified interest regions includes highlighting the interest regions as a function of their ranking.
5 The method of claim 1 and further including ranking the identified interest regions in terms of their relative potential interest
6 The method of claim 1 wherein taking action on at least one of the identified interest regions includes generating a detailed specimen image of a first of the identified interest regions at a magnification which is greater than the first magnification
7 The method of claim 6 and further including transmitting the detailed specimen image of the first identified interest region to a remote location
8 The method of claim 7 and further including generating a detailed specimen image of a second of the identified interest regions at a magnification which is greater than the first magnification; and transmitting the detailed specimen image of the second of the identified interest regions to the remote location
9 The method of claim 8 and further including repeating the steps of generating and transmitting a detailed specimen image at a magnification greater than the first magnification for at least a third of the identified interest regions
10 The method of claim 9 wherein the method further includes ranking the identified interest regions in terms of their relative potential interest, and the steps of generating and transmitting detailed specimen images are performed on identified interest regions as a function of their ranking
11 The method of claim 10 wherein: the method further includes providing a training set of clinician-specific information representative of a clinician's specimen observation behavior, and generating detailed specimen images includes generating the detailed specimen images as a function of the training set of clinician-specific information
12 The method of claim 1 1 wherein providing the training set of clinician-specific information includes providing a training set of information including information representative of a clinician's detailed image magnification behavior; and generating detailed specimen images includes generating detailed specimen images at magnifications determined as a function of the information representative of the image magnification behavior
13 The method of claim 6 wherein the method further includes ranking the identified interest regions in terms of their relative potential interest, and generating and a detailed specimen image includes generating a detailed specimen image on a relatively highly ranked identified interest region
14 The method of claim 6 wherein the method further includes providing a training set of clinician-specific information representative of a clinician's specimen observation behavior, and generating detailed specimen images includes generating the detailed specimen images as a function of the training set of clinician-specific information
15 The method of claim 14 wherein. providing the training set of clinician-specific information includes providing a training set of information including information representative of a clinician's detailed image magnification behavior; and generating detailed specimen images includes generating detailed specimen images at magnifications determined as a function of the information representative of the image magnification behavior.
16. The method of claim 15 wherein : providing the training set of clinician-specific information includes providing a training set of information including information representative of a clinician's interest region scanning sequence behavior; and generating detailed specimen images includes generating detailed specimen images on identified interest regions in an order determined as a function of the information representative of the scanning sequence behavior.
17. The method of claim 14 wherein: providing the training set of clinician-specific information includes providing a training set of information including information representative of a clinician's interest region scanning sequence behavior; and generating detailed specimen images includes generating detailed specimen images on identified interest regions in an order determined as a function of the information representative of the scanning sequence behavior.
18. The method of claim 6 and further including. evaluating the detailed specimen image as a function of the training set of information to identify interest regions of likely potential interest; and taking action on at least one of the identified interest regions in the detailed specimen image.
19. The method of claim 18 wherein: the method further includes ranking the interest regions in the detailed specimen image in terms of their relative potential interest; and highlighting the identified interest regions includes highlighting the interest regions as a function of their ranking.
20. Operating a server system workstation in accordance with the method of claim 1 , wherein the server system workstation is configured for use in connection with a digital telemedicine imaging system of the type having a server system at each of at least one remote surgical sites and a client system at a local site, for enabling a clinician at the local site to observe, analyze and provide diagnostic opinions on medical specimens from patients at the remote surgical sites, the client system workstation adapted to be interconnected to a digital communication channel for telecommunication with the server system.
PCT/US1998/023760 1997-12-11 1998-11-10 Digital telepathology imaging system programmed for identifying image regions of potential interest and anticipating sequence of image acquisition WO1999030264A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU13136/99A AU1313699A (en) 1997-12-11 1998-11-10 Digital telepathology imaging system programmed for identifying image regions ofpotential interest and anticipating sequence of image acquisition

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US98912197A 1997-12-11 1997-12-11
US08/989,121 1997-12-11

Publications (1)

Publication Number Publication Date
WO1999030264A1 true WO1999030264A1 (en) 1999-06-17

Family

ID=25534784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/023760 WO1999030264A1 (en) 1997-12-11 1998-11-10 Digital telepathology imaging system programmed for identifying image regions of potential interest and anticipating sequence of image acquisition

Country Status (2)

Country Link
AU (1) AU1313699A (en)
WO (1) WO1999030264A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001078009A2 (en) * 2000-03-31 2001-10-18 Glaxo Group Limited Quantitative analysis of biological images
WO2002033867A2 (en) * 2000-10-18 2002-04-25 Rageshkumar Mahendrabhai Shah An electronic system to consult doctor at & from any place in the world
WO2002048680A1 (en) * 2000-12-13 2002-06-20 THE GOVERNMENT OF THE UNITED STATES OF AMERICA AS REPRESENTED BY THE SECRETARY OF THE DEPARTMENT OF HEALTH AND HUMAN SEVICES. The National Institutes of Health Method and system for processing regions of interest for objects comprising biological material
EP1223853A1 (en) * 1999-10-08 2002-07-24 The Research Foundation Of State University Of New York Virtual telemicroscope
WO2001057777A3 (en) * 2000-02-04 2002-08-22 Arch Dev Corp Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images
EP1234176A1 (en) * 1999-11-01 2002-08-28 Keren Mechkarim Ichilov, Pnimit D' System and method for generating a profile of particulate components of a body fluid sample
WO2003010708A1 (en) * 2000-11-30 2003-02-06 University Of Medicine & Dentistry Of New Jersey Collaborative diagnostic systems
WO2003067256A2 (en) * 2002-02-05 2003-08-14 University Of Medicine & Dentistry Of New Jersey Systems for analyzing microtissue arrays
EP1422648A2 (en) * 2002-10-29 2004-05-26 National Institute of Radiological Sciences Sample picture data processing method and sample inspection system and sample inspection method
EP1455305A2 (en) 2003-03-05 2004-09-08 Fairfield Imaging Ltd. Method for providing quantitive data and images for use in pathology analysis
US7171030B2 (en) 2000-11-30 2007-01-30 University Of Medicine & Denistry Of New Jersey Systems for analyzing microtissue arrays
US7292251B1 (en) 2000-10-06 2007-11-06 The Research Foundation Of State University Of New York Virtual telemicroscope
US7542596B2 (en) 1996-08-23 2009-06-02 Olympus America Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
US7596249B2 (en) 2002-02-22 2009-09-29 Olympus America Inc. Focusable virtual microscopy apparatus and method
WO2010133375A1 (en) * 2009-05-22 2010-11-25 Leica Microsystems Cms Gmbh System and method for computer-controlled execution of at least one test in a scanning microscope
US7854899B2 (en) 2004-08-26 2010-12-21 The United States Of America As Represented By The Secretary Of Health And Human Services Template methods and devices for preparing sample arrays
EP2535753A1 (en) * 2011-06-15 2012-12-19 Möller-Wedel GmbH Operation microscope with video editing function
EP2734838A1 (en) * 2011-07-20 2014-05-28 Mikroscan Technologies, Inc. Network-based pathology system with desktop slide scanner
US10074148B2 (en) 2011-03-31 2018-09-11 Rite Aid Hdqtrs. Corp. Medical kiosk and method of use
US10119901B2 (en) 2013-11-15 2018-11-06 Mikroscan Technologies, Inc. Geological scanner
US10162166B2 (en) 2014-10-28 2018-12-25 Mikroscan Technologies, Inc. Microdissection viewing system
US10223681B2 (en) 2012-08-15 2019-03-05 Rite Aid Hdqtrs. Corp. Veterinary kiosk with integrated veterinary medical devices
WO2024018351A1 (en) * 2022-07-19 2024-01-25 Ntp Nano Tech Projects S.R.L. Method and system for managing in real time digital images generated by a digital optical platform

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297034A (en) * 1987-04-30 1994-03-22 Corabi International Telemetrics, Inc. Telepathology diagnostic network
US5331550A (en) * 1991-03-05 1994-07-19 E. I. Du Pont De Nemours And Company Application of neural networks as an aid in medical diagnosis and general anomaly detection
GB2288511A (en) * 1994-04-15 1995-10-18 David Harris Diagnostic method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5297034A (en) * 1987-04-30 1994-03-22 Corabi International Telemetrics, Inc. Telepathology diagnostic network
US5331550A (en) * 1991-03-05 1994-07-19 E. I. Du Pont De Nemours And Company Application of neural networks as an aid in medical diagnosis and general anomaly detection
GB2288511A (en) * 1994-04-15 1995-10-18 David Harris Diagnostic method and apparatus

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7542596B2 (en) 1996-08-23 2009-06-02 Olympus America Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
EP1223853A4 (en) * 1999-10-08 2006-03-15 Univ New York State Res Found Virtual telemicroscope
EP1223853A1 (en) * 1999-10-08 2002-07-24 The Research Foundation Of State University Of New York Virtual telemicroscope
EP1234176A1 (en) * 1999-11-01 2002-08-28 Keren Mechkarim Ichilov, Pnimit D' System and method for generating a profile of particulate components of a body fluid sample
EP1234176A4 (en) * 1999-11-01 2005-09-21 Inflamet Ltd System and method for generating a profile of particulate components of a body fluid sample
WO2001057777A3 (en) * 2000-02-04 2002-08-22 Arch Dev Corp Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images
US6901156B2 (en) 2000-02-04 2005-05-31 Arch Development Corporation Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images
US7184582B2 (en) 2000-02-04 2007-02-27 Arch Development Corporation Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images
WO2001078009A3 (en) * 2000-03-31 2002-08-08 Glaxo Group Ltd Quantitative analysis of biological images
WO2001078009A2 (en) * 2000-03-31 2001-10-18 Glaxo Group Limited Quantitative analysis of biological images
US7292251B1 (en) 2000-10-06 2007-11-06 The Research Foundation Of State University Of New York Virtual telemicroscope
WO2002033867A3 (en) * 2000-10-18 2005-06-02 Rageshkumar Mahendrabhai Shah An electronic system to consult doctor at & from any place in the world
WO2002033867A2 (en) * 2000-10-18 2002-04-25 Rageshkumar Mahendrabhai Shah An electronic system to consult doctor at & from any place in the world
US7027633B2 (en) 2000-11-30 2006-04-11 Foran David J Collaborative diagnostic systems
US7171030B2 (en) 2000-11-30 2007-01-30 University Of Medicine & Denistry Of New Jersey Systems for analyzing microtissue arrays
WO2003010708A1 (en) * 2000-11-30 2003-02-06 University Of Medicine & Dentistry Of New Jersey Collaborative diagnostic systems
WO2002048680A1 (en) * 2000-12-13 2002-06-20 THE GOVERNMENT OF THE UNITED STATES OF AMERICA AS REPRESENTED BY THE SECRETARY OF THE DEPARTMENT OF HEALTH AND HUMAN SEVICES. The National Institutes of Health Method and system for processing regions of interest for objects comprising biological material
WO2003067256A3 (en) * 2002-02-05 2003-10-16 Univ New Jersey Med Systems for analyzing microtissue arrays
US7079673B2 (en) 2002-02-05 2006-07-18 University Of Medicine & Denistry Of Nj Systems for analyzing microtissue arrays
WO2003067256A2 (en) * 2002-02-05 2003-08-14 University Of Medicine & Dentistry Of New Jersey Systems for analyzing microtissue arrays
US7925067B2 (en) 2002-02-22 2011-04-12 Olympus America Inc. Focusable virtual microscopy apparatus and method
US8306300B2 (en) 2002-02-22 2012-11-06 Olympus America Inc. Focusable virtual microscopy apparatus and method
US7596249B2 (en) 2002-02-22 2009-09-29 Olympus America Inc. Focusable virtual microscopy apparatus and method
EP1422648A2 (en) * 2002-10-29 2004-05-26 National Institute of Radiological Sciences Sample picture data processing method and sample inspection system and sample inspection method
EP1422648A3 (en) * 2002-10-29 2004-08-18 National Institute of Radiological Sciences Sample picture data processing method and sample inspection system and sample inspection method
US7593556B2 (en) 2002-10-29 2009-09-22 National Institute Of Radiological Sciences Sample picture data processing method and sample inspection system and method
EP1455305A3 (en) * 2003-03-05 2010-11-24 Hamamatsu Photonics K.K. Method for providing quantitative data and images for use in pathology analysis
EP1455305A2 (en) 2003-03-05 2004-09-08 Fairfield Imaging Ltd. Method for providing quantitive data and images for use in pathology analysis
US7854899B2 (en) 2004-08-26 2010-12-21 The United States Of America As Represented By The Secretary Of Health And Human Services Template methods and devices for preparing sample arrays
US9599804B2 (en) 2009-05-22 2017-03-21 Leica Microsystems Cms Gmbh System and method for computer-controlled execution of at least one test in a scanning microscope
WO2010133375A1 (en) * 2009-05-22 2010-11-25 Leica Microsystems Cms Gmbh System and method for computer-controlled execution of at least one test in a scanning microscope
US10074148B2 (en) 2011-03-31 2018-09-11 Rite Aid Hdqtrs. Corp. Medical kiosk and method of use
EP2535753A1 (en) * 2011-06-15 2012-12-19 Möller-Wedel GmbH Operation microscope with video editing function
EP2734838A4 (en) * 2011-07-20 2015-04-01 Mikroscan Technologies Inc Network-based pathology system with desktop slide scanner
US9495577B2 (en) 2011-07-20 2016-11-15 Mikroscan Technologies, Inc. Network-based pathology system with desktop slide scanner
US9871960B2 (en) 2011-07-20 2018-01-16 Mikroscan Technologies, Inc. Network-based pathology system with desktop slide scanner
US9883093B2 (en) 2011-07-20 2018-01-30 Mikroscan Technologies, Inc. Network-based pathology system with desktop slide scanner
EP2734838A1 (en) * 2011-07-20 2014-05-28 Mikroscan Technologies, Inc. Network-based pathology system with desktop slide scanner
US10223681B2 (en) 2012-08-15 2019-03-05 Rite Aid Hdqtrs. Corp. Veterinary kiosk with integrated veterinary medical devices
US10119901B2 (en) 2013-11-15 2018-11-06 Mikroscan Technologies, Inc. Geological scanner
US10162166B2 (en) 2014-10-28 2018-12-25 Mikroscan Technologies, Inc. Microdissection viewing system
WO2024018351A1 (en) * 2022-07-19 2024-01-25 Ntp Nano Tech Projects S.R.L. Method and system for managing in real time digital images generated by a digital optical platform

Also Published As

Publication number Publication date
AU1313699A (en) 1999-06-28

Similar Documents

Publication Publication Date Title
WO1999030264A1 (en) Digital telepathology imaging system programmed for identifying image regions of potential interest and anticipating sequence of image acquisition
EP3776458B1 (en) Augmented reality microscope for pathology with overlay of quantitative biomarker data
US20210224541A1 (en) Augmented Reality Microscope for Pathology
JP4542386B2 (en) Image display system, image providing apparatus, image display apparatus, and computer program
US5216596A (en) Telepathology diagnostic network
Yagi et al. Digital imaging in pathology: the case for standardization
EP1016031B1 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
CN104981798B (en) The selection and display of biological marker expression
TWI412949B (en) Automated selection of image regions
JP2022534155A (en) A Neural Network-Based Identification Method for Regions of Interest in Digital Pathology Images
JP6348504B2 (en) Biological sample split screen display and system and method for capturing the records
US6246785B1 (en) Automated, microscope-assisted examination process of tissue or bodily fluid samples
CA2398736C (en) Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
US20060159325A1 (en) System and method for review in studies including toxicity and risk assessment studies
US20020061127A1 (en) Apparatus for remote control of a microscope
WO1998044446A9 (en) Method and apparatus for acquiring and reconstructing magnified specimen images from a computer-controlled microscope
JP2008535528A (en) System and method for forming variable quality images of slides
WO1999013360A2 (en) Digital telepathology imaging system with bandwidth optimization and virtual focussing
EP0293083A2 (en) Remote transmission diagnostic system
Weinstein et al. Static and dynamic imaging in pathology
JP5702943B2 (en) Pathological diagnosis support device, pathological diagnosis support method, control program for pathological diagnosis support, and recording medium recording the control program
Gilbertson et al. Clinical slide digitization: whole slide imaging in clinical practice experience from the university of pittsburgh
JP4490225B2 (en) Cell image display method, cell image display system, cell image display device, and computer program
WO2022201992A1 (en) Medical image analysis device, medical image analysis method, and medical image analysis system
Ljubojević et al. Improving the Teaching of Histology by Using the Manual Whole Slide Imaging Technology

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: KR

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA