CA2232164A1 - A neural network assisted multi-spectral segmentation system - Google Patents

A neural network assisted multi-spectral segmentation system Download PDF

Info

Publication number
CA2232164A1
CA2232164A1 CA002232164A CA2232164A CA2232164A1 CA 2232164 A1 CA2232164 A1 CA 2232164A1 CA 002232164 A CA002232164 A CA 002232164A CA 2232164 A CA2232164 A CA 2232164A CA 2232164 A1 CA2232164 A1 CA 2232164A1
Authority
CA
Canada
Prior art keywords
images
neural network
nuclear
cellular material
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002232164A
Other languages
French (fr)
Inventor
Ryan S. Raz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Veracel Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2232164A1 publication Critical patent/CA2232164A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • G01N15/1433
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Abstract

A neural network assisted multi-spectral segmentation method and system.
According to the invention, three images having different optical bands are acquired for the same micrographic scene of a biological sample. The images are processed and a cellular material map is generated identifying cellular material. The cellular material map is then applied to a neural network. The neural network classifies the cellular material map into nuclear objects and cytoplasmic objects by determining a threshold surface in the 3-dimensional space separating the cytoplasmic and nuclear regions. In another aspect, the neural network comprises a hardware-encoded algorithm in the form of a look-up table.

Description

-A N~:u~T- NETWORK ASSISTED MULTI-S~K~TKAL SEGMENTATION S~STEM

FIELD OF T~E lNV~- ~ lON
The present invention relates to automated diagnostic techniques in medicine and biology, and more particularly to neural network ~or multi-spectral segmentation o~ nuclear and cytoplasmic,objects.

R~C~O~ND OF THE lNv~NllON
Automated diagnostic systems in medicine and biology o~ten rely on the visual inspection o~ microscopic images. Known systems attempt to mimic or imitate the procedures employed by hllm~n~. An appropriate example o~ this type o~ system is an automated instrument designed to assist a cyto-technologist in the review or diagnosis o~ Pap smears. In its usual operation such a system will rapidly acquire microscopic images o~ the cellular content o~ the Pap smears and then subject them to a battery o~ image analysis procedures. The goal o~ these procedures is the identi~ication o~ images that are likely to contain unusual or potentially abnormal cervical cells.
The image analysis techniques utilized by these automated instruments are similar to the procedures consciously, and o~ten unconsciously, per~ormed by the human cyto-technologist. There are three distinct operations that must ~ollow each other ~or this type o~ evaluation: (1) segmentation;
(2) ~eature extraction; and (3) classi~ication.
The segmentation is the delineation o~ the objects o~
interest within the micrographic image. In addition to the cervical cells required ~or an analysis there is a wide range o~
"background" material, debris and cont~m;n~tion that inter~eres with the identi~ication o~ the cervical cells and there~ore must be delineated. Also ~or each cervical cell, it is necessary to delineate the-nucleus with the cytoplasm.
The Feature Extraction operation is per~ormed a~ter the completion o~ the segmentation operation. Feature extraction comprises characterizing the segmented regions as a series o~

descriptors based on the morphological, textural, densitometric and colorimetric attributes o~ these regions.

The Classi~ication step is the ~inal step in the image analysis. The ~eatures extracted in the previous stage are used in some type o~ discr;m;n~nt-based classi~ication procedure. The results o~ this classi~ication are then translated into a diagnosis" of the cells in the image.
O~ the three stages outlined above, segmentation is the most crucial and the most di~icult. This is particularly true ~or the types o~ images typically encountered in medical or biological spec;m~n~.
In the case o~ a Pap smear, the goal o~ segmentation is to accurately delineate the cervical cells and their nuclei.
The situation is complicated not only by the variety o~ cells ~ound in the smear, but also by the alterations in morphology produced by the sample preparation technique and by the quantity o~ debris associated with these specimens. Furthermore, during preparation it is di~icult to control the way cervical cells are deposited on the sur~ace o~ the slide which as a result leads to a large amount o~ cell overlap and distortion.
Under these circumstances a segmentation operation is di~icult. One known way to improve the accuracy and speed o~
segmentation ~or these types o~ images involves exploiting the di~erential staining procedure associated with all Pap smears.
According to the Papanicolaou protocol the nuclei are stained dark blue while the cytoplasm is stained anything ~rom a blue-green to an orange-pink. The Papanicolaou Stain is a combination o~ several stains or dyes together with a speci~ic protocol designed to ~mph~size and delineate cellular structures o~
importance ~or pathological analysis. The stains or dyes included in the Papanicolaou Stain are Haematoxylin, Orange G and Eosin Azure (a mixture o~ two acid dyes, Eosin Y and Light Green SF Yellowish, together with Bismark Brown). Each stain component is sensitive to or binds selectively to a particular cell structure or material. Haematoxylin binds to the nuclear material colouring it dark blue. Orange G is an indicator o~

keratin protein content. Eosin Y stains nucleoli, red blood cells and mature squamous epithelial cells. Light Green SF
yellowish acid stains metabolically active epithelial cells.
Bismark Brown stains vegetable material and cellulose.
The combination o~ these stains and their diagnostic interpretation has evolved into a stable medical protocol which predates the advent o~ computer-aided imaging instruments.
Consequently, the dyes present a complex pattern o~ spectral properties to standard image analysis procedures. Speci~ically, a simple spectral decomposition based on the optical behaviour o~ the dyes is not suf~icient on its own to reliably distinguish the cellular components within an image. The overlap o~ the spectral response o~ the dyes is too large ~or this type o~
straight-~orward segmentation.
The use o~ di~erential st~;n;ng characteristics is only the means to the end in the solution to the problem o~
segmentation. 0~ equal importance is the procedure ~or handling the in~ormation provided by the spectral character o~ the cellular objects when making a decision concerning identity.
In the art, attempts have been made to automate diagnostic procedures, however, there r~m~;n~ a need ~or a system ~or per~orming the segmentation process.

BRIEF S~nK~RY OF I~IE lNV~ lON
The present invention provides a Neural-Network Assisted Multi-Spectral Segmentation (also re~erred to as the NNA-MSS) method and system.
The ~irst stage according to the present invention comprises the acquisition o~ three images of the same micrographic scene. Each image is obt~;n~ using a di~erent narrow band-pass optical ~ilter which has the e~ect o~ selecting narrow band o~ optical wavelengths associated with distinguishing absorption peaks in the stain spectra. The choice o~ optical wavelength bands is guided by the degree o~ separation a~orded by these peaks when used to distinguish the di~erent types o~ cellular material on the slide sur~ace.

The second stage according to the invention comprises a neural-network (trained on an extensive set o~ typical examples) to make decisions on the identity o~ material already deemed to be cellular in origin. The neural network decides whether or not a picture element in the digitized image is nuclear or not nuclear in character. With the completion o~ this step the system can continue on applying a st~n~rd range o~
image processing techniques to re~ine the segmentation. The relationship between the cellular components and the transmission intensity o~ the light images in each o~ the three spectral bands is a complex and non-linear one. By using a neural network to combine the in~ormation ~rom these three images it is possible to achieve a high degree o~ success in separating the cervical cell ~rom the background and the nuclei ~rom the cytoplasm. A
success that would not be possible with a set o~ linear operations alone.
The diagnosis and evaluation o~ Pap smears is aided by the introduction o~ a di~erential staining procedure called the Papanicolaou Stain. The Papanicolaou Stain is a combination o~
several stains or dyes together with a speci~ic protocol designed to ~mph~ize and delineate cellular structures o~ importance to pathological analysis. The stains or dyes included in the Papanicolaou Stain are Haematoxylin, Orange G and Eosin Azure (a mixture o~ two acid dyes, Eosin Y and Light Green SF Yellowish, together with Bismarck Brown). Each stain component is sensitive to or binds selectively to a particular cellular structure or material. Haematoxylin binds to the nuclear material colouring it dark blue; Orange G is an indicator of keratin protein content; Eosin Y stains nucleoli, red blood cells and mature s~uamous epithelial cells; Light Green SF yellowish stains metabolically active epithelial cells; Bismarck srown stains vegetable material and cellulose.
According to another aspect o~ the invention, three optical wavelength bands are used in a complex procedure to segment Papanicolaou-st~;n~ epithelial cells in digitized images. The procedure utilizes standard segmentation operations (erosion, dilation, etc.) together with the neural-network to identi~y the location o~ nuclear components in areas already determined to be cellular material The purpose o~ the segmentation is to extract the cellular objects, i.e. to distinguish the nucleus o~ the cell ~rom the cytoplasm. According to this segmentation the multi-spectral images are divided into two classes: cytoplasm objects and nuclear objects, which are separated by a multi-~;m~n~ional threshold t which comprises a 3-~;m~n~ional space.
The neural network according to the invention comprises a Probability Projection Neural Network (PPNN). The PPNN
according to the present invention ~eatures ~ast training ~or a large volume o~ data, processing o~ multi-modal non-Gaussian data distribution, good generalization simultaneously with high sensitivity to small clusters of patterns representing the use~ul subclasses o~ cells. In another aspect, the PPNN is implemented as a hardware-encoded algorithm.
In one aspect, the present invention provides a method for identi~ying nuclear and cytoplasmic objects in a biological specimen, said method comprising the steps o~: (a) acquiring a plurality o~ images o~ said biological specimen; (b) identi~ying cellular material ~rom said images and creating a cellular material map; (c) applying a neural network to said cellular material map and classi~ying nuclear and cytoplasmic objects ~rom said images.
In second aspect, the present invention provides a system ~or identi~ying nuclear and cytoplasmic objects in biological specimen, said system comprising: (a) image acquisition means ~or acquiring a plurality o~ images o~ said biological specimen; (b) processing means ~or processing said images and generating a cellular material map identi~ying cellular material; (c) neural processor means ~or processing said cellular material map and including means ~or classi~ying nuclear and cytoplasmic objects ~rom said images.
In a third aspect, the present invention provides a hardware-encoded neural processor ~or classi~ying input data, said hardware-encoded neural processor comprising: (a) a memory having a plurality o~ addressable storage locations; (b) said addressable storage locations containing classi~ication in~ormation associated with the input data; (c) address generation means ~or generating an address ~rom said input data ~or accessing the classi~ication in~ormation stored in said memory ~or selected input data.
A pre~erred embodiment o~ the present invention will now be described, by way o~ example, with re~erence to the following specification, claims, and drawings BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows in ~low chart form a neural network assisted multi-spectral segmentation method according to the present invention;
Fig. 2 shows in diay~d--u--atic ~orm a processing element ~or the neural network;
Fig. 3 shows in diagrammatic form a neural network comprising the processing elements o~ Fig. 2;
Fig. 4 shows in diagrammatic ~orm a training step ~or the neural network;
Fig. 5 shows in flow chart ~orm a clustering algorithm ~or the neural network according to the present invention; and Fig. 6 shows a hardware implementation ~or the neural network according to the present invention.

DE~TT-~n DESCRIPTION OF T~E ~K~r~KRED ~RODIMENT
The present invention provides a Neural Network Assisted Multi-Spectral Segmentation (also referred to as NNA-MSS) system and method. The multi-spectral segmentation method is related to that described and claimed in co-pending International Patent Application No. CA96/00477 ~iled July 18, 1996 and in the name of the applicant.
The NNA-MSS according to the present invention is particularly suitedto Papanicolaou-stained gynaecological smears and will be described in this context. It is however to be understood that the present invention has wider applicability to applications outside o~ Papanicolaou-stained smears.
Re~erence is ~irst made to Fig. 1 which shows in ~low chart a Neural Network Assisted Multi-Spectral Segmentation (NNA-MSS) method 1 according to the present invention.
The ~irst step 10 involves inputting three digitized images, i.e. micrographic scenes, o~ a cellular specimen. The images are taken in each of~ the three narrow optical bands: 540 + 5 nm; 577 + 5 nm and 630 + 5 nm. (The images are generated by an imaging system (not shown) as will be understood by one skilled in the art, and thus need not be described in detail here.) The images are next processed by the multi-segmentation method 1 and neural network as will be described.
As shown in Fig. 1, the images are subjected to a levelling operation (block 12). The levelling operation 12 involves removing the spatial variations in the illumination intensity from the images. The levelling operation is implemented as a simple mathematical routine using known image processing techniques. The result o~ the levelling operation is a set o~ 8-bit digitized images with uni~orm illumination across their ~ields.
The 8-bit digitized images ~irst undergo a series o~
processing steps to identi~y cellular material in the digitized images. The digitized images are then processed by the neural network to segment the nuclear objects ~rom the cytoplasm objects.
Re~erring to Fig. 1, ~ollowing the levelling operation 12 the next operation comprises a threshold procedure block 14.
The threshold procedure involves analyzing the levelled images in a search ~or material o~ cellular origin. The threshold procedure 14 iS applied to the 530 nm and 630 nm optical wavelength bands and comprises identi~ying material in the image o~ cellular origin as regions o~ the digitized image that ~all within a range o~ speci~ic digital values. The threshold procedure 14 produces a single binary ~map" o~ the image where the single binary bit identi~ies regions that are, or are not, cellular material.

The threshold operation 14 is ~ollowed by a dilation operation (block 16). The dilation operation 16 is a conventional image processing operation which modi~ies The binary map o~ cellular material generated in block 14. The dilation operation allows the regions o~ cellular material to grow or dilate by one pixel in order to ~ill small voids in large regions. Pre~erably, the dilation operation 16 is modi~ied with the condition that the dilation does not allow two separate regions o~ cellular material to join to make a single region, i.e. a "no-join" condition. This condition allows the accuracy o~ the binary map to be preserved through dilation operation 16.
Pre~erably, the dilation operation is applied twice to ensure a proper ~illing o~ voids. The result o~ the dilation operations 16 is a modi~ied binary map o~ cellular material.
As shown in Fig. 1, the dilation operation 16 is ~ollowed by an erosion operation (block 18). The erosion operation 18 brings the modi~ied binary map o~ cellular material (a result o~ the dilation operation 16) back to its original boundaries. The erosion operation 18 is implemented using conventional image processing techni~ues. The erosion operation 18 allows the cellular boundaries in the binary image to shrink or erode but will not a~ect the ~illed voids. Advantageously, the erosion operation 18 has the additional e~ect o~ eliminating small regions o~ cellular material that are not important to the later diagnostic analysis. The result o~ the erosion operation 18 is a ~inal binary map of the regions in the digitized image that are cytoplasm.
The next stage according to the invention, is the operation o~ the neural network at block 20. The neural network 20 is applied to the 8-bit digitized images, with attention restricted to those regions that lie within the cytoplasm as determined by the ~inal binary cytoplasm map generated as a result o~ the-previous operations. The neural network 20 makes decisions concerning the identity o~ individual picture elements (or "pixels~) in the binary image as either being part o~ a nucleus or not part o~ a nucleus. The result o~ the operation o~ the neural network is a digital map o~ the regions within the CA 02232l64 l998-03-l6 cytoplasm that are considered to be nuclear material. The nuclear material map is then subjected to further processing.
The neural network 20 according to the present invention is described in detail below.
Following the application o~ the neural network 20, the resulting nuclear material map is subjected to an erosion operation (block 22). The erosion operation 22 eliminates regions o~ the nuclear material map that are too small to be o~
diagnostic signi~icance. The result is a modi~ied binary map of nuclear regions.
The modified binary map resulting ~rom the erosion operation 22 is then subjected to a dilation operation (block 24). The dilation operation 24 is subject to a no-join condition, such that, the dilation operation does not allow two separate regions o~ nuclear material to join to make a single region. In this way the accuracy o~ the binary map is preserved notwithstanding the dilation operation. The dilation operation 24 is pre~erably applied twice to ensure a proper ~illing of voids. The result o~ these dilation operations is a modi~ied binary map o~ nuclear material.
Following the dilation operation 24, an erosion operation is applied (block 26) Double application o~ the erosion operation 26 eliminates regions o~ the nuclear material in the binary map that are too small to be o~ diagnostic signi~icance. The result is a modi~ied binary map o~ nuclear regions.
The r~m~;n;ng operations involve constructing a binary map comprising high gradients, i.e boundaries, o~ pixel intensity, in order to sever nuclear regions that share high gradient boundaries. The presence o~ these high gradient boundaries is evidence o~ two, closely spaced but separate nuclei.
The-~irst step in severing the high-gradient boundaries in the nuclear map is to construct a binary map o~ these high gradient boundaries using a threshold operation (block 28) applied to a Sobel map.

The Sobel map is generated by applying the Sobel gradient operator to the 577 nm 8-bit digitized image to determine regions of that image that contain high gradients o~
pixel intensity (block 29). (The 8-bit digitized image ~or the 577 nm band was obt~ n~ ~rom the levelling operation in block 12.) The result o~ the Sobel operation in block 29 is an 8-bit map of gradient intensity.
Following the threshold Sobel operation 28, a logical NOT operation is per~ormed (block 30) The logical NOT operation 30 determines the coincidence o~ the two states, high-gradients and nuclei, and reverses the pixel value o~ the nuclear map at the point o~ the coincidence in order to eliminate it ~rom regions that are presumed to be nuclear material. The result o~
this logical operation is a modi~ied nuclear map.
The modi~ied nuclear map is next subjected to an erosion operation (block 32). The erosion operation 32 eliminates regions in the modi~ied nuclear map that are too small to be o~ diagnostic signi~icance. The result is a modi~ied binary map o~ nuclear regions.
A~ter the application o~ the gradient technique ~or severing close nuclear boundaries (blocks 28 and 30) and the erosion operation (block 32) ~or clearing the image o~
insigni~icant regions, the binary map o~ nuclear regions is dramatically altered. To restore the map to its original boundaries while preserving the newly-~ormed separations, the process applies a dilation operation at block 34. The dilation operation 34 includes the condition that no two nuclear regions will become joined as they dilate and that no nuclear region will be allowed to grow outside its old boundary as de~ined by the binary map that existed be~ore the Sobel procedure was applied The dilation operation 34 is pre~erably applied ~our times. The result is a modi~ied binary map o~ nuclear material.
With the application o~ the dilation operation 34, the nuclear segmentation procedure according to the multi-spectral segmentation process 1 is complete and the resulting binary nuclear map is labelled in block 36, and i~ required ~urther image processing is applied.

As described above, the operation at block 20 in Fig.
1 comprises neural network processing o~ the digitized images.
In general, the neural network 20 is a highly parallel, distributed, in~ormation processing system that has the topology o~ a directed graph. The network comprises a set o~ "nodes" and series o~ " conn~; ons" between the nodes. The nodes comprise processing elements and the conn~;ons between the nodes represent the trans~er of in~ormation ~rom one node to another.
Re~erence is made to Fig. 2 which shows a node or processing element lOOa ~or a backpropagation neural network 20.
Each o~ the nodes lOOa accepts one or more inputs 102 shown individually as a1, a2, a3 ... an in Fig 2. The inputs 102 are taken into the node lOOa and each input 102 is multiplied by its own mathematical weighting ~actor be~ore being summed together with the threshold ~actor ~or the processing element lOOa. The processing element lOOa then generates a single output 104 (i.e.
bj) according to the "trans~er ~unction" being used in the network 20. The output 104 is then available as an input to other nodes or processing elements, ~or example processing elements lOOb, lOOc, lOOd, lOOe and 100~ as depicted in Fig. 1.
The trans~er ~unction may be any suitable mathematical ~unction but it is usual to employ a "sigmoid" ~unction. The relationship between the inputs 102 into the node 100 and the output 104 is given by expression (1) as ~ollows:

bj = { ~ Wji ai - ~j } (1) where bj is the output 104 of the node 100, ai is the value o~
the input 102 to the node labelled "I", wji is the weighting given to that input 102, and ej is the threshold value ~or the node 100. In the present application, the trans~er ~unction is modelled a~ter a sigmoid ~unction.
In its general form, the nodes or processing elements ~or the neural network are arranged in a series o~ layers denoted by 106, 108 and 110 as shown in Fig. 3. The ~irst layer 106 comprises nodes or processing elements 112 shown individually as 112a, 112b, 112c, 112d and 112e. The ~irst layer 106 is an input layer and accepts the in~ormation required ~or a decision.
The second layer 108 in the neural network 20 is known as the hidden layer and comprises processing elements 114 shown individually as 114a, 114b, 114c, 114d and 114e. All of the nodes 112 in the input layer 106 are connected to all o~ the nodes 114 in the hidden layer 108. It will be understood that there may be more than one hidden layer, with each node in the successive layer connected to each node o~ the previous layer.
For convenience only one hidden layer 108 is shown in Fig. 3.
The (last) hidden layer 108 leads to the output layer 110. The output layer 110 comprises processing elements 116 shown individually as 116a, 116b, 116c, 116d and 116e in Fig. 3.
Each node 114 o~ the (last) hidden layer 108 (Fig. 3) is connected to each node 116 o~ the output layer 110. The output layer 110 renders the decision to be interpreted by subse~uent computing ma~h; nery, The strength o~ the neural network architecture is its ability to generalize based on previous training o~ particular examples. In order to take advantage o~ this, the neural network is presented a series o~ examples o~ the type o~ objects that it is destined to classify. The backpropagation neural network organizes itsel~ by altering the multiplicity o~ its co~n~;on weights and thresholds according to its success in rendering a correct decision. This is called supervised learning wherein the operator provides the network with the in~ormation regarding its success in classi~ication. The network relies on a standard general rule ~or modi~ying its connexion weights and thresholds based on the success of its per~ormance, i.e. back-propagation.
In the context o~ the multi-spectral segmentation process, the multi-spectral images are divided into two classes:
C0 - cytoplasm and C1 - nuclear, separated by the multi-~;m~n~ional threshold t which comprises a 3-~;m~n~ional space.
The distribution o~ the pixels ~or the nuclear and cytoplasm objects is complex and the 3-D space comprises numerous clusters and non-overlapped regions. It has been ~ound that the optimal threshold has a complex non-linear sur~ace in the 3-D space, and the neural network according to the present invention provides the means for ~inding the complex threshold sur~ace in the 3-D
space in order to segment the nuclear and cytoplasmic objects.
According to this aspect o~ the invention, the neural network 20 comprises an input layer 106, a single hidden layer 108, and an output layer 110. The input layer 106 comprises three nodes or processing elements 112 (Fig. 3) ~or each o~ the three 8-bit digitized values for the particular pixel being m;nPd. (The three digitized values arise ~rom the three levelled images collected in each o~ the three optical bands, as described above with re~erence to Fig. 1.) The output layer 110 comprises a single processing element 116 (Fig. 3) which indicates whether the pixel under ~mi n~tion is or is not part o~ the nucleus.
Be~ore the neural network 20 can be success~ully operated ~or decision-making it must ~irst be ~trained~ in order to establish the proper combination o~ weights and thresholds.
The training is per~ormed outside o~ the segmentation procedure on a large set o~ examples. Errors made in the classification o~ pixels in the examples are ~back-propagated" as corrections to the c~nn~; on weights and the threshold values in each o~ the processing units. Once the classi~ication error is acceptable the network is "~rozen" at these weight and threshold values and it is integrated as a simple algebraic operation into the segmentation procedure as shown at block 20 in Fig 1.
In a pre~erred embodiment, the neural net~ork 20 according to the invention comprises a Probability Projection Neural Network which will also be re~erred to as a PPNN. The PPNN according to the present invention ~eatures ~ast training ~or a large volume o~ data, processing of multi-modal non-Gaussian data distribution, good generalization simultaneously with high sensitivity to small clusters o~ patterns representing the use~ul subclasses o~ cells. In another aspect, the PPNN is well-suited to a hardware-encoded implementation.
The PPNN according to the invention utilizes a Probability Density Function (PDF) estimator. As a result, the PPNN is suitable ~or use as a Probability Density Function estimator or as a general classi~ier in pattern recognition The PPNN uses the training data to create an N-~;m~n.~ional PDF array which in turn is used to estimate the likelihood o~ a ~eature vector being within the given classes as will now be described To create and train the PPN network, the input space is partitioned into m x m x m discrete nodes (i~ the discrete input space is known, then m is usually selected less than the range) For example, ~or a 3-D PDF array creating a 26 x 26 x 26 grid is su~icient As shown in Fig 4, the next step involves mapping or projecting the influence o~ the each training pattern to the neighbour nodes This is accomplished according to expression (2) as shown below:

Pj[XO~Xl~ Xn-l] = Pj l~xO~X1, . . ., Xn_l] + dj~Xo~xl, . . ., Xn_l]
1, i~ rk - ~
o~ i~ rk 2 rO ( 2) d, ~xo ~ Xl~ . . ., Xn-l] = ~ 1--rk i ~ rk < rO
2n (1 - rl) i =O

where Pj ~XO,Xl, . . . ~Xn_l] is the current value o~ the (XO~Xl~ Xn 1) node a~ter the j~th iteration; dj ~xo~ Xl~ Xn_l] represents the in~luence o~ j'th input pattern to the (xO,xl, . . . ,xn l) nodei rk is the distance ~rom the pattern to the k~th node; rO is the m;n;ml~m distance between two neighbour nodes; and n is the ~;m~n~ion o~
the space 2n From expression (1), it will be appreciated that Vj ~ d represents the normalized values k=l Once the accumulation o~ PN~XO,X1, ...~xn_l] (where j = N
- number o~ the training patterns) is completed, a normalization operation is per~ormed to obtain the total energy value ~or PPNN
Epp~ - 1. The normalized values (i.e. P*) ~or PPNN are calculated according to expression (3) as ~ollows:

N [XO, X1, . . ., Xn_1] = PN [XO, X1, . . ., Xn 1] /N ( 3 ) For ~eed-~orward calculations the trained and normalized nodes P*N[XO,X1, . . . ,X~_1] and the reverse mapping are utilized according to expression (4) given below, 2n 1 hj [xO, . . . ~ Xn_l] ~ ~ PNi) [Xo~ xl, . . ., x" l] d~ ff' [xO, xl, - , Xn-l] ~i=O (4) where dj(i) [xO~xl~ xn-l] are calculated according to expression (1) above.
To solve a two class (i.e. CO - cytoplasm and C1 -nuclear) application using the PPNN according to the present invention, two networks must be trained ~or each class separately, that is, Pco[xo~xl~ Xn-l] and Pcl[XO~xl~ ~Xn-l]-Because both PPNN are normalized, they can be joined together according to expression ( 5) below as :Eollows:

PCo/Cl [XO ~ Xl ~ Xn-l] = P Co [XO ~ Xl ~ Xn_l] P Cl [XO ~Xl~ Xn-l] (5) The f~inal decision ~rom expressions (4) and (5) iS given by CO, if~ hj ~ O
Patternj ~ fCl, if hj 5 0 (6) While the PPNN according to the present invention is particularly suited to handle multi-modal data distributions, in many practical situations there will be an unbalanced data set.
This means that some clusters will contain less data sàmples than other clusters and as a result some natural clusters which were represented with a small number o~ patterns could be lost a~ter PPNN joining. To solve this problem there is provided an algorithm which equalizes all natural clusters according to another aspect o~ the invention.
~ Re~erence is next made to Fig. 5~ which shows in ~low chart ~orm an embodiment o~ a clustering algorithm 200 accordiny to the present invention. All training patterns, i.e. N samples, in block 202 and a given number (i.e. "K") o~ clusters in block 204 are applied to a K-mean clustering operation block 206. The clustering operation 206 clusters the input data and generates clusters 1 through K (block 208). Next, all the training data which belongs to an ith-cluster is extracted into a separate sub-class. For each sub-class o~ training data, a normalized PPNN, i.e. Ei = 1, is created (block 210). The ~inal operation in the clustering algorithm comprises joining all o~ the K PPNN's together and normalizing the resulting PPNN by dividing all nodes by the number o~ clusters (block 212). The operation per~ormed in block 212 may be expressed as ~ollows:

E = (E~ + .... + Ek)/K-1 It will also be understood that the clustering algorithm 200 may be implemented to the each class separately be~ore creating the ~inal classi~ier according the expression (6) above, as ~ollows The optimal number o~ clusters ~or each o~ two classes may be found ~rom ~inal PPNN per~ormance analysis (expression (6) above). First, the number o~ clusters ~or PPN2 = 1 are ~ixed and the optimal number o~ clusters ~or PPNl are ~ound. Next, the reverse variant is modelled as: PPNl = 1, A PPN2 = opt. Lastly, the two optimal networks PPNl~t ~ PPN2~t are combined together according to expression (6).
While the neural network assisted multi-spectral segmentation process is described with a Probability Projection Neural Network according to the present invention, it will be understood that other conventional neural networks are suitable, including ~or example, Backpropagation (BP) networks, Elliptic Basic Functions (EBF) networks, and Learning Vector Quantization (LQV) networks. However, the PPNN is pre~erred. The per~ormance results o~ the Probability Projection Neural Net have been ~ound to exceed those achieved by conventional networks.
According to another aspect o~ the present invention, the neural network assisted multi-spectral segmentation process is implemented as a hardware-encoded procedure embedded in conventional FPGA (Field Programmable Gate Array) logic as part of a special-purpose computer.
The hardware implementation of this network is found in the ~orm o~ a look-up table cont~; n~ in a portion of hardware memory (Fig. 6). As described above, the neural network 20 comprises three input nodes and a single, binary output node.
The structure of the neural network 20 according to the present invention also simpli~ies the hardware implementation of the network.
As shown in Fig. 6, the three input nodes correspond to three optical bands 301, 302, 303 used in gathering the images. The images taken in the 530 nm and 630 nm bands have 7-bits of useful resolution while the 577 nm band retains all 8-bits. (The 577 nm band is centered on the nucleus.) The performance of the neural network 20 is then determined for all possible combinations of these three inputs. Since there are 22 bits in total, there are 272 or 4.2 million possible combinations. To create the look-up table, all input pixels in the space (27 x 27 x 28 variants for the three images in the present embodiment) are scanned and the look-up table is ~illed with the PPNN decision, i.e. 1 - pixel belongs to nuclear; 0 -pixel doesn't belong to nuclear, for all each of these pixel combinations.
The coding of the results (i.e. outputs) of the neural network comprises assigning each possible combination of inputs a unique address 304 in a look-up table 305 stored in memory.
The address 304 in the table 305 is formed from by ~oining together the binary values o~ the three ~.h~nnel values indicated by 306, 307, 308, respectively in Fig. 6. For example, as shown in Fig. 6, the pixel for the image from the first ~.h~nn~l 301 (i.e. 530 nm) is binary 0101011, the pixel for image from the second channel 302 (i.e. 630 nm) is binary 0101011, and the pixel for the image from the third rh~nn~l 303 (i.e 577 nm) is binary 00101011, and concatenated together binary representations 306, 307, 308 ~orm the address 304 which is binary 0101011010101100101011. The address 304 points to a location in the look-up table 305 (i.e. memory) which stores a single binary value 309 that represents the response o~ the neural network to this combination o~ inputs, e.g. the logic O at memory location 0101011010101100101011 signi~ies that the pixel in question does not belong to the nucleus.
The hardware-encoding o~ NNA-MSS advantageously allows the process to execute at a high speed while making a complex decision. Secondly, as experimental data is ~urther tabulated and evaluated more complex decision spaces can be utilized to improve segmentation accuracy. Thus, an algorithm according to the present invention can be optimized ~urther by the adjustment o~ a table o~ coe~icients that describe the neural-network conn~; on weights without the necessity o~ altering the system architecture.
The present invention may be embodied in other speci~ic ~orms without departing ~rom the spirit or essential characteristics thereo~. There~ore, the presently discussed embodiments are considered to be illustrative and not restrictive, the scope o~ the invention being indicated by the appended claims rather than the ~oregoiny description, and all changes which come within the m~n;ng and range o~ equivalency o~ the claims are there~ore intended to be embraced therein.

Claims (18)

WHAT IS CLAIMED IS:
1. A method for identifying nuclear and cytoplasmic objects in a biological specimen, said method comprising the steps of:
(a) acquiring a plurality of images of said biological specimen;
(b) identifying cellular material from said images and creating a cellular material map;
(c) applying a neural network to said cellular material map and classifying nuclear and cytoplasmic objects from said images.
2. The method as claimed in claim 1, wherein said step of acquiring a plurality of images comprises capturing three digitized images of micrographic scene for said biological specimen.
3. The method as claimed in claim 2, wherein said step of creating a cellular material map comprises a threshold operation for identifying regions in said images containing cellular material.
4. The method as claimed in claim 3, further including the application of dilation and erosion operations to said cellular material map.
5. The method as claimed in claim 1, wherein said step of applying a neural network comprises training said neural network with examples of types of nuclear and cytoplasmic objects to be classified, and said training step including backpropagating errors in classification of said examples.
6. The method as claimed in claim 1, wherein said step of classifying nuclear and cytoplasmic objects comprises determining a threshold surface in three-dimensional space, and said nuclear and cytoplasmic objects being separated by said three-dimensional space.
7. The method as claimed in claim 6, wherein said neural network comprises a probability projection neural network.
8. The method as claimed in claim 7, wherein said probability projection neural network utilizes a probability density function estimator to estimate a feature vector being within given classes.
9. The method as claimed in claim 8, further including the step of equalizing clusters of data appearing in said images
10. A system for identifying nuclear and cytoplasmic objects in a biological specimen, said system comprising:
(a) image acquisition means for acquiring a plurality of images of said biological specimen;
(b) processing means for processing said images and generating a cellular material map identifying cellular material;
(c) neural processor means for processing said cellular material map and including means for classifying nuclear and cytoplasmic objects from said images.
11. The system as claimed in claim 10, wherein said neural processor means comprises a look-up table stored in memory having decision outputs stored in addressable locations of said memory, and including addressing means for generating an address to said memory for reading said decision output corresponding to a combination of said image inputs.
12. The system as claimed in claim 11, wherein said addressing means comprises means for combining binary values corresponding to said images and forming an address for accessing said memory from said combined binary values.
13. The system as claimed in claim 10, wherein said neural processor means comprises a probability projection neural network and includes a probability density function estimator to estimate a feature vector being within given classes.
14. The system as claimed in claim 13, wherein said neural processor means includes equalization means for equalizing clusters of data in said images.
15. The system as claimed in claim 10, wherein said neural processor means includes means for determining a threshold surface in three-dimensional space, said nuclear and cytoplasmic objects being separated by said three-dimensional space
16. A hardware-encoded neural processor for classifying input data, said hardware-encoded neural processor comprising:
(a) a memory having a plurality of addressable storage locations;
(b) said addressable storage locations containing classification information associated with the input data;
(c) address generation means for generating an address from said input data for accessing the classification information stored in said memory for selected input data.
17. The device as claimed in claim 16, wherein said input data comprises image pixels in a digitized image of cellular material.
18. The device as claimed in claim 17, wherein said classification information comprises a binary digit stored in each of said addressable locations of said memory, one state of said binary digit indicating that said input data belongs to a predetermined class, and the other state of said binary digit indicating that the input data is outside said class.
CA002232164A 1995-09-19 1996-09-18 A neural network assisted multi-spectral segmentation system Abandoned CA2232164A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US396495P 1995-09-19 1995-09-19
US60/003,964 1995-09-19
PCT/CA1996/000619 WO1997011350A2 (en) 1995-09-19 1996-09-18 A neural network assisted multi-spectral segmentation system

Publications (1)

Publication Number Publication Date
CA2232164A1 true CA2232164A1 (en) 1997-03-27

Family

ID=21708431

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002232164A Abandoned CA2232164A1 (en) 1995-09-19 1996-09-18 A neural network assisted multi-spectral segmentation system

Country Status (6)

Country Link
US (2) US6463425B2 (en)
EP (1) EP0850405A2 (en)
JP (1) JPH11515097A (en)
AU (1) AU726049B2 (en)
CA (1) CA2232164A1 (en)
WO (1) WO1997011350A2 (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NZ515488A (en) * 1999-05-12 2002-09-27 Siemens Ag Automatic address reading and decision for mail items
JP2005331394A (en) 2004-05-20 2005-12-02 Olympus Corp Image processor
US20060017740A1 (en) * 2004-07-26 2006-01-26 Coleman Christopher R Diurnal variation of geo-specific terrain temperatures in real-time infrared sensor simulation
US20060020563A1 (en) * 2004-07-26 2006-01-26 Coleman Christopher R Supervised neural network for encoding continuous curves
US20070036467A1 (en) * 2004-07-26 2007-02-15 Coleman Christopher R System and method for creating a high resolution material image
US7709796B2 (en) * 2005-02-25 2010-05-04 Iscon Video Imaging, Inc. Methods and systems for detecting presence of materials
US7627540B2 (en) * 2005-06-28 2009-12-01 Neurosciences Research Foundation, Inc. Addressing scheme for neural modeling and brain-based devices using special purpose processor
US7533071B2 (en) * 2005-06-28 2009-05-12 Neurosciences Research Foundation, Inc. Neural modeling and brain-based devices using special purpose processor
US7765029B2 (en) * 2005-09-13 2010-07-27 Neurosciences Research Foundation, Inc. Hybrid control device
US8117137B2 (en) 2007-04-19 2012-02-14 Microsoft Corporation Field-programmable gate array based accelerator system
WO2010003044A2 (en) * 2008-07-03 2010-01-07 Nec Laboratories America, Inc. Epithelial layer detector and related methods
US20110208644A1 (en) * 2008-08-21 2011-08-25 Kazuhiro Doi Money management system
US8131659B2 (en) * 2008-09-25 2012-03-06 Microsoft Corporation Field-programmable gate array based accelerator system
US8301638B2 (en) * 2008-09-25 2012-10-30 Microsoft Corporation Automated feature selection based on rankboost for ranking
US9551700B2 (en) * 2010-12-20 2017-01-24 Milagen, Inc. Device and methods for the detection of cervical disease
JP5333570B2 (en) * 2011-12-21 2013-11-06 富士ゼロックス株式会社 Image processing apparatus, program, and image processing system
US9087301B2 (en) 2012-12-21 2015-07-21 International Business Machines Corporation Hardware architecture for simulating a neural network of neurons
US9053429B2 (en) * 2012-12-21 2015-06-09 International Business Machines Corporation Mapping neural dynamics of a neural model on to a coarsely grained look-up table
US9373059B1 (en) * 2014-05-05 2016-06-21 Atomwise Inc. Systems and methods for applying a convolutional network to spatial data
US9971966B2 (en) * 2016-02-26 2018-05-15 Google Llc Processing cell images using neural networks
US10546237B2 (en) 2017-03-30 2020-01-28 Atomwise Inc. Systems and methods for correcting error in a first classifier by evaluating classifier output in parallel
GB201705876D0 (en) 2017-04-11 2017-05-24 Kheiron Medical Tech Ltd Recist
GB201705911D0 (en) 2017-04-12 2017-05-24 Kheiron Medical Tech Ltd Abstracts
US10902581B2 (en) 2017-06-19 2021-01-26 Apeel Technology, Inc. System and method for hyperspectral image processing to identify foreign object
US10902577B2 (en) 2017-06-19 2021-01-26 Apeel Technology, Inc. System and method for hyperspectral image processing to identify object
WO2019028004A1 (en) 2017-07-31 2019-02-07 Smiths Detection Inc. System for determining the presence of a substance of interest in a sample
CA3102170A1 (en) 2018-06-14 2019-12-19 Kheiron Medical Technologies Ltd Immediate workup
EP3598194A1 (en) 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Method for microscopic assessment
US11151356B2 (en) * 2019-02-27 2021-10-19 Fei Company Using convolution neural networks for on-the-fly single particle reconstruction
CN113515798B (en) * 2021-07-05 2022-08-12 中山大学 Urban three-dimensional space expansion simulation method and device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4998284A (en) * 1987-11-17 1991-03-05 Cell Analysis Systems, Inc. Dual color camera microscope and methodology for cell staining and analysis
US4839807A (en) * 1987-08-03 1989-06-13 University Of Chicago Method and system for automated classification of distinction between normal lungs and abnormal lungs with interstitial disease in digital chest radiographs
US5544650A (en) * 1988-04-08 1996-08-13 Neuromedical Systems, Inc. Automated specimen classification system and method
US4965725B1 (en) * 1988-04-08 1996-05-07 Neuromedical Systems Inc Neural network based automated cytological specimen classification system and method
WO1991020048A1 (en) * 1990-06-21 1991-12-26 Applied Electronic Vision, Inc. Cellular analysis utilizing video processing and neural network
US5734022A (en) * 1990-08-01 1998-03-31 The Johns Hopkins University Antibodies to a novel mammalian protein associated with uncontrolled cell division
US5257182B1 (en) * 1991-01-29 1996-05-07 Neuromedical Systems Inc Morphological classification system and method
US5276772A (en) * 1991-01-31 1994-01-04 Ail Systems, Inc. Real time adaptive probabilistic neural network system and method for data sorting
US5784162A (en) * 1993-08-18 1998-07-21 Applied Spectral Imaging Ltd. Spectral bio-imaging methods for biological research, medical diagnostics and therapy
US5331550A (en) * 1991-03-05 1994-07-19 E. I. Du Pont De Nemours And Company Application of neural networks as an aid in medical diagnosis and general anomaly detection
IL98622A (en) * 1991-06-25 1996-10-31 Scitex Corp Ltd Method and apparatus for employing neural networks in color image processing
US5276771A (en) * 1991-12-27 1994-01-04 R & D Associates Rapidly converging projective neural network
EP0587093B1 (en) * 1992-09-08 1999-11-24 Hitachi, Ltd. Information processing apparatus using inference and adaptive learning
US6690817B1 (en) * 1993-08-18 2004-02-10 Applied Spectral Imaging Ltd. Spectral bio-imaging data for cell classification using internal reference
JP3207690B2 (en) * 1994-10-27 2001-09-10 シャープ株式会社 Image processing device
US6665060B1 (en) * 1999-10-29 2003-12-16 Cytyc Corporation Cytological imaging system and method

Also Published As

Publication number Publication date
AU726049B2 (en) 2000-10-26
US20020042785A1 (en) 2002-04-11
EP0850405A2 (en) 1998-07-01
US20020123977A1 (en) 2002-09-05
WO1997011350A3 (en) 1997-05-22
AU6921496A (en) 1997-04-09
JPH11515097A (en) 1999-12-21
WO1997011350A2 (en) 1997-03-27
US6463425B2 (en) 2002-10-08

Similar Documents

Publication Publication Date Title
CA2232164A1 (en) A neural network assisted multi-spectral segmentation system
CN112308158B (en) Multi-source field self-adaptive model and method based on partial feature alignment
CN109800736B (en) Road extraction method based on remote sensing image and deep learning
Buyssens et al. Multiscale convolutional neural networks for vision–based classification of cells
WO2018125580A1 (en) Gland segmentation with deeply-supervised multi-level deconvolution networks
Hao et al. Geometry-aware deep recurrent neural networks for hyperspectral image classification
CN111523521A (en) Remote sensing image classification method for double-branch fusion multi-scale attention neural network
CN111091527A (en) Method and system for automatically detecting pathological change area in pathological tissue section image
CN109102498B (en) Method for segmenting cluster type cell nucleus in cervical smear image
CN110110634B (en) Pathological image multi-staining separation method based on deep learning
Song et al. Hybrid deep autoencoder with Curvature Gaussian for detection of various types of cells in bone marrow trephine biopsy images
Huynh et al. Plant identification using new architecture convolutional neural networks combine with replacing the red of color channel image by vein morphology leaf
CN111160194B (en) Static gesture image recognition method based on multi-feature fusion
Jonnalagedda et al. [regular paper] mvpnets: Multi-viewing path deep learning neural networks for magnification invariant diagnosis in breast cancer
CN110414317B (en) Full-automatic leukocyte classification counting method based on capsule network
CN115116054A (en) Insect pest identification method based on multi-scale lightweight network
CN113592893A (en) Image foreground segmentation method combining determined main body and refined edge
Nawandhar et al. Image segmentation using thresholding for cell nuclei detection of colon tissue
Kim et al. Nucleus segmentation and recognition of uterine cervical pap-smears
Zubair Machine learning based biomedical image analysis and feature extraction methods
CN114187265A (en) Blood leukocyte segmentation method based on double-path and cavity space pyramid pooling
Pinz et al. Neuromorphic methods for recognition of compact image objects
Nishchhal et al. Accurate Cell Segmentation in Blood Smear Images Based on Color Analysis and Cnn Models
Hasan et al. Nuclei segmentation in er-ihc stained histopathology images using mask r-cnn
Sahaya Jeniba et al. A m ultilevel self‐attention based segmentation and classification technique using Directional Hexagonal Mixed Pattern algorithm for lung nodule detection in thoracic CT image

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued