WO2001018739A1 - Searching for images electronically - Google Patents

Searching for images electronically Download PDF

Info

Publication number
WO2001018739A1
WO2001018739A1 PCT/US2000/024516 US0024516W WO0118739A1 WO 2001018739 A1 WO2001018739 A1 WO 2001018739A1 US 0024516 W US0024516 W US 0024516W WO 0118739 A1 WO0118739 A1 WO 0118739A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
reference image
images
module
index information
Prior art date
Application number
PCT/US2000/024516
Other languages
French (fr)
Inventor
Michael Y. Lu
Ziqiang Chen
Jiangshen You
Xiao-Jin Xiong
Hongguang Luo
Anthony Fisher
Bing Liu
Binglin Xie
Jing Ji
Thomas Cheatham
Original Assignee
Medical Online, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medical Online, Inc. filed Critical Medical Online, Inc.
Priority to AU73545/00A priority Critical patent/AU7354500A/en
Publication of WO2001018739A1 publication Critical patent/WO2001018739A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Definitions

  • the invention relates to searching for images electronically. Images provide useful visual records of features contained in them.
  • a family photograph is a record of the members of the family
  • a chest X-ray is a record of the organs or tumors in the chest
  • a magnetic resonance image is a record of structures or functionality of biological tissues.
  • Image records can take various forms including paintings, photographs, X-ray images, computed tomography (CT) images, electrocardiographs (ECG) , positron emission tomography (PET) images, single photon emission computed tomography (SPECT) images, and magnetic resonance images (MR images) .
  • CT computed tomography
  • ECG electrocardiographs
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • MR images magnetic resonance images
  • the invention electronically matches images in a collection in order of their similarity to a reference image.
  • the images contain different versions of a common feature and the ranking is based on information derived from the images about the version of the common feature associated with the image. For example, a collection of tattoos of crosses could be matched against a reference tattoo of a cross based on such information as the size of the tattoo, its location on a persons's body, and its color characteristics .
  • One application of the invention is in quickly comparing single-modality or multi-modality radiological images. For example, doctors can electronically search through a collection of chest X-ray images for past patients who had similar tumors to the tumor in an X-ray image from a current patient. The medical history of the past patients with similar tumors can be used to determine the prognosis and the treatment options for the current patient .
  • Other applications include identifying criminal suspects by comparing their identifying characteristics to those in stored images of known criminals. For example, a suspect whose tattooed arm was captured in an image by a security camera could be identified by matching the tattoo against a database of images of known criminals with tattooed arms.
  • a method of searching for images electronically includes receiving a reference image that contains a version of a common feature shared with other images. A portion of the reference image that includes a version of the feature is selected and index information is determined based on the selected portion. The determined index information is matched against stored sets of index information, each set of which is associated with one of the other images.
  • Embodiments of the invention may include one or more of the following.
  • the index information may include the location of the portion in the reference image, the size of the portion, or pixel properties of the portion (for example its gray-scale or color levels) .
  • the method may include extracting a representation of the version of the common feature.
  • Extracted index information may be determined based on the extracted representation.
  • the extracted index information may be matched against sets of stored extracted index information, each set of which is associated with one of said other images.
  • the extracted index information may include, the location of the extracted representation in the reference image, the size of the extracted representation, or the pixel properties of extracted representation (for example its gray-scale or color levels) .
  • the method may include receiving additional information associated with the reference image and matching the additional information with sets of stored additional information.
  • Each set of said stored additional information may be associated with one of the other images.
  • the matching of the portions may be limited to less than all of the sets of stored index information based on the additional information.
  • the common feature may include an organism's tissue.
  • the additional information my be information about the type tissue represented in the reference image, or descriptive information about the organism whose tissue is represented in the image (e.g., the organism's age, weight, state of health, or smoking habits).
  • the additional information may include information about the storage format of the reference image.
  • the other images may be stored, for example, in a database server.
  • the reference image may added to the database server to make it available to subsequent searches.
  • the reference image may be normalized to conform to the stored images prior to said selecting.
  • the normalization may include scaling or altering the image storage format.
  • the reference image may be received on a communication channel, such as the Internet or an intranet.
  • Each set of stored index information may be assigned a score depending on the matching.
  • the results of the matching may be presented, through a web server, in order of how closely they match the reference image index information.
  • the invention in another general aspect, relates to a system for presenting images electronically that includes a first input communication interface equipped to receive a reference image.
  • the reference image includes a version of a common feature that is shared with other images.
  • a first selection module is configured to select a portion of the reference image containing the version of the common feature.
  • the selection module also determines index information based on the selection.
  • a first matching module is configured to match the index information against sets of stored index information. Each set of stored index information is associated with one of the other images.
  • Embodiments of the invention may contain one or more of the following.
  • the system may include an extractor module which is equipped to extract a representation of the reference version of the common feature from the selected portion.
  • the feature extractor module may also be equipped to determine extracted index information based on the extraction.
  • the matching module may match the extracted index information against sets of stored extracted index information. Each set of said stored extracted index information may be associated with one of the other images.
  • the system may be equipped to receive additional information associated with the reference image.
  • the matching module may match said additional information against sets of stored additional information. Each set of stored additional information may be associated with one of the other images.
  • the matching module may limit the matching to less than all of the sets of stored index information based on the additional reference information.
  • the common features may be tissues of an organism and the additional reference information may be about the organism.
  • the additional information may also be about the reference image.
  • the system may include a normalizing module configured to normalize the reference image to conform to the other images prior to the selecting.
  • the normalizing may include scaling or changing format of the image.
  • the communication module may be the Internet or an intranet.
  • the system could include an output communication module, such as a web server, to provide access to the results of the matching.
  • the system may present the results in order of how closely they match said indexing information.
  • the system could include a data module, such as a database server, for storing the other images.
  • the results may provide access to the other images that correspond to the sets of other indexing information.
  • the data module may be configured to add the reference image to the other reference images to make it available to subsequent searches.
  • At least one of the modules may be implemented in a computer program executed on a processor.
  • the processor may execute computer programs associated with more than one module. Alternatively, one program associated with a module may be executed on more than one processor. Different modules could be executed on processors on different computers.
  • the computers may communicate with each other through the Internet or through an intranet.
  • the web server may communicate with the modules of the system through CGI, ISAPI, NSAPI.
  • the system may include a second set of input communication interface, selection module, and matching module that is equipped to process a search request when the first set is unavailable, for example, when the first set is processing another search request.
  • FIG. 1 is a screenshot of a web browser user interface to the search server.
  • FIG. 2 shows the communication link between multiple concurrent users and the search server.
  • FIG. 3 is a flow chart of the technique used to compare an input image to stored images.
  • FIG. 4 shows the subsystems of the search server.
  • FIG. 5 shows a chest X-ray of a tumor, a region of interest (ROI) including the tumor, a segment of the tumor extracted from the ROI.
  • Fig. 5' shows a set of MR T2 images that is similar to the images in FIG. 5.
  • FIG. 6 shows the information used in a search.
  • FIG. 7 shows reference patient information and different matching records from the data bank.
  • FIG. 8 shows a searching system that uses multiple computers to provide searching functionality.
  • a doctor In diagnostic procedures, it is often useful for a doctor to compare a reference radiological image from a patient, such as the X-ray image 53 shown in Fig. 5, to documented images from other patients. Such a comparison could be part of a search for medical cases similar to that of the patient. The similar cases can be used to determine a course of treatment.
  • the doctor initiates such a search using a web browser, shown in Fig. 1, by typing the search server's uniform resource location (URL) in the location box 7 and striking the enter key on his computer keyboard. This causes the web browser to load a search web page 10 on the browser.
  • the search web page 10 is provided from a central server 14 (shown in Fig. 2) by way of the Internet 15.
  • the doctor enters the name of the patient's image file in box 1 along with other information about the reference image in boxes 2, 63, 64, and 65.
  • the patient's image file is, for example, an electronic file that is stored on the hard disk of the local computer.
  • the image type such as X-ray or MRI, is entered in box 2.
  • the image storage format such as DICOM, GIF, JPEG, or BMP, is entered in box 63.
  • the section of the patient's body portrayed in the image (e.g. chest) is entered in box 64.
  • the feature to base the search on is entered in input 65.
  • the search shown in Figure 1 thus represents a request for a search of chest X-rays based on a reference tumor.
  • the doctor can also enter a confidential identifier in box 5, which associates images from a specific patient to related images of the same patient without disclosing the personal identity of the patient.
  • the doctor can enter other information about the patient in boxes 6, 7, 8, and 9.
  • the additional information can be used to narrow the image search to a relevant demographic category. Possible examples of additional information are whether or not the patient smokes (box 9), the patient's diagnosis (box 8), and the patient's weight (box 7) .
  • the doctor can also enter the name of the patient's health plan in box 3 and a patient ID given by the health plan in box 4. This information can be used to bill the patient's health plan for the image search. To preserve the anonymity of the patient, the system does not link this billing information with the patient's images.
  • the doctor After entering all the relevant information, the doctor has the option of having the search match the records on the search server with either the reference image only or with both the reference image and the additional information. To match the records with the image only, the doctor clicks on button 11. Otherwise, the doctor clicks on button 12. As shown in FIG. 2, multiple concurrent users 13,
  • search server 14 can be connected to the search server 14 via the Internet 15.
  • User 13 's search requests 16 and search results 17 are transmitted over the Internet 15 to and from the server 14.
  • the other users 18, and 19 are also communicating with the server.
  • Another user 71 is connected to the search server 14 via an intranet 72.
  • a doctor who has physical access to the search server may submit his search and receive the results directly on the search server 14 without using either the Internet 15 or the Intranet 72.
  • the scope of the search is determined (step 21) based on classification information entered by the doctor about the —patient's image.
  • the search scope would be limited to chest X-ray images containing tumors based on the information in boxes 2, 64, and 65. If the doctor opted to have the search also match the additional patient information by clicking on button 12, the search scope would also be limited to the class of images from male smokers. The classification restricts the scope of the matching so that only images in the search server that meet the classification are considered. This assures that meaningful results are obtained in real-time.
  • the input image and the associated classification information then undergo a feature-based matching procedure described below.
  • the image is normalized (step 22) so that it is represented in a scale and image format consistent with a common format used for the images on the search server.
  • Possible common image formats include BMP, JPEG, and GIF.
  • the doctor selects the format of his input image in box 63.
  • the search server 14 is also capable of detecting the input format if the doctor does not fill box 63. Incompatible patient image files are converted to the common format before they are matched.
  • the normalized image could contain irrelevant tissue in addition to the feature that is relevant to the search.
  • Figure 5 shows a normalized chest X-ray 53 depicting a tumor 66.
  • the X-ray 53 shows the patient's arms 67-68 and neck 69, which are irrelevant to the matching.
  • the normalized image is windowed to locate a small region of interest (ROI) 54 (step 23 in figure 3) containing the relevant feature (in this case the tumor 66) .
  • ROI region of interest
  • the location of the ROI 54 in the patient's image 53 (represented by the horizontal coordinate 59 and the vertical coordinate 58 from the top left corner of the image) reflects the location of the tumor 66 in the X-ray.
  • the width 60 and the height 57 of the ROI 54 indicate the size of the tumor 66.
  • Pixel properties in the ROI 54 such as its gray scale properties and its local structure are indicative of the properties on and around the tumor.
  • the selected ROI 54 from step 23 then has the relevant feature segmented (step 24) to yield an image segment 55 that includes even fewer pixels than the ROI that are associated with tissue that is irrelevant to the search.
  • the ROI 54 still contains additional irrelevant tissue around the tumor 66.
  • Step 24 segments the relevant feature 66 from the irrelevant tissues to yield image segment 55 that only depicts the feature of interest 66.
  • the location and the size of the tumor 66 are determined from the segment 55 and used in subsequent matching (step 25) .
  • the search server allows the doctor to select from the two methods depending on his needs.
  • the search server could be configured to use other methods of segmentation.
  • Information 73 (Fig. 6) determined from the location, size, and properties of the ROI 54 and the segment 55 is used to match (step 25) the patient's reference image 53 with stored images. Matching the information instead of the images reduces the matching time, allowing the matching to be implemented in real-time.
  • a correlation algorithm described in the Journal of Computer Assisted Tomography, vol. 16, pp. 620-633, 1993, is used to match the determined information 73 about the feature of interest 66 to an index 77 containing sets of information 74-76 determined from images stored in the database 28.
  • each entry 74-76 in the index 77 is assigned a matching percentage score (step 26) that is based on its similarity to the determined information 73.
  • the scores of the entries 74-76 indicate the similarity between the image associated with the entry and the patient's image.
  • An implementation of the procedure described above can match several hundred patients' reference images against several million stored images in one second.
  • the database images corresponding to entries in the index 77 that produce higher match percentages with a reference image are presented to the doctor (step 27), along with their corresponding match percentages as a result of the search.
  • Such a result set for a search of brain tumor images is shown in figure 7. It contains different brain tumor images 57, 58, 59, and 60 along with their matching percentages which are 99%, 98%, 93%, and 89% respectively.
  • the doctor has the option of requesting images with lower matching percentages by clicking on button 70.
  • it can be configured so that doctors only have access to images that meet a minimum match percentage.
  • the doctor can fine tune the search results by running broader or narrower searches as needed. For instance, if the search yields insufficient results, the doctor can broaden the search scope by reloading the search page 10 and entering less information in the boxes 5, 6, 7, 8, and 9. For example, the doctor might clear box 9 so that the search also returns images from patients who do not smoke. The doctor can also opt to search on the image only by clicking on button 11. Alternatively, if the search yields a lot of irrelevant information, the doctor can narrow the search by reloading the search page and entering more information in the boxes 5, 6, 7, 8, and 9. For example, the doctor can specify that the patient has been diagnosed with a malignant lung tumor in box 8 to exclude images from patients with benign tumors.
  • the search server 52 shown in figure 4 comprises: a web server 29, a database management system 35 and image manipulation modules 31-34.
  • the search server components can be run on physically separate computers that are connected to each other by a local area network (LAN) or even the Internet.
  • LAN local area network
  • the web server 29, which could be implemented on a Sun Workstation, comprises a request receiving module 30 and a data serving module 36.
  • the web server modules communicate with the rest of the search server via common gateway interface (CGI), information server application programming interface (ISAPI), Netscape server application programming interface (NSAPI), or any other protocol for communicating between a web server and a client or server application.
  • CGI common gateway interface
  • ISAPI information server application programming interface
  • NSAPI Netscape server application programming interface
  • the database management system 35 comprises data 38 and a data comparing and matching module 37.
  • An Oracle VIR can be used as the database management system.
  • the data stored in the database includes collections of normalized images 41, selected ROI 42, segmented features 43, additional image information 40, and an obtained data index 39.
  • the request receiving module 30 of the web server 29 conveys the data 45 to the scope module 31 to determine the scope of the search.
  • the image 46 is then normalized by the normalizing module 32 before being windowed by the ROI selection module 33 to locate the ROI and obtained information 48.
  • the relevant features 49 are extracted from the ROI 48 by the segmentation module 34 to derive an image segment and additional obtained information 49.
  • the additional information from the doctor 52, the normalized image 47, the selected ROI and obtained information 48, the segmented image and the additional information 49 are then conveyed to the comparing module 37 of the database management system 35.
  • the database management system matches the data 38 to the conveyed information and assigns a matching percentage to each data record.
  • the matched data 50 from the database management system 35 is then delivered to the data serving module 51 for eventual delivery to the doctor.
  • the database management system can also be configured so that the patient's image, the ROI, the segmented image, and the obtained additional information are added to the database so that they are available for later searches.
  • Tier II 88 comprises web servers 83-85 which receive search requests from the users 78-82 and Tier III 89 comprises images servers 86, 87 which perform the requested searches.
  • the web servers 83-85 could be Sun Microsystems computers running Netscape Web Server Software and the image servers 86, 87 could be Sun Microsystems computers running Oracle 8i.
  • the search requests are assigned to the web servers 83-85 in a round-robin fashion such that each search request is only received by one web server.
  • a web request scheme can be implemented by commercially available DNS servers such as MetaDNS by Metainfo Inc.
  • a web request from user 78 may be assigned to web server 83 which would then transmit the request to the image server with the lightest load.
  • Oracle Application Server can be used to determine which image server has the lightest load.
  • the searching of the database could be done by any health care provider, by a patient, by a researcher or by any other party.
  • the website on which the search is requested could be a health care portal providing a variety of other features and could be accessible to anyone with access to the World Wide Web.
  • Similar approaches could be used for other kinds of images in which features can be extracted for rapid comparison with similar common features of a large body of other images.
  • the images could include military images, machine vision images and criminal investigation images, to name but a few.
  • the search server may be used to identify a suspect by comparing his identifying characteristics to those in stored images of known criminals. For example, a suspect whose tattooed arm was captured in an image by a security camera could be identified by matching the tattoo on his arm against a database of images of known criminals with tattooed arms.
  • the search server may be used to electronically search through an electronic archive of images to see if the archive contains prior images of a historical monument in a current image. The prior images could be compared to the current image to see if the historical monument was in need of repair.

Abstract

A method of searching for images electronically includes receiving a reference image (20) that contains a version of a common feature shared with other images. A portion of the reference image that includes a version of the feature is selected (23) and indexing information is determined based on the selected portion. The determined index information is matched (25) against stored sets of indexing information (28), each set of which is associated with one of the other images.

Description

SEARCHING FOR IMAGES ELECTRONICALLY Background of the Invention The invention relates to searching for images electronically. Images provide useful visual records of features contained in them. A family photograph is a record of the members of the family, a chest X-ray is a record of the organs or tumors in the chest, and a magnetic resonance image is a record of structures or functionality of biological tissues.
Image records can take various forms including paintings, photographs, X-ray images, computed tomography (CT) images, electrocardiographs (ECG) , positron emission tomography (PET) images, single photon emission computed tomography (SPECT) images, and magnetic resonance images (MR images) . The records are often grouped together in a collection such as a photo album, a book, a folder, or a database .
It is often necessary to compare the images in a collection to a reference image to determine which images are similar to the reference image. For example, when identifying a suspect from a collection of mug shots one compares the recalled image of the suspect with the various mug shots in the book.
Summary of the Invention The invention electronically matches images in a collection in order of their similarity to a reference image. The images contain different versions of a common feature and the ranking is based on information derived from the images about the version of the common feature associated with the image. For example, a collection of tattoos of crosses could be matched against a reference tattoo of a cross based on such information as the size of the tattoo, its location on a persons's body, and its color characteristics .
One application of the invention is in quickly comparing single-modality or multi-modality radiological images. For example, doctors can electronically search through a collection of chest X-ray images for past patients who had similar tumors to the tumor in an X-ray image from a current patient. The medical history of the past patients with similar tumors can be used to determine the prognosis and the treatment options for the current patient .
Other applications include identifying criminal suspects by comparing their identifying characteristics to those in stored images of known criminals. For example, a suspect whose tattooed arm was captured in an image by a security camera could be identified by matching the tattoo against a database of images of known criminals with tattooed arms.
Other advantages and features will become apparent from the following description and from the claims.
In one general aspect of the invention, a method of searching for images electronically includes receiving a reference image that contains a version of a common feature shared with other images. A portion of the reference image that includes a version of the feature is selected and index information is determined based on the selected portion. The determined index information is matched against stored sets of index information, each set of which is associated with one of the other images.
Embodiments of the invention may include one or more of the following. The index information may include the location of the portion in the reference image, the size of the portion, or pixel properties of the portion (for example its gray-scale or color levels) .
The method may include extracting a representation of the version of the common feature. Extracted index information may be determined based on the extracted representation. The extracted index information may be matched against sets of stored extracted index information, each set of which is associated with one of said other images. The extracted index information may include, the location of the extracted representation in the reference image, the size of the extracted representation, or the pixel properties of extracted representation (for example its gray-scale or color levels) .
The method may include receiving additional information associated with the reference image and matching the additional information with sets of stored additional information. Each set of said stored additional information may be associated with one of the other images. The matching of the portions may be limited to less than all of the sets of stored index information based on the additional information. The common feature may include an organism's tissue. The additional information my be information about the type tissue represented in the reference image, or descriptive information about the organism whose tissue is represented in the image (e.g., the organism's age, weight, state of health, or smoking habits). The additional information may include information about the storage format of the reference image.
The other images may be stored, for example, in a database server. The reference image may added to the database server to make it available to subsequent searches. The reference image may be normalized to conform to the stored images prior to said selecting. The normalization may include scaling or altering the image storage format. The reference image may be received on a communication channel, such as the Internet or an intranet. Each set of stored index information may be assigned a score depending on the matching. The results of the matching may be presented, through a web server, in order of how closely they match the reference image index information.
In another general aspect, the invention relates to a system for presenting images electronically that includes a first input communication interface equipped to receive a reference image. The reference image includes a version of a common feature that is shared with other images. A first selection module is configured to select a portion of the reference image containing the version of the common feature. The selection module also determines index information based on the selection. A first matching module is configured to match the index information against sets of stored index information. Each set of stored index information is associated with one of the other images.
Embodiments of the invention may contain one or more of the following. The system may include an extractor module which is equipped to extract a representation of the reference version of the common feature from the selected portion. The feature extractor module may also be equipped to determine extracted index information based on the extraction. The matching module may match the extracted index information against sets of stored extracted index information. Each set of said stored extracted index information may be associated with one of the other images.
The system may be equipped to receive additional information associated with the reference image. The matching module may match said additional information against sets of stored additional information. Each set of stored additional information may be associated with one of the other images. The matching module may limit the matching to less than all of the sets of stored index information based on the additional reference information. The common features may be tissues of an organism and the additional reference information may be about the organism. The additional information may also be about the reference image.
The system may include a normalizing module configured to normalize the reference image to conform to the other images prior to the selecting. The normalizing may include scaling or changing format of the image.
The communication module may be the Internet or an intranet. The system could include an output communication module, such as a web server, to provide access to the results of the matching. The system may present the results in order of how closely they match said indexing information. The system could include a data module, such as a database server, for storing the other images. The results may provide access to the other images that correspond to the sets of other indexing information. The data module may be configured to add the reference image to the other reference images to make it available to subsequent searches. At least one of the modules may be implemented in a computer program executed on a processor. The processor may execute computer programs associated with more than one module. Alternatively, one program associated with a module may be executed on more than one processor. Different modules could be executed on processors on different computers. The computers may communicate with each other through the Internet or through an intranet. The web server may communicate with the modules of the system through CGI, ISAPI, NSAPI. The system may include a second set of input communication interface, selection module, and matching module that is equipped to process a search request when the first set is unavailable, for example, when the first set is processing another search request.
Detailed Description FIG. 1 is a screenshot of a web browser user interface to the search server.
FIG. 2 shows the communication link between multiple concurrent users and the search server.
FIG. 3 is a flow chart of the technique used to compare an input image to stored images. FIG. 4 shows the subsystems of the search server.
FIG. 5 shows a chest X-ray of a tumor, a region of interest (ROI) including the tumor, a segment of the tumor extracted from the ROI. Fig. 5' shows a set of MR T2 images that is similar to the images in FIG. 5.
FIG. 6 shows the information used in a search.
FIG. 7 shows reference patient information and different matching records from the data bank. FIG. 8 shows a searching system that uses multiple computers to provide searching functionality.
In diagnostic procedures, it is often useful for a doctor to compare a reference radiological image from a patient, such as the X-ray image 53 shown in Fig. 5, to documented images from other patients. Such a comparison could be part of a search for medical cases similar to that of the patient. The similar cases can be used to determine a course of treatment.
The doctor initiates such a search using a web browser, shown in Fig. 1, by typing the search server's uniform resource location (URL) in the location box 7 and striking the enter key on his computer keyboard. This causes the web browser to load a search web page 10 on the browser. The search web page 10 is provided from a central server 14 (shown in Fig. 2) by way of the Internet 15.
Once the search web page is loaded, the doctor enters the name of the patient's image file in box 1 along with other information about the reference image in boxes 2, 63, 64, and 65. The patient's image file is, for example, an electronic file that is stored on the hard disk of the local computer. The image type, such as X-ray or MRI, is entered in box 2. The image storage format, such as DICOM, GIF, JPEG, or BMP, is entered in box 63. The section of the patient's body portrayed in the image (e.g. chest) is entered in box 64. And the feature to base the search on is entered in input 65. The search shown in Figure 1 thus represents a request for a search of chest X-rays based on a reference tumor.
The doctor can also enter a confidential identifier in box 5, which associates images from a specific patient to related images of the same patient without disclosing the personal identity of the patient. The doctor can enter other information about the patient in boxes 6, 7, 8, and 9. The additional information can be used to narrow the image search to a relevant demographic category. Possible examples of additional information are whether or not the patient smokes (box 9), the patient's diagnosis (box 8), and the patient's weight (box 7) .
The doctor can also enter the name of the patient's health plan in box 3 and a patient ID given by the health plan in box 4. This information can be used to bill the patient's health plan for the image search. To preserve the anonymity of the patient, the system does not link this billing information with the patient's images.
After entering all the relevant information, the doctor has the option of having the search match the records on the search server with either the reference image only or with both the reference image and the additional information. To match the records with the image only, the doctor clicks on button 11. Otherwise, the doctor clicks on button 12. As shown in FIG. 2, multiple concurrent users 13,
18, and 19 can be connected to the search server 14 via the Internet 15. User 13 's search requests 16 and search results 17 are transmitted over the Internet 15 to and from the server 14. The other users 18, and 19 are also communicating with the server. Another user 71 is connected to the search server 14 via an intranet 72. A doctor who has physical access to the search server may submit his search and receive the results directly on the search server 14 without using either the Internet 15 or the Intranet 72.
As shown in FIG. 3, upon the server 14 receiving the patient's image file and its associated information (step 20), the scope of the search is determined (step 21) based on classification information entered by the doctor about the —patient's image. In the example of figure 1, the search scope would be limited to chest X-ray images containing tumors based on the information in boxes 2, 64, and 65. If the doctor opted to have the search also match the additional patient information by clicking on button 12, the search scope would also be limited to the class of images from male smokers. The classification restricts the scope of the matching so that only images in the search server that meet the classification are considered. This assures that meaningful results are obtained in real-time.
The input image and the associated classification information then undergo a feature-based matching procedure described below. First, the image is normalized (step 22) so that it is represented in a scale and image format consistent with a common format used for the images on the search server. Possible common image formats include BMP, JPEG, and GIF. In the example of figure 1, the doctor selects the format of his input image in box 63. The search server 14 is also capable of detecting the input format if the doctor does not fill box 63. Incompatible patient image files are converted to the common format before they are matched.
Depending on the nature of the initial image, the normalized image could contain irrelevant tissue in addition to the feature that is relevant to the search. For example, Figure 5 shows a normalized chest X-ray 53 depicting a tumor 66. However, in addition to the tumor 66 the X-ray 53 shows the patient's arms 67-68 and neck 69, which are irrelevant to the matching. To reduce the amount of irrelevant tissue, the normalized image is windowed to locate a small region of interest (ROI) 54 (step 23 in figure 3) containing the relevant feature (in this case the tumor 66) . Certain properties obtained during the location of the ROI 54 are useful in the search procedure because they are indicative of characteristics of the relevant feature 66. For example, the location of the ROI 54 in the patient's image 53 (represented by the horizontal coordinate 59 and the vertical coordinate 58 from the top left corner of the image) reflects the location of the tumor 66 in the X-ray. The width 60 and the height 57 of the ROI 54 indicate the size of the tumor 66. Pixel properties in the ROI 54 such as its gray scale properties and its local structure are indicative of the properties on and around the tumor.
The selected ROI 54 from step 23 then has the relevant feature segmented (step 24) to yield an image segment 55 that includes even fewer pixels than the ROI that are associated with tissue that is irrelevant to the search. For example, in Figure 5, the ROI 54 still contains additional irrelevant tissue around the tumor 66. Step 24 segments the relevant feature 66 from the irrelevant tissues to yield image segment 55 that only depicts the feature of interest 66. The location and the size of the tumor 66 are determined from the segment 55 and used in subsequent matching (step 25) .
Two different methods can be used to segment the feature of interest. The first is the statistical method which is illustrated in IEEE Trans . Medi cal Imaging, vol. 13, pp. 441-449. This method does not require human intervention and is therefore better suited for an automatic search server. Alternatively, the dynamic active contour model in IEEE Trans . Medical Imaging, vol. 16, pp.199-209 could be used when the doctor requires more control over the segmentation process. The search server allows the doctor to select from the two methods depending on his needs. The search server could be configured to use other methods of segmentation.
Information 73 (Fig. 6) determined from the location, size, and properties of the ROI 54 and the segment 55 is used to match (step 25) the patient's reference image 53 with stored images. Matching the information instead of the images reduces the matching time, allowing the matching to be implemented in real-time. A correlation algorithm, described in the Journal of Computer Assisted Tomography, vol. 16, pp. 620-633, 1993, is used to match the determined information 73 about the feature of interest 66 to an index 77 containing sets of information 74-76 determined from images stored in the database 28. As a result of the matching, each entry 74-76 in the index 77 is assigned a matching percentage score (step 26) that is based on its similarity to the determined information 73. The scores of the entries 74-76 indicate the similarity between the image associated with the entry and the patient's image. An implementation of the procedure described above can match several hundred patients' reference images against several million stored images in one second. The database images corresponding to entries in the index 77 that produce higher match percentages with a reference image are presented to the doctor (step 27), along with their corresponding match percentages as a result of the search. Such a result set for a search of brain tumor images is shown in figure 7. It contains different brain tumor images 57, 58, 59, and 60 along with their matching percentages which are 99%, 98%, 93%, and 89% respectively. Additionally, the doctor has the option of requesting images with lower matching percentages by clicking on button 70. To reduce the load on the search server, it can be configured so that doctors only have access to images that meet a minimum match percentage.
The doctor can fine tune the search results by running broader or narrower searches as needed. For instance, if the search yields insufficient results, the doctor can broaden the search scope by reloading the search page 10 and entering less information in the boxes 5, 6, 7, 8, and 9. For example, the doctor might clear box 9 so that the search also returns images from patients who do not smoke. The doctor can also opt to search on the image only by clicking on button 11. Alternatively, if the search yields a lot of irrelevant information, the doctor can narrow the search by reloading the search page and entering more information in the boxes 5, 6, 7, 8, and 9. For example, the doctor can specify that the patient has been diagnosed with a malignant lung tumor in box 8 to exclude images from patients with benign tumors.
The search server 52, shown in figure 4 comprises: a web server 29, a database management system 35 and image manipulation modules 31-34. The search server components can be run on physically separate computers that are connected to each other by a local area network (LAN) or even the Internet.
The web server 29, which could be implemented on a Sun Workstation, comprises a request receiving module 30 and a data serving module 36. The web server modules communicate with the rest of the search server via common gateway interface (CGI), information server application programming interface (ISAPI), Netscape server application programming interface (NSAPI), or any other protocol for communicating between a web server and a client or server application.
The database management system 35 comprises data 38 and a data comparing and matching module 37. An Oracle VIR can be used as the database management system. The data stored in the database includes collections of normalized images 41, selected ROI 42, segmented features 43, additional image information 40, and an obtained data index 39.
Upon receiving data 44 from a doctor, the request receiving module 30 of the web server 29 conveys the data 45 to the scope module 31 to determine the scope of the search. The image 46 is then normalized by the normalizing module 32 before being windowed by the ROI selection module 33 to locate the ROI and obtained information 48. The relevant features 49 are extracted from the ROI 48 by the segmentation module 34 to derive an image segment and additional obtained information 49.
The additional information from the doctor 52, the normalized image 47, the selected ROI and obtained information 48, the segmented image and the additional information 49 are then conveyed to the comparing module 37 of the database management system 35. The database management system matches the data 38 to the conveyed information and assigns a matching percentage to each data record. The matched data 50 from the database management system 35 is then delivered to the data serving module 51 for eventual delivery to the doctor.
The database management system can also be configured so that the patient's image, the ROI, the segmented image, and the obtained additional information are added to the database so that they are available for later searches.
The search server functionality is often provided by multiple computers that are connected to each other in a local TCP/IP network 90, as shown in FIG. 8. The server computers are divided into two tiers: tier II 88 and tier III 89. Tier II 88 comprises web servers 83-85 which receive search requests from the users 78-82 and Tier III 89 comprises images servers 86, 87 which perform the requested searches. The web servers 83-85 could be Sun Microsystems computers running Netscape Web Server Software and the image servers 86, 87 could be Sun Microsystems computers running Oracle 8i.
The search requests are assigned to the web servers 83-85 in a round-robin fashion such that each search request is only received by one web server. Such a web request scheme can be implemented by commercially available DNS servers such as MetaDNS by Metainfo Inc. For example, a web request from user 78 may be assigned to web server 83 which would then transmit the request to the image server with the lightest load. Oracle Application Server can be used to determine which image server has the lightest load. Other embodiments are within the following claims. For example, the searching of the database could be done by any health care provider, by a patient, by a researcher or by any other party. The website on which the search is requested could be a health care portal providing a variety of other features and could be accessible to anyone with access to the World Wide Web.
Similar approaches could be used for other kinds of images in which features can be extracted for rapid comparison with similar common features of a large body of other images. The images could include military images, machine vision images and criminal investigation images, to name but a few.
The search server may be used to identify a suspect by comparing his identifying characteristics to those in stored images of known criminals. For example, a suspect whose tattooed arm was captured in an image by a security camera could be identified by matching the tattoo on his arm against a database of images of known criminals with tattooed arms.
Alternatively, the search server may be used to electronically search through an electronic archive of images to see if the archive contains prior images of a historical monument in a current image. The prior images could be compared to the current image to see if the historical monument was in need of repair.
What is claimed is:

Claims

1. A method comprising: receiving a reference image that includes a version of a common feature shared with other images; selecting a portion of said reference image that includes a version of the feature; determining index information based on said portion of said reference image; and matching said determined index information against stored sets of index information, each set of said stored index information being associated with one of said other images .
2. The method of claim 1 further comprising: extracting a representation of the version of the common feature; determining extracted index information based on said extracted representation; and matching said extracted index information against sets of stored extracted index information, each set of said extracted index information being associated with one of said other images.
3. The method of claim 1 wherein said index information includes the location of the portion in the reference image.
4. The method of claim 1 wherein said index information includes the size of the portion.
5. The method of claim 1 wherein said index information includes pixel properties of said portion.
6. The method of claim 5 wherein said pixel properties include gray-scale levels.
7. The method of claim 5 wherein said pixel properties include color levels.
8. The method of claim 2 wherein said extracted index information includes the location of the extracted representation in the reference image.
9. The method of claim 2 wherein said extracted index information includes the size of the extracted representation.
10. The method of claim 2 wherein said extracted index information comprises pixel properties of said extracted representation.
11. The method of claim 10 wherein said pixel properties include gray-scale levels.
12. The method of claim 10 wherein said pixel properties include color levels.
13. The method of claim 1 further comprising: receiving additional information associated with said reference image.
14. The method of claim 13 further comprising matching said additional information with sets of stored additional information, each set of said stored additional information being associated with one of said other images.
15. The method of claim 13 further comprising limiting said matching to less than all of said sets of stored indexing information based on said additional information.
16. The method of claim 13 wherein said common feature comprises tissue of an organism.
17. The method of claim 16 wherein said additional reference information comprises information about the type of tissue represented in the reference image.
18. The method of claim 13 wherein said additional reference information comprises information about storage format of the reference image.
19. The method of claim 16 wherein said additional reference information comprises descriptive information about the organism whose tissue is represented by the image .
20. The method of claim 19 wherein said descriptive information comprises the age of the organism.
21. The method of claim 19 where said descriptive information comprises the weight of the organism.
22. The method of claim 19 where said descriptive information comprises the state of health of the organism.
23. The method of claim 19 where said descriptive information comprises habits of the organism.
24. The method of claim 23 wherein said habits include whether or not the organism is a smoker.
25. The method of claim 1 further comprising storing said other images.
26. The method of claim 25 further comprising storing said reference image.
27. The method of claim 25, wherein said storing is done in a database server.
28. The method of claim 1 further comprising normalizing said reference image to conform to said other images prior to said selecting.
29. The method of claim 28 wherein said normalizing includes scaling.
30. The method of claim 28 wherein said normalizing includes changing the image storage format.
31. The method of claim 1, wherein said reference image is received on a communication channel.
32. The method of claim 31, wherein said communication channel comprises the Internet.
33. The method of claim 1, wherein said communication channel comprises an intranet.
34. The method of claim 1 further comprising providing access to results of said matching.
35. The method of claim 34, wherein said access to the results of the matching is provided through a web server.
36. The method of claim 34, wherein said results are presented in order of how closely they match.
37. The method of claim 1, wherein each of said sets of stored index information is assigned a score depending on said matching.
38. A system comprising: a first input communication interface equipped to receive a reference image, said reference image including a version of a common feature shared with other images; a first selection module configured to select a portion of said reference image containing said version of said common feature, said selection module also determining index information based on said selection; a first matching module configured to match the index information against sets of stored index information, each set of said stored portion index information being associated with one of said other images .
39. The system of claim 38 further comprising an extractor module equipped to extract a representation of said reference version of said common feature from said selected portion, said feature extractor module also equipped to determine extracted index information based on said extraction, wherein: said matching module is further configured to match said extracted index information against sets of stored extracted index information, each set of said stored extracted index information being associated with one of said other images.
40. The system of claim 38 wherein: said system is equipped to receive additional information associated with said reference image.
41. The system of claim 40 wherein said matching module is further configured to match said additional information against sets of stored additional information, each set of said stored additional information being associated with one of said other images.
42. The system of claim 40 wherein said matching module is configured to limit said matching to less than all of said sets of stored index information based on said additional reference information.
43. The system of claim 40 wherein said common features are tissues of an organism.
44. The system of claim 43 wherein said additional reference information is about said organism.
45. The system of claim 43 wherein said additional reference information is about said reference image.
46. The system of claim 38 further comprising a normalizing module configured to normalize said reference image to conform to said other images prior to said selecting.
47. The system of claim 46 wherein said normalizing module is configured to scale said reference image .
48. The system of claim 46 wherein said normalizing module is configured to change the image storage format.
49. The system of claim 38, wherein said input communication module comprises the Internet.
50. The system of claim 38, wherein said input communication module comprises an intranet.
51. The system of claim 38 further comprising an output communication module for providing access to the results of said matching.
52. The system of claim 51, wherein said output communication module comprises a web server.
53. The system of claim 51, wherein said system is configured to present results in order of how closely they match said indexing information.
54. The system of claim 51 further comprising a data module for storing said other images.
55. The system of claim 54, wherein said data module comprises a database server.
56. The system of claim 54 wherein said results are configured to provide access to said other images that correspond to said sets of stored indexing information.
57. The system of claim 54 wherein said data module is configured to add said reference image to said stored reference images.
58. The system of claim 38 wherein at least one of said modules is implemented in a computer program.
59. The system of claim 58 wherein said computer program is executed on a processor.
60. The system of claim 59 wherein said processor executes computer programs associated with more than one module .
61. The system of claim 58 wherein said computer program is executed on more than one processor.
62. The system of claim 58 wherein different modules are executed on processors on different computers.
63. The system of claim 62 wherein said computers communicate with each other through the Internet.
6 . The system of claim 62 wherein said computers communicate with each other through an Intranet.
65. The system of claim 52 wherein said web server communicates with one of said modules of the system v a
CGI.
66. The system of claim 52 wherein said web server communicates with one of said modules of the system via ISAPI.
67. The system of claim 56 wherein said web server communicates with one of said modules of the system via NSAPI.
68. The system of claim 38 further comprising a second input communication interface equipped to receive a reference image, said second input communication interface being configured to receive a reference image when said first communication interface is unavailable.
69. The system of claim 38 further comprising a second selection module configured to select a portion of said reference image containing said version of said common feature, said second selection module being configured to select said portion when said first selection module is unavailable.
70. The system of claim 38 further comprising a second matching module configured to match the indexing information against sets of stored indexing information, said second matching module being configured to match said indexing information when said first matching module is unavailable .
71. The system of claim 68 wherein said first communication interface is unavailable because it is receiving another reference image.
72. The system of claim 69 wherein said first selection module is unavailable because it is selecting a portion from another reference image.
73. The system of claim 70 wherein said first matching module is unavailable because it is matching a set of indexing information from another reference image.
PCT/US2000/024516 1999-09-03 2000-09-01 Searching for images electronically WO2001018739A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU73545/00A AU7354500A (en) 1999-09-03 2000-09-01 Searching for images electronically

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US39008999A 1999-09-03 1999-09-03
US09/390,089 1999-09-03

Publications (1)

Publication Number Publication Date
WO2001018739A1 true WO2001018739A1 (en) 2001-03-15

Family

ID=23541008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/024516 WO2001018739A1 (en) 1999-09-03 2000-09-01 Searching for images electronically

Country Status (2)

Country Link
AU (1) AU7354500A (en)
WO (1) WO2001018739A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1393242A1 (en) * 2001-05-18 2004-03-03 Leonard S. Schultz Methods and apparatus for image recognition and dictation
WO2007147059A2 (en) * 2006-06-15 2007-12-21 Revolutions Medical Corporation System for and method of performing a medical evaluation
WO2013066609A1 (en) * 2011-11-03 2013-05-10 Facebook, Inc. Feature-extraction-based image scoring
US9959320B2 (en) 2013-12-19 2018-05-01 Facebook, Inc. Generating card stacks with queries on online social networks

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817050A (en) * 1985-11-22 1989-03-28 Kabushiki Kaisha Toshiba Database system
US5761655A (en) * 1990-06-06 1998-06-02 Alphatronix, Inc. Image file storage and retrieval system
US5901246A (en) * 1995-06-06 1999-05-04 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5915038A (en) * 1996-08-26 1999-06-22 Philips Electronics North America Corporation Using index keys extracted from JPEG-compressed images for image retrieval

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817050A (en) * 1985-11-22 1989-03-28 Kabushiki Kaisha Toshiba Database system
US5761655A (en) * 1990-06-06 1998-06-02 Alphatronix, Inc. Image file storage and retrieval system
US5901246A (en) * 1995-06-06 1999-05-04 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5915038A (en) * 1996-08-26 1999-06-22 Philips Electronics North America Corporation Using index keys extracted from JPEG-compressed images for image retrieval

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1393242A1 (en) * 2001-05-18 2004-03-03 Leonard S. Schultz Methods and apparatus for image recognition and dictation
EP1393242A4 (en) * 2001-05-18 2007-06-06 Leonard S Schultz Methods and apparatus for image recognition and dictation
US7613336B2 (en) 2001-05-18 2009-11-03 Schultz Leonard S Methods and apparatus for image recognition and dictation
WO2007147059A2 (en) * 2006-06-15 2007-12-21 Revolutions Medical Corporation System for and method of performing a medical evaluation
WO2007147059A3 (en) * 2006-06-15 2008-12-18 Revolutions Medical Corp System for and method of performing a medical evaluation
WO2013066609A1 (en) * 2011-11-03 2013-05-10 Facebook, Inc. Feature-extraction-based image scoring
US8929615B2 (en) 2011-11-03 2015-01-06 Facebook, Inc. Feature-extraction-based image scoring
US9959320B2 (en) 2013-12-19 2018-05-01 Facebook, Inc. Generating card stacks with queries on online social networks
US10268733B2 (en) 2013-12-19 2019-04-23 Facebook, Inc. Grouping recommended search queries in card clusters
US10360227B2 (en) 2013-12-19 2019-07-23 Facebook, Inc. Ranking recommended search queries

Also Published As

Publication number Publication date
AU7354500A (en) 2001-04-10

Similar Documents

Publication Publication Date Title
US8189883B2 (en) Similar case search apparatus and method, and recording medium storing program therefor
Kumar et al. Content-based medical image retrieval: a survey of applications to multidimensional and multimodality data
US8180123B2 (en) Similar case search apparatus and method, and recording medium storing program therefor
US7139417B2 (en) Combination compression and registration techniques to implement temporal subtraction as an application service provider to detect changes over time to medical imaging
JP2009513205A (en) Image processing system especially used for diagnostic images
JP5502346B2 (en) Case image registration device, method and program, and case image search device, method, program and system
Lowe et al. Towards knowledge-based retrieval of medical images. The role of semantic indexing, image content representation and knowledge-based retrieval.
JP7082993B2 (en) Medical image processing equipment, methods and programs, diagnostic support equipment, methods and programs, and medical support systems and methods
CN110598722B (en) Multi-modal neuroimaging data automatic information fusion system
US10318709B2 (en) Method and system for cross-modality case-based computer-aided diagnosis
JP5094770B2 (en) Case image retrieval apparatus, method and program
JP5739700B2 (en) Similar case browsing system, similar case browsing method
JP2008200373A (en) Similar case retrieval apparatus and its method and program and similar case database registration device and its method and program
Bueno et al. How to add content-based image retrieval capability in a PACS
Morishita et al. New solutions for automated image recognition and identification: challenges to radiologic technology and forensic pathology
US20090245609A1 (en) Anatomical illustration selecting method, anatomical illustration selecting device, and medical network system
WO2001018739A1 (en) Searching for images electronically
US20070129970A1 (en) Method and apparatus for location and presentation of information in an electronic patient record that is relevant to a user, in particular to a physician for supporting a decision
CN110752027A (en) Electronic medical record data pushing method and device, computer equipment and storage medium
JP7420914B2 (en) Information processing device, information processing method, and information processing program
Lamard et al. Use of a JPEG-2000 Wavelet Compression Scheme for Content-Based Ophtalmologic Retinal Images Retrieval.
Jin et al. OBIA: an open biomedical imaging archive
Murugan et al. Efficient clustering of unlabeled brain DICOM images based on similarity
Jin et al. Content and semantic context based image retrieval for medical image grid
Antani et al. Geographically distributed complementary content-based image retrieval systems for biomedical image informatics

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP