CA2423485A1 - Selection of medical images based on image data - Google Patents
Selection of medical images based on image data Download PDFInfo
- Publication number
- CA2423485A1 CA2423485A1 CA002423485A CA2423485A CA2423485A1 CA 2423485 A1 CA2423485 A1 CA 2423485A1 CA 002423485 A CA002423485 A CA 002423485A CA 2423485 A CA2423485 A CA 2423485A CA 2423485 A1 CA2423485 A1 CA 2423485A1
- Authority
- CA
- Canada
- Prior art keywords
- image
- images
- cardiac cycle
- computer
- readable medium
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/50—Clinical applications
- A61B6/504—Clinical applications involving diagnosis of blood vessels, e.g. by angiography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5288—Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/005—Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7285—Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
- A61B5/7289—Retrospective gating, i.e. associating measured signals or images with a physiological event after the actual measurement or image acquisition, e.g. by simultaneously recording an additional physiological signal during the measurement or image acquisition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/541—Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/412—Dynamic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/031—Recognition of patterns in medical or anatomical images of internal organs
Abstract
Systems and methods for deriving a cardiac cycle signal by selecting images of a portion of a cardiovascular system include receiving a plurality of images from a scanner that have been recorded over a period of time. The images represent one or more locations along the extent of the cardiovascular system.
The images are then selected based on common criteria determined from the plurality of images and without reference to an external signal. The common criteria comprises changes in the size of a cross section of the aorta, changes in the volume of the heart, changes in the area of a cross section of the heart. In addition, the criteria can include the mean pixel difference between adjacent images.
The images are then selected based on common criteria determined from the plurality of images and without reference to an external signal. The common criteria comprises changes in the size of a cross section of the aorta, changes in the volume of the heart, changes in the area of a cross section of the heart. In addition, the criteria can include the mean pixel difference between adjacent images.
Description
SELECTION OF MEDICAL IMAGES BASED ON IMAGE DATA
Field The present invention relates generally to medical imaging, and more particularly to systems and methods for performing temporal selection of medical images based on image data.
Copyright Notice/Permission A portion of the disclosure of this patent document contains material that is subj ect to copyright protection. The copyright owner has no obj ection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto:
Copyright D 2000, Vital hnages, Inc. All Rights Reserved.
Backer ound Heart disease is a significant public health problem. Therefore, in medicine, it is of considerable value to obtain cross-sectional and volumetric images of the human heart. Applications include angiography of coronary vessels for detecting stenosis, examination of the thoracic aorta fox dissection of the vessel wall, and quantization of calcium deposits in the coronary arteries as a marker for atheroscerotic plaque.
Recently, this last application has gathered significant interest due to the finding that the exam can be performed with a conventional high-speed helical CT (Computer Tomography) scanner, as opposed to the more expensive and less prevalent electron beam CT scanner. However, helical CT scanners are still too slow to "freeze" the motion of the heart, resulting in some cross-sectional images being blurred and out of register with other images captured at a different phase of the heart's cycle of contraction (systole) and relaxation (diastole).
As a consequence, methods have been introduced to determine the heart cycle phase of axial CT images by simultaneously recording the electrocardiogram (EKG), and synchronizing this signal with the sequence of images. Since the high amplitude "r-wave" of the EKG is a reliable and very brief indicator of the onset of ventricular contraction, it is straightforward to use its pear as an indicator of when heart motion will be near its most violent.
It~is then possible to either shut down image acquisition during these times, or retrospectively eliminate images that have been acquired at these times. This method is called EKG gating.
The disadvantage of EKG gating is that it significantly lengthens the exam time due to the need to wire the patient with electrodes, etc. It also introduces additional complications in the design of the CT scanner and of the software used to post-process the acquired images. It may additionally require manual verification of the selection of images, and introduces the possibility of mismatching EKG traces and images from different patients.
As a result, there is a need in the art for systems and methods that can filter images acquired during an image scan of a patient without the need for external monitoring devices such as EKG monitors.
Summary The above-mentioned shortcomings, disadvantages and problems are addressed by the present invention, which will be understood by reading and studying the following specification.
In one embodiment of the invention, a method for selecting images of a portion of a cardiovascular system includes receiving a plurality of images from a scanner that have been recorded over a period of time. The images represent one or more locations along the extent of the cardiovascular system. The images are then selected based on common criteria determined from the plurality of images and without reference to an external signal.
In some embodiments, the common criteria comprises changes in the size of a cross section of the aorta. In alternative embodiments, the common criteria comprises changes in the volume of the heart. In still further embodiments, the common criteria comprises changes in the area of a cross section of the heart.
In yet other embodiments, the criteria includes the mean pixel difference between adjacent images.
The present invention describes systems, clients, servers, methods, and computer-readable media of varying scope. In addition to the aspects and advantages of the present invention described in this summary, further aspects and advantages of the invention will become apparent by reference to the drawings and by reading the detailed description that follows.
Brief Description Of The Drawings FIG. 1 is a block diagram of the hardware and operating enviroiunent in which different embodiments of the invention can be practiced;
FIG. 2 is a diagram illustrating a system level overview of an exemplary embodiment of the invention;
FIG. 3 is a flowchart illustrating a method for performing retrospective gating of medical image data according to an exemplary embodiment of the invention;
FIG. 4 is a flowchart providing further details on determining a signal used to filter images; and FIG. 5 provides a graph comparing the image based retrospective gating to prior art EI~G gating.
Detailed Description In the following detailed description of exemplary embodiments of the invention, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense.
In the Figures, the same reference number is used throughout to refer to an identical component which appears in multiple Figures. Signals and connections may be referred to by the same reference number or label, and the actual meaning will be clear from its use in the context of the description.
The detailed description is divided into multiple sections. In the first section the hardware and operating environment of different embodiments of the invention is described. In the second section, the software environment of varying embodiments of the invention is described. In the third section, methods of va~.-ious embodiments of the invention are described. In the final section, a conclusion is provided.
Hardware and Operating Enviromnent FIG. 1 is a diagram of the hardware and operating enviroiunent in conjunction with which embodiments of the invention may be practiced. The description of FIG. 1 is intended to provide a brief, general description of suitable computer hardware and a suitable computing environment in conjunction with which the invention may be implemented. Although not required, the invention is described in the general context of computer-executable instructions, such as program modules, being executed by a computer, such as a personal computer, workstation, or a server computer.
Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
Moreover, those spilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a cormnunications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
As shown in FIG. 1, the computing system 100 includes a processor.
The invention can be implemented on computers based upon microprocessors such as the PENTIUM~ family of microprocessors manufactured by the Intel Corporation, the MIPS~ family of microprocessors from the Silicon Graphics Corporation, the POWERPC~ family of microprocessors from both the Motorola Corporation and the IBM Corporation, the PRECISION ARCHITECTURE~
family of microprocessors from the Hewlett-Packard Company, the SPARC~
Field The present invention relates generally to medical imaging, and more particularly to systems and methods for performing temporal selection of medical images based on image data.
Copyright Notice/Permission A portion of the disclosure of this patent document contains material that is subj ect to copyright protection. The copyright owner has no obj ection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto:
Copyright D 2000, Vital hnages, Inc. All Rights Reserved.
Backer ound Heart disease is a significant public health problem. Therefore, in medicine, it is of considerable value to obtain cross-sectional and volumetric images of the human heart. Applications include angiography of coronary vessels for detecting stenosis, examination of the thoracic aorta fox dissection of the vessel wall, and quantization of calcium deposits in the coronary arteries as a marker for atheroscerotic plaque.
Recently, this last application has gathered significant interest due to the finding that the exam can be performed with a conventional high-speed helical CT (Computer Tomography) scanner, as opposed to the more expensive and less prevalent electron beam CT scanner. However, helical CT scanners are still too slow to "freeze" the motion of the heart, resulting in some cross-sectional images being blurred and out of register with other images captured at a different phase of the heart's cycle of contraction (systole) and relaxation (diastole).
As a consequence, methods have been introduced to determine the heart cycle phase of axial CT images by simultaneously recording the electrocardiogram (EKG), and synchronizing this signal with the sequence of images. Since the high amplitude "r-wave" of the EKG is a reliable and very brief indicator of the onset of ventricular contraction, it is straightforward to use its pear as an indicator of when heart motion will be near its most violent.
It~is then possible to either shut down image acquisition during these times, or retrospectively eliminate images that have been acquired at these times. This method is called EKG gating.
The disadvantage of EKG gating is that it significantly lengthens the exam time due to the need to wire the patient with electrodes, etc. It also introduces additional complications in the design of the CT scanner and of the software used to post-process the acquired images. It may additionally require manual verification of the selection of images, and introduces the possibility of mismatching EKG traces and images from different patients.
As a result, there is a need in the art for systems and methods that can filter images acquired during an image scan of a patient without the need for external monitoring devices such as EKG monitors.
Summary The above-mentioned shortcomings, disadvantages and problems are addressed by the present invention, which will be understood by reading and studying the following specification.
In one embodiment of the invention, a method for selecting images of a portion of a cardiovascular system includes receiving a plurality of images from a scanner that have been recorded over a period of time. The images represent one or more locations along the extent of the cardiovascular system. The images are then selected based on common criteria determined from the plurality of images and without reference to an external signal.
In some embodiments, the common criteria comprises changes in the size of a cross section of the aorta. In alternative embodiments, the common criteria comprises changes in the volume of the heart. In still further embodiments, the common criteria comprises changes in the area of a cross section of the heart.
In yet other embodiments, the criteria includes the mean pixel difference between adjacent images.
The present invention describes systems, clients, servers, methods, and computer-readable media of varying scope. In addition to the aspects and advantages of the present invention described in this summary, further aspects and advantages of the invention will become apparent by reference to the drawings and by reading the detailed description that follows.
Brief Description Of The Drawings FIG. 1 is a block diagram of the hardware and operating enviroiunent in which different embodiments of the invention can be practiced;
FIG. 2 is a diagram illustrating a system level overview of an exemplary embodiment of the invention;
FIG. 3 is a flowchart illustrating a method for performing retrospective gating of medical image data according to an exemplary embodiment of the invention;
FIG. 4 is a flowchart providing further details on determining a signal used to filter images; and FIG. 5 provides a graph comparing the image based retrospective gating to prior art EI~G gating.
Detailed Description In the following detailed description of exemplary embodiments of the invention, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense.
In the Figures, the same reference number is used throughout to refer to an identical component which appears in multiple Figures. Signals and connections may be referred to by the same reference number or label, and the actual meaning will be clear from its use in the context of the description.
The detailed description is divided into multiple sections. In the first section the hardware and operating environment of different embodiments of the invention is described. In the second section, the software environment of varying embodiments of the invention is described. In the third section, methods of va~.-ious embodiments of the invention are described. In the final section, a conclusion is provided.
Hardware and Operating Enviromnent FIG. 1 is a diagram of the hardware and operating enviroiunent in conjunction with which embodiments of the invention may be practiced. The description of FIG. 1 is intended to provide a brief, general description of suitable computer hardware and a suitable computing environment in conjunction with which the invention may be implemented. Although not required, the invention is described in the general context of computer-executable instructions, such as program modules, being executed by a computer, such as a personal computer, workstation, or a server computer.
Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
Moreover, those spilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a cormnunications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
As shown in FIG. 1, the computing system 100 includes a processor.
The invention can be implemented on computers based upon microprocessors such as the PENTIUM~ family of microprocessors manufactured by the Intel Corporation, the MIPS~ family of microprocessors from the Silicon Graphics Corporation, the POWERPC~ family of microprocessors from both the Motorola Corporation and the IBM Corporation, the PRECISION ARCHITECTURE~
family of microprocessors from the Hewlett-Packard Company, the SPARC~
family of microprocessors from the Sun Microsystems Corporation, or the ALPHA~ family of microprocessors from the Compaq Computer Corporation.
Computing system 200 represents any personal computer, laptop, server, or even a battery-powered, pocket-sized, mobile computer known as a hand-held PC.
The computing system 100 includes system memory 113 (including read-only memory (ROM) 114 and random access memory (RAM) 115), which is connected to the processor 112 by a system data/address bus 116. ROM 114 represents any device that is primarily read-only including electrically erasable programmable read-only memory (EEPROM), flash memory, etc. RAM 115 represents any random access memory such as Synchronous Dynamic Random Access Memory.
Within the computing system 100, input/output bus 118 is connected to the data/address bus 116 via bus controller 119. In one embodiment, input/output bus 118 is implemented as a standard Peripheral Component Interconnect (PCI) bus. The bus controller 119 examines all signals from the processor 112 to route the signals to the appropriate bus. Signals between the processor 112 and the system memory 113 are merely passed through the bus controller 119. However, signals from the processor 112 intended for devices other than system memory 113 are routed onto the input/output bus 118.
Various devices are connected to the input/output bus 118 including hard disk drive 120, floppy drive 121 that is used to read floppy disk 151, and optical drive 122, such as a CD-ROM drive that is used to read an optical disk 152.
The video display 124 or other kind of display device is connected to the input/output bus 118 via a video adapter 125.
A user enters commands and information into the computing system 100 by using a keyboard 40 and/or pointing device, such as a mouse 42, which are connected to bus 118 via input/output ports 128. Other types of pointing devices (not shown in FIG. 1) include track pads, traclc balls, joy sticks, data gloves, head traclcers, and other devices suitable for positioning a cursor on the video display 124.
As shown in FIG. 1, the computing system 100 also includes a modem 129. Although illustrated in FIG. 1 as external to the computing system 100, those of ordinary skill in the art will quickly recognize that the modem 129 may also be internal to the computing system 100. The modem 129 is typically used to connnunicate over wide area networlcs (not shown), such as the global Internet. The computing system may also contain a network interface card S3, as is known in the art, for communication over a networlc.
Software applications 136 and data are typically stored via one of the memory storage devices, which may include the hard disk 120, floppy dislc 151, CD-ROM 1 S2 and are copied to RAM 11 S for execution. In one embodiment, however, software applications 136 are stored in ROM 114 and are copied to RAM 11 S for execution or are executed directly from ROM 114.
In general, the operating system 13S executes software applications 136 and carries out instructions issued by the user. For example, when the user wants to load a software application 136, the operating system 135 interprets the instruction and causes the processor 112 to load software application 136 into RAM 11 S from either the hard disk 120 or the optical dislc 1 S2. Once software 1 S application 136 is loaded into the RAM 11 S, it can be used by the processor 112.
In case of large software applications 136, processor 112 loads various portions of program modules into RAM 11 S as needed.
The Basic Input/output System (BIOS) 117 for the computing system 100 is stored in ROM 114 and is loaded into RAM 11S upon booting. Those skilled in the art will recognize that the BIOS 117 is a set of basic executable routines that have conventionally helped to transfer information between the computing resources within the computing system 100. These low-level service routines are used by operating system 13S or other software applications 136.
In one embodiment computing system 100 includes a registry (not shown) which is a system database that holds configuration information for computing system 100. For example, Windows 9S , Windows 9~~, Windows°
NT, and Windows 2000~ by Microsoft maintain the registry in two hidden files, called LJSER.DAT and SYSTEM.DAT, located on a permanent storage device such as an internal disk.
Software Environment The embodiments of the invention describe a software environment of systems and methods that provide for the retrospective gating of medical images.
Computing system 200 represents any personal computer, laptop, server, or even a battery-powered, pocket-sized, mobile computer known as a hand-held PC.
The computing system 100 includes system memory 113 (including read-only memory (ROM) 114 and random access memory (RAM) 115), which is connected to the processor 112 by a system data/address bus 116. ROM 114 represents any device that is primarily read-only including electrically erasable programmable read-only memory (EEPROM), flash memory, etc. RAM 115 represents any random access memory such as Synchronous Dynamic Random Access Memory.
Within the computing system 100, input/output bus 118 is connected to the data/address bus 116 via bus controller 119. In one embodiment, input/output bus 118 is implemented as a standard Peripheral Component Interconnect (PCI) bus. The bus controller 119 examines all signals from the processor 112 to route the signals to the appropriate bus. Signals between the processor 112 and the system memory 113 are merely passed through the bus controller 119. However, signals from the processor 112 intended for devices other than system memory 113 are routed onto the input/output bus 118.
Various devices are connected to the input/output bus 118 including hard disk drive 120, floppy drive 121 that is used to read floppy disk 151, and optical drive 122, such as a CD-ROM drive that is used to read an optical disk 152.
The video display 124 or other kind of display device is connected to the input/output bus 118 via a video adapter 125.
A user enters commands and information into the computing system 100 by using a keyboard 40 and/or pointing device, such as a mouse 42, which are connected to bus 118 via input/output ports 128. Other types of pointing devices (not shown in FIG. 1) include track pads, traclc balls, joy sticks, data gloves, head traclcers, and other devices suitable for positioning a cursor on the video display 124.
As shown in FIG. 1, the computing system 100 also includes a modem 129. Although illustrated in FIG. 1 as external to the computing system 100, those of ordinary skill in the art will quickly recognize that the modem 129 may also be internal to the computing system 100. The modem 129 is typically used to connnunicate over wide area networlcs (not shown), such as the global Internet. The computing system may also contain a network interface card S3, as is known in the art, for communication over a networlc.
Software applications 136 and data are typically stored via one of the memory storage devices, which may include the hard disk 120, floppy dislc 151, CD-ROM 1 S2 and are copied to RAM 11 S for execution. In one embodiment, however, software applications 136 are stored in ROM 114 and are copied to RAM 11 S for execution or are executed directly from ROM 114.
In general, the operating system 13S executes software applications 136 and carries out instructions issued by the user. For example, when the user wants to load a software application 136, the operating system 135 interprets the instruction and causes the processor 112 to load software application 136 into RAM 11 S from either the hard disk 120 or the optical dislc 1 S2. Once software 1 S application 136 is loaded into the RAM 11 S, it can be used by the processor 112.
In case of large software applications 136, processor 112 loads various portions of program modules into RAM 11 S as needed.
The Basic Input/output System (BIOS) 117 for the computing system 100 is stored in ROM 114 and is loaded into RAM 11S upon booting. Those skilled in the art will recognize that the BIOS 117 is a set of basic executable routines that have conventionally helped to transfer information between the computing resources within the computing system 100. These low-level service routines are used by operating system 13S or other software applications 136.
In one embodiment computing system 100 includes a registry (not shown) which is a system database that holds configuration information for computing system 100. For example, Windows 9S , Windows 9~~, Windows°
NT, and Windows 2000~ by Microsoft maintain the registry in two hidden files, called LJSER.DAT and SYSTEM.DAT, located on a permanent storage device such as an internal disk.
Software Environment The embodiments of the invention describe a software environment of systems and methods that provide for the retrospective gating of medical images.
FIG. 2 is a bloclc diagram describing the major components of such a system.
As shown, the system includes an image scanner 202 and an image processing system 206.
Image scanner 202 in one embodiment of the invention is a CT scanner.
The scanner can be a high-speed helical CT scanner, or it can be an electron beam CT scanner. However, the invention is not limited to CT scanners, an any scanner that can provide a sequence of images recorded over a period of time are within the scope of the invention. For example, scanner 202 could be a Magnetic Resonance Imaging (MRI) or ultrasound scamler.
Scanner 202 produces image data 204 that comprises a sequence of two-dimensional images of the human body. This image data is then sent to image processing system 206 for processing. In one embodiment of the invention, image processing system 206 is the ImageGate system from Vital Images, Inc.
The image data can be transferred from scanner 202 to image processing system 206 using any data transmission means, including tape media, CD-ROM, floppy-disk, removable hard drive, and network means, including the Internet.
Image processing system 206 is a suitably configured computer, such as the computer illustrated in FIG. 1, and employs the methods detailed below to perform retrospective gating of the image data. The output of system 206 comprises cardiac cycle signal 208 and filtered image data 210. Filtered image data 210 comprises the image data that corresponds to images acquired at desired points in the cardiac cycle signal 208.
This section has described the various system components in a system that performs image based retrospective gating of cardiac images. As those of shill in the art will appreciate, the software can be written in any of a number of programming languages known in the art, including but not limited to C/C++, Visual Basic, Smalltalk, Pascal, Ada and similar programming languages. The invention is not limited to any particular programming language for implementation.
Methods of an Exemplary Embodiment of the Invention In the previous section, a system level overview of the operation of an exemplary embodiment of the invention was described. In this section, the particular methods of various embodiments of the invention performed by an operating environment executing an exemplary embodiment are described by reference to a flowchart shown in FIGS. 3 and 4. The methods to be performed by the operating environment constitute computer programs made up of computer-executable instructions. Describing the methods by reference to a flowchart enables one spilled in the art to develop such programs including such instructions to carry out the methods on suitable computers (the processor of the computer executing the instructions from computer-readable media). The methods illustrated in FIGS. 3 and 4 are inclusive of the acts required to be taken by an operating environment executing an exemplary embodiment of the invention.
The methods of the various embodiments of the invention illustrated below operate to perform time categorization and selection of a subset of images from a plurality of images based on the image data alone, that is, without reference to any external signals such as an EI~G. In some embodiments of the invention, the plurality of images comprise a series of images that are received from a CT scanning system in which multiple images are taken at different points in time of a single space of a body. In other embodiments, the plurality of images can comprise a sequence of images in which one image of a space of a body at a single point in time is taken.
A method for categorizing and selecting images according to an embodiment of the invention for performing image-based retrospective gating of scamled medical image data is illustrated in FIG. 3. A system executing the method begins by receiving scanned image data (block 302). In one embodiment of the invention, the scanned image data is received from a CT
scanner such as scanner 202 (FIG. 1). However, the invention is not limited to any particular method of obtaining the scanned image data. As will be appreciated by those of skill in the art, any system capable of producing a volumetric sequence of images is within the scope of the invention. It is desirable that the images are heavily overlapped axial images of the chest.
The axial overlap provides the ability for the system to select a subset of images acquired when the heart was most at rest and still adequately sample the heart to its full anatomical extent.
As shown, the system includes an image scanner 202 and an image processing system 206.
Image scanner 202 in one embodiment of the invention is a CT scanner.
The scanner can be a high-speed helical CT scanner, or it can be an electron beam CT scanner. However, the invention is not limited to CT scanners, an any scanner that can provide a sequence of images recorded over a period of time are within the scope of the invention. For example, scanner 202 could be a Magnetic Resonance Imaging (MRI) or ultrasound scamler.
Scanner 202 produces image data 204 that comprises a sequence of two-dimensional images of the human body. This image data is then sent to image processing system 206 for processing. In one embodiment of the invention, image processing system 206 is the ImageGate system from Vital Images, Inc.
The image data can be transferred from scanner 202 to image processing system 206 using any data transmission means, including tape media, CD-ROM, floppy-disk, removable hard drive, and network means, including the Internet.
Image processing system 206 is a suitably configured computer, such as the computer illustrated in FIG. 1, and employs the methods detailed below to perform retrospective gating of the image data. The output of system 206 comprises cardiac cycle signal 208 and filtered image data 210. Filtered image data 210 comprises the image data that corresponds to images acquired at desired points in the cardiac cycle signal 208.
This section has described the various system components in a system that performs image based retrospective gating of cardiac images. As those of shill in the art will appreciate, the software can be written in any of a number of programming languages known in the art, including but not limited to C/C++, Visual Basic, Smalltalk, Pascal, Ada and similar programming languages. The invention is not limited to any particular programming language for implementation.
Methods of an Exemplary Embodiment of the Invention In the previous section, a system level overview of the operation of an exemplary embodiment of the invention was described. In this section, the particular methods of various embodiments of the invention performed by an operating environment executing an exemplary embodiment are described by reference to a flowchart shown in FIGS. 3 and 4. The methods to be performed by the operating environment constitute computer programs made up of computer-executable instructions. Describing the methods by reference to a flowchart enables one spilled in the art to develop such programs including such instructions to carry out the methods on suitable computers (the processor of the computer executing the instructions from computer-readable media). The methods illustrated in FIGS. 3 and 4 are inclusive of the acts required to be taken by an operating environment executing an exemplary embodiment of the invention.
The methods of the various embodiments of the invention illustrated below operate to perform time categorization and selection of a subset of images from a plurality of images based on the image data alone, that is, without reference to any external signals such as an EI~G. In some embodiments of the invention, the plurality of images comprise a series of images that are received from a CT scanning system in which multiple images are taken at different points in time of a single space of a body. In other embodiments, the plurality of images can comprise a sequence of images in which one image of a space of a body at a single point in time is taken.
A method for categorizing and selecting images according to an embodiment of the invention for performing image-based retrospective gating of scamled medical image data is illustrated in FIG. 3. A system executing the method begins by receiving scanned image data (block 302). In one embodiment of the invention, the scanned image data is received from a CT
scanner such as scanner 202 (FIG. 1). However, the invention is not limited to any particular method of obtaining the scanned image data. As will be appreciated by those of skill in the art, any system capable of producing a volumetric sequence of images is within the scope of the invention. It is desirable that the images are heavily overlapped axial images of the chest.
The axial overlap provides the ability for the system to select a subset of images acquired when the heart was most at rest and still adequately sample the heart to its full anatomical extent.
In one embodiment of the invention, the categorization and selection of images operates to select those images that are the least blurred. In the case of cardiac image data, the least blurred images represent those points in time in the cardiac cycle when the heart is at rest in between beats. In some embodiments of the invention, a Fourier transform is used to determine "blurriness" of an image.
In alternative embodiments, the Mean Pixel Difference (MPD) between one image and a subsequent adjacent image is determined. The images with the lowest MPD are selected as being the least blurry images. In MPD, each pixel on a first axial image is subtracted from the corresponding pixel on the subsequent adjacent axial image. The absolute value of the pixel differences for the image are summed and divided by the number of pixels in the image. This single number for each image is the MPD. An image with a lower MPD than another image can be said to be less blurry.
Next, those images that are the least blurry, or whose blurriness are within a particular tolerance, are then selected for inclusion in the set of images for which further analysis, such as volumetric analysis, will be performed.
The image processing system derives a cardiac cycle signal from the image data (block 304). The various phases of the cardiac cycle can be derived from the image data by detecting in the image data the corresponding motion or changes of different parts of the heart and/or other parts of the body affected by the motion of the heart, such as blood vessels. By measuring this motion as a function of a time, a periodic signal can be derived that can be used to determine heart rate and cardiac cycle phase for any time during the image data collection.
Finally, the image data is filtered according to the caxdiac cycle signal determined at block 304 to select those images that where acquired when the heart was most at rest (block 306). The filtered images can then be used to render accurate volumetric images of the heart. w Various methods can be used to determine the cardiac cycle in block 304 above. The methods generally determine the cycle based on criteria common to the images. One such method according to an embodiment of the invention detects changes in the cross-section of the aorta, and is illustrated in FIG.
4. The method begins by estimating the position of the aorta (block 402). Various means can be used to detect the position of the aorta. For example, in some embodiments of the invention, a matching filter as is known in the art is used to detect the aorta. In alternative embodiments, an Active Appearance Model is used to detect the position. In further alternative embodiments, an Active Shape Model is used.
In still further embodiments of the invention, an algorithm called Hough Transform is used to find the approximate aorta position. As is known in the art, the Hough Transform can be regarded as a generalized template matching method, and is typically used to extract out edges or curves from an image.
The Hough Transform can be used to extract circles and even generalized (perhaps non- symmetrical) edges. Use of the Hough Transform is desirable, because it is invariant to rotation and translation. Further details on the Hough Transform can be found in U.S. Patent 3,069,654, which is hereby incorporated by reference herein. For the purpose of Hough Transform, aorta can be described as a circle of radius in a range 8 - 25mzn.
In one embodiment of the invention, the position of the aorta is determined only in the first image. For the following images, the results of segmentation in the previous image are used. This is desirable because it reduces the time required to execute the method. The approximate aorta border in the (z + 1) image is defined as a circle of radius Ryes, where Rres is the radius of the circle found by Hough transform in image z. The circle is centered at position (Xc(z), Yc(z)), which is the center of aorta found in the previous, z image. Because processing each image is a sequential process, it is vulnerable to error propagation. In one embodiment of the invention, the likelihood of possible error propagation is reduced by using the same circle radius to initialize the aorta segmentation and only updating its position in subsequent images.
In another embodiment of the invention, a voting mechanism for detecting the aorta is used. The Hough Transform is applied to a sequence of five consecutive images. A result is accepted, if the aorta center is found in a similar location in at least four of these images. If such a match is not found, another set of five images is investigated.
In the Hough transform, a fit value is determined for every pixel in an image. The fit value represents the probability that the point belongs to the aorta border. The computation of fit values is based on a priori knowledge of the aorta properties in CT images.
For every pixel (x, y) in an image, the fit value is computed as fit(x, y) = 0 if I(x, y) < -100HU or' I(x, y) > 200HU (1) f [g(x~Y)~ expected ~ otlzeYwise where:
I(x, y) is the image intensity in Hounsfield Units [HU].
g is the image gradient.
expected 1S the expected direction of the gradient.
In one embodiment of the invention, a directional gradient detector based on 5 ~ 5 Gaussian mask is used. In the case of the Hough Transform, the expected direction is not known. Therefore, one embodiment of the invention uses the following formula to set lZexpected () expected = I g' 2 Function f(.) in formula (1) is a transformation function that limits the maximum value of the considered gradient. The transformation function f(.) in one embodiment of the invention is:
f( ) a:a<aM~ ( ) a = 3 aM~.r ~ a ~ aNr~x where aM~ is the gradient response to a step edge of volume 100HU.
After estimating the position of the aorta, the method then segments (i.e.
detects) the aorta in each of the 2D images in the sequence (bloclc 404). In some embodiments of the invention, edge-based dynamic programming as is l~nown in the art is applied to the images in order to segment the aorta. In alternative embodiments, Active Contours, also referred to as "Snakes" are used to detect the aorta. In further embodiments of the invention, region growing algorithms are used to detect the aorta. Such algorithms are known in the art.
In one particular embodiment of the invention, the aorta is segmented using a dynamic programming based boundary detection algorithm. Details of such a boundary detection algorithm are presented in M. Sonka, V. Hlavac, R.
Boyle, Irnage Pf~ocessing, Analysis, and Machine hision; PWS Publishing, 2"d edition, 1998, pp. 158-163, which is hereby incorporated by reference herein.
The approximate aorta location determined at blocl~ 402 is used to create a Region of Interest (ROI) within which the aorta boundary is sought. For the purpose of searching the image for the aorta, the ROI is mapped into a rectangular graph, in which every node (xG, yG) corresponds to one image pixel.
This is expressed in the following formula:
width (x~ , yG ) = cefateYlinePoint(xG ) + h~ (xG ) ~ (y~ - 2 ) (4) where:
ia~ is the unit normal vector to the circular centerline at xcth point along the centerline; and width is the width of the ROI.
Every node in the rectangular graph has a cost associated with it. The cost is given by the formula:
cost(x, y) = max( fit(i, j) - fit(x, y)) (5) ~.J
where:
fit is defined by Equation 1 with nexpected = n~.
The method finds a path of minimal cost between any node in the first column and any node in the last column of the rectangular graph. To ensure that the found boundary is a closed contour, the search is restricted by forcing the first and last points of the minimal cost path have to have the same YG
coordinate in the rectangular graph.
In addition to assigning a cost to every graph node, a cost is assigned to every linl~ between graph nodes. Only nodes in 2 consecutive columns are connected by linl~s. The costs are assigned according to the following formula:
0 if k = j 3o cost[(x., y;), (x. + 1, yx)] = 0.3 ~ cost(x~, y;) if k = j ~ 1 (6) otherwise Such a cost assignment effectively connects only nodes whose y coordinate differs by no more than one. It ensures connectivity of the resulting boundary. The middle rule in Equation 6 for assigning cost reflects the knowledge that the sought border should be circular. Thus, a border that is parallel to the ROI centerline (a circle) is considered to be a preferred direction over a direction that diverges from the ROI centerline direction. The minimum cost path is then mapped into the original image and represents the aorta border.
After determining the position of the aorta, the method then computes the area of the aorta cross-section in the image (block 406). The cross-sectional area in the image is then used to compute an approximation of the volume of aorta at the time of an image acquisition. The area is computed as an area enclosed by the aorta border found by the aorta segmentation in block 404. In one embodiment of the invention, for every image, the found border is projected onto a plane that is perpendicular to the aorta direction. This provides better approximation of the true cross-sectional area of the aorta in cases when the image acquisition plane is not perpendicular to the aorta direction.
The aorta direction for a given image can be approximated by a tangent to a line connecting centers of gravity of outlined aorta boundaries. In some embodiments of the invention, the line is smoothed using algorithms known in the art.
Every border point (x, y, z) is projected into a point (xP, yP, zP) as follows:
xP x tx v + ty (t ~ ~> (~>
t~
where:
t is the unit vector representing aorta direction; and c is a vector connecting the boundary point (x, y, z) with the aorta center in the given image.
The aorta center can be computed from its boundary points, yielding the following computation of the vector c x N x~
~ = y --~ yt Na r=i z z~
where:
(xi ,y1,zt) are the aorta boundary points NB is the total number of boundary points.
In some embodiments of the invention, the area computation is simplified by expressing the projected border in a new coordinate system (wX, wY, wZ) in which the wZ coordinate of all the boundary points is zero. The new coordinate system is determined as follows:
t w, ev W'Z ' ( ~'r'Z ' ev ) W ~ev Wz ~ (Wz ~ Bv) v =
Wx ex - Wz' (Wz' ex)-Wy' (Wy' ex) ex-Wz'(W:,'ex)-Wy'(Wy'ex)I
Thus, the coordinate transformation can be computed as:
x' xP
y' = T yP (10) z' 'zP
where:
Wx ~ x Wx ~ y Wx ~ Z
T = wv ~ x wv ~ y wv ~ z (11) w ~x w ~ y w ~z Because z' = 0 for all points of the transformed border, in some embodiments the area of the aorta cross-section is computed using a discrete implementation of Simpson's Rule. Such algorithms are l~nown in the art.
The sequence of values representing the aorta cross-sectional areas represent an in-time regularly sampled signal that can be used to compute the heart rate and the phase of the cardiac sample for every image in the sequence.
In some embodiments of the invention, the signal determined is filtered to remove noise and to simplify cardiac phase determination (block 408). In some embodiments of the invention, filtering is performed by a Butterworth filter. It is desirable that the filter is designed so that there is no more than 3dB
attenuation in the pass band and at least 40dB attenuation in the stop band.
h1 order to determine the band pass of the filter, the aorta area signal is first High Pass Filtered (HPF) in order to remove DC component and low frequencies that may correspond to the change of aorta size in space rather than in time. In some embodiments, the HPF signal (S~F) is computed using a digital lugh-pass filter based on the Butterworth filter.
Next, the heart rate is determined. In some embodiments, it is estimated by determining the main frequency component in the area signal. It is desirable to avoid rasterization and to ensure stability, thus some embodiments determine the main frequency not from the original signal, but from the signal's autocorrelation:
fM~ = arg max FouYiey~~sHPF * sHPF~ (12) where ~* denotes the correlation operation.
The raw aorta area signal is then filtered with a Butterworth filter with a passing band of < 0.7 f ,1.3 f > , which represents an estimated variation of the heart rate for a single patient during the scanning procedure. In order to improve the filtering results, some embodiments use the signal obtained using the above-described filtering to compute new values of fL and fH. The beginning of each period is determined by positively sloped zero crossings of the signal. The newly obtained limits are used to finally filter the signal according to the following formula:
SBPF (x) = Butterwo~th~s(x), fL, fx}, where:
s denotes the raw aorta area signal; and ~ fL ~ fH > is the frequency range of the pass band. (13) In alternative embodiments, signal .filtering is performed by applying an alternative HPF to the aorta signal, again, in order to remove DC component and low frequencies that may correspond to change of aorta size in space rather than in time. In these embodiments, the HPF can be approximated by subtracting a moving average from the original signal according to the following:
SHPF (x) = s(x) - ~ s(x + i) (14) where s denotes the raw aorta area signal.
Next, in order to remove noise from the signal and thus simplify the determination of points of maximum positive gradient that corresponds to the opening of aortic valve, the signal is smoothed.
W alternative embodiments of the invention, smoothing is performed using Band Pass Filtering (BPF) in a frequency domain. A moving window Fourier Transform can be used to smooth the signal separately in short segments.
The window width is chosen so that it corresponds to approximately three cardiac cycles and is a power of 2 (to simplify the employment of Fast Fourier Transform algorithm). The width of the BPF is chosen to be one, thus considering only the basic frequency that occurs in the given segment of the signal. The BPF frequency is selected as the highest frequency in the spectrum (not considering zero frequency - the signal offset). The window for the Fourier Transform is always moved by just one sample and the smoothed signals are averaged in the time domain. The filtering is illustrated in the following formula:
BPFI (s<i,l+w> ) = Fourie~-' f Fourie~~ f s<Z,1+,v> ~ Filter fl ~ ~ (15) where:
Filtey~ is a BPF with a passing frequency:
fi = arg max FouYie~ f s<i,r+w> ~r (16) fe<l,wl2>
and the window width w is a power of 2.
The final filter signal is thus:
min(i,N-w) 3O SBPF (i) _ ~ BPF(S<.i,.i+w>)(i -.7) 1 (17) j=maxi-w,0) mini, N - w) - maxi - w,0) where N is the number of samples in the aorta area signal.
After the signal has been filtered, the method determines the cardiac phase (blocl~ 410). Depending on the heart rate of the person being scanned and on the image acquisition speed, 6-12 images can typically be obtained per cardiac cycle. It is generally the case that the opening of the aortic valve is followed by the maximum rate of change (max. gradient) of aortic pressure that is reflected by the maximum gradient of increase of aorta cross-sectional area.
This happens approximately 0.1 seconds after the beginning of ventricular polarization. To match the aorta area signal to the heart cycle the method searches for the points of maximum positive gradient, which generally correspond to the moments of aortic valve opening.
The filtered signal approximates the time change of aorta cross-sectional area (volume) during a cardiac cycle. Thus, one period of the signal corresponds to one cardiac cycle. Since signal can be approximated by a sine wave, the maximum positive derivative of the signal can be determined and used as the point of a positively sloped zero crossing. Using linear interpolation, the method can determine that the zero crossing point xZ between signal points (x, Yl) and (x + 1, y2), such that is:
xZ = Y' + x (18) Yi - Yz Thus the set of times (xZ) represents the instants corresponding to aortic valve openings. From the aortic valve openings, the point in the cardiac cycle signal can be determined where the heart is most at rest.
In order to remove possible errors in the signal filtering, some embodiments of the invention perform post-processing on the signal. Based on the derived heart rate statistics (mean and standard deviation), outliers representing too short heart rate can be removed and/or a heart cycle can be added if the measured one is too long.
The above-described signal filtering techniques can be used for both helical scanning systems and for rnulti-slice scanning systems. In the case of multi-slice scanning systems, the scamung protocol is generally a "step and shoot" procedure, which results in a signal consisting of short measured pieces (1-2 heart cycles) interlaced with segments of 1-2 heart cycles without any data.
In such systems there are typically more than one measurement of every segment. Usually there are four segments, and the segments are synchronized.
In order to use the same filtering scheme as above, the existing signals are averaged to obtain one signal, resulting in a higher S/N ratio. Then, the missing segments of the signal are reconstructed. A method for restoration of lost samples in digital signals is described in detail in Raymond Veldhuis:
Restoratio~z ofLost Samples ih Digital Sigyaals. Prentice Hall W ternational Series in Acoustics, Speech and Signal Processing, 1990 Prentice Hall International (ITK) Ltd., ISBN - 0-13-775198-2, Chapter 3 Autoregressive processes, pp. 28-56. In some embodiments, this algoritlnn is used to fill the missing data.
However, the invention is not limited to the algoritlnn described in Veldhuis, and any algorithm capable of restoring lost samples can be used. When the signal is reconstructed to its full length, the same filtering scheme as described above can be employed The method of determining the cardiac cycle illustrated in FIG. 4 is desirable, for several reasons. First, it is easy to measure accurately because a reliable segmentation algorithm exists. Second, measuring the cross section of the aorta is insensitive to the shifting and twisting that occurs as the heart beats.
Finally, measuring the cross section of the aorta is independent of heart geometry changes in the space domain (changes from one acquired image to another). However, the invention is not limited to deriving a cardiac cycle signal based on the changes in the area of the cross section of the aorta. For example, changes in the area of cross section of other blood vessels besides the aorta can be detected and used to derive a signal. In addition, the invention is not limited to detecting changes in the area of a blood vessel such as the aorta. In alternative embodiments of the invention, the motion of a blood vessel wall is determined and used to derive a signal.
FIG. 5 illustrates a sample of the signals obtained using the methods described above. Graph 502 illustrates the difference between an EKG-derived signal an image-derived signal for ventricular systole. Graph 504 illustrates the difference between an EKG-derived heart rate and an image-derived heart rate.
Graph 506 presents two signals, Raw signal 508 which represents the signal before filtering, and filtered signal 510 which represents the signal after the filtering described above has been applied.
In an alternative embodiment of the invention, the cardiac signal can be derived using a "Mean Pixel Difference" (MPD) between the acquired images.
As noted above, in MPD, each pixel on a first axial image is subtracted from the corresponding pixel on the subsequent adjacent axial image. The absolute value of the pixel differences for the image are summed and divided by the number of pixels in the image. This single number for each image is the MPD.
The MPD thus represents a direct measure of changes in the data between two image slices. The sequence of MPD values can thus be used to derive a cardiac cycle signal. Using the MPD is not as desirable as using the changes in the area of cross-sections of the aorta, because other factors besides heart motion can affect the calculation of the MPD. For example, the difference is often caused not only by the heart motion, but also by the change of the heart geometry as the scan progresses both in the time and spatial domains.
In a further alternative embodiment of the invention, the cardiac signal can be derived using the fact that the heart volume periodically decreases and increases during systole and diastole. By measuring these changes in heart volume (or area of heart cross-section) across a sequence of images, a cardiac cycle can be computed. The reliable measurement of the heart cross-sectional area requires accurate segmentation of the heart in each of the images. The cross-sectional area can then be used to determine a cardiac cycle signal in a manner similar to that described above in reference to FIG. 4.
In a still further embodiment of the invention, heart border motion can be used to derive a cardiac cycle signal. Here, heart border motion is measured by detecting the motion in the walls of the heart. For example, the atrium and ventricle borders. This can provide the same information as measurement of heart cross-sectional area. In this embodiment, easy-to-segment parts of the heart are measured and the changes in the measurements used to derive a cardiac cycle signal.
Conclusion Systems and methods for using image data to derive signals, such as S cardiac cycle signals have been disclosed. The embodiments of the invention provide advantages over previous systems. For example, there is no need to attach wires to the patient for EKG, which can reduce patient apprehension and nervousness about the imaging procedure. Furthermore, deriving the cardiac cycle from the image data eliminates the possibility that the EKG cycle data does not match the image data due to mishandling of the EKG data, since no EKG
data is required. Moreover, EKG data can provide a less than accurate cardiac cycle due to variations in the measurement of electrical signals generated by the heart. The image data captures the heart motion directly, unlike the EKG.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention.
The terminology used in this application is meant to include all of these environments. It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Therefore, it is manifestly intended that this invention be limited only by the following claims and equivalents thereof.
In alternative embodiments, the Mean Pixel Difference (MPD) between one image and a subsequent adjacent image is determined. The images with the lowest MPD are selected as being the least blurry images. In MPD, each pixel on a first axial image is subtracted from the corresponding pixel on the subsequent adjacent axial image. The absolute value of the pixel differences for the image are summed and divided by the number of pixels in the image. This single number for each image is the MPD. An image with a lower MPD than another image can be said to be less blurry.
Next, those images that are the least blurry, or whose blurriness are within a particular tolerance, are then selected for inclusion in the set of images for which further analysis, such as volumetric analysis, will be performed.
The image processing system derives a cardiac cycle signal from the image data (block 304). The various phases of the cardiac cycle can be derived from the image data by detecting in the image data the corresponding motion or changes of different parts of the heart and/or other parts of the body affected by the motion of the heart, such as blood vessels. By measuring this motion as a function of a time, a periodic signal can be derived that can be used to determine heart rate and cardiac cycle phase for any time during the image data collection.
Finally, the image data is filtered according to the caxdiac cycle signal determined at block 304 to select those images that where acquired when the heart was most at rest (block 306). The filtered images can then be used to render accurate volumetric images of the heart. w Various methods can be used to determine the cardiac cycle in block 304 above. The methods generally determine the cycle based on criteria common to the images. One such method according to an embodiment of the invention detects changes in the cross-section of the aorta, and is illustrated in FIG.
4. The method begins by estimating the position of the aorta (block 402). Various means can be used to detect the position of the aorta. For example, in some embodiments of the invention, a matching filter as is known in the art is used to detect the aorta. In alternative embodiments, an Active Appearance Model is used to detect the position. In further alternative embodiments, an Active Shape Model is used.
In still further embodiments of the invention, an algorithm called Hough Transform is used to find the approximate aorta position. As is known in the art, the Hough Transform can be regarded as a generalized template matching method, and is typically used to extract out edges or curves from an image.
The Hough Transform can be used to extract circles and even generalized (perhaps non- symmetrical) edges. Use of the Hough Transform is desirable, because it is invariant to rotation and translation. Further details on the Hough Transform can be found in U.S. Patent 3,069,654, which is hereby incorporated by reference herein. For the purpose of Hough Transform, aorta can be described as a circle of radius in a range 8 - 25mzn.
In one embodiment of the invention, the position of the aorta is determined only in the first image. For the following images, the results of segmentation in the previous image are used. This is desirable because it reduces the time required to execute the method. The approximate aorta border in the (z + 1) image is defined as a circle of radius Ryes, where Rres is the radius of the circle found by Hough transform in image z. The circle is centered at position (Xc(z), Yc(z)), which is the center of aorta found in the previous, z image. Because processing each image is a sequential process, it is vulnerable to error propagation. In one embodiment of the invention, the likelihood of possible error propagation is reduced by using the same circle radius to initialize the aorta segmentation and only updating its position in subsequent images.
In another embodiment of the invention, a voting mechanism for detecting the aorta is used. The Hough Transform is applied to a sequence of five consecutive images. A result is accepted, if the aorta center is found in a similar location in at least four of these images. If such a match is not found, another set of five images is investigated.
In the Hough transform, a fit value is determined for every pixel in an image. The fit value represents the probability that the point belongs to the aorta border. The computation of fit values is based on a priori knowledge of the aorta properties in CT images.
For every pixel (x, y) in an image, the fit value is computed as fit(x, y) = 0 if I(x, y) < -100HU or' I(x, y) > 200HU (1) f [g(x~Y)~ expected ~ otlzeYwise where:
I(x, y) is the image intensity in Hounsfield Units [HU].
g is the image gradient.
expected 1S the expected direction of the gradient.
In one embodiment of the invention, a directional gradient detector based on 5 ~ 5 Gaussian mask is used. In the case of the Hough Transform, the expected direction is not known. Therefore, one embodiment of the invention uses the following formula to set lZexpected () expected = I g' 2 Function f(.) in formula (1) is a transformation function that limits the maximum value of the considered gradient. The transformation function f(.) in one embodiment of the invention is:
f( ) a:a<aM~ ( ) a = 3 aM~.r ~ a ~ aNr~x where aM~ is the gradient response to a step edge of volume 100HU.
After estimating the position of the aorta, the method then segments (i.e.
detects) the aorta in each of the 2D images in the sequence (bloclc 404). In some embodiments of the invention, edge-based dynamic programming as is l~nown in the art is applied to the images in order to segment the aorta. In alternative embodiments, Active Contours, also referred to as "Snakes" are used to detect the aorta. In further embodiments of the invention, region growing algorithms are used to detect the aorta. Such algorithms are known in the art.
In one particular embodiment of the invention, the aorta is segmented using a dynamic programming based boundary detection algorithm. Details of such a boundary detection algorithm are presented in M. Sonka, V. Hlavac, R.
Boyle, Irnage Pf~ocessing, Analysis, and Machine hision; PWS Publishing, 2"d edition, 1998, pp. 158-163, which is hereby incorporated by reference herein.
The approximate aorta location determined at blocl~ 402 is used to create a Region of Interest (ROI) within which the aorta boundary is sought. For the purpose of searching the image for the aorta, the ROI is mapped into a rectangular graph, in which every node (xG, yG) corresponds to one image pixel.
This is expressed in the following formula:
width (x~ , yG ) = cefateYlinePoint(xG ) + h~ (xG ) ~ (y~ - 2 ) (4) where:
ia~ is the unit normal vector to the circular centerline at xcth point along the centerline; and width is the width of the ROI.
Every node in the rectangular graph has a cost associated with it. The cost is given by the formula:
cost(x, y) = max( fit(i, j) - fit(x, y)) (5) ~.J
where:
fit is defined by Equation 1 with nexpected = n~.
The method finds a path of minimal cost between any node in the first column and any node in the last column of the rectangular graph. To ensure that the found boundary is a closed contour, the search is restricted by forcing the first and last points of the minimal cost path have to have the same YG
coordinate in the rectangular graph.
In addition to assigning a cost to every graph node, a cost is assigned to every linl~ between graph nodes. Only nodes in 2 consecutive columns are connected by linl~s. The costs are assigned according to the following formula:
0 if k = j 3o cost[(x., y;), (x. + 1, yx)] = 0.3 ~ cost(x~, y;) if k = j ~ 1 (6) otherwise Such a cost assignment effectively connects only nodes whose y coordinate differs by no more than one. It ensures connectivity of the resulting boundary. The middle rule in Equation 6 for assigning cost reflects the knowledge that the sought border should be circular. Thus, a border that is parallel to the ROI centerline (a circle) is considered to be a preferred direction over a direction that diverges from the ROI centerline direction. The minimum cost path is then mapped into the original image and represents the aorta border.
After determining the position of the aorta, the method then computes the area of the aorta cross-section in the image (block 406). The cross-sectional area in the image is then used to compute an approximation of the volume of aorta at the time of an image acquisition. The area is computed as an area enclosed by the aorta border found by the aorta segmentation in block 404. In one embodiment of the invention, for every image, the found border is projected onto a plane that is perpendicular to the aorta direction. This provides better approximation of the true cross-sectional area of the aorta in cases when the image acquisition plane is not perpendicular to the aorta direction.
The aorta direction for a given image can be approximated by a tangent to a line connecting centers of gravity of outlined aorta boundaries. In some embodiments of the invention, the line is smoothed using algorithms known in the art.
Every border point (x, y, z) is projected into a point (xP, yP, zP) as follows:
xP x tx v + ty (t ~ ~> (~>
t~
where:
t is the unit vector representing aorta direction; and c is a vector connecting the boundary point (x, y, z) with the aorta center in the given image.
The aorta center can be computed from its boundary points, yielding the following computation of the vector c x N x~
~ = y --~ yt Na r=i z z~
where:
(xi ,y1,zt) are the aorta boundary points NB is the total number of boundary points.
In some embodiments of the invention, the area computation is simplified by expressing the projected border in a new coordinate system (wX, wY, wZ) in which the wZ coordinate of all the boundary points is zero. The new coordinate system is determined as follows:
t w, ev W'Z ' ( ~'r'Z ' ev ) W ~ev Wz ~ (Wz ~ Bv) v =
Wx ex - Wz' (Wz' ex)-Wy' (Wy' ex) ex-Wz'(W:,'ex)-Wy'(Wy'ex)I
Thus, the coordinate transformation can be computed as:
x' xP
y' = T yP (10) z' 'zP
where:
Wx ~ x Wx ~ y Wx ~ Z
T = wv ~ x wv ~ y wv ~ z (11) w ~x w ~ y w ~z Because z' = 0 for all points of the transformed border, in some embodiments the area of the aorta cross-section is computed using a discrete implementation of Simpson's Rule. Such algorithms are l~nown in the art.
The sequence of values representing the aorta cross-sectional areas represent an in-time regularly sampled signal that can be used to compute the heart rate and the phase of the cardiac sample for every image in the sequence.
In some embodiments of the invention, the signal determined is filtered to remove noise and to simplify cardiac phase determination (block 408). In some embodiments of the invention, filtering is performed by a Butterworth filter. It is desirable that the filter is designed so that there is no more than 3dB
attenuation in the pass band and at least 40dB attenuation in the stop band.
h1 order to determine the band pass of the filter, the aorta area signal is first High Pass Filtered (HPF) in order to remove DC component and low frequencies that may correspond to the change of aorta size in space rather than in time. In some embodiments, the HPF signal (S~F) is computed using a digital lugh-pass filter based on the Butterworth filter.
Next, the heart rate is determined. In some embodiments, it is estimated by determining the main frequency component in the area signal. It is desirable to avoid rasterization and to ensure stability, thus some embodiments determine the main frequency not from the original signal, but from the signal's autocorrelation:
fM~ = arg max FouYiey~~sHPF * sHPF~ (12) where ~* denotes the correlation operation.
The raw aorta area signal is then filtered with a Butterworth filter with a passing band of < 0.7 f ,1.3 f > , which represents an estimated variation of the heart rate for a single patient during the scanning procedure. In order to improve the filtering results, some embodiments use the signal obtained using the above-described filtering to compute new values of fL and fH. The beginning of each period is determined by positively sloped zero crossings of the signal. The newly obtained limits are used to finally filter the signal according to the following formula:
SBPF (x) = Butterwo~th~s(x), fL, fx}, where:
s denotes the raw aorta area signal; and ~ fL ~ fH > is the frequency range of the pass band. (13) In alternative embodiments, signal .filtering is performed by applying an alternative HPF to the aorta signal, again, in order to remove DC component and low frequencies that may correspond to change of aorta size in space rather than in time. In these embodiments, the HPF can be approximated by subtracting a moving average from the original signal according to the following:
SHPF (x) = s(x) - ~ s(x + i) (14) where s denotes the raw aorta area signal.
Next, in order to remove noise from the signal and thus simplify the determination of points of maximum positive gradient that corresponds to the opening of aortic valve, the signal is smoothed.
W alternative embodiments of the invention, smoothing is performed using Band Pass Filtering (BPF) in a frequency domain. A moving window Fourier Transform can be used to smooth the signal separately in short segments.
The window width is chosen so that it corresponds to approximately three cardiac cycles and is a power of 2 (to simplify the employment of Fast Fourier Transform algorithm). The width of the BPF is chosen to be one, thus considering only the basic frequency that occurs in the given segment of the signal. The BPF frequency is selected as the highest frequency in the spectrum (not considering zero frequency - the signal offset). The window for the Fourier Transform is always moved by just one sample and the smoothed signals are averaged in the time domain. The filtering is illustrated in the following formula:
BPFI (s<i,l+w> ) = Fourie~-' f Fourie~~ f s<Z,1+,v> ~ Filter fl ~ ~ (15) where:
Filtey~ is a BPF with a passing frequency:
fi = arg max FouYie~ f s<i,r+w> ~r (16) fe<l,wl2>
and the window width w is a power of 2.
The final filter signal is thus:
min(i,N-w) 3O SBPF (i) _ ~ BPF(S<.i,.i+w>)(i -.7) 1 (17) j=maxi-w,0) mini, N - w) - maxi - w,0) where N is the number of samples in the aorta area signal.
After the signal has been filtered, the method determines the cardiac phase (blocl~ 410). Depending on the heart rate of the person being scanned and on the image acquisition speed, 6-12 images can typically be obtained per cardiac cycle. It is generally the case that the opening of the aortic valve is followed by the maximum rate of change (max. gradient) of aortic pressure that is reflected by the maximum gradient of increase of aorta cross-sectional area.
This happens approximately 0.1 seconds after the beginning of ventricular polarization. To match the aorta area signal to the heart cycle the method searches for the points of maximum positive gradient, which generally correspond to the moments of aortic valve opening.
The filtered signal approximates the time change of aorta cross-sectional area (volume) during a cardiac cycle. Thus, one period of the signal corresponds to one cardiac cycle. Since signal can be approximated by a sine wave, the maximum positive derivative of the signal can be determined and used as the point of a positively sloped zero crossing. Using linear interpolation, the method can determine that the zero crossing point xZ between signal points (x, Yl) and (x + 1, y2), such that is:
xZ = Y' + x (18) Yi - Yz Thus the set of times (xZ) represents the instants corresponding to aortic valve openings. From the aortic valve openings, the point in the cardiac cycle signal can be determined where the heart is most at rest.
In order to remove possible errors in the signal filtering, some embodiments of the invention perform post-processing on the signal. Based on the derived heart rate statistics (mean and standard deviation), outliers representing too short heart rate can be removed and/or a heart cycle can be added if the measured one is too long.
The above-described signal filtering techniques can be used for both helical scanning systems and for rnulti-slice scanning systems. In the case of multi-slice scanning systems, the scamung protocol is generally a "step and shoot" procedure, which results in a signal consisting of short measured pieces (1-2 heart cycles) interlaced with segments of 1-2 heart cycles without any data.
In such systems there are typically more than one measurement of every segment. Usually there are four segments, and the segments are synchronized.
In order to use the same filtering scheme as above, the existing signals are averaged to obtain one signal, resulting in a higher S/N ratio. Then, the missing segments of the signal are reconstructed. A method for restoration of lost samples in digital signals is described in detail in Raymond Veldhuis:
Restoratio~z ofLost Samples ih Digital Sigyaals. Prentice Hall W ternational Series in Acoustics, Speech and Signal Processing, 1990 Prentice Hall International (ITK) Ltd., ISBN - 0-13-775198-2, Chapter 3 Autoregressive processes, pp. 28-56. In some embodiments, this algoritlnn is used to fill the missing data.
However, the invention is not limited to the algoritlnn described in Veldhuis, and any algorithm capable of restoring lost samples can be used. When the signal is reconstructed to its full length, the same filtering scheme as described above can be employed The method of determining the cardiac cycle illustrated in FIG. 4 is desirable, for several reasons. First, it is easy to measure accurately because a reliable segmentation algorithm exists. Second, measuring the cross section of the aorta is insensitive to the shifting and twisting that occurs as the heart beats.
Finally, measuring the cross section of the aorta is independent of heart geometry changes in the space domain (changes from one acquired image to another). However, the invention is not limited to deriving a cardiac cycle signal based on the changes in the area of the cross section of the aorta. For example, changes in the area of cross section of other blood vessels besides the aorta can be detected and used to derive a signal. In addition, the invention is not limited to detecting changes in the area of a blood vessel such as the aorta. In alternative embodiments of the invention, the motion of a blood vessel wall is determined and used to derive a signal.
FIG. 5 illustrates a sample of the signals obtained using the methods described above. Graph 502 illustrates the difference between an EKG-derived signal an image-derived signal for ventricular systole. Graph 504 illustrates the difference between an EKG-derived heart rate and an image-derived heart rate.
Graph 506 presents two signals, Raw signal 508 which represents the signal before filtering, and filtered signal 510 which represents the signal after the filtering described above has been applied.
In an alternative embodiment of the invention, the cardiac signal can be derived using a "Mean Pixel Difference" (MPD) between the acquired images.
As noted above, in MPD, each pixel on a first axial image is subtracted from the corresponding pixel on the subsequent adjacent axial image. The absolute value of the pixel differences for the image are summed and divided by the number of pixels in the image. This single number for each image is the MPD.
The MPD thus represents a direct measure of changes in the data between two image slices. The sequence of MPD values can thus be used to derive a cardiac cycle signal. Using the MPD is not as desirable as using the changes in the area of cross-sections of the aorta, because other factors besides heart motion can affect the calculation of the MPD. For example, the difference is often caused not only by the heart motion, but also by the change of the heart geometry as the scan progresses both in the time and spatial domains.
In a further alternative embodiment of the invention, the cardiac signal can be derived using the fact that the heart volume periodically decreases and increases during systole and diastole. By measuring these changes in heart volume (or area of heart cross-section) across a sequence of images, a cardiac cycle can be computed. The reliable measurement of the heart cross-sectional area requires accurate segmentation of the heart in each of the images. The cross-sectional area can then be used to determine a cardiac cycle signal in a manner similar to that described above in reference to FIG. 4.
In a still further embodiment of the invention, heart border motion can be used to derive a cardiac cycle signal. Here, heart border motion is measured by detecting the motion in the walls of the heart. For example, the atrium and ventricle borders. This can provide the same information as measurement of heart cross-sectional area. In this embodiment, easy-to-segment parts of the heart are measured and the changes in the measurements used to derive a cardiac cycle signal.
Conclusion Systems and methods for using image data to derive signals, such as S cardiac cycle signals have been disclosed. The embodiments of the invention provide advantages over previous systems. For example, there is no need to attach wires to the patient for EKG, which can reduce patient apprehension and nervousness about the imaging procedure. Furthermore, deriving the cardiac cycle from the image data eliminates the possibility that the EKG cycle data does not match the image data due to mishandling of the EKG data, since no EKG
data is required. Moreover, EKG data can provide a less than accurate cardiac cycle due to variations in the measurement of electrical signals generated by the heart. The image data captures the heart motion directly, unlike the EKG.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention.
The terminology used in this application is meant to include all of these environments. It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Therefore, it is manifestly intended that this invention be limited only by the following claims and equivalents thereof.
Claims (56)
1. A method for selecting images of a portion of a cardiovascular system comprising:
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system; and selecting at least a subset of the images based on common criteria determined from the plurality of images and without reference to an external signal.
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system; and selecting at least a subset of the images based on common criteria determined from the plurality of images and without reference to an external signal.
2. The method of claim 1, wherein the portion of the cardiovascular system is the heart.
3. The method of claim 1, wherein the image scanner is a CT (Computer Tomography) scanner.
4. The method of claim 1, wherein the image scanner is a MRI (Magnetic Resonance Image) scanner.
5. The method of claim 1, wherein the image scanner is an Ultrasound scanner.
6. The method of claim 1, wherein selecting a subset of the images results from a determination of the blurriness of each image.
7. The method of claim 6, wherein the blurriness of the image is determined by a Fourier transform applied to the image.
8. The method of claim 6, wherein the blurriness of the image is determined by the mean pixel difference between the image and an adjacent image.
9. The method of claim 1, wherein selecting a subset of the images results from a determination of a change of a relative position of at least one vessel edge in each image.
10. A method for ordering a plurality of images of a portion of a cardiovascular system comprising:
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system;
deriving a cardiac cycle signal from the plurality of scanned images; and assigning a phase in the cardiac cycle to each scanned image.
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system;
deriving a cardiac cycle signal from the plurality of scanned images; and assigning a phase in the cardiac cycle to each scanned image.
11. The method of claim 10, wherein the portion of the cardiovascular system is the heart.
12. The method of claim 10, wherein the image scanner is a CT (Computer Tomography) scanner.
13. The method of claim 10, wherein the image scanner is a MRI (Magnetic Resonance Image) scanner.
14. The method of claim 10, wherein the image scamler is an Ultrasound scanner.
15. The method of claim 10, wherein deriving the cardiac signal comprises:
segmenting a set of data representing a blood vessel in each image;
computing a change value for the blood vessel; and determining the cardiac cycle signal based on a sequence of the change values for each image.
segmenting a set of data representing a blood vessel in each image;
computing a change value for the blood vessel; and determining the cardiac cycle signal based on a sequence of the change values for each image.
16. The method of claim 15, wherein the change value is a change in the area of a cross section of the blood vessel.
17. The method of claim 15, wherein the change value is a change in the position of a wall of the blood vessel.
18. The method of claim 10, wherein deriving the cardiac cycle signal comprises:
segmenting a set of data representing a cross-section of the aorta in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
segmenting a set of data representing a cross-section of the aorta in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
19. The method of claim 18, further comprising estimating the position of the aorta within the image data prior to segmenting the set of data.
20. The method of claim 19, wherein estimating the position of the aorta utilizes a Hough transform.
21. The method of claim 18, wherein segmenting of the aorta cross-section utilizes Dynamic Programming.
22. The method of claim 18, further comprising filtering the cardiac cycle signal to produce a smoothed cardiac cycle signal.
23. The method of claim 10, wherein deriving the cardiac cycle signal comprises:
for each image in the plurality of images performing the tasks of:
selecting an adjacent subsequent image;
calculating a mean pixel difference between the image and the subsequent image; and determining the cardiac cycle signal based on the mean pixel differences of the images.
for each image in the plurality of images performing the tasks of:
selecting an adjacent subsequent image;
calculating a mean pixel difference between the image and the subsequent image; and determining the cardiac cycle signal based on the mean pixel differences of the images.
24. The method of claim 10, wherein deriving the cardiac cycle signal comprises:
segmenting a set of data representing a cross-section of a heart in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
segmenting a set of data representing a cross-section of a heart in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
25. The method of claim 10, wherein deriving the cardiac cycle signal comprises:
for each image in the plurality of images performing the tasks of:
determining a first border of a heart in the image;
determining a second border of the heart in a subsequent adjacent image;
determining the difference between the first border and the second border; and determining the cardiac cycle based on a sequence of the differences.
for each image in the plurality of images performing the tasks of:
determining a first border of a heart in the image;
determining a second border of the heart in a subsequent adjacent image;
determining the difference between the first border and the second border; and determining the cardiac cycle based on a sequence of the differences.
26. The method of claim 10, wherein the ordered set of images is further filtered to produce a subset of images, said subset of images comprising images acquired at a desired point in the cardiac cycle signal.
27. The method of claim 10, wherein the derived cardiac cycle signal is used to interpolate or reconstruct new images at specific phases in the cardiac cycle from the original scanned images or other related data.
28. A computer-readable medium having computer executable instructions for performing a method for selecting images of a portion of a cardiovascular system, the method comprising:
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system; and selecting at least a subset of the images based on common criteria determined from the plurality of images and without reference to an external signal.
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system; and selecting at least a subset of the images based on common criteria determined from the plurality of images and without reference to an external signal.
29. The computer-readable medium of claim 28, wherein the portion of the cardiovascular system is the heart.
30. The computer-readable medium of claim 28, wherein the image scamler is a CT (Computer Tomography) scanner.
31. The computer-readable medium of claim 28, wherein the image scanner is a MRI (Magnetic Resonance Image) scanner.
32. The computer-readable medium of claim 28, wherein the image scanner is an Ultrasound scanner.
33. The computer-readable medium of claim 28, wherein selecting a subset of the images results from a determination of the blurriness of each image.
34. The computer-readable medium of claim 33, wherein the blurriness of the image is determined by a Fourier transform applied to the image.
35. The computer-readable medium of claim 33, wherein the blurriness of the image is determined by the mean pixel difference between the image and an adjacent image.
36. The computer-readable medium of claim 28, wherein selecting a subset of the images results from a determination of a change of a relative position of at least one vessel edge in each image.
37. A computer-readable medium having computer executable instructions for performing a method for ordering a plurality of images of a portion of a cardiovascular system, the method comprising:
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system;
deriving a cardiac cycle signal from the plurality of scanned images; and assigning a phase in the cardiac cycle to each scanned image.
receiving from an image scanner a plurality of images recorded over a period of time, the images representing one or more locations along the extent of the cardiovascular system;
deriving a cardiac cycle signal from the plurality of scanned images; and assigning a phase in the cardiac cycle to each scanned image.
38. The computer-readable medium of claim 37, wherein the portion of the cardiovascular system is the heart.
39. The computer-readable medium of claim 37, wherein the image scanner is a CT (Computer Tomography) scanner.
40. The computer-readable medimn of claim 37, wherein the image scanner is a MRI (Magnetic Resonance Image) scanner.
41. The computer-readable medium of claim 37, wherein the image scanner is an Ultrasound scanner.
42. The computer-readable medium of claim 37, wherein deriving the cardiac signal comprises:
segmenting a set of data representing a blood vessel in each image;
computing a change value for the blood vessel; and determining the cardiac cycle signal based on a sequence of the change values for each image.
segmenting a set of data representing a blood vessel in each image;
computing a change value for the blood vessel; and determining the cardiac cycle signal based on a sequence of the change values for each image.
43. The computer-readable medium of claim 42, wherein the change value is a change in the area of a cross section of the blood vessel.
44. The computer-readable medium of claim 42, wherein the change value is a change in the position of a wall of the blood vessel.
45. The computer-readable medium of claim 37, wherein deriving the cardiac cycle signal comprises:
segmenting a set of data representing a cross-section of the aorta in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
segmenting a set of data representing a cross-section of the aorta in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
46. The computer-readable medium of claim 45, further comprising estimating the position of the aorta within the image data prior to segmenting the set of data.
47. The computer-readable medium of claim 46, wherein estimating the position of the aorta utilizes a Hough transform.
48. The computer-readable medium of claim 45, wherein segmenting of the aorta cross-section utilizes Dynamic Programming.
49. The computer-readable medium of claim 45, further comprising filtering the cardiac cycle signal to produce a smoothed cardiac cycle signal.
50. The computer-readable medium of claim 37, wherein deriving the cardiac cycle signal comprises:
for each image in the plurality of images performing the tasks of:
selecting an adjacent subsequent image;
calculating a mean pixel difference between the image and the subsequent image; and determining the cardiac cycle signal based on the mean pixel differences of the images.
for each image in the plurality of images performing the tasks of:
selecting an adjacent subsequent image;
calculating a mean pixel difference between the image and the subsequent image; and determining the cardiac cycle signal based on the mean pixel differences of the images.
51. The computer-readable medium of claim 37, wherein deriving the cardiac cycle signal comprises:
segmenting a set of data representing a cross-section of a heart in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
segmenting a set of data representing a cross-section of a heart in each image;
computing an area value representing an area of the cross-section; and determining the cardiac cycle signal based on a sequence of the area values for each image.
52. The computer-readable medium of claim 37, wherein deriving the cardiac cycle signal comprises:
for each image in the plurality of images performing the tasks of:
determining a first border of a heart in the image;
determining a second border of the heart in a subsequent adjacent image;
determining the difference between the first border and the second border; and determining the cardiac cycle based on a sequence of the differences.
for each image in the plurality of images performing the tasks of:
determining a first border of a heart in the image;
determining a second border of the heart in a subsequent adjacent image;
determining the difference between the first border and the second border; and determining the cardiac cycle based on a sequence of the differences.
53. The computer-readable medium of claim 37, wherein the ordered set of images is further filtered to produce a subset of images, said subset of images comprising images acquired at a desired point in the cardiac cycle signal.
54. The computer-readable medium of claim 37, wherein the derived cardiac cycle signal is used to interpolate or reconstruct new images at specific phases in the cardiac cycle from the original scanned images or other related data.
55. A computerized image processing system comprising:
a data storage subsystem operable to store a plurality of medical images;
an image processing subsystem operable to select at least a subset of the plurality of medical images based on common criteria determined from the plurality of medical images and without reference to an external signal.
a data storage subsystem operable to store a plurality of medical images;
an image processing subsystem operable to select at least a subset of the plurality of medical images based on common criteria determined from the plurality of medical images and without reference to an external signal.
56. The system of claim 55, wherein the image processing subsystem derives a cardiac cycle signal and wherein the selection of medical images is based on the cardiac cycle signal.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/669,395 | 2000-09-26 | ||
US09/669,395 US7031504B1 (en) | 2000-09-26 | 2000-09-26 | Image data based retrospective temporal selection of medical images |
PCT/US2001/030011 WO2002026125A2 (en) | 2000-09-26 | 2001-09-26 | Selection of medical images based on image data |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2423485A1 true CA2423485A1 (en) | 2002-04-04 |
Family
ID=24686183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002423485A Abandoned CA2423485A1 (en) | 2000-09-26 | 2001-09-26 | Selection of medical images based on image data |
Country Status (8)
Country | Link |
---|---|
US (2) | US7031504B1 (en) |
EP (1) | EP1322219B1 (en) |
JP (2) | JP2004509686A (en) |
AT (1) | ATE361698T1 (en) |
AU (1) | AU2001294711A1 (en) |
CA (1) | CA2423485A1 (en) |
DE (1) | DE60128376D1 (en) |
WO (1) | WO2002026125A2 (en) |
Families Citing this family (177)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8016823B2 (en) | 2003-01-18 | 2011-09-13 | Tsunami Medtech, Llc | Medical instrument and method of use |
US7892229B2 (en) | 2003-01-18 | 2011-02-22 | Tsunami Medtech, Llc | Medical instruments and techniques for treating pulmonary disorders |
US6937696B1 (en) | 1998-10-23 | 2005-08-30 | Varian Medical Systems Technologies, Inc. | Method and system for predictive physiological gating |
US8788020B2 (en) | 1998-10-23 | 2014-07-22 | Varian Medical Systems, Inc. | Method and system for radiation application |
US7031504B1 (en) * | 2000-09-26 | 2006-04-18 | Vital Images, Inc. | Image data based retrospective temporal selection of medical images |
US7549987B2 (en) | 2000-12-09 | 2009-06-23 | Tsunami Medtech, Llc | Thermotherapy device |
US9433457B2 (en) | 2000-12-09 | 2016-09-06 | Tsunami Medtech, Llc | Medical instruments and techniques for thermally-mediated therapies |
US8444636B2 (en) | 2001-12-07 | 2013-05-21 | Tsunami Medtech, Llc | Medical instrument and method of use |
FR2848093B1 (en) * | 2002-12-06 | 2005-12-30 | Ge Med Sys Global Tech Co Llc | METHOD FOR DETECTING THE CARDIAC CYCLE FROM AN ANGIOGRAM OF CORONARY VESSELS |
JP2006509613A (en) * | 2002-12-13 | 2006-03-23 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | System and method for processing a series of image frames indicative of a cardiac cycle |
GB2397738B (en) † | 2003-01-21 | 2007-08-29 | Elekta Ab | Computed tomography scanning |
US20050018889A1 (en) * | 2003-07-09 | 2005-01-27 | Jianying Li | Systems and methods for filtering images |
US8571639B2 (en) | 2003-09-05 | 2013-10-29 | Varian Medical Systems, Inc. | Systems and methods for gating medical procedures |
US7567696B2 (en) * | 2004-03-02 | 2009-07-28 | Siemens Medical Solutions Usa, Inc. | System and method for detecting the aortic valve using a model-based segmentation technique |
FR2872019B1 (en) | 2004-06-24 | 2012-06-15 | Gen Electric | METHOD FOR MONITORING CARDIAC ACTIVITY FROM A SERIES OF HEART IMAGES |
US20060047291A1 (en) * | 2004-08-20 | 2006-03-02 | Uptake Medical Corporation | Non-foreign occlusion of an airway and lung collapse |
US7970625B2 (en) | 2004-11-04 | 2011-06-28 | Dr Systems, Inc. | Systems and methods for retrieval of medical data |
US7787672B2 (en) | 2004-11-04 | 2010-08-31 | Dr Systems, Inc. | Systems and methods for matching, naming, and displaying medical images |
US7885440B2 (en) | 2004-11-04 | 2011-02-08 | Dr Systems, Inc. | Systems and methods for interleaving series of medical images |
US7920152B2 (en) | 2004-11-04 | 2011-04-05 | Dr Systems, Inc. | Systems and methods for viewing medical 3D imaging volumes |
US7660488B2 (en) | 2004-11-04 | 2010-02-09 | Dr Systems, Inc. | Systems and methods for viewing medical images |
KR20070108141A (en) | 2004-11-16 | 2007-11-08 | 로버트 엘 베리 | Device and method for lung treatment |
EP1845856B1 (en) * | 2005-01-31 | 2019-03-13 | Koninklijke Philips N.V. | Method and system for deriving a heart rate without the use of an electrocardiogram in non-3d imaging applications |
DE102005005919B4 (en) * | 2005-02-09 | 2007-01-25 | Siemens Ag | Method and CT device for taking X-ray CT images of a beating heart of a patient |
US7775978B2 (en) * | 2005-03-09 | 2010-08-17 | Siemens Medical Solutions Usa, Inc. | Cyclical information determination with medical diagnostic ultrasound |
US7352370B2 (en) * | 2005-06-02 | 2008-04-01 | Accuray Incorporated | Four-dimensional volume of interest |
US20070032785A1 (en) | 2005-08-03 | 2007-02-08 | Jennifer Diederich | Tissue evacuation device |
US7991210B2 (en) * | 2005-11-23 | 2011-08-02 | Vital Images, Inc. | Automatic aortic detection and segmentation in three-dimensional image data |
US7587232B2 (en) * | 2006-02-28 | 2009-09-08 | Kabushiki Kaisha Toshiba | Magnetic resonance imaging apparatus, magnetic resonance data processing apparatus, magnetic resonance data processing program and magnetic resonance imaging apparatus control method |
DE602007006194D1 (en) * | 2006-06-28 | 2010-06-10 | Philips Intellectual Property | MODEL-BASED DETERMINATION OF THE CONTRACTION STATUS E |
US9867530B2 (en) | 2006-08-14 | 2018-01-16 | Volcano Corporation | Telescopic side port catheter device with imaging system and method for accessing side branch occlusions |
JP4912807B2 (en) * | 2006-09-22 | 2012-04-11 | 株式会社東芝 | Ultrasound diagnostic imaging equipment |
US8585645B2 (en) * | 2006-11-13 | 2013-11-19 | Uptake Medical Corp. | Treatment with high temperature vapor |
US7993323B2 (en) | 2006-11-13 | 2011-08-09 | Uptake Medical Corp. | High pressure and high temperature vapor catheters and systems |
US7953614B1 (en) | 2006-11-22 | 2011-05-31 | Dr Systems, Inc. | Smart placement rules |
JP2008212634A (en) * | 2007-02-06 | 2008-09-18 | Toshiba Corp | Magnetic resonance imaging apparatus and image analysis method and image analysis program therefor |
EP1956383B1 (en) | 2007-02-06 | 2010-10-20 | Kabushiki Kaisha Toshiba | MRI involving a cine prescan for motion analysis |
US9629571B2 (en) | 2007-03-08 | 2017-04-25 | Sync-Rx, Ltd. | Co-use of endoluminal data and extraluminal imaging |
US10716528B2 (en) | 2007-03-08 | 2020-07-21 | Sync-Rx, Ltd. | Automatic display of previously-acquired endoluminal images |
US8781193B2 (en) * | 2007-03-08 | 2014-07-15 | Sync-Rx, Ltd. | Automatic quantitative vessel analysis |
US9375164B2 (en) | 2007-03-08 | 2016-06-28 | Sync-Rx, Ltd. | Co-use of endoluminal data and extraluminal imaging |
US9968256B2 (en) | 2007-03-08 | 2018-05-15 | Sync-Rx Ltd. | Automatic identification of a tool |
US11197651B2 (en) | 2007-03-08 | 2021-12-14 | Sync-Rx, Ltd. | Identification and presentation of device-to-vessel relative motion |
US11064964B2 (en) | 2007-03-08 | 2021-07-20 | Sync-Rx, Ltd | Determining a characteristic of a lumen by measuring velocity of a contrast agent |
JP5639764B2 (en) * | 2007-03-08 | 2014-12-10 | シンク−アールエックス,リミティド | Imaging and tools for use with moving organs |
WO2014002095A2 (en) | 2012-06-26 | 2014-01-03 | Sync-Rx, Ltd. | Flow-related image processing in luminal organs |
WO2012176191A1 (en) | 2011-06-23 | 2012-12-27 | Sync-Rx, Ltd. | Luminal background cleaning |
WO2008144751A1 (en) | 2007-05-21 | 2008-11-27 | Cornell University | Method for segmenting objects in images |
EP2170198B1 (en) | 2007-07-06 | 2015-04-15 | Tsunami Medtech, LLC | Medical system |
US9596993B2 (en) | 2007-07-12 | 2017-03-21 | Volcano Corporation | Automatic calibration systems and methods of use |
WO2009009802A1 (en) | 2007-07-12 | 2009-01-15 | Volcano Corporation | Oct-ivus catheter for concurrent luminal imaging |
WO2009009799A1 (en) | 2007-07-12 | 2009-01-15 | Volcano Corporation | Catheter for in vivo imaging |
EP2198797B1 (en) | 2007-08-23 | 2011-04-13 | Aegea Medical, Inc. | Uterine therapy device |
US9347765B2 (en) * | 2007-10-05 | 2016-05-24 | Volcano Corporation | Real time SD-OCT with distributed acquisition and processing |
BRPI0818239A2 (en) * | 2007-10-22 | 2017-12-05 | Uptake Medical Corp | determination of patient-specific treatment parameters and steam delivery |
US8322335B2 (en) * | 2007-10-22 | 2012-12-04 | Uptake Medical Corp. | Determining patient-specific vapor treatment and delivery parameters |
US8009887B2 (en) * | 2007-11-02 | 2011-08-30 | Siemens Corporation | Method and system for automatic quantification of aortic valve function from 4D computed tomography data using a physiological model |
FR2923152A1 (en) * | 2007-11-06 | 2009-05-08 | Gen Electric | METHOD OF ACQUIRING A THREE DIMENSIONAL RADIOLOGICAL IMAGE OF A MOVING ORGAN |
EP2210118B1 (en) * | 2007-11-09 | 2016-10-12 | Koninklijke Philips N.V. | Cyclic motion correction in combined mr/pet(or spect) scanner system |
JP5751738B2 (en) * | 2007-12-07 | 2015-07-22 | 株式会社東芝 | Magnetic resonance imaging system |
US8909321B2 (en) | 2007-12-07 | 2014-12-09 | Kabushiki Kaisha Toshiba | Diagnostic imaging apparatus, magnetic resonance imaging apparatus, and X-ray CT apparatus |
US9924992B2 (en) * | 2008-02-20 | 2018-03-27 | Tsunami Medtech, Llc | Medical system and method of use |
US8721632B2 (en) | 2008-09-09 | 2014-05-13 | Tsunami Medtech, Llc | Methods for delivering energy into a target tissue of a body |
US8579888B2 (en) | 2008-06-17 | 2013-11-12 | Tsunami Medtech, Llc | Medical probes for the treatment of blood vessels |
ES2450391T3 (en) * | 2008-06-19 | 2014-03-24 | Sync-Rx, Ltd. | Progressive progress of a medical instrument |
US8255038B2 (en) * | 2008-08-28 | 2012-08-28 | Siemens Medical Solutions Usa, Inc. | System and method for non-uniform image scanning and acquisition |
US10667727B2 (en) | 2008-09-05 | 2020-06-02 | Varian Medical Systems, Inc. | Systems and methods for determining a state of a patient |
US8391950B2 (en) * | 2008-09-30 | 2013-03-05 | Siemens Medical Solutions Usa, Inc. | System for multi-dimensional anatomical functional imaging |
US10695126B2 (en) | 2008-10-06 | 2020-06-30 | Santa Anna Tech Llc | Catheter with a double balloon structure to generate and apply a heated ablative zone to tissue |
CN102238920B (en) | 2008-10-06 | 2015-03-25 | 维兰德.K.沙马 | Method and apparatus for tissue ablation |
US10064697B2 (en) | 2008-10-06 | 2018-09-04 | Santa Anna Tech Llc | Vapor based ablation system for treating various indications |
US9561066B2 (en) | 2008-10-06 | 2017-02-07 | Virender K. Sharma | Method and apparatus for tissue ablation |
US9561068B2 (en) | 2008-10-06 | 2017-02-07 | Virender K. Sharma | Method and apparatus for tissue ablation |
US9095313B2 (en) | 2008-11-18 | 2015-08-04 | Sync-Rx, Ltd. | Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe |
US9101286B2 (en) | 2008-11-18 | 2015-08-11 | Sync-Rx, Ltd. | Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points |
US9144394B2 (en) | 2008-11-18 | 2015-09-29 | Sync-Rx, Ltd. | Apparatus and methods for determining a plurality of local calibration factors for an image |
US10362962B2 (en) | 2008-11-18 | 2019-07-30 | Synx-Rx, Ltd. | Accounting for skipped imaging locations during movement of an endoluminal imaging probe |
US11064903B2 (en) | 2008-11-18 | 2021-07-20 | Sync-Rx, Ltd | Apparatus and methods for mapping a sequence of images to a roadmap image |
US9974509B2 (en) | 2008-11-18 | 2018-05-22 | Sync-Rx Ltd. | Image super enhancement |
US8855744B2 (en) | 2008-11-18 | 2014-10-07 | Sync-Rx, Ltd. | Displaying a device within an endoluminal image stack |
US8380533B2 (en) | 2008-11-19 | 2013-02-19 | DR Systems Inc. | System and method of providing dynamic and customizable medical examination forms |
US11284931B2 (en) | 2009-02-03 | 2022-03-29 | Tsunami Medtech, Llc | Medical systems and methods for ablating and absorbing tissue |
GB0906461D0 (en) * | 2009-04-15 | 2009-05-20 | Siemens Medical Solutions | Partial volume correction via smoothing at viewer |
US8712120B1 (en) | 2009-09-28 | 2014-04-29 | Dr Systems, Inc. | Rules-based approach to transferring and/or viewing medical images |
US8900223B2 (en) | 2009-11-06 | 2014-12-02 | Tsunami Medtech, Llc | Tissue ablation systems and methods of use |
US8224056B2 (en) * | 2009-12-15 | 2012-07-17 | General Electronic Company | Method for computed tomography motion estimation and compensation |
US9161801B2 (en) | 2009-12-30 | 2015-10-20 | Tsunami Medtech, Llc | Medical system and method of use |
JP5801995B2 (en) * | 2010-02-03 | 2015-10-28 | 日立アロカメディカル株式会社 | Ultrasonic diagnostic equipment |
US9044196B2 (en) | 2010-06-17 | 2015-06-02 | Koninklijke Philips N.V. | Automated heart rate detection for 3D ultrasonic fetal imaging |
US9943353B2 (en) | 2013-03-15 | 2018-04-17 | Tsunami Medtech, Llc | Medical system and method of use |
US8971493B2 (en) | 2010-09-08 | 2015-03-03 | Siemens Medical Solutions Usa, Inc. | System for image scanning and acquisition with low-dose radiation |
WO2012064864A1 (en) | 2010-11-09 | 2012-05-18 | Aegea Medical Inc. | Positioning method and apparatus for delivering vapor to the uterus |
US11141063B2 (en) | 2010-12-23 | 2021-10-12 | Philips Image Guided Therapy Corporation | Integrated system architectures and methods of use |
US11040140B2 (en) | 2010-12-31 | 2021-06-22 | Philips Image Guided Therapy Corporation | Deep vein thrombosis therapeutic methods |
US9075899B1 (en) | 2011-08-11 | 2015-07-07 | D.R. Systems, Inc. | Automated display settings for categories of items |
US9360630B2 (en) | 2011-08-31 | 2016-06-07 | Volcano Corporation | Optical-electrical rotary joint and methods of use |
US10146403B2 (en) | 2011-09-26 | 2018-12-04 | Koninklijke Philips N.V. | Medical image system and method |
CA2851355C (en) | 2011-10-07 | 2020-02-18 | Aegea Medical Inc. | Integrity testing method and apparatus for delivering vapor to the uterus |
US8799358B2 (en) | 2011-11-28 | 2014-08-05 | Merge Healthcare Incorporated | Remote cine viewing of medical images on a zero-client application |
KR101334064B1 (en) | 2012-01-03 | 2013-11-28 | 연세대학교 산학협력단 | Apparatus and method for measureing velocity vector imaging of blood vessel |
US20140016847A1 (en) * | 2012-07-13 | 2014-01-16 | General Electric Company | Multi-phase computed tomography image reconstruction |
US9367965B2 (en) | 2012-10-05 | 2016-06-14 | Volcano Corporation | Systems and methods for generating images of tissue |
US9858668B2 (en) | 2012-10-05 | 2018-01-02 | Volcano Corporation | Guidewire artifact removal in images |
US9292918B2 (en) | 2012-10-05 | 2016-03-22 | Volcano Corporation | Methods and systems for transforming luminal images |
US11272845B2 (en) | 2012-10-05 | 2022-03-15 | Philips Image Guided Therapy Corporation | System and method for instant and automatic border detection |
JP2015532536A (en) | 2012-10-05 | 2015-11-09 | デイビッド ウェルフォード, | System and method for amplifying light |
US10568586B2 (en) | 2012-10-05 | 2020-02-25 | Volcano Corporation | Systems for indicating parameters in an imaging data set and methods of use |
US9307926B2 (en) | 2012-10-05 | 2016-04-12 | Volcano Corporation | Automatic stent detection |
US9324141B2 (en) | 2012-10-05 | 2016-04-26 | Volcano Corporation | Removal of A-scan streaking artifact |
US10070827B2 (en) | 2012-10-05 | 2018-09-11 | Volcano Corporation | Automatic image playback |
US9286673B2 (en) | 2012-10-05 | 2016-03-15 | Volcano Corporation | Systems for correcting distortions in a medical image and methods of use thereof |
US9840734B2 (en) | 2012-10-22 | 2017-12-12 | Raindance Technologies, Inc. | Methods for analyzing DNA |
JP6322210B2 (en) | 2012-12-13 | 2018-05-09 | ボルケーノ コーポレイション | Devices, systems, and methods for targeted intubation |
EP2934282B1 (en) | 2012-12-20 | 2020-04-29 | Volcano Corporation | Locating intravascular images |
JP2016504589A (en) | 2012-12-20 | 2016-02-12 | ナサニエル ジェイ. ケンプ, | Optical coherence tomography system reconfigurable between different imaging modes |
US10942022B2 (en) | 2012-12-20 | 2021-03-09 | Philips Image Guided Therapy Corporation | Manual calibration of imaging system |
WO2014099899A1 (en) | 2012-12-20 | 2014-06-26 | Jeremy Stigall | Smooth transition catheters |
US11406498B2 (en) | 2012-12-20 | 2022-08-09 | Philips Image Guided Therapy Corporation | Implant delivery system and implants |
US10939826B2 (en) | 2012-12-20 | 2021-03-09 | Philips Image Guided Therapy Corporation | Aspirating and removing biological material |
US10166003B2 (en) | 2012-12-21 | 2019-01-01 | Volcano Corporation | Ultrasound imaging with variable line density |
US10413317B2 (en) | 2012-12-21 | 2019-09-17 | Volcano Corporation | System and method for catheter steering and operation |
WO2014099896A1 (en) | 2012-12-21 | 2014-06-26 | David Welford | Systems and methods for narrowing a wavelength emission of light |
US9486143B2 (en) | 2012-12-21 | 2016-11-08 | Volcano Corporation | Intravascular forward imaging device |
US10058284B2 (en) | 2012-12-21 | 2018-08-28 | Volcano Corporation | Simultaneous imaging, monitoring, and therapy |
US10191220B2 (en) | 2012-12-21 | 2019-01-29 | Volcano Corporation | Power-efficient optical circuit |
JP2016508757A (en) | 2012-12-21 | 2016-03-24 | ジェイソン スペンサー, | System and method for graphical processing of medical data |
US9612105B2 (en) | 2012-12-21 | 2017-04-04 | Volcano Corporation | Polarization sensitive optical coherence tomography system |
WO2014099672A1 (en) | 2012-12-21 | 2014-06-26 | Andrew Hancock | System and method for multipath processing of image signals |
JP2016502884A (en) | 2012-12-21 | 2016-02-01 | ダグラス メイヤー, | Rotating ultrasound imaging catheter with extended catheter body telescope |
US9495604B1 (en) | 2013-01-09 | 2016-11-15 | D.R. Systems, Inc. | Intelligent management of computerized advanced processing |
EP2945556A4 (en) | 2013-01-17 | 2016-08-31 | Virender K Sharma | Method and apparatus for tissue ablation |
US9370330B2 (en) | 2013-02-08 | 2016-06-21 | Siemens Medical Solutions Usa, Inc. | Radiation field and dose control |
CN105103163A (en) | 2013-03-07 | 2015-11-25 | 火山公司 | Multimodal segmentation in intravascular images |
US10226597B2 (en) | 2013-03-07 | 2019-03-12 | Volcano Corporation | Guidewire with centering mechanism |
US20140276923A1 (en) | 2013-03-12 | 2014-09-18 | Volcano Corporation | Vibrating catheter and methods of use |
EP3895604A1 (en) | 2013-03-12 | 2021-10-20 | Collins, Donna | Systems and methods for diagnosing coronary microvascular disease |
US11026591B2 (en) | 2013-03-13 | 2021-06-08 | Philips Image Guided Therapy Corporation | Intravascular pressure sensor calibration |
US9301687B2 (en) | 2013-03-13 | 2016-04-05 | Volcano Corporation | System and method for OCT depth calibration |
CN105120759B (en) | 2013-03-13 | 2018-02-23 | 火山公司 | System and method for producing image from rotation intravascular ultrasound equipment |
US10292677B2 (en) | 2013-03-14 | 2019-05-21 | Volcano Corporation | Endoluminal filter having enhanced echogenic properties |
EP2967606B1 (en) | 2013-03-14 | 2018-05-16 | Volcano Corporation | Filters with echogenic characteristics |
US10219887B2 (en) | 2013-03-14 | 2019-03-05 | Volcano Corporation | Filters with echogenic characteristics |
CN104434078A (en) * | 2013-09-13 | 2015-03-25 | 施乐公司 | System and method for determining video-based pulse transit time with time-series signals |
US9782211B2 (en) | 2013-10-01 | 2017-10-10 | Uptake Medical Technology Inc. | Preferential volume reduction of diseased segments of a heterogeneous lobe |
JP6653667B2 (en) | 2014-05-06 | 2020-02-26 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Devices, systems and methods for vascular evaluation |
WO2015179666A1 (en) | 2014-05-22 | 2015-11-26 | Aegea Medical Inc. | Systems and methods for performing endometrial ablation |
ES2942296T3 (en) | 2014-05-22 | 2023-05-31 | Aegea Medical Inc | Integrity test method and apparatus for administering vapor to the uterus |
CN106456128B (en) * | 2014-06-12 | 2020-10-23 | 皇家飞利浦有限公司 | Medical image processing apparatus and method |
JP6373654B2 (en) * | 2014-06-25 | 2018-08-15 | キヤノンメディカルシステムズ株式会社 | X-ray diagnostic equipment |
CA2963866C (en) * | 2014-10-14 | 2023-11-07 | East Carolina University | Methods, systems and computer program products for determining hemodynamic status parameters using signals derived from multispectral blood flow and perfusion imaging |
US11553844B2 (en) * | 2014-10-14 | 2023-01-17 | East Carolina University | Methods, systems and computer program products for calculating MetaKG signals for regions having multiple sets of optical characteristics |
US10722173B2 (en) | 2014-10-14 | 2020-07-28 | East Carolina University | Methods, systems and computer program products for visualizing anatomical structures and blood flow and perfusion physiology using imaging techniques |
US10485604B2 (en) | 2014-12-02 | 2019-11-26 | Uptake Medical Technology Inc. | Vapor treatment of lung nodules and tumors |
DE102014225846B4 (en) * | 2014-12-15 | 2016-07-28 | Siemens Healthcare Gmbh | Determination of magnetic resonance angiography images with time-of-flight angiography and magnetic resonance apparatus |
US10531906B2 (en) | 2015-02-02 | 2020-01-14 | Uptake Medical Technology Inc. | Medical vapor generator |
US10390718B2 (en) | 2015-03-20 | 2019-08-27 | East Carolina University | Multi-spectral physiologic visualization (MSPV) using laser imaging methods and systems for blood flow and perfusion imaging and quantification in an endoscopic design |
US10058256B2 (en) | 2015-03-20 | 2018-08-28 | East Carolina University | Multi-spectral laser imaging (MSLI) methods and systems for blood flow and perfusion imaging and quantification |
JP6348865B2 (en) * | 2015-03-30 | 2018-06-27 | 株式会社リガク | CT image processing apparatus and method |
US20170039321A1 (en) | 2015-04-30 | 2017-02-09 | D.R. Systems, Inc. | Database systems and interactive user interfaces for dynamic interaction with, and sorting of, digital medical image data |
US10123761B2 (en) | 2015-07-01 | 2018-11-13 | William E. Butler | Device and method for spatiotemporal reconstruction of a moving vascular pulse wave in the brain and other organs |
US10102633B2 (en) * | 2015-11-30 | 2018-10-16 | Hyland Switzerland Sarl | System and methods of segmenting vessels from medical imaging data |
US11331037B2 (en) | 2016-02-19 | 2022-05-17 | Aegea Medical Inc. | Methods and apparatus for determining the integrity of a bodily cavity |
US11331140B2 (en) | 2016-05-19 | 2022-05-17 | Aqua Heart, Inc. | Heated vapor ablation systems and methods for treating cardiac conditions |
JP2018110637A (en) * | 2017-01-10 | 2018-07-19 | コニカミノルタ株式会社 | Dynamic image processing system |
US11129673B2 (en) | 2017-05-05 | 2021-09-28 | Uptake Medical Technology Inc. | Extra-airway vapor ablation for treating airway constriction in patients with asthma and COPD |
US11344364B2 (en) | 2017-09-07 | 2022-05-31 | Uptake Medical Technology Inc. | Screening method for a target nerve to ablate for the treatment of inflammatory lung disease |
US11350988B2 (en) | 2017-09-11 | 2022-06-07 | Uptake Medical Technology Inc. | Bronchoscopic multimodality lung tumor treatment |
USD845467S1 (en) | 2017-09-17 | 2019-04-09 | Uptake Medical Technology Inc. | Hand-piece for medical ablation catheter |
US11419658B2 (en) | 2017-11-06 | 2022-08-23 | Uptake Medical Technology Inc. | Method for treating emphysema with condensable thermal vapor |
US11490946B2 (en) | 2017-12-13 | 2022-11-08 | Uptake Medical Technology Inc. | Vapor ablation handpiece |
US11278259B2 (en) * | 2018-02-23 | 2022-03-22 | Verathon Inc. | Thrombus detection during scanning |
CA3102080A1 (en) | 2018-06-01 | 2019-12-05 | Santa Anna Tech Llc | Multi-stage vapor-based ablation treatment methods and vapor generation and delivery systems |
CN109674493B (en) * | 2018-11-28 | 2021-08-03 | 深圳蓝韵医学影像有限公司 | Method, system and equipment for medical ultrasonic automatic tracking of carotid artery blood vessel |
CN113395935B (en) | 2019-02-06 | 2023-11-24 | 威廉·E·巴特勒 | Method and computer system for reconstructing a representation of heart beat frequency angiography |
US11653927B2 (en) | 2019-02-18 | 2023-05-23 | Uptake Medical Technology Inc. | Vapor ablation treatment of obstructive lung disease |
GB2596015B (en) | 2019-03-27 | 2022-10-12 | E Butler William | Reconstructing cardiac frequency phenomena in angiographic data |
US11514577B2 (en) | 2019-04-04 | 2022-11-29 | William E. Butler | Intrinsic contrast optical cross-correlated wavelet angiography |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3069654A (en) | 1960-03-25 | 1962-12-18 | Paul V C Hough | Method and means for recognizing complex patterns |
US4547892A (en) * | 1977-04-01 | 1985-10-15 | Technicare Corporation | Cardiac imaging with CT scanner |
JPS596042A (en) * | 1982-07-02 | 1984-01-13 | 株式会社東芝 | Image treating apparatus |
JPS59155234A (en) * | 1983-02-23 | 1984-09-04 | 株式会社東芝 | Image input apparatus |
US4788975B1 (en) * | 1987-11-05 | 1999-03-02 | Trimedyne Inc | Control system and method for improved laser angioplasty |
JPH01189772A (en) * | 1988-01-25 | 1989-07-28 | Toshiba Corp | Image filing device |
US5133020A (en) * | 1989-07-21 | 1992-07-21 | Arch Development Corporation | Automated method and system for the detection and classification of abnormal lesions and parenchymal distortions in digital medical images |
JP3018194B2 (en) * | 1989-10-18 | 2000-03-13 | ジーイー横河メディカルシステム株式会社 | X-ray CT scanner |
JPH04146729A (en) * | 1990-10-11 | 1992-05-20 | Toshiba Corp | Cardiac function analyzing method and device therefor |
JPH06301765A (en) * | 1993-04-19 | 1994-10-28 | Fuji Photo Film Co Ltd | Picture processing method |
JPH07193766A (en) * | 1993-12-27 | 1995-07-28 | Toshiba Corp | Picture information processor |
JPH07192111A (en) * | 1993-12-27 | 1995-07-28 | Kawasaki Steel Corp | Calculator for cross sectional area of cavity of organ for medical diagnostic image |
JP3403804B2 (en) * | 1994-05-12 | 2003-05-06 | アロカ株式会社 | Ultrasound diagnostic equipment |
US5570430A (en) * | 1994-05-31 | 1996-10-29 | University Of Washington | Method for determining the contour of an in vivo organ using multiple image frames of the organ |
JP3443189B2 (en) * | 1994-11-08 | 2003-09-02 | アロカ株式会社 | Ultrasound diagnostic equipment |
JP3516497B2 (en) * | 1994-12-21 | 2004-04-05 | ジーイー横河メディカルシステム株式会社 | Ultrasound diagnostic equipment |
US6690963B2 (en) * | 1995-01-24 | 2004-02-10 | Biosense, Inc. | System for determining the location and orientation of an invasive medical instrument |
US5533085A (en) | 1995-02-27 | 1996-07-02 | University Of Washington | Automatic indexing of cine-angiograms |
JP3532311B2 (en) * | 1995-07-31 | 2004-05-31 | 株式会社日立メディコ | Magnetic resonance imaging system |
JP3502513B2 (en) * | 1996-09-25 | 2004-03-02 | 株式会社東芝 | Ultrasonic image processing method and ultrasonic image processing apparatus |
JPH1099328A (en) * | 1996-09-26 | 1998-04-21 | Toshiba Corp | Image processor and image processing method |
US6859548B2 (en) * | 1996-09-25 | 2005-02-22 | Kabushiki Kaisha Toshiba | Ultrasonic picture processing method and ultrasonic picture processing apparatus |
US5809105A (en) * | 1997-03-19 | 1998-09-15 | General Electric Company | Noise filter for digital x-ray imaging system |
JPH1139479A (en) * | 1997-07-16 | 1999-02-12 | Dainippon Screen Mfg Co Ltd | Method for evaluating definition |
US5910111A (en) * | 1997-09-30 | 1999-06-08 | Hunziker; Patrick | Display of heart motion |
US6154516A (en) * | 1998-09-04 | 2000-11-28 | Picker International, Inc. | Cardiac CT system |
DE19854939C2 (en) * | 1998-11-27 | 2001-11-22 | Siemens Ag | Method and device for generating CT images |
JP3897925B2 (en) * | 1999-01-29 | 2007-03-28 | 株式会社日立メディコ | Cone beam CT system |
JP2000222578A (en) * | 1999-02-02 | 2000-08-11 | Matsushita Electric Ind Co Ltd | Pattern matching method and detection of movement vector |
US6252924B1 (en) * | 1999-09-30 | 2001-06-26 | General Electric Company | Method and apparatus for motion-free cardiac CT imaging |
US6510337B1 (en) * | 1999-11-26 | 2003-01-21 | Koninklijke Philips Electronics, N.V. | Multi-phase cardiac imager |
US6393091B1 (en) * | 1999-12-13 | 2002-05-21 | General Electric Company | Methods and apparatus for non-uniform temporal cardiac imaging |
US6563941B1 (en) * | 1999-12-14 | 2003-05-13 | Siemens Corporate Research, Inc. | Model-based registration of cardiac CTA and MR acquisitions |
US6539074B1 (en) * | 2000-08-25 | 2003-03-25 | General Electric Company | Reconstruction of multislice tomographic images from four-dimensional data |
US7031504B1 (en) * | 2000-09-26 | 2006-04-18 | Vital Images, Inc. | Image data based retrospective temporal selection of medical images |
-
2000
- 2000-09-26 US US09/669,395 patent/US7031504B1/en not_active Expired - Lifetime
-
2001
- 2001-09-26 CA CA002423485A patent/CA2423485A1/en not_active Abandoned
- 2001-09-26 AT AT01975377T patent/ATE361698T1/en not_active IP Right Cessation
- 2001-09-26 DE DE60128376T patent/DE60128376D1/en not_active Expired - Lifetime
- 2001-09-26 WO PCT/US2001/030011 patent/WO2002026125A2/en active IP Right Grant
- 2001-09-26 AU AU2001294711A patent/AU2001294711A1/en not_active Abandoned
- 2001-09-26 JP JP2002529958A patent/JP2004509686A/en active Pending
- 2001-09-26 EP EP01975377A patent/EP1322219B1/en not_active Expired - Lifetime
-
2006
- 2006-04-18 US US11/379,183 patent/US20070036417A1/en not_active Abandoned
-
2012
- 2012-06-15 JP JP2012135501A patent/JP5591873B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
EP1322219B1 (en) | 2007-05-09 |
DE60128376D1 (en) | 2007-06-21 |
EP1322219A2 (en) | 2003-07-02 |
WO2002026125A3 (en) | 2003-01-23 |
US7031504B1 (en) | 2006-04-18 |
WO2002026125A2 (en) | 2002-04-04 |
JP2012228520A (en) | 2012-11-22 |
AU2001294711A1 (en) | 2002-04-08 |
JP2004509686A (en) | 2004-04-02 |
ATE361698T1 (en) | 2007-06-15 |
US20070036417A1 (en) | 2007-02-15 |
JP5591873B2 (en) | 2014-09-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1322219B1 (en) | Selection of medical images based on image data | |
Beichel et al. | Robust active appearance models and their application to medical image analysis | |
JP4918048B2 (en) | Image processing apparatus and method | |
Manzke et al. | Automatic phase determination for retrospectively gated cardiac CT: Automatic phase determination for retrospectively gated cardiac CT | |
EP1917641A2 (en) | Method and apparatus for automatic 4d coronary modeling and motion vector field estimation | |
EP3743883B1 (en) | Flow analysis in 4d mr image data | |
Amini et al. | Pointwise tracking of left-ventricular motion in 3D | |
Amini et al. | Non-rigid motion models for tracking the left-ventricular wall | |
Veronesi et al. | Tracking of left ventricular long axis from real-time three-dimensional echocardiography using optical flow techniques | |
Van Stevendaal et al. | A motion‐compensated scheme for helical cone‐beam reconstruction in cardiac CT angiography | |
Van der Geest et al. | Automated detection of left ventricular epi-and endocardial contours in short-axis MR images | |
Toumoulin et al. | Fast detection and characterization of vessels in very large 3-D data sets using geometrical moments | |
Barrett et al. | Determination of left ventricular contours: A probabilistic algorithm derived from angiographic images | |
Bosch et al. | Developments towards real-time frame-to-frame automatic contour detection on echocardiograms | |
CN111093506A (en) | Motion compensated heart valve reconstruction | |
van Stralen et al. | Left Ventricular Volume Estimation in Cardiac Three-dimensional Ultrasound: A Semiautomatic Border Detection Approach1 | |
Prasad et al. | Quantification of 3D regional myocardial wall thickening from gated magnetic resonance images | |
Duan et al. | Surface function actives | |
Meng et al. | Automatic identification of end-diastolic and end-systolic cardiac frames from invasive coronary angiography videos | |
Qian et al. | Intermodal registration of CTA and IVUS-VH, and its application on CTA-based plaque composition analysis | |
Santarelli et al. | A new algorithm for 3D automatic detection and tracking of cardiac wall motion | |
Hansis et al. | Automatic optimum phase point selection based on centerline consistency for 3D rotational coronary angiography | |
Zhang et al. | Nonrigid registration and template matching for coronary motion modeling from 4D CTA | |
Jiji | Identifying the anomaly in LV wall motion using eigenspace | |
Clarysse et al. | Integrated quantitative analysis of tagged magnetic resonance images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Discontinued |