US20160358334A1 - Motion correction for medical imaging - Google Patents
Motion correction for medical imaging Download PDFInfo
- Publication number
- US20160358334A1 US20160358334A1 US15/174,130 US201615174130A US2016358334A1 US 20160358334 A1 US20160358334 A1 US 20160358334A1 US 201615174130 A US201615174130 A US 201615174130A US 2016358334 A1 US2016358334 A1 US 2016358334A1
- Authority
- US
- United States
- Prior art keywords
- motion
- gating
- emitting sources
- image data
- interest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 134
- 238000002059 diagnostic imaging Methods 0.000 title claims abstract description 42
- 238000012937 correction Methods 0.000 title description 62
- 238000000034 method Methods 0.000 claims abstract description 40
- 238000001454 recorded image Methods 0.000 claims abstract 9
- 230000000241 respiratory effect Effects 0.000 claims description 47
- 230000003068 static effect Effects 0.000 claims description 17
- 230000003902 lesion Effects 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 5
- 239000013598 vector Substances 0.000 claims description 5
- 230000001131 transforming effect Effects 0.000 claims 4
- 238000003384 imaging method Methods 0.000 description 17
- 230000004044 response Effects 0.000 description 14
- 238000002591 computed tomography Methods 0.000 description 10
- 238000002600 positron emission tomography Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 210000004185 liver Anatomy 0.000 description 5
- 210000004072 lung Anatomy 0.000 description 5
- 230000002496 gastric effect Effects 0.000 description 4
- 238000012879 PET imaging Methods 0.000 description 3
- 210000001015 abdomen Anatomy 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/02—Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computerised tomographs
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/02—Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computerised tomographs
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5205—Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5258—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
- A61B6/5264—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5288—Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5294—Devices using data or image processing specially adapted for radiation diagnosis involving using additional data, e.g. patient information, image labeling, acquisition parameters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/005—Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration by the use of histogram techniques
-
- G06T5/73—
-
- G06T7/0085—
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10104—Positron emission tomography [PET]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30016—Brain
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/412—Dynamic
Definitions
- the present invention relates to motion correction in medical imaging systems.
- Medical imaging systems/scanners e.g., positron emission tomography (PET), computed tomography (CT), etc.
- PET positron emission tomography
- CT computed tomography
- Two primary sources of patient movements are head and neck motion and respiratory motion.
- One such technique consists of recording motion data from an electronic device attached to the patient (e.g., a waist band attached to the patient) to monitor the patient movement, while the patient is being scanned by the medical imaging scanner.
- Motion correction is performed through post-processing of the scanned data by correlating the scanned data with the motion data.
- delays may be present between the scanned data and the motion data. Movement of the electronic device itself on the patient's body may occur during the scan.
- Another technique involves using a video monitoring system to detect the patient movement from a video recording of the patient during the scan. The motion data detected by this video monitoring system usually lacks accuracy, and patients find video recording of their medical examination intrusive.
- these motion correction methods not only are invasive and uncomfortable, but also result in inaccurate or erroneous motion correction of the scanned data.
- the present invention provides methods for motion correction for use in medical imaging systems. These methods require no attached electronic hardware devices or invasive camera systems, and offer high resolution tracking of motion that can automatically detect and correct patient movement during imaging.
- external emitting sources such as positron emitting sources
- Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner.
- Listmode is a known data format for recording events, e.g., during a PET session.
- the listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources.
- a coincident line of response also known as a line of response (LOR) is an imaginary line connecting points where a pair of annihilation photons are detected by a medical imaging scanner.
- the locations and motion of the external emitting sources may be tracked in a three-dimensional space and recorded throughout the course of the scan.
- Static regions corresponding to subsequent locations of limited or no motion of the external emitting sources may be determined.
- Imaging data coinciding with the static regions are stored, while imaging data corresponding to transition regions from one static region to another are discarded.
- Motion vectors between each static region are recorded and then used in reconstruction to create a motion-corrected dataset.
- the listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
- external emitting sources such as positron emitting sources
- regions of motion associated with the patient's clinical indication such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
- Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner.
- the listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources.
- the locations and motion of the external emitting sources may be tracked and recorded throughout the course of the scan. Respiratory motion information may be derived from the tracked motion of the sources and a respiratory waveform may be generated.
- the respiratory waveform may be analyzed and marked for gating of the listmode data.
- Gating is a data processing technique applicable to listmode data, in which data that lies outside of specified “gate” areas are discarded.
- gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner.
- the listmode data may also be altered such that the motion affected events are repositioned into a user defined “stationary” geometry for use in subsequent histogramming and reconstruction.
- an object of interest such as a lesion, of the patient affected by respiratory or head and neck motion may be selected.
- a bounding region containing the object of interest may be identified.
- Listmode data of a medical imaging data may be used to determine the coincident lines of response measured within the bounding region.
- Respiratory motion information may be derived from the tracked motion of the object of interest and a respiratory waveform may be generated.
- the respiratory waveform may be analyzed and marked for gating of the listmode data.
- the gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner.
- Head and neck correction information may also be derived from the tracked motion of the object of interest enabling the ability to identify periods of non-motion and remove motion-affected data.
- the listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
- FIG. 1 illustrates a medical imaging system according to an embodiment of the present invention.
- FIG. 2 illustrates a method for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
- FIG. 3 compares images from a dataset that has been corrected for head and neck motion to images from an uncorrected dataset.
- FIG. 4 illustrates a method for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
- FIG. 5( a ) illustrates exemplary respiratory waveforms for a preclinical subject.
- FIG. 5( b ) illustrates exemplary respiratory waveforms for a clinical subject.
- FIG. 6 compares an image from a dataset that has been corrected for respiratory motion to an image from an uncorrected dataset.
- FIG. 7 illustrates a medical imaging system according to an embodiment of the present invention.
- FIG. 8 illustrates a method for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
- FIG. 1 illustrates a simplified diagram of a medical imaging system 100 , according to an embodiment of the present invention.
- An example of the medical imaging system 100 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof.
- the medical imaging system 100 may include a detector 102 , a coincidence processing module 104 , a listmode data acquisition module 106 , a motion correction module 108 , an image reconstruction module 110 , an image output module 112 , a memory 114 , and a processor 116 .
- a patient 118 may commonly be positioned within the detector 102 , as shown in FIG. 1 , and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, the patient may be moved continually through the horizontal scan range.
- the memory 114 may be provided as a volatile memory, a non-volatile memory, or a combination thereof.
- the memory 114 may store program instructions, scan data generated by the medical imaging system 100 , and any data as needed by the medical imaging system 100 .
- Algorithms to operate the coincidence processing module 104 , the listmode data acquisition module 106 , the motion correction module 108 , the image reconstruction module 110 , and the image output module 112 may be provided as software stored in the memory 114 .
- the processor 116 may be a microcontroller or a microprocessor.
- the processor 116 may execute the instructions stored in the memory 114 and may control the operations of the coincidence processing module 104 , the listmode data acquisition module 106 , the motion correction module 108 , the image reconstruction module 110 , and the image output module 112 .
- the motion correction module 108 may be coupled externally to the medical imaging system 100 .
- the motion correction module 108 may include a separate memory and processor.
- FIG. 2 illustrates a method 200 for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
- step 202 external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in three off-axis positions, as shown by positions 120 , 122 , and 124 in FIG. 1 , for example.
- positron emitting sources such as positron emitting sources
- pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100 .
- the listmode data may be in 64-bit listmode format.
- the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.
- FIG. 1 shows an example of a coincident line of response 126 .
- the motion correction module 108 may track the locations of the external emitting sources at positions 120 , 122 , and 124 in a three-dimensional space and record these locations throughout the course of the scan.
- the motion correction module 108 may determine static regions corresponding to subsequent locations of limited or no motion of the external emitting sources.
- the motion correction module 108 stores imaging data coinciding with the static regions and discards imaging data corresponding to transition regions from one static region to another.
- Imaging data may be any data among the listmode data that do not correspond to pairs of annihilation photons from the external emitting sources.
- the motion correction module may generate motion vectors between each static position.
- the image reconstruction module 112 may use the motion vectors to create a motion-corrected dataset, thereby reconstructing an image.
- the head and neck motion correction techniques described herein were tested in an experimental setup similar to the one shown in FIG. 1 .
- a plurality of patients were imaged on a PET/CT scanner, i.e., the medical imaging system 100 .
- Low activity/dose point sources were placed on the heads of the patients, in asymmetrical locations to enable three-dimensional tracking, as illustrated by positions 120 , 122 , and 124 in FIG. 1 , for example.
- the patients were instructed to move during their scans, simulating typical patient movements during imaging.
- the PET/CT scanner acquired PET data for 10 minutes in a 64-bit listmode format.
- a motion correction module 108 used algorithms, as outlined in steps 206 and 208 above, to track head motions during the scans.
- the motion correction module 108 corrected data by calculating centroid locations for each source at time points when the patient was still. Events associated with transitional motion were discarded. The initial CT position was used as the reference position. Subsequent static positions were transformed to the reference frame by calculating transformation matrices from the calculated centroid locations. All reoriented static positions were summed to create the final dataset.
- anatomical modalities may be used to generate a reference point for which the transformation matrix may be generated.
- the list mode data itself may be used to generate one or more reference points for the reconstruction of three-dimensional volumes from specific time segments within the acquired list mode data.
- an alternative reconstruction process may involve segmenting the list mode data, designating specific segments as corresponding to “stationary” geometries, reconstructing a three-dimensional volume for those designated segments, then combining the reconstructed volumes into a single volume.
- FIG. 3 illustrates a sample of images resulting from the experiment.
- FIG. 3 shows, on the left, images resulting from a dataset that has not been motion-corrected.
- the images on the right result from the same, but motion-corrected, dataset.
- head and neck motion correction method 200 is a robust method for automatic motion correction in head and neck patients. Automatic correction of motion may prevent inaccurate radiological examinations and prevent burdening patients with repeated imaging procedures.
- Respiratory motion in medical imaging affects diagnostic image quality for a wide range of cancers, including: lung, liver, pancreatic, and gastric.
- the medical imaging system 100 shown in FIG. 1 may also be used for respiratory motion correction.
- the external emitting sources are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
- the torso of the patient 118 must be positioned within detector 102 accordingly or pass through the axial field of view during the course of data acquisition.
- FIG. 4 illustrates a method 400 for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
- step 402 external emitting sources, such as positron emitting sources, are placed on the on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
- positron emitting sources such as positron emitting sources
- pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100 .
- the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.
- FIG. 1 shows an example of a coincident line of response 126 .
- the motion correction module 108 may track the locations of the external emitting sources and record these locations throughout the course of the scan.
- the motion correction module 108 may derive respiratory motion information from the tracked motion of the sources and generate a respiratory waveform.
- the motion correction module 108 may analyze and mark the respiratory waveform for gating of the listmode data.
- the motion correction module may insert gating tags into the listmode data.
- the image reconstruction module 112 may reconstruct a motion-corrected image using the inserted gating tags and generate histograms.
- the gating tags may mark locations of local maxima for each respiratory cycle, enabling reconstruction through amplitude or phase-based gating.
- the respiratory motion correction techniques described herein were tested in both preclinical and clinical imaging systems. For both systems, low activity/dose point sources were placed on animals or humans in sites of respiratory motion for software tracking by the motion correction module 108 . Standard electronic gating systems were also attached to the subjects with a respiratory pad used for mouse imaging and a respiratory band used for human imaging. PET data were collected for 10 minutes for clinical and preclinical subjects. 64-bit listmode data was acquired with tags inserted from standard electronic systems. The raw listmode data was processed by the motion correction module 108 , as discussed above, inserting the gating tags into the listmode data. The motion correction module 108 was configured to insert gating tags at local maxima in the y-axis for each respiratory cycle. High frequency noise was removed by applying a discrete wavelet transformation denoising technique. Amplitude-based gating was used to reconstruct static images with a duty cycle of 20%. It should be noted that phase-based gating may also be used for the image reconstruction.
- FIGS. 5( a ) and 5( b ) illustrate exemplary respiratory waveforms for a preclinical subject and a clinical subject, respectively.
- the waveforms are marked for gating tags.
- the solid vertical marks, some of which are labeled 52 correspond to tags to be inserted into the listmode data by the respiratory motion correction method 400
- the broken vertical marks, some of which are labeled 55 correspond to tags from the standard electronic gating system.
- Comparison of waveforms between electronic and software-based gating indicated correlation between insertion points of greater than 99%, while timing drift in gating tag entry was only observed in the electronic signals generated by the standard electronic gating system.
- FIG. 6 illustrates exemplary images resulting from the experiment.
- FIG. 6 shows, on the left, an image resulting from a dataset that has not been motion-corrected, and, on the right, a corresponding image resulting from the same, but motion-corrected, dataset.
- Phase-based and amplitude-based gated reconstructions are possible using this respiratory motion correction method 400 and shows improved image quality in regions of respiratory motion. Therefore, the respiratory motion correction method 400 is able to produce accurate respiratory waveforms and correct insertion of gating tags. Visual comparison of data indicates that reconstructed images using tags inserted from the respiratory motion correction method 400 produce images with reduced motion artifacts.
- FIG. 7 illustrates a simplified diagram of a medical imaging system 700 , according to an embodiment of the present invention.
- An example of the medical imaging system 700 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof.
- the medical imaging system 700 may include a detector 702 , a coincidence processing module 704 , a listmode data acquisition module 706 , a motion correction module 708 , an image reconstruction module 710 , an image output module 712 , a memory 714 , and a processor 716 .
- a patient 718 may commonly be positioned within the detector 702 , as shown in FIG.
- a user 730 may interact with the medical imaging system 700 to select an object of interest 732 , the location of which needs to be tracked, as will be explained in more details below.
- the object of interest 732 may be, but are not limited to, a lesion in the lung and edges of anatomical surfaces, such as the dome of the liver.
- FIG. 8 illustrates a method 800 for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
- the user 730 may select an object of interest 732 , such as a lesion, of the patient affected by respiratory or head and neck motion.
- the user 730 may then identify a bounding region 734 containing the object of interest 732 .
- FIG. 7 illustrates, for example, an object of interest 732 bounded by a bounding region 734 .
- the motion correction module 708 may use the listmode data from the listmode data acquisition module 706 to identify the coincident lines of response, from the coincidence processing module 706 , measured within the bounding region 734 .
- FIG. 7 shows an example of a coincident line of response 736 .
- the motion correction module 708 may track the location of the object of interest 732 throughout the course of the scan.
- the motion correction module 708 may derive respiratory motion information from the tracked motion of the object of interest 732 and generate a respiratory waveform.
- the motion correction module 708 may analyze and mark the respiratory waveform for gating of the listmode data.
- the motion correction module may insert gating tags into the listmode data.
- the image reconstruction module 712 may reconstruct a motion-corrected image and generate histograms.
Abstract
Description
- The present application claims the benefit of U.S. Provisional application, Ser. No. 62/171,489, filed Jun. 5, 2015, entitled “ MOTION CORRECTION FOR MEDICAL IMAGING,” the disclosure of which is incorporated herein by reference in its entirety.
- The present invention relates to motion correction in medical imaging systems.
- Medical imaging systems/scanners (e.g., positron emission tomography (PET), computed tomography (CT), etc.) are typically used for diagnostic purposes. Patient movement during medical imaging, however, can result in degraded image quality and reduced diagnostic confidence. Two primary sources of patient movements are head and neck motion and respiratory motion.
- Conventional solutions to correct for patient movement in medical imaging have significant drawbacks. One such technique consists of recording motion data from an electronic device attached to the patient (e.g., a waist band attached to the patient) to monitor the patient movement, while the patient is being scanned by the medical imaging scanner. Motion correction is performed through post-processing of the scanned data by correlating the scanned data with the motion data. In addition to the problems caused by the electronic hardware itself such as difficult placement on the patient, delays may be present between the scanned data and the motion data. Movement of the electronic device itself on the patient's body may occur during the scan. Another technique involves using a video monitoring system to detect the patient movement from a video recording of the patient during the scan. The motion data detected by this video monitoring system usually lacks accuracy, and patients find video recording of their medical examination intrusive. Thus, these motion correction methods not only are invasive and uncomfortable, but also result in inaccurate or erroneous motion correction of the scanned data.
- The present invention provides methods for motion correction for use in medical imaging systems. These methods require no attached electronic hardware devices or invasive camera systems, and offer high resolution tracking of motion that can automatically detect and correct patient movement during imaging.
- In an embodiment, external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in, at least, three off-axis positions. Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner. Listmode is a known data format for recording events, e.g., during a PET session. The listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources. A coincident line of response, also known as a line of response (LOR) is an imaginary line connecting points where a pair of annihilation photons are detected by a medical imaging scanner. Thus, the locations and motion of the external emitting sources may be tracked in a three-dimensional space and recorded throughout the course of the scan. Static regions corresponding to subsequent locations of limited or no motion of the external emitting sources may be determined. Imaging data coinciding with the static regions are stored, while imaging data corresponding to transition regions from one static region to another are discarded. Motion vectors between each static region are recorded and then used in reconstruction to create a motion-corrected dataset. The listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
- In an embodiment, external emitting sources, such as positron emitting sources, are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging. Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner. The listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources. Thus, the locations and motion of the external emitting sources may be tracked and recorded throughout the course of the scan. Respiratory motion information may be derived from the tracked motion of the sources and a respiratory waveform may be generated. The respiratory waveform may be analyzed and marked for gating of the listmode data. Gating is a data processing technique applicable to listmode data, in which data that lies outside of specified “gate” areas are discarded. According to an embodiment, gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner. The listmode data may also be altered such that the motion affected events are repositioned into a user defined “stationary” geometry for use in subsequent histogramming and reconstruction.
- In an embodiment, an object of interest, such as a lesion, of the patient affected by respiratory or head and neck motion may be selected. A bounding region containing the object of interest may be identified. Listmode data of a medical imaging data may be used to determine the coincident lines of response measured within the bounding region. Thus, the locations and motion of the object of interest may be tracked. Respiratory motion information may be derived from the tracked motion of the object of interest and a respiratory waveform may be generated. The respiratory waveform may be analyzed and marked for gating of the listmode data. The gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner. Head and neck correction information may also be derived from the tracked motion of the object of interest enabling the ability to identify periods of non-motion and remove motion-affected data. The listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
-
FIG. 1 illustrates a medical imaging system according to an embodiment of the present invention. -
FIG. 2 illustrates a method for head and neck motion correction in medical imaging, according to an embodiment of the present invention. -
FIG. 3 compares images from a dataset that has been corrected for head and neck motion to images from an uncorrected dataset. -
FIG. 4 illustrates a method for respiratory motion correction in medical imaging, according to an embodiment of the present invention. -
FIG. 5(a) illustrates exemplary respiratory waveforms for a preclinical subject. -
FIG. 5(b) illustrates exemplary respiratory waveforms for a clinical subject. -
FIG. 6 compares an image from a dataset that has been corrected for respiratory motion to an image from an uncorrected dataset. -
FIG. 7 illustrates a medical imaging system according to an embodiment of the present invention. -
FIG. 8 illustrates a method for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention. -
FIG. 1 illustrates a simplified diagram of amedical imaging system 100, according to an embodiment of the present invention. An example of themedical imaging system 100 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof. Themedical imaging system 100 may include adetector 102, acoincidence processing module 104, a listmodedata acquisition module 106, amotion correction module 108, animage reconstruction module 110, animage output module 112, amemory 114, and aprocessor 116. Apatient 118 may commonly be positioned within thedetector 102, as shown inFIG. 1 , and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, the patient may be moved continually through the horizontal scan range. - The
memory 114 may be provided as a volatile memory, a non-volatile memory, or a combination thereof. Thememory 114 may store program instructions, scan data generated by themedical imaging system 100, and any data as needed by themedical imaging system 100. Algorithms to operate thecoincidence processing module 104, the listmodedata acquisition module 106, themotion correction module 108, theimage reconstruction module 110, and theimage output module 112 may be provided as software stored in thememory 114. Theprocessor 116 may be a microcontroller or a microprocessor. Theprocessor 116 may execute the instructions stored in thememory 114 and may control the operations of thecoincidence processing module 104, the listmodedata acquisition module 106, themotion correction module 108, theimage reconstruction module 110, and theimage output module 112. - In another embodiment, the
motion correction module 108 may be coupled externally to themedical imaging system 100. In such an embodiment, themotion correction module 108 may include a separate memory and processor. -
FIG. 2 illustrates amethod 200 for head and neck motion correction in medical imaging, according to an embodiment of the present invention. - In
step 202, external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in three off-axis positions, as shown bypositions FIG. 1 , for example. - In
step 204, pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected bydetector 102 and recorded among the listmode data acquired by the listmodedata acquisition module 106 of themedical imaging system 100. For example, the listmode data may be in 64-bit listmode format. - In
step 206, themotion correction module 108 may use the listmode data from the listmodedata acquisition module 106 to determine the coincident lines of response from thecoincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.FIG. 1 shows an example of a coincident line ofresponse 126. - In
step 208, from the corresponding coincident lines of response, themotion correction module 108 may track the locations of the external emitting sources atpositions - In
step 210, themotion correction module 108 may determine static regions corresponding to subsequent locations of limited or no motion of the external emitting sources. Themotion correction module 108 stores imaging data coinciding with the static regions and discards imaging data corresponding to transition regions from one static region to another. Imaging data may be any data among the listmode data that do not correspond to pairs of annihilation photons from the external emitting sources. - In
step 212, the motion correction module may generate motion vectors between each static position. - In
step 214, theimage reconstruction module 112 may use the motion vectors to create a motion-corrected dataset, thereby reconstructing an image. - One skilled in the art would appreciate that, given that motion correction may be performed on the raw listmode data prior to image reconstruction, the head and neck motion correction techniques described herein may be applied to many PET imaging systems.
- The head and neck motion correction techniques described herein were tested in an experimental setup similar to the one shown in
FIG. 1 . A plurality of patients were imaged on a PET/CT scanner, i.e., themedical imaging system 100. Low activity/dose point sources were placed on the heads of the patients, in asymmetrical locations to enable three-dimensional tracking, as illustrated bypositions FIG. 1 , for example. The patients were instructed to move during their scans, simulating typical patient movements during imaging. For each patient, the PET/CT scanner acquired PET data for 10 minutes in a 64-bit listmode format. Amotion correction module 108 used algorithms, as outlined insteps step 210, themotion correction module 108 corrected data by calculating centroid locations for each source at time points when the patient was still. Events associated with transitional motion were discarded. The initial CT position was used as the reference position. Subsequent static positions were transformed to the reference frame by calculating transformation matrices from the calculated centroid locations. All reoriented static positions were summed to create the final dataset. - As an alternative to using a CT position as a reference, other anatomical modalities may be used to generate a reference point for which the transformation matrix may be generated. For instance, the list mode data itself may be used to generate one or more reference points for the reconstruction of three-dimensional volumes from specific time segments within the acquired list mode data. Thus, an alternative reconstruction process may involve segmenting the list mode data, designating specific segments as corresponding to “stationary” geometries, reconstructing a three-dimensional volume for those designated segments, then combining the reconstructed volumes into a single volume.
-
FIG. 3 illustrates a sample of images resulting from the experiment.FIG. 3 shows, on the left, images resulting from a dataset that has not been motion-corrected. The images on the right result from the same, but motion-corrected, dataset. - Analysis of the images indicated that patient motion during the scan severely degraded the quality of the images. Key features of the brain are hardly discernible. The algorithms used by
motion correction module 108 successfully tracked all three source positions during the scan. Following the motion correction, the images of the brain were crisp with no signs of motion, as shown on the right side ofFIG. 3 . In addition, this head and neckmotion correction method 200 enabled recovery of about 97% of data. Therefore, head and neckmotion correction method 200 is a robust method for automatic motion correction in head and neck patients. Automatic correction of motion may prevent inaccurate radiological examinations and prevent burdening patients with repeated imaging procedures. - Respiratory motion in medical imaging affects diagnostic image quality for a wide range of cancers, including: lung, liver, pancreatic, and gastric. The
medical imaging system 100 shown inFIG. 1 may also be used for respiratory motion correction. However, the external emitting sources are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging. The torso of thepatient 118 must be positioned withindetector 102 accordingly or pass through the axial field of view during the course of data acquisition. -
FIG. 4 illustrates amethod 400 for respiratory motion correction in medical imaging, according to an embodiment of the present invention. - In
step 402, external emitting sources, such as positron emitting sources, are placed on the on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging. - In
step 404, pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected bydetector 102 and recorded among the listmode data acquired by the listmodedata acquisition module 106 of themedical imaging system 100. - In
step 406, themotion correction module 108 may use the listmode data from the listmodedata acquisition module 106 to determine the coincident lines of response from thecoincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.FIG. 1 shows an example of a coincident line ofresponse 126. - In
step 408, from the corresponding coincident lines of response, themotion correction module 108 may track the locations of the external emitting sources and record these locations throughout the course of the scan. - In
step 410, themotion correction module 108 may derive respiratory motion information from the tracked motion of the sources and generate a respiratory waveform. - In
step 412, themotion correction module 108 may analyze and mark the respiratory waveform for gating of the listmode data. - In
step 414, the motion correction module may insert gating tags into the listmode data. - In
step 416, based on the gating tags, theimage reconstruction module 112 may reconstruct a motion-corrected image using the inserted gating tags and generate histograms. For example, as noted below, the gating tags may mark locations of local maxima for each respiratory cycle, enabling reconstruction through amplitude or phase-based gating. - One skilled in the art would appreciate that, given that motion correction may also be performed on the raw listmode data prior to image reconstruction, the respiratory motion correction techniques described herein may be applied to many PET imaging system.
- The respiratory motion correction techniques described herein were tested in both preclinical and clinical imaging systems. For both systems, low activity/dose point sources were placed on animals or humans in sites of respiratory motion for software tracking by the
motion correction module 108. Standard electronic gating systems were also attached to the subjects with a respiratory pad used for mouse imaging and a respiratory band used for human imaging. PET data were collected for 10 minutes for clinical and preclinical subjects. 64-bit listmode data was acquired with tags inserted from standard electronic systems. The raw listmode data was processed by themotion correction module 108, as discussed above, inserting the gating tags into the listmode data. Themotion correction module 108 was configured to insert gating tags at local maxima in the y-axis for each respiratory cycle. High frequency noise was removed by applying a discrete wavelet transformation denoising technique. Amplitude-based gating was used to reconstruct static images with a duty cycle of 20%. It should be noted that phase-based gating may also be used for the image reconstruction. -
FIGS. 5(a) and 5(b) illustrate exemplary respiratory waveforms for a preclinical subject and a clinical subject, respectively. The waveforms are marked for gating tags. The solid vertical marks, some of which are labeled 52, correspond to tags to be inserted into the listmode data by the respiratorymotion correction method 400, while the broken vertical marks, some of which are labeled 55, correspond to tags from the standard electronic gating system. Comparison of waveforms between electronic and software-based gating indicated correlation between insertion points of greater than 99%, while timing drift in gating tag entry was only observed in the electronic signals generated by the standard electronic gating system. -
FIG. 6 illustrates exemplary images resulting from the experiment.FIG. 6 shows, on the left, an image resulting from a dataset that has not been motion-corrected, and, on the right, a corresponding image resulting from the same, but motion-corrected, dataset. - Phase-based and amplitude-based gated reconstructions are possible using this respiratory
motion correction method 400 and shows improved image quality in regions of respiratory motion. Therefore, the respiratorymotion correction method 400 is able to produce accurate respiratory waveforms and correct insertion of gating tags. Visual comparison of data indicates that reconstructed images using tags inserted from the respiratorymotion correction method 400 produce images with reduced motion artifacts. - Motion Correction without External Emitting Sources
- Respiratory or head and neck motion correction may also be realized without the use external emitting sources.
FIG. 7 illustrates a simplified diagram of amedical imaging system 700, according to an embodiment of the present invention. An example of themedical imaging system 700 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof. Themedical imaging system 700 may include adetector 702, acoincidence processing module 704, a listmodedata acquisition module 706, amotion correction module 708, animage reconstruction module 710, animage output module 712, amemory 714, and aprocessor 716. Apatient 718 may commonly be positioned within thedetector 702, as shown inFIG. 7 , and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, thepatient 718 may be moved continually through the horizontal scan range. Auser 730 may interact with themedical imaging system 700 to select an object ofinterest 732, the location of which needs to be tracked, as will be explained in more details below. Examples of the object ofinterest 732 may be, but are not limited to, a lesion in the lung and edges of anatomical surfaces, such as the dome of the liver. -
FIG. 8 illustrates amethod 800 for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention. - In
step 802, using a reconstructed image from theimage output module 712, theuser 730 may select an object ofinterest 732, such as a lesion, of the patient affected by respiratory or head and neck motion. Theuser 730 may then identify abounding region 734 containing the object ofinterest 732.FIG. 7 illustrates, for example, an object ofinterest 732 bounded by a boundingregion 734. - In
step 804, themotion correction module 708 may use the listmode data from the listmodedata acquisition module 706 to identify the coincident lines of response, from thecoincidence processing module 706, measured within the boundingregion 734.FIG. 7 shows an example of a coincident line ofresponse 736. - In
step 806, from the identified coincident lines of response, themotion correction module 708 may track the location of the object ofinterest 732 throughout the course of the scan. - In
step 808, themotion correction module 708 may derive respiratory motion information from the tracked motion of the object ofinterest 732 and generate a respiratory waveform. - In
step 810, themotion correction module 708 may analyze and mark the respiratory waveform for gating of the listmode data. - In
step 812, the motion correction module may insert gating tags into the listmode data. - In
step 814, based on the gating tags and using either amplitude-based gating or phase-based gating, theimage reconstruction module 712 may reconstruct a motion-corrected image and generate histograms. - One skilled in the art would appreciate that, given that motion correction may be performed on the raw listmode data prior to image reconstruction, the respiratory motion correction techniques described herein may be applied to many PET imaging systems.
- The foregoing description has been set forth merely to illustrate the invention and is not intended as being limiting. Each of the disclosed aspects and embodiments of the present invention may be considered individually or in combination with other aspects, embodiments, and variations of the invention. Further, while certain features of embodiments of the present invention may be shown in only certain figures, such features can be incorporated into other embodiments shown in other figures while remaining within the scope of the present invention. In addition, unless otherwise specified, none of the steps of the methods of the present invention are confined to any particular order of performance. Modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art and such modifications are within the scope of the present invention. Furthermore, all references cited herein are incorporated by reference in their entirety.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/174,130 US10255684B2 (en) | 2015-06-05 | 2016-06-06 | Motion correction for PET medical imaging based on tracking of annihilation photons |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562171489P | 2015-06-05 | 2015-06-05 | |
US15/174,130 US10255684B2 (en) | 2015-06-05 | 2016-06-06 | Motion correction for PET medical imaging based on tracking of annihilation photons |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160358334A1 true US20160358334A1 (en) | 2016-12-08 |
US10255684B2 US10255684B2 (en) | 2019-04-09 |
Family
ID=57452252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/174,130 Expired - Fee Related US10255684B2 (en) | 2015-06-05 | 2016-06-06 | Motion correction for PET medical imaging based on tracking of annihilation photons |
Country Status (1)
Country | Link |
---|---|
US (1) | US10255684B2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10075640B2 (en) * | 2015-12-31 | 2018-09-11 | Sony Corporation | Motion compensation for image sensor with a block based analog-to-digital converter |
CN110507336A (en) * | 2019-07-23 | 2019-11-29 | 广东省医疗器械研究所 | A kind of personalized method for cervical vertebra monitoring and correction |
US10664979B2 (en) | 2018-09-14 | 2020-05-26 | Siemens Healthcare Gmbh | Method and system for deep motion model learning in medical images |
JP2022513233A (en) * | 2018-12-17 | 2022-02-07 | シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド | Automatic motion correction during PET imaging |
US11270434B2 (en) * | 2019-10-07 | 2022-03-08 | Siemens Medical Solutions Usa, Inc. | Motion correction for medical image data |
US11894126B1 (en) * | 2023-02-24 | 2024-02-06 | Ix Innovation Llc | Systems and methods for tracking movement of a wearable device for advanced image stabilization |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040102695A1 (en) * | 2002-11-25 | 2004-05-27 | Stergios Stergiopoulos | Method and device for correcting organ motion artifacts in MRI systems |
US20050123183A1 (en) * | 2003-09-02 | 2005-06-09 | Paul Schleyer | Data driven motion correction for nuclear imaging |
US20070265528A1 (en) * | 2006-04-10 | 2007-11-15 | Tong Xu | Method and apparatus for real-time tumor tracking |
US20070280508A1 (en) * | 2006-05-19 | 2007-12-06 | Ernst Thomas M | Motion tracking system for real time adaptive imaging and spectroscopy |
US20090253980A1 (en) * | 2008-04-08 | 2009-10-08 | General Electric Company | Method and apparatus for determining the effectiveness of an image transformation process |
US20110293143A1 (en) * | 2009-02-17 | 2011-12-01 | Koninklijke Philips Electronics N.V. | Functional imaging |
US20120051664A1 (en) * | 2010-08-31 | 2012-03-01 | General Electric Company | Motion compensation in image processing |
US8224056B2 (en) * | 2009-12-15 | 2012-07-17 | General Electronic Company | Method for computed tomography motion estimation and compensation |
US20130079626A1 (en) * | 2011-09-26 | 2013-03-28 | Andriy Shmatukha | Systems and methods for automated dynamic contrast enhancement imaging |
US20130287278A1 (en) * | 2011-01-05 | 2013-10-31 | Koninklijke Philips Electronics N.V. | Method and apparatus to detect and correct motion in list-mode pet data with a gated signal |
US20140133717A1 (en) * | 2011-06-21 | 2014-05-15 | Koninklijke Philips N.V. | Respiratory motion determination apparatus |
US20150134261A1 (en) * | 2013-11-14 | 2015-05-14 | J. Michael O'Connor | Synchronization of patient motion detection equipment with medical imaging systems |
US20150302613A1 (en) * | 2014-04-16 | 2015-10-22 | Siemens Medical Solutions Usa, Inc. | Method To Compensate Gating Effects On Image Uniformity And Quantification For PET Scan With Continuous Bed Motion |
US20160095565A1 (en) * | 2014-10-01 | 2016-04-07 | Siemens Aktiengesellschaft | Method and imaging system for compensating for location assignment errors in pet data occurring due to a cyclical motion of a patient |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0324374D0 (en) * | 2003-10-17 | 2003-11-19 | Hammersmith Imanet Ltd | Method of, and software for, conducting motion correction for a tomographic scanner |
US8170302B1 (en) * | 2005-09-30 | 2012-05-01 | Ut-Battelle, Llc | System and method for generating motion corrected tomographic images |
EP2109399B1 (en) * | 2007-02-07 | 2014-03-12 | Koninklijke Philips N.V. | Motion estimation in treatment planning |
US20160247293A1 (en) * | 2015-02-24 | 2016-08-25 | Brain Biosciences, Inc. | Medical imaging systems and methods for performing motion-corrected image reconstruction |
US9606245B1 (en) * | 2015-03-24 | 2017-03-28 | The Research Foundation For The State University Of New York | Autonomous gamma, X-ray, and particle detector |
-
2016
- 2016-06-06 US US15/174,130 patent/US10255684B2/en not_active Expired - Fee Related
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040102695A1 (en) * | 2002-11-25 | 2004-05-27 | Stergios Stergiopoulos | Method and device for correcting organ motion artifacts in MRI systems |
US20050123183A1 (en) * | 2003-09-02 | 2005-06-09 | Paul Schleyer | Data driven motion correction for nuclear imaging |
US20070265528A1 (en) * | 2006-04-10 | 2007-11-15 | Tong Xu | Method and apparatus for real-time tumor tracking |
US20070280508A1 (en) * | 2006-05-19 | 2007-12-06 | Ernst Thomas M | Motion tracking system for real time adaptive imaging and spectroscopy |
US20090253980A1 (en) * | 2008-04-08 | 2009-10-08 | General Electric Company | Method and apparatus for determining the effectiveness of an image transformation process |
US20110293143A1 (en) * | 2009-02-17 | 2011-12-01 | Koninklijke Philips Electronics N.V. | Functional imaging |
US8224056B2 (en) * | 2009-12-15 | 2012-07-17 | General Electronic Company | Method for computed tomography motion estimation and compensation |
US20120051664A1 (en) * | 2010-08-31 | 2012-03-01 | General Electric Company | Motion compensation in image processing |
US20130287278A1 (en) * | 2011-01-05 | 2013-10-31 | Koninklijke Philips Electronics N.V. | Method and apparatus to detect and correct motion in list-mode pet data with a gated signal |
US20140133717A1 (en) * | 2011-06-21 | 2014-05-15 | Koninklijke Philips N.V. | Respiratory motion determination apparatus |
US20130079626A1 (en) * | 2011-09-26 | 2013-03-28 | Andriy Shmatukha | Systems and methods for automated dynamic contrast enhancement imaging |
US20150134261A1 (en) * | 2013-11-14 | 2015-05-14 | J. Michael O'Connor | Synchronization of patient motion detection equipment with medical imaging systems |
US20150302613A1 (en) * | 2014-04-16 | 2015-10-22 | Siemens Medical Solutions Usa, Inc. | Method To Compensate Gating Effects On Image Uniformity And Quantification For PET Scan With Continuous Bed Motion |
US20160095565A1 (en) * | 2014-10-01 | 2016-04-07 | Siemens Aktiengesellschaft | Method and imaging system for compensating for location assignment errors in pet data occurring due to a cyclical motion of a patient |
Non-Patent Citations (3)
Title |
---|
Büther, F., Ernst, I., Hamill, J., Eich, H. T., Schober, O., Schäfers, M., & Schäfers, K. P. (2013). External radioactive markers for PET data-driven respiratory gating in positron emission tomography. European journal of nuclear medicine and molecular imaging, 40(4), 602-614. * |
Harteela, M., Hirvi, H., Mäkipää, A., Teuho, J., Koivumäki, T., Mäkelä, M. M., & Teräs, M. (2014). Comparison of end-expiratory respiratory gating methods for PET/CT. Acta Oncologica, 53(8), 1079-1085. * |
Nehmeh, S. A., Erdi, Y. E., Rosenzweig, K. E., Schoder, H., Larson, S. M., Squire, O. D., & Humm, J. L. (2003). Reduction of respiratory motion artifacts in PET imaging of lung cancer by respiratory correlated dynamic PET: methodology and comparison with respiratory gated PET. Journal of Nuclear Medicine, 44(10), 1644-1648. * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10075640B2 (en) * | 2015-12-31 | 2018-09-11 | Sony Corporation | Motion compensation for image sensor with a block based analog-to-digital converter |
US10664979B2 (en) | 2018-09-14 | 2020-05-26 | Siemens Healthcare Gmbh | Method and system for deep motion model learning in medical images |
JP2022513233A (en) * | 2018-12-17 | 2022-02-07 | シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド | Automatic motion correction during PET imaging |
JP7238134B2 (en) | 2018-12-17 | 2023-03-13 | シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド | Automatic motion compensation during PET imaging |
CN110507336A (en) * | 2019-07-23 | 2019-11-29 | 广东省医疗器械研究所 | A kind of personalized method for cervical vertebra monitoring and correction |
US11270434B2 (en) * | 2019-10-07 | 2022-03-08 | Siemens Medical Solutions Usa, Inc. | Motion correction for medical image data |
US11894126B1 (en) * | 2023-02-24 | 2024-02-06 | Ix Innovation Llc | Systems and methods for tracking movement of a wearable device for advanced image stabilization |
Also Published As
Publication number | Publication date |
---|---|
US10255684B2 (en) | 2019-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10255684B2 (en) | Motion correction for PET medical imaging based on tracking of annihilation photons | |
US9414773B2 (en) | Respiratory motion determination apparatus | |
US9451926B2 (en) | Respiratory motion correction with internal-external motion correlation, and associated systems and methods | |
JP6243121B2 (en) | Method and apparatus for motion detection and correction in imaging scans using time-of-flight information | |
Büther et al. | Detection of respiratory tumour motion using intrinsic list mode-driven gating in positron emission tomography | |
JP5947813B2 (en) | Method and apparatus for detecting and correcting motion in wrist mode PET data with gate signal | |
Olesen et al. | List-mode PET motion correction using markerless head tracking: proof-of-concept with scans of human subject | |
US20040258286A1 (en) | Systems and methods for retrospective internal gating | |
US20080287772A1 (en) | Motion Compensation in PET Reconstruction | |
US9579070B2 (en) | Optimal respiratory gating in medical imaging | |
US8658979B2 (en) | Nuclear image reconstruction | |
JP5389907B2 (en) | Geometric transformations that maintain list mode format | |
EP2575616B1 (en) | Amplitude/slope-based motion phase mapping | |
Visvikis et al. | Respiratory motion in positron emission tomography for oncology applications: Problems and solutions | |
Feng et al. | Real-time data-driven rigid motion detection and correction for brain scan with listmode PET | |
KR20140042461A (en) | Method and apparatus to correct motion | |
US20230022425A1 (en) | Apparatus, system, method and computer probram for providing a nuclear image of a region of interest of a patient | |
CN110215226B (en) | Image attenuation correction method, image attenuation correction device, computer equipment and storage medium | |
Goddard et al. | Non-invasive PET head-motion correction via optical 3d pose tracking | |
US20230008263A1 (en) | Motion compensation of positron emission tomographic data | |
van den Hoff et al. | Motion Compensation in Emission Tomography | |
Woo et al. | Development of a motion correction system for respiratory-gated PET study | |
Woo et al. | Motion correction of respiratory-gated PET/CT images using polynomial warping | |
Breuilly et al. | Image-based motion detection in 4D images and application to respiratory motion suppression | |
van den Hoff et al. | Emission Tomography |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UNIVERSITY OF TENNESSEE RESEARCH FOUNDATION, TENNE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OSBORNE, DUSTIN R.;HU, DONGMING;LEE, SANG HYEB;SIGNING DATES FROM 20160609 TO 20160612;REEL/FRAME:039126/0046 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230409 |