US20160358334A1 - Motion correction for medical imaging - Google Patents

Motion correction for medical imaging Download PDF

Info

Publication number
US20160358334A1
US20160358334A1 US15/174,130 US201615174130A US2016358334A1 US 20160358334 A1 US20160358334 A1 US 20160358334A1 US 201615174130 A US201615174130 A US 201615174130A US 2016358334 A1 US2016358334 A1 US 2016358334A1
Authority
US
United States
Prior art keywords
motion
gating
emitting sources
image data
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/174,130
Other versions
US10255684B2 (en
Inventor
Dustin R. Osborne
Dongming Hu
Sang Hyeb Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Tennessee Research Foundation
Original Assignee
University of Tennessee Research Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Tennessee Research Foundation filed Critical University of Tennessee Research Foundation
Priority to US15/174,130 priority Critical patent/US10255684B2/en
Assigned to UNIVERSITY OF TENNESSEE RESEARCH FOUNDATION reassignment UNIVERSITY OF TENNESSEE RESEARCH FOUNDATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEE, SANG HYEB, HU, DONGMING, OSBORNE, DUSTIN R.
Publication of US20160358334A1 publication Critical patent/US20160358334A1/en
Application granted granted Critical
Publication of US10255684B2 publication Critical patent/US10255684B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5288Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5294Devices using data or image processing specially adapted for radiation diagnosis involving using additional data, e.g. patient information, image labeling, acquisition parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/73
    • G06T7/0085
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/412Dynamic

Definitions

  • the present invention relates to motion correction in medical imaging systems.
  • Medical imaging systems/scanners e.g., positron emission tomography (PET), computed tomography (CT), etc.
  • PET positron emission tomography
  • CT computed tomography
  • Two primary sources of patient movements are head and neck motion and respiratory motion.
  • One such technique consists of recording motion data from an electronic device attached to the patient (e.g., a waist band attached to the patient) to monitor the patient movement, while the patient is being scanned by the medical imaging scanner.
  • Motion correction is performed through post-processing of the scanned data by correlating the scanned data with the motion data.
  • delays may be present between the scanned data and the motion data. Movement of the electronic device itself on the patient's body may occur during the scan.
  • Another technique involves using a video monitoring system to detect the patient movement from a video recording of the patient during the scan. The motion data detected by this video monitoring system usually lacks accuracy, and patients find video recording of their medical examination intrusive.
  • these motion correction methods not only are invasive and uncomfortable, but also result in inaccurate or erroneous motion correction of the scanned data.
  • the present invention provides methods for motion correction for use in medical imaging systems. These methods require no attached electronic hardware devices or invasive camera systems, and offer high resolution tracking of motion that can automatically detect and correct patient movement during imaging.
  • external emitting sources such as positron emitting sources
  • Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner.
  • Listmode is a known data format for recording events, e.g., during a PET session.
  • the listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources.
  • a coincident line of response also known as a line of response (LOR) is an imaginary line connecting points where a pair of annihilation photons are detected by a medical imaging scanner.
  • the locations and motion of the external emitting sources may be tracked in a three-dimensional space and recorded throughout the course of the scan.
  • Static regions corresponding to subsequent locations of limited or no motion of the external emitting sources may be determined.
  • Imaging data coinciding with the static regions are stored, while imaging data corresponding to transition regions from one static region to another are discarded.
  • Motion vectors between each static region are recorded and then used in reconstruction to create a motion-corrected dataset.
  • the listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
  • external emitting sources such as positron emitting sources
  • regions of motion associated with the patient's clinical indication such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
  • Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner.
  • the listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources.
  • the locations and motion of the external emitting sources may be tracked and recorded throughout the course of the scan. Respiratory motion information may be derived from the tracked motion of the sources and a respiratory waveform may be generated.
  • the respiratory waveform may be analyzed and marked for gating of the listmode data.
  • Gating is a data processing technique applicable to listmode data, in which data that lies outside of specified “gate” areas are discarded.
  • gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner.
  • the listmode data may also be altered such that the motion affected events are repositioned into a user defined “stationary” geometry for use in subsequent histogramming and reconstruction.
  • an object of interest such as a lesion, of the patient affected by respiratory or head and neck motion may be selected.
  • a bounding region containing the object of interest may be identified.
  • Listmode data of a medical imaging data may be used to determine the coincident lines of response measured within the bounding region.
  • Respiratory motion information may be derived from the tracked motion of the object of interest and a respiratory waveform may be generated.
  • the respiratory waveform may be analyzed and marked for gating of the listmode data.
  • the gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner.
  • Head and neck correction information may also be derived from the tracked motion of the object of interest enabling the ability to identify periods of non-motion and remove motion-affected data.
  • the listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
  • FIG. 1 illustrates a medical imaging system according to an embodiment of the present invention.
  • FIG. 2 illustrates a method for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
  • FIG. 3 compares images from a dataset that has been corrected for head and neck motion to images from an uncorrected dataset.
  • FIG. 4 illustrates a method for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
  • FIG. 5( a ) illustrates exemplary respiratory waveforms for a preclinical subject.
  • FIG. 5( b ) illustrates exemplary respiratory waveforms for a clinical subject.
  • FIG. 6 compares an image from a dataset that has been corrected for respiratory motion to an image from an uncorrected dataset.
  • FIG. 7 illustrates a medical imaging system according to an embodiment of the present invention.
  • FIG. 8 illustrates a method for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
  • FIG. 1 illustrates a simplified diagram of a medical imaging system 100 , according to an embodiment of the present invention.
  • An example of the medical imaging system 100 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof.
  • the medical imaging system 100 may include a detector 102 , a coincidence processing module 104 , a listmode data acquisition module 106 , a motion correction module 108 , an image reconstruction module 110 , an image output module 112 , a memory 114 , and a processor 116 .
  • a patient 118 may commonly be positioned within the detector 102 , as shown in FIG. 1 , and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, the patient may be moved continually through the horizontal scan range.
  • the memory 114 may be provided as a volatile memory, a non-volatile memory, or a combination thereof.
  • the memory 114 may store program instructions, scan data generated by the medical imaging system 100 , and any data as needed by the medical imaging system 100 .
  • Algorithms to operate the coincidence processing module 104 , the listmode data acquisition module 106 , the motion correction module 108 , the image reconstruction module 110 , and the image output module 112 may be provided as software stored in the memory 114 .
  • the processor 116 may be a microcontroller or a microprocessor.
  • the processor 116 may execute the instructions stored in the memory 114 and may control the operations of the coincidence processing module 104 , the listmode data acquisition module 106 , the motion correction module 108 , the image reconstruction module 110 , and the image output module 112 .
  • the motion correction module 108 may be coupled externally to the medical imaging system 100 .
  • the motion correction module 108 may include a separate memory and processor.
  • FIG. 2 illustrates a method 200 for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
  • step 202 external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in three off-axis positions, as shown by positions 120 , 122 , and 124 in FIG. 1 , for example.
  • positron emitting sources such as positron emitting sources
  • pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100 .
  • the listmode data may be in 64-bit listmode format.
  • the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.
  • FIG. 1 shows an example of a coincident line of response 126 .
  • the motion correction module 108 may track the locations of the external emitting sources at positions 120 , 122 , and 124 in a three-dimensional space and record these locations throughout the course of the scan.
  • the motion correction module 108 may determine static regions corresponding to subsequent locations of limited or no motion of the external emitting sources.
  • the motion correction module 108 stores imaging data coinciding with the static regions and discards imaging data corresponding to transition regions from one static region to another.
  • Imaging data may be any data among the listmode data that do not correspond to pairs of annihilation photons from the external emitting sources.
  • the motion correction module may generate motion vectors between each static position.
  • the image reconstruction module 112 may use the motion vectors to create a motion-corrected dataset, thereby reconstructing an image.
  • the head and neck motion correction techniques described herein were tested in an experimental setup similar to the one shown in FIG. 1 .
  • a plurality of patients were imaged on a PET/CT scanner, i.e., the medical imaging system 100 .
  • Low activity/dose point sources were placed on the heads of the patients, in asymmetrical locations to enable three-dimensional tracking, as illustrated by positions 120 , 122 , and 124 in FIG. 1 , for example.
  • the patients were instructed to move during their scans, simulating typical patient movements during imaging.
  • the PET/CT scanner acquired PET data for 10 minutes in a 64-bit listmode format.
  • a motion correction module 108 used algorithms, as outlined in steps 206 and 208 above, to track head motions during the scans.
  • the motion correction module 108 corrected data by calculating centroid locations for each source at time points when the patient was still. Events associated with transitional motion were discarded. The initial CT position was used as the reference position. Subsequent static positions were transformed to the reference frame by calculating transformation matrices from the calculated centroid locations. All reoriented static positions were summed to create the final dataset.
  • anatomical modalities may be used to generate a reference point for which the transformation matrix may be generated.
  • the list mode data itself may be used to generate one or more reference points for the reconstruction of three-dimensional volumes from specific time segments within the acquired list mode data.
  • an alternative reconstruction process may involve segmenting the list mode data, designating specific segments as corresponding to “stationary” geometries, reconstructing a three-dimensional volume for those designated segments, then combining the reconstructed volumes into a single volume.
  • FIG. 3 illustrates a sample of images resulting from the experiment.
  • FIG. 3 shows, on the left, images resulting from a dataset that has not been motion-corrected.
  • the images on the right result from the same, but motion-corrected, dataset.
  • head and neck motion correction method 200 is a robust method for automatic motion correction in head and neck patients. Automatic correction of motion may prevent inaccurate radiological examinations and prevent burdening patients with repeated imaging procedures.
  • Respiratory motion in medical imaging affects diagnostic image quality for a wide range of cancers, including: lung, liver, pancreatic, and gastric.
  • the medical imaging system 100 shown in FIG. 1 may also be used for respiratory motion correction.
  • the external emitting sources are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
  • the torso of the patient 118 must be positioned within detector 102 accordingly or pass through the axial field of view during the course of data acquisition.
  • FIG. 4 illustrates a method 400 for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
  • step 402 external emitting sources, such as positron emitting sources, are placed on the on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
  • positron emitting sources such as positron emitting sources
  • pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100 .
  • the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources.
  • FIG. 1 shows an example of a coincident line of response 126 .
  • the motion correction module 108 may track the locations of the external emitting sources and record these locations throughout the course of the scan.
  • the motion correction module 108 may derive respiratory motion information from the tracked motion of the sources and generate a respiratory waveform.
  • the motion correction module 108 may analyze and mark the respiratory waveform for gating of the listmode data.
  • the motion correction module may insert gating tags into the listmode data.
  • the image reconstruction module 112 may reconstruct a motion-corrected image using the inserted gating tags and generate histograms.
  • the gating tags may mark locations of local maxima for each respiratory cycle, enabling reconstruction through amplitude or phase-based gating.
  • the respiratory motion correction techniques described herein were tested in both preclinical and clinical imaging systems. For both systems, low activity/dose point sources were placed on animals or humans in sites of respiratory motion for software tracking by the motion correction module 108 . Standard electronic gating systems were also attached to the subjects with a respiratory pad used for mouse imaging and a respiratory band used for human imaging. PET data were collected for 10 minutes for clinical and preclinical subjects. 64-bit listmode data was acquired with tags inserted from standard electronic systems. The raw listmode data was processed by the motion correction module 108 , as discussed above, inserting the gating tags into the listmode data. The motion correction module 108 was configured to insert gating tags at local maxima in the y-axis for each respiratory cycle. High frequency noise was removed by applying a discrete wavelet transformation denoising technique. Amplitude-based gating was used to reconstruct static images with a duty cycle of 20%. It should be noted that phase-based gating may also be used for the image reconstruction.
  • FIGS. 5( a ) and 5( b ) illustrate exemplary respiratory waveforms for a preclinical subject and a clinical subject, respectively.
  • the waveforms are marked for gating tags.
  • the solid vertical marks, some of which are labeled 52 correspond to tags to be inserted into the listmode data by the respiratory motion correction method 400
  • the broken vertical marks, some of which are labeled 55 correspond to tags from the standard electronic gating system.
  • Comparison of waveforms between electronic and software-based gating indicated correlation between insertion points of greater than 99%, while timing drift in gating tag entry was only observed in the electronic signals generated by the standard electronic gating system.
  • FIG. 6 illustrates exemplary images resulting from the experiment.
  • FIG. 6 shows, on the left, an image resulting from a dataset that has not been motion-corrected, and, on the right, a corresponding image resulting from the same, but motion-corrected, dataset.
  • Phase-based and amplitude-based gated reconstructions are possible using this respiratory motion correction method 400 and shows improved image quality in regions of respiratory motion. Therefore, the respiratory motion correction method 400 is able to produce accurate respiratory waveforms and correct insertion of gating tags. Visual comparison of data indicates that reconstructed images using tags inserted from the respiratory motion correction method 400 produce images with reduced motion artifacts.
  • FIG. 7 illustrates a simplified diagram of a medical imaging system 700 , according to an embodiment of the present invention.
  • An example of the medical imaging system 700 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof.
  • the medical imaging system 700 may include a detector 702 , a coincidence processing module 704 , a listmode data acquisition module 706 , a motion correction module 708 , an image reconstruction module 710 , an image output module 712 , a memory 714 , and a processor 716 .
  • a patient 718 may commonly be positioned within the detector 702 , as shown in FIG.
  • a user 730 may interact with the medical imaging system 700 to select an object of interest 732 , the location of which needs to be tracked, as will be explained in more details below.
  • the object of interest 732 may be, but are not limited to, a lesion in the lung and edges of anatomical surfaces, such as the dome of the liver.
  • FIG. 8 illustrates a method 800 for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
  • the user 730 may select an object of interest 732 , such as a lesion, of the patient affected by respiratory or head and neck motion.
  • the user 730 may then identify a bounding region 734 containing the object of interest 732 .
  • FIG. 7 illustrates, for example, an object of interest 732 bounded by a bounding region 734 .
  • the motion correction module 708 may use the listmode data from the listmode data acquisition module 706 to identify the coincident lines of response, from the coincidence processing module 706 , measured within the bounding region 734 .
  • FIG. 7 shows an example of a coincident line of response 736 .
  • the motion correction module 708 may track the location of the object of interest 732 throughout the course of the scan.
  • the motion correction module 708 may derive respiratory motion information from the tracked motion of the object of interest 732 and generate a respiratory waveform.
  • the motion correction module 708 may analyze and mark the respiratory waveform for gating of the listmode data.
  • the motion correction module may insert gating tags into the listmode data.
  • the image reconstruction module 712 may reconstruct a motion-corrected image and generate histograms.

Abstract

Systems and methods for correcting motion during medical imaging involve using a detector to track annihilation photons produced by one of (i) external emitting sources placed onto a body of a person being imaged or (ii) an object of interest in the body. Motion information is generated based on the tracking. A motion-corrected image is formed from recorded image data, using the motion information.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The present application claims the benefit of U.S. Provisional application, Ser. No. 62/171,489, filed Jun. 5, 2015, entitled “ MOTION CORRECTION FOR MEDICAL IMAGING,” the disclosure of which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The present invention relates to motion correction in medical imaging systems.
  • BACKGROUND
  • Medical imaging systems/scanners (e.g., positron emission tomography (PET), computed tomography (CT), etc.) are typically used for diagnostic purposes. Patient movement during medical imaging, however, can result in degraded image quality and reduced diagnostic confidence. Two primary sources of patient movements are head and neck motion and respiratory motion.
  • Conventional solutions to correct for patient movement in medical imaging have significant drawbacks. One such technique consists of recording motion data from an electronic device attached to the patient (e.g., a waist band attached to the patient) to monitor the patient movement, while the patient is being scanned by the medical imaging scanner. Motion correction is performed through post-processing of the scanned data by correlating the scanned data with the motion data. In addition to the problems caused by the electronic hardware itself such as difficult placement on the patient, delays may be present between the scanned data and the motion data. Movement of the electronic device itself on the patient's body may occur during the scan. Another technique involves using a video monitoring system to detect the patient movement from a video recording of the patient during the scan. The motion data detected by this video monitoring system usually lacks accuracy, and patients find video recording of their medical examination intrusive. Thus, these motion correction methods not only are invasive and uncomfortable, but also result in inaccurate or erroneous motion correction of the scanned data.
  • SUMMARY
  • The present invention provides methods for motion correction for use in medical imaging systems. These methods require no attached electronic hardware devices or invasive camera systems, and offer high resolution tracking of motion that can automatically detect and correct patient movement during imaging.
  • In an embodiment, external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in, at least, three off-axis positions. Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner. Listmode is a known data format for recording events, e.g., during a PET session. The listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources. A coincident line of response, also known as a line of response (LOR) is an imaginary line connecting points where a pair of annihilation photons are detected by a medical imaging scanner. Thus, the locations and motion of the external emitting sources may be tracked in a three-dimensional space and recorded throughout the course of the scan. Static regions corresponding to subsequent locations of limited or no motion of the external emitting sources may be determined. Imaging data coinciding with the static regions are stored, while imaging data corresponding to transition regions from one static region to another are discarded. Motion vectors between each static region are recorded and then used in reconstruction to create a motion-corrected dataset. The listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
  • In an embodiment, external emitting sources, such as positron emitting sources, are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging. Annihilation photons produced by the external emitting sources may be detected by a medical imaging scanner and recorded among the listmode data of the medical imaging scanner. The listmode data may be used to determine the coincident lines of response corresponding to the annihilation photons from the external emitting sources. Thus, the locations and motion of the external emitting sources may be tracked and recorded throughout the course of the scan. Respiratory motion information may be derived from the tracked motion of the sources and a respiratory waveform may be generated. The respiratory waveform may be analyzed and marked for gating of the listmode data. Gating is a data processing technique applicable to listmode data, in which data that lies outside of specified “gate” areas are discarded. According to an embodiment, gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner. The listmode data may also be altered such that the motion affected events are repositioned into a user defined “stationary” geometry for use in subsequent histogramming and reconstruction.
  • In an embodiment, an object of interest, such as a lesion, of the patient affected by respiratory or head and neck motion may be selected. A bounding region containing the object of interest may be identified. Listmode data of a medical imaging data may be used to determine the coincident lines of response measured within the bounding region. Thus, the locations and motion of the object of interest may be tracked. Respiratory motion information may be derived from the tracked motion of the object of interest and a respiratory waveform may be generated. The respiratory waveform may be analyzed and marked for gating of the listmode data. The gating tags are subsequently inserted into the listmode data for histogramming and motion-corrected image reconstruction by the medical imaging scanner. Head and neck correction information may also be derived from the tracked motion of the object of interest enabling the ability to identify periods of non-motion and remove motion-affected data. The listmode data may also be altered such that the motion affected events are repositioned into a common “motion free” geometry for use in subsequent histogramming and reconstruction.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a medical imaging system according to an embodiment of the present invention.
  • FIG. 2 illustrates a method for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
  • FIG. 3 compares images from a dataset that has been corrected for head and neck motion to images from an uncorrected dataset.
  • FIG. 4 illustrates a method for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
  • FIG. 5(a) illustrates exemplary respiratory waveforms for a preclinical subject.
  • FIG. 5(b) illustrates exemplary respiratory waveforms for a clinical subject.
  • FIG. 6 compares an image from a dataset that has been corrected for respiratory motion to an image from an uncorrected dataset.
  • FIG. 7 illustrates a medical imaging system according to an embodiment of the present invention.
  • FIG. 8 illustrates a method for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
  • DETAILED DESCRIPTION Head and Neck Motion Correction
  • FIG. 1 illustrates a simplified diagram of a medical imaging system 100, according to an embodiment of the present invention. An example of the medical imaging system 100 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof. The medical imaging system 100 may include a detector 102, a coincidence processing module 104, a listmode data acquisition module 106, a motion correction module 108, an image reconstruction module 110, an image output module 112, a memory 114, and a processor 116. A patient 118 may commonly be positioned within the detector 102, as shown in FIG. 1, and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, the patient may be moved continually through the horizontal scan range.
  • The memory 114 may be provided as a volatile memory, a non-volatile memory, or a combination thereof. The memory 114 may store program instructions, scan data generated by the medical imaging system 100, and any data as needed by the medical imaging system 100. Algorithms to operate the coincidence processing module 104, the listmode data acquisition module 106, the motion correction module 108, the image reconstruction module 110, and the image output module 112 may be provided as software stored in the memory 114. The processor 116 may be a microcontroller or a microprocessor. The processor 116 may execute the instructions stored in the memory 114 and may control the operations of the coincidence processing module 104, the listmode data acquisition module 106, the motion correction module 108, the image reconstruction module 110, and the image output module 112.
  • In another embodiment, the motion correction module 108 may be coupled externally to the medical imaging system 100. In such an embodiment, the motion correction module 108 may include a separate memory and processor.
  • FIG. 2 illustrates a method 200 for head and neck motion correction in medical imaging, according to an embodiment of the present invention.
  • In step 202, external emitting sources, such as positron emitting sources, are placed on the patient's head or neck in three off-axis positions, as shown by positions 120, 122, and 124 in FIG. 1, for example.
  • In step 204, pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100. For example, the listmode data may be in 64-bit listmode format.
  • In step 206, the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources. FIG. 1 shows an example of a coincident line of response 126.
  • In step 208, from the corresponding coincident lines of response, the motion correction module 108 may track the locations of the external emitting sources at positions 120, 122, and 124 in a three-dimensional space and record these locations throughout the course of the scan.
  • In step 210, the motion correction module 108 may determine static regions corresponding to subsequent locations of limited or no motion of the external emitting sources. The motion correction module 108 stores imaging data coinciding with the static regions and discards imaging data corresponding to transition regions from one static region to another. Imaging data may be any data among the listmode data that do not correspond to pairs of annihilation photons from the external emitting sources.
  • In step 212, the motion correction module may generate motion vectors between each static position.
  • In step 214, the image reconstruction module 112 may use the motion vectors to create a motion-corrected dataset, thereby reconstructing an image.
  • One skilled in the art would appreciate that, given that motion correction may be performed on the raw listmode data prior to image reconstruction, the head and neck motion correction techniques described herein may be applied to many PET imaging systems.
  • The head and neck motion correction techniques described herein were tested in an experimental setup similar to the one shown in FIG. 1. A plurality of patients were imaged on a PET/CT scanner, i.e., the medical imaging system 100. Low activity/dose point sources were placed on the heads of the patients, in asymmetrical locations to enable three-dimensional tracking, as illustrated by positions 120, 122, and 124 in FIG. 1, for example. The patients were instructed to move during their scans, simulating typical patient movements during imaging. For each patient, the PET/CT scanner acquired PET data for 10 minutes in a 64-bit listmode format. A motion correction module 108 used algorithms, as outlined in steps 206 and 208 above, to track head motions during the scans. As in step 210, the motion correction module 108 corrected data by calculating centroid locations for each source at time points when the patient was still. Events associated with transitional motion were discarded. The initial CT position was used as the reference position. Subsequent static positions were transformed to the reference frame by calculating transformation matrices from the calculated centroid locations. All reoriented static positions were summed to create the final dataset.
  • As an alternative to using a CT position as a reference, other anatomical modalities may be used to generate a reference point for which the transformation matrix may be generated. For instance, the list mode data itself may be used to generate one or more reference points for the reconstruction of three-dimensional volumes from specific time segments within the acquired list mode data. Thus, an alternative reconstruction process may involve segmenting the list mode data, designating specific segments as corresponding to “stationary” geometries, reconstructing a three-dimensional volume for those designated segments, then combining the reconstructed volumes into a single volume.
  • FIG. 3 illustrates a sample of images resulting from the experiment. FIG. 3 shows, on the left, images resulting from a dataset that has not been motion-corrected. The images on the right result from the same, but motion-corrected, dataset.
  • Analysis of the images indicated that patient motion during the scan severely degraded the quality of the images. Key features of the brain are hardly discernible. The algorithms used by motion correction module 108 successfully tracked all three source positions during the scan. Following the motion correction, the images of the brain were crisp with no signs of motion, as shown on the right side of FIG. 3. In addition, this head and neck motion correction method 200 enabled recovery of about 97% of data. Therefore, head and neck motion correction method 200 is a robust method for automatic motion correction in head and neck patients. Automatic correction of motion may prevent inaccurate radiological examinations and prevent burdening patients with repeated imaging procedures.
  • Respiratory Motion Correction—With External Emitting Sources
  • Respiratory motion in medical imaging affects diagnostic image quality for a wide range of cancers, including: lung, liver, pancreatic, and gastric. The medical imaging system 100 shown in FIG. 1 may also be used for respiratory motion correction. However, the external emitting sources are placed on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging. The torso of the patient 118 must be positioned within detector 102 accordingly or pass through the axial field of view during the course of data acquisition.
  • FIG. 4 illustrates a method 400 for respiratory motion correction in medical imaging, according to an embodiment of the present invention.
  • In step 402, external emitting sources, such as positron emitting sources, are placed on the on the patient's torso along regions of motion associated with the patient's clinical indication, such as near the chest for lung imaging or just above the belly button for liver or gastric imaging.
  • In step 404, pairs of annihilation photons produced by the external emitting sources and moving in approximately opposite directions may be detected by detector 102 and recorded among the listmode data acquired by the listmode data acquisition module 106 of the medical imaging system 100.
  • In step 406, the motion correction module 108 may use the listmode data from the listmode data acquisition module 106 to determine the coincident lines of response from the coincidence processing module 106 corresponding to pairs of annihilation photons from the external emitting sources. FIG. 1 shows an example of a coincident line of response 126.
  • In step 408, from the corresponding coincident lines of response, the motion correction module 108 may track the locations of the external emitting sources and record these locations throughout the course of the scan.
  • In step 410, the motion correction module 108 may derive respiratory motion information from the tracked motion of the sources and generate a respiratory waveform.
  • In step 412, the motion correction module 108 may analyze and mark the respiratory waveform for gating of the listmode data.
  • In step 414, the motion correction module may insert gating tags into the listmode data.
  • In step 416, based on the gating tags, the image reconstruction module 112 may reconstruct a motion-corrected image using the inserted gating tags and generate histograms. For example, as noted below, the gating tags may mark locations of local maxima for each respiratory cycle, enabling reconstruction through amplitude or phase-based gating.
  • One skilled in the art would appreciate that, given that motion correction may also be performed on the raw listmode data prior to image reconstruction, the respiratory motion correction techniques described herein may be applied to many PET imaging system.
  • The respiratory motion correction techniques described herein were tested in both preclinical and clinical imaging systems. For both systems, low activity/dose point sources were placed on animals or humans in sites of respiratory motion for software tracking by the motion correction module 108. Standard electronic gating systems were also attached to the subjects with a respiratory pad used for mouse imaging and a respiratory band used for human imaging. PET data were collected for 10 minutes for clinical and preclinical subjects. 64-bit listmode data was acquired with tags inserted from standard electronic systems. The raw listmode data was processed by the motion correction module 108, as discussed above, inserting the gating tags into the listmode data. The motion correction module 108 was configured to insert gating tags at local maxima in the y-axis for each respiratory cycle. High frequency noise was removed by applying a discrete wavelet transformation denoising technique. Amplitude-based gating was used to reconstruct static images with a duty cycle of 20%. It should be noted that phase-based gating may also be used for the image reconstruction.
  • FIGS. 5(a) and 5(b) illustrate exemplary respiratory waveforms for a preclinical subject and a clinical subject, respectively. The waveforms are marked for gating tags. The solid vertical marks, some of which are labeled 52, correspond to tags to be inserted into the listmode data by the respiratory motion correction method 400, while the broken vertical marks, some of which are labeled 55, correspond to tags from the standard electronic gating system. Comparison of waveforms between electronic and software-based gating indicated correlation between insertion points of greater than 99%, while timing drift in gating tag entry was only observed in the electronic signals generated by the standard electronic gating system.
  • FIG. 6 illustrates exemplary images resulting from the experiment. FIG. 6 shows, on the left, an image resulting from a dataset that has not been motion-corrected, and, on the right, a corresponding image resulting from the same, but motion-corrected, dataset.
  • Phase-based and amplitude-based gated reconstructions are possible using this respiratory motion correction method 400 and shows improved image quality in regions of respiratory motion. Therefore, the respiratory motion correction method 400 is able to produce accurate respiratory waveforms and correct insertion of gating tags. Visual comparison of data indicates that reconstructed images using tags inserted from the respiratory motion correction method 400 produce images with reduced motion artifacts.
  • Motion Correction without External Emitting Sources
  • Respiratory or head and neck motion correction may also be realized without the use external emitting sources. FIG. 7 illustrates a simplified diagram of a medical imaging system 700, according to an embodiment of the present invention. An example of the medical imaging system 700 may employ, but is not limited to, positron emission tomography (PET) or computed tomography (CT), or a combination thereof. The medical imaging system 700 may include a detector 702, a coincidence processing module 704, a listmode data acquisition module 706, a motion correction module 708, an image reconstruction module 710, an image output module 712, a memory 714, and a processor 716. A patient 718 may commonly be positioned within the detector 702, as shown in FIG. 7, and may be moved horizontally depending on the region of interest of the patient's body that needs to be scanned. For continuous bed motion enabled systems, the patient 718 may be moved continually through the horizontal scan range. A user 730 may interact with the medical imaging system 700 to select an object of interest 732, the location of which needs to be tracked, as will be explained in more details below. Examples of the object of interest 732 may be, but are not limited to, a lesion in the lung and edges of anatomical surfaces, such as the dome of the liver.
  • FIG. 8 illustrates a method 800 for respiratory motion correction in medical imaging without the use of external emitting sources, according to an embodiment of the present invention.
  • In step 802, using a reconstructed image from the image output module 712, the user 730 may select an object of interest 732, such as a lesion, of the patient affected by respiratory or head and neck motion. The user 730 may then identify a bounding region 734 containing the object of interest 732. FIG. 7 illustrates, for example, an object of interest 732 bounded by a bounding region 734.
  • In step 804, the motion correction module 708 may use the listmode data from the listmode data acquisition module 706 to identify the coincident lines of response, from the coincidence processing module 706, measured within the bounding region 734. FIG. 7 shows an example of a coincident line of response 736.
  • In step 806, from the identified coincident lines of response, the motion correction module 708 may track the location of the object of interest 732 throughout the course of the scan.
  • In step 808, the motion correction module 708 may derive respiratory motion information from the tracked motion of the object of interest 732 and generate a respiratory waveform.
  • In step 810, the motion correction module 708 may analyze and mark the respiratory waveform for gating of the listmode data.
  • In step 812, the motion correction module may insert gating tags into the listmode data.
  • In step 814, based on the gating tags and using either amplitude-based gating or phase-based gating, the image reconstruction module 712 may reconstruct a motion-corrected image and generate histograms.
  • One skilled in the art would appreciate that, given that motion correction may be performed on the raw listmode data prior to image reconstruction, the respiratory motion correction techniques described herein may be applied to many PET imaging systems.
  • The foregoing description has been set forth merely to illustrate the invention and is not intended as being limiting. Each of the disclosed aspects and embodiments of the present invention may be considered individually or in combination with other aspects, embodiments, and variations of the invention. Further, while certain features of embodiments of the present invention may be shown in only certain figures, such features can be incorporated into other embodiments shown in other figures while remaining within the scope of the present invention. In addition, unless otherwise specified, none of the steps of the methods of the present invention are confined to any particular order of performance. Modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art and such modifications are within the scope of the present invention. Furthermore, all references cited herein are incorporated by reference in their entirety.

Claims (20)

What is claimed is:
1. A method for correcting motion during medical imaging, the method comprising:
tracking annihilation photons produced by one of (i) external emitting sources placed onto a body of a person being imaged or (ii) an object of interest in the body;
generating motion information based on the tracking; and
forming a motion-corrected image from recorded image data using the motion information.
2. The method of claim 1, wherein the motion information includes motion vectors between static regions corresponding to locations of limited or no motion.
3. The method of claim 2, further comprising:
discarding portions of the recorded image data that are associated with non-static regions.
4. The method of claim 2, wherein the motion-corrected image is formed by transforming the static regions to a reference position.
5. The method of claim 4, wherein annihilation photons produced by the external emitting sources are tracked, and wherein the transforming includes calculating transformation matrices from centroid locations of the external emitting sources.
6. The method of claim 1, wherein the motion information includes a respiratory waveform.
7. The method of claim 6, further comprising:
inserting gating tags into the recorded image data based on the respiratory waveform.
8. The method of claim 7, wherein the gating tags are inserted at local maxima for each of a plurality of respiratory cycles.
9. The method of claim 7, further comprising:
applying amplitude or phase-based gating to the recorded image data based on the inserted gating tags.
10. The method of claim 1, wherein annihilation photons produced by the object of interest are tracked, and wherein the object of interest is a lesion or an edge of an anatomical surface.
11. A system for correcting motion during medical imaging, the system comprising:
a detector device that tracks annihilation photons produced by one of (i) external emitting sources placed onto a body of a person being imaged or (ii) an object of interest in the body; and
a hardware processor that generates motion information based on the tracking, and forms a motion-corrected image from recorded image data using the motion information.
12. The system of claim 11, wherein the motion information includes motion vectors between static regions corresponding to locations of limited or no motion.
13. The system of claim 12, wherein the processor discards portions of the recorded image data that are associated with non-static regions.
14. The system of claim 12, wherein the processor forms the motion-corrected image by transforming the static regions to a reference position.
15. The system of claim 14, wherein the detector device tracks annihilation photons produced by the external emitting sources, and wherein the transforming includes calculating transformation matrices from centroid locations of the external emitting sources.
16. The system of claim 11, wherein the motion information includes a respiratory waveform.
17. The system of claim 16, wherein the processor inserts gating tags into the recorded image data based on the respiratory waveform.
18. The system of claim 17, wherein the gating tags are inserted at local maxima for each of a plurality of respiratory cycles.
19. The system of claim 17, wherein the processor applies amplitude or phase-based gating to the recorded image data based on the inserted gating tags.
20. The system of claim 11, wherein the detector device tracks annihilation photons produced by the object of interest, and wherein the object of interest is a lesion or an edge of an anatomical surface.
US15/174,130 2015-06-05 2016-06-06 Motion correction for PET medical imaging based on tracking of annihilation photons Expired - Fee Related US10255684B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/174,130 US10255684B2 (en) 2015-06-05 2016-06-06 Motion correction for PET medical imaging based on tracking of annihilation photons

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562171489P 2015-06-05 2015-06-05
US15/174,130 US10255684B2 (en) 2015-06-05 2016-06-06 Motion correction for PET medical imaging based on tracking of annihilation photons

Publications (2)

Publication Number Publication Date
US20160358334A1 true US20160358334A1 (en) 2016-12-08
US10255684B2 US10255684B2 (en) 2019-04-09

Family

ID=57452252

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/174,130 Expired - Fee Related US10255684B2 (en) 2015-06-05 2016-06-06 Motion correction for PET medical imaging based on tracking of annihilation photons

Country Status (1)

Country Link
US (1) US10255684B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10075640B2 (en) * 2015-12-31 2018-09-11 Sony Corporation Motion compensation for image sensor with a block based analog-to-digital converter
CN110507336A (en) * 2019-07-23 2019-11-29 广东省医疗器械研究所 A kind of personalized method for cervical vertebra monitoring and correction
US10664979B2 (en) 2018-09-14 2020-05-26 Siemens Healthcare Gmbh Method and system for deep motion model learning in medical images
JP2022513233A (en) * 2018-12-17 2022-02-07 シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド Automatic motion correction during PET imaging
US11270434B2 (en) * 2019-10-07 2022-03-08 Siemens Medical Solutions Usa, Inc. Motion correction for medical image data
US11894126B1 (en) * 2023-02-24 2024-02-06 Ix Innovation Llc Systems and methods for tracking movement of a wearable device for advanced image stabilization

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102695A1 (en) * 2002-11-25 2004-05-27 Stergios Stergiopoulos Method and device for correcting organ motion artifacts in MRI systems
US20050123183A1 (en) * 2003-09-02 2005-06-09 Paul Schleyer Data driven motion correction for nuclear imaging
US20070265528A1 (en) * 2006-04-10 2007-11-15 Tong Xu Method and apparatus for real-time tumor tracking
US20070280508A1 (en) * 2006-05-19 2007-12-06 Ernst Thomas M Motion tracking system for real time adaptive imaging and spectroscopy
US20090253980A1 (en) * 2008-04-08 2009-10-08 General Electric Company Method and apparatus for determining the effectiveness of an image transformation process
US20110293143A1 (en) * 2009-02-17 2011-12-01 Koninklijke Philips Electronics N.V. Functional imaging
US20120051664A1 (en) * 2010-08-31 2012-03-01 General Electric Company Motion compensation in image processing
US8224056B2 (en) * 2009-12-15 2012-07-17 General Electronic Company Method for computed tomography motion estimation and compensation
US20130079626A1 (en) * 2011-09-26 2013-03-28 Andriy Shmatukha Systems and methods for automated dynamic contrast enhancement imaging
US20130287278A1 (en) * 2011-01-05 2013-10-31 Koninklijke Philips Electronics N.V. Method and apparatus to detect and correct motion in list-mode pet data with a gated signal
US20140133717A1 (en) * 2011-06-21 2014-05-15 Koninklijke Philips N.V. Respiratory motion determination apparatus
US20150134261A1 (en) * 2013-11-14 2015-05-14 J. Michael O'Connor Synchronization of patient motion detection equipment with medical imaging systems
US20150302613A1 (en) * 2014-04-16 2015-10-22 Siemens Medical Solutions Usa, Inc. Method To Compensate Gating Effects On Image Uniformity And Quantification For PET Scan With Continuous Bed Motion
US20160095565A1 (en) * 2014-10-01 2016-04-07 Siemens Aktiengesellschaft Method and imaging system for compensating for location assignment errors in pet data occurring due to a cyclical motion of a patient

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0324374D0 (en) * 2003-10-17 2003-11-19 Hammersmith Imanet Ltd Method of, and software for, conducting motion correction for a tomographic scanner
US8170302B1 (en) * 2005-09-30 2012-05-01 Ut-Battelle, Llc System and method for generating motion corrected tomographic images
EP2109399B1 (en) * 2007-02-07 2014-03-12 Koninklijke Philips N.V. Motion estimation in treatment planning
US20160247293A1 (en) * 2015-02-24 2016-08-25 Brain Biosciences, Inc. Medical imaging systems and methods for performing motion-corrected image reconstruction
US9606245B1 (en) * 2015-03-24 2017-03-28 The Research Foundation For The State University Of New York Autonomous gamma, X-ray, and particle detector

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040102695A1 (en) * 2002-11-25 2004-05-27 Stergios Stergiopoulos Method and device for correcting organ motion artifacts in MRI systems
US20050123183A1 (en) * 2003-09-02 2005-06-09 Paul Schleyer Data driven motion correction for nuclear imaging
US20070265528A1 (en) * 2006-04-10 2007-11-15 Tong Xu Method and apparatus for real-time tumor tracking
US20070280508A1 (en) * 2006-05-19 2007-12-06 Ernst Thomas M Motion tracking system for real time adaptive imaging and spectroscopy
US20090253980A1 (en) * 2008-04-08 2009-10-08 General Electric Company Method and apparatus for determining the effectiveness of an image transformation process
US20110293143A1 (en) * 2009-02-17 2011-12-01 Koninklijke Philips Electronics N.V. Functional imaging
US8224056B2 (en) * 2009-12-15 2012-07-17 General Electronic Company Method for computed tomography motion estimation and compensation
US20120051664A1 (en) * 2010-08-31 2012-03-01 General Electric Company Motion compensation in image processing
US20130287278A1 (en) * 2011-01-05 2013-10-31 Koninklijke Philips Electronics N.V. Method and apparatus to detect and correct motion in list-mode pet data with a gated signal
US20140133717A1 (en) * 2011-06-21 2014-05-15 Koninklijke Philips N.V. Respiratory motion determination apparatus
US20130079626A1 (en) * 2011-09-26 2013-03-28 Andriy Shmatukha Systems and methods for automated dynamic contrast enhancement imaging
US20150134261A1 (en) * 2013-11-14 2015-05-14 J. Michael O'Connor Synchronization of patient motion detection equipment with medical imaging systems
US20150302613A1 (en) * 2014-04-16 2015-10-22 Siemens Medical Solutions Usa, Inc. Method To Compensate Gating Effects On Image Uniformity And Quantification For PET Scan With Continuous Bed Motion
US20160095565A1 (en) * 2014-10-01 2016-04-07 Siemens Aktiengesellschaft Method and imaging system for compensating for location assignment errors in pet data occurring due to a cyclical motion of a patient

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Büther, F., Ernst, I., Hamill, J., Eich, H. T., Schober, O., Schäfers, M., & Schäfers, K. P. (2013). External radioactive markers for PET data-driven respiratory gating in positron emission tomography. European journal of nuclear medicine and molecular imaging, 40(4), 602-614. *
Harteela, M., Hirvi, H., Mäkipää, A., Teuho, J., Koivumäki, T., Mäkelä, M. M., & Teräs, M. (2014). Comparison of end-expiratory respiratory gating methods for PET/CT. Acta Oncologica, 53(8), 1079-1085. *
Nehmeh, S. A., Erdi, Y. E., Rosenzweig, K. E., Schoder, H., Larson, S. M., Squire, O. D., & Humm, J. L. (2003). Reduction of respiratory motion artifacts in PET imaging of lung cancer by respiratory correlated dynamic PET: methodology and comparison with respiratory gated PET. Journal of Nuclear Medicine, 44(10), 1644-1648. *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10075640B2 (en) * 2015-12-31 2018-09-11 Sony Corporation Motion compensation for image sensor with a block based analog-to-digital converter
US10664979B2 (en) 2018-09-14 2020-05-26 Siemens Healthcare Gmbh Method and system for deep motion model learning in medical images
JP2022513233A (en) * 2018-12-17 2022-02-07 シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド Automatic motion correction during PET imaging
JP7238134B2 (en) 2018-12-17 2023-03-13 シーメンス メディカル ソリューションズ ユーエスエー インコーポレイテッド Automatic motion compensation during PET imaging
CN110507336A (en) * 2019-07-23 2019-11-29 广东省医疗器械研究所 A kind of personalized method for cervical vertebra monitoring and correction
US11270434B2 (en) * 2019-10-07 2022-03-08 Siemens Medical Solutions Usa, Inc. Motion correction for medical image data
US11894126B1 (en) * 2023-02-24 2024-02-06 Ix Innovation Llc Systems and methods for tracking movement of a wearable device for advanced image stabilization

Also Published As

Publication number Publication date
US10255684B2 (en) 2019-04-09

Similar Documents

Publication Publication Date Title
US10255684B2 (en) Motion correction for PET medical imaging based on tracking of annihilation photons
US9414773B2 (en) Respiratory motion determination apparatus
US9451926B2 (en) Respiratory motion correction with internal-external motion correlation, and associated systems and methods
JP6243121B2 (en) Method and apparatus for motion detection and correction in imaging scans using time-of-flight information
Büther et al. Detection of respiratory tumour motion using intrinsic list mode-driven gating in positron emission tomography
JP5947813B2 (en) Method and apparatus for detecting and correcting motion in wrist mode PET data with gate signal
Olesen et al. List-mode PET motion correction using markerless head tracking: proof-of-concept with scans of human subject
US20040258286A1 (en) Systems and methods for retrospective internal gating
US20080287772A1 (en) Motion Compensation in PET Reconstruction
US9579070B2 (en) Optimal respiratory gating in medical imaging
US8658979B2 (en) Nuclear image reconstruction
JP5389907B2 (en) Geometric transformations that maintain list mode format
EP2575616B1 (en) Amplitude/slope-based motion phase mapping
Visvikis et al. Respiratory motion in positron emission tomography for oncology applications: Problems and solutions
Feng et al. Real-time data-driven rigid motion detection and correction for brain scan with listmode PET
KR20140042461A (en) Method and apparatus to correct motion
US20230022425A1 (en) Apparatus, system, method and computer probram for providing a nuclear image of a region of interest of a patient
CN110215226B (en) Image attenuation correction method, image attenuation correction device, computer equipment and storage medium
Goddard et al. Non-invasive PET head-motion correction via optical 3d pose tracking
US20230008263A1 (en) Motion compensation of positron emission tomographic data
van den Hoff et al. Motion Compensation in Emission Tomography
Woo et al. Development of a motion correction system for respiratory-gated PET study
Woo et al. Motion correction of respiratory-gated PET/CT images using polynomial warping
Breuilly et al. Image-based motion detection in 4D images and application to respiratory motion suppression
van den Hoff et al. Emission Tomography

Legal Events

Date Code Title Description
AS Assignment

Owner name: UNIVERSITY OF TENNESSEE RESEARCH FOUNDATION, TENNE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OSBORNE, DUSTIN R.;HU, DONGMING;LEE, SANG HYEB;SIGNING DATES FROM 20160609 TO 20160612;REEL/FRAME:039126/0046

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230409