US20070049827A1 - Clinical feedback of ablation efficacy during ablation procedure - Google Patents

Clinical feedback of ablation efficacy during ablation procedure Download PDF

Info

Publication number
US20070049827A1
US20070049827A1 US11/312,023 US31202305A US2007049827A1 US 20070049827 A1 US20070049827 A1 US 20070049827A1 US 31202305 A US31202305 A US 31202305A US 2007049827 A1 US2007049827 A1 US 2007049827A1
Authority
US
United States
Prior art keywords
ultrasound
data
physiology
image
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/312,023
Inventor
Brenda Donaldson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US11/312,023 priority Critical patent/US20070049827A1/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DONALDSON, BRENDA L.
Publication of US20070049827A1 publication Critical patent/US20070049827A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • A61B5/0215Measuring pressure in heart or blood vessels by means inserted into the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • A61B5/279Bioelectric electrodes therefor specially adapted for particular uses
    • A61B5/28Bioelectric electrodes therefor specially adapted for particular uses for electrocardiography [ECG]
    • A61B5/283Invasive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal

Definitions

  • Embodiments of the present invention generally relate to methods and systems for combining physiology information with ultrasound based anatomic structures. More particularly, embodiments relate to methods and systems that construct a 2D or 3D representation of an anatomical structure based on ultrasound data and superimpose thereon graphical information representative of physiologic characteristics of the anatomic structure in real-time or near real-time during an ablation procedure.
  • Ablation procedures are used to treat various conditions including artrial fibrillation, artrial flutter, A-V nodal re-entrant tachycardia, by-pass tract tachycardia, ventricular tachycardia and others.
  • Clinicians use catheters which deliver radio frequency, cryo, laser and other forms of energy to destroy selected tissues.
  • Ablation size is determined by various factors including, but not limited to, catheter contact, angle of the catheter in relation to the selected tissue, blood flow in the area, impedance in the setting selected by the clinician on various ablation delivery systems.
  • Ablation efficacy requires tissue destruction at the catheter-tissue interface.
  • Ablation is either set to destroy tissue in a selected area or to permanently prevent electrical conduction by preventing a rhythm from crossing a line drawn by ablations termed a line of block. It is known to provide mapping systems which can record the location of ablations but such systems to not provide any feedback as to the status of the tissue at the catheter tip.
  • EP electrophysiology
  • HD hemo-dynamic
  • ablation procedures are carried out through the use of, among other things, EP catheters, HD catheters and mapping sensors.
  • the procedure room also includes a fluoroscopy system, a diagnostic ultrasound system, a patient monitoring device and an ablation system.
  • the ultrasound system may utilize a variety of probes, such as ultrasound catheters, transesophageal probes, surface probes and the like.
  • the ultrasound system may be used before, during or after an ablation procedure to monitor the position of the EP catheters and/or ablation catheters.
  • the mapping system is utilized with physiology catheters to detect and record desired physiologic parameters.
  • the mapping system includes equipment to monitor and track the position of a mapping catheter, from which a map is created of the region of interest.
  • mapping catheter positioned in a heart chamber that may include passive and active electrode sites.
  • the active electrode sites impose an electric field within the chamber.
  • the blood volume and wall motion modulate the electric field that is detected by passive electrode sites on the catheter.
  • Electrophysiology measurements and geometric measurements are taken from the catheter and used to construct a map and to display intrinsic heart activity.
  • Another type of conventional mapping system utilizes an external imaging modality such as ultrasound, SPECT, PET, MRI, CT system that is positioned external to the patient to capture a 3D image of the heart.
  • the diagnostic image is captured before the heart is mapped.
  • the mapping system utilizes data obtained from the catheter to generate a geometric map, with which the diagnostic image is then registered.
  • mapping, ablation, physiology and ultrasound systems include separate computers, monitors, and user interfaces, all of which are mounted on separate chassis.
  • a method for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site.
  • the method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site.
  • the method further includes generating an ultrasound image based on the ultrasound data.
  • the ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane.
  • the method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based on the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane.
  • the method further includes forming and saving a display image by combining the ultrasound image and physiologic data.
  • the method further includes detecting a change in the subject of interest proximate the ablation site.
  • the method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system.
  • Another embodiment includes comparing the first and second display images of the combined ultrasound image and physiology data, wherein the efficacy of the ablation procedure can be determined.
  • a method for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site.
  • the method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site.
  • the method further includes generating an ultrasound image based on the ultrasound data.
  • the ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane.
  • the method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based On the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane.
  • the method further includes forming and saving a display image by combining the ultrasound image and physiologic data.
  • the method further includes detecting a change in the subject of interest proximate the ablation site.
  • the physiology signals from the physiology catheter located proximate the ablation site and, based thereon, producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scanned plane.
  • the method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system. Tracking a position of an ultrasound probe and a physiology catheter, and generating tracking information denoting positions of the ultrasound probe and physiology catheter with respect to a common reference coordinate system and registering the ultrasound image and physiology data within a common coordinate reference system.
  • a method for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site.
  • the method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site.
  • the method further includes generating an ultrasound image based on the ultrasound data.
  • the ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane.
  • the method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based on the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane.
  • the method further includes forming and saving a display image by combining the ultrasound image and physiologic data.
  • the method further includes detecting a change in the subject of interest proximate the ablation site.
  • the method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system.
  • FIG. 1 illustrates a block diagram of a physiology system formed in accordance with an embodiment of the present invention.
  • FIG. 2 illustrates a block diagram of the functional modules, within the ultrasound processor module, that are utilized to carry out ultrasound mid-processing operations in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates a block diagram of the functional modules, within the display processor module, that are utilized to carry out the display processing operations in accordance with an embodiment of the present invention.
  • FIG. 4 illustrates a flowchart of the process to acquire, register and display ultrasound images in combination with physiology data in real-time or near real-time during an ablation procedure.
  • FIG. 5 illustrates an exemplary application by which ultrasound data and physiology data may be acquired in connection with an electrophysiology procedure within a subject of interest.
  • Complications such as complete heart block, myocardial infarctions and pulmonary vein stenosis, among others can occur when ablation lesions are larger in circumference or deeper in the tissue than intended. Conversely, if the lesions are not complete in critical areas, the patient may receive no benefit from the ablation procedure.
  • ultrasound systems have the capacity to perform tissue characterization.
  • tissue characterization When conductive tissue is ablated, the character of the tissue changes from muscle, neuro fibers, etc. to non-conducting tissue that will eventually result in scar tissue.
  • an ultrasound physiology system to gather tissue characterization at the ablation site.
  • the system analyzes the information and provides the clinician with the depth and diameter of fully ablated tissue as well as the area of injury surrounding the ablation.
  • the data may be displayed to the clinician as numeric and/or may be displayed as a visual indicator, for instance the clinician may see a notation in the log of the recording system stating the ablation destroyed tissue, alternatively the system may color code the location of the ablation with an assigned color designating success and different color designating inefficacy with these indicators being displayed over an ultrasound image previously acquired using, for example, an MR or CT system or overlayed onto a map created by a separate 3-D mapping system.
  • FIG. 1 illustrates a physiology system 10 formed in accordance with an embodiment of the present invention.
  • a system controller 8 manages the overall interaction and operation of the various modules, accessories and the like.
  • the physiology system 10 includes a beam former module 12 configured to be joined with one or more ultrasound probes 14 - 16 .
  • ultrasound probes may include an intravascular ultrasound (IVUS) catheter 14 , an echocardiography (ICE) catheter, a transesophageal probe 15 , an interventional probe, an ultrasound surface probe 16 and the like.
  • the beam former module 12 controls transmit and receive operations to and from the probes 14 - 16 .
  • a physiology signal processing module 20 is provided and joined with one or more catheters 22 - 24 .
  • catheters include a basket catheter 22 , a multi-pole electrophysiology catheter 23 (e.g. a 4-pole, 8-pole, 10-pole, 20 pole and the like), a hemodynamic catheter 24 and the like.
  • the beam former module 12 processes radio frequency (RF) echo signals from one or more of probes 14 - 16 and produces there from I, Q data pairs associated with each data sample within a scan plane through the region of interest.
  • the beam former module 12 may supply the I, Q data pairs directly to the ultrasound processor module 30 .
  • the beam former module 12 may store the collection of I, Q data pairs defining the sample points within a single scan plane in the ultrasound data memory 38 as raw ultrasound data.
  • the ultrasound data memory 38 stores the I, Q data pairs for individual scan planes as two dimensional data sets, or alternatively for collections of scan planes as three dimensional data sets.
  • the ultrasound processor module 30 processes the raw I, Q data pairs, as explained below in more detail, to form ultrasound images (2D or 3D).
  • the ultrasound processor module 30 may form B-mode images, color flow images, power Doppler images, spectral Doppler images, M-mode images, ARFI images, strain images, strain rate images and the like.
  • the ultrasound images contain ultrasound image data representing voxels associated with data samples from the region of interest, where the ultrasound image data may be defined in Cartesian or polar coordinates.
  • the ultrasound images may be stored individually as two dimensional data sets. Alternatively, collections of ultrasound images may be stored as three dimensional data sets.
  • the beam former module 12 and ultrasound processor module 30 processes the signals from the ultrasound probe in real-time during a physiology procedure in order that the display 48 is able to display and continuously update the ultrasound image in real-time during the physiology procedure.
  • the ultrasound processor module may generate new ultrasound images at a frame rate of at least seven frames per second such that the display processor module is able to update the ultrasound image information within the displayed image at a frame rate of at least seven frames per second.
  • the frame rate, at which new ultrasound images are generated and displayed may be increased to 16 , 32 or 64 frames per second or higher.
  • the physiology signal processor 20 passively and/or actively operates upon one or more of the catheters 22 - 24 to measure physiology signals.
  • the physiology signal processor module 20 receives physiology signals from one or more of the catheters 22 - 24 and produces physiology data representative of the physiologic activity of a portion of the regions of interest proximate the sensors on the corresponding catheter 22 - 24 .
  • the physiology data is stored in physiology data memory 40 .
  • ECG leads 26 are provided on the surface of the subject and produce ECG signals that are received by the physiology signal processor module 20 and/or to a cardiac cycle detection module 28 .
  • the cardiac cycle detection module 28 monitors the cardiac activity denoted by the ECG signals and generates therefrom timing information representative of cyclical points in the subject's cardiac cycle.
  • the timing information is provided to the physiology signal processor module 20 and to the ultrasound processor module 30 .
  • intracardiac signals obtained from EP catheters may provide the cardiac cycle detection signal.
  • a position tracking module 32 is joined with a series of detectors 34 that may operate as transmitters and/or receivers.
  • the position tracking module 32 may also receive position information from one or more of the ultrasound probes 14 - 16 and/or physiology catheters 22 - 24 .
  • the ultrasound probes 14 - 16 are each provided with first and second reference point elements (denoted RP 1 and RP 2 on each probe and catheter).
  • the reference point elements may represent transmitters and/or receivers configured to transmit or receive acoustic energy, radio frequency energy, electromagnetic energy and the like.
  • only a single reference point element or sensor may be provided on one or more of the probes and catheters. Examples of conventional sensor configurations and detector systems are described in U.S.
  • the position tracking module 32 generates tracking information defining the position of each ultrasound probe and each physiology catheter with respect to a common reference coordinate system.
  • the position information may include XYZ coordinates for each reference point element within a common three-dimensional Cartesian coordinate system.
  • the position information may be defined in polar coordinate within a common three-dimensional polar coordinate system.
  • the tracking information may uniquely identify each reference point element, such as through a unique transmit signature and the like.
  • the position tracking module 32 may include a relational table containing an ID for each reference point element uniquely associated with probe/catheter descriptive information (e.g. the serial number, type, dimensions, shape and the like).
  • the tracking information may also include orientation information (e.g. pitch roll and yaw) describing the orientation of a reference axis 17 of a probe or catheter relative to the reference coordinate system.
  • the position tracking module 32 repeatedly monitors and tracks the reference point element, to generate a continuous stream of coordinate position data sets, wherein a single combination of XYZ values represent a single coordinate position data set.
  • the position tracking module 32 may record, with each coordinate position data set, a time stamp indicating a time at which the coordinate position data set was obtained.
  • the time stamp may be defined by a system clock 36 that also provides reference timing information to the physiology signal processor module 20 and ultrasound processor module 30 .
  • the time stamp may be defined with respect to the cardiac cycle the patient (e.g. X seconds following/preceding the peak of the R-wave).
  • cardiac cycle timing information is provided by the cardiac cycle detection module 28 to each of the physiology signal processor module 20 , ultrasound processor module 30 and position tracking module 32 .
  • the position tracking module 32 may provide the position information, orientation information and timing information (collectively referred to as “tracking information”) to the physiology and ultrasound processor modules 20 and 30 .
  • tracking information When the tracking information is provided to the ultrasound processor module 30 , the ultrasound processor module 30 stores the tracking information with the ultrasound image in the ultrasound data memory 38 .
  • the tracking information uniquely identifies the time at which the ultrasound image was acquired, as well as the position and/or orientation of the ultrasound probe 14 - 16 at the time of acquisition.
  • the physiology processor module 20 records the tracking information with the physiology data in the physiology data memory 40 .
  • the tracking information uniquely identifies the time at which the physiology data was acquired, as well as the position and/or orientation of the physiology catheter(s) 22 - 24 at the time of acquisition.
  • a registration module 42 accesses the ultrasound and physiology data memories 38 and 40 to obtain one or more ultrasound images and related physiology data sets acquired at the same point(s) in time.
  • the ultrasound images and associated physiology data sets are identified from memories 38 and 40 based on the recorded time stamps.
  • the registration module 42 transforms one or both of the ultrasound image and physiology data into a common coordinate system and stores the results in a common data memory 44 .
  • the registration module 42 may map the physiology data set into the coordinate system defined by the ultrasound images as stored in the ultrasound data memory 38 .
  • the registration module 42 may map the ultrasound images into the coordinate system defined by the physiology data sets as stored in the physiology data memory 40 .
  • the registration module 42 may transform both the ultrasound images and physiology data sets into a new coordinate system.
  • a display processor module 46 accesses the common data memory 44 to obtain select combinations of ultrasound images and physiology data sets for presentation on display 48 .
  • the display processor module may form a display image combining the ultrasound image and physiology data set, such that the physiology data is mapped on to an anatomical structure contained in, and defined by, the ultrasound image.
  • the display processor module 46 may access a lookup table 50 that is stored as part of, or separate from, the common data memory 44 to define display characteristics, such as transparency, opacity, color, brightness and the like, for individual display pixels defining the resultant display image.
  • the lookup table 50 may be used to define data samples or voxels within the ultrasound image through one of gray scale and color information, and to define the physiology data through the other of gray scale and color information.
  • one combination or range of colors may be designated to denote ultrasound information, while a separate combination or range of colors may be designated to denote physiology data.
  • the brightness, intensity or opacity of each pixel in the display image may be varied in accordance with one or both of the value of the ultrasound information and the value of the physiology data.
  • the ultrasound image may be defined by B-mode data values for each data point or voxel, while the physiology data associated with the data point or voxel may be defined by one or more colors within a range of colors (e.g., ranging from blue to red, or ranging from light blue to dark blue, or ranging from light red to dark red).
  • the ultrasound image may be defined by non B-mode data values, such as anatomic M-mode, strain or strain rate characteristics of the anatomic structure, with the strain or strain rate being represented in the display image by discrete colors within a range of colors (e.g., ranging from blue to red, or ranging from light blue to dark blue, or ranging from light red to dark red).
  • the physiology data may be represented through variations of the brightness at each display pixel.
  • a user interface 52 to is provided to control the overall operation of the physiology system 10 .
  • the user interface 52 may include, among other things, a keyboard, mouse and/or trackball.
  • the user interface 52 may permit an operator to designate a portion of the ultrasound image, for which physiologic data is of interest.
  • the display processor module 46 and/or physiology signal processor module 20 may then generate a separate physiology graph to be displayed independent and distinct from the ultrasound image.
  • the display 48 may present an ultrasound image as a B-mode sector scan, with one or more points of interest on the B-mode sector scan designated.
  • a separate graph may be co-displayed on display 48 with the ultrasound B-mode image.
  • FIG. 2 illustrates an exemplary block diagram of the ultrasound processor module 33 of FIG. 1 formed in accordance with an embodiment of the present invention.
  • the operations of the modules illustrated in FIG. 2 may be controlled by a local ultrasound controller 87 or by the system controller 8 .
  • the modules 49 - 59 perform mid-processor operations.
  • the ultrasound processor module 30 obtains ultrasound data 21 from the ultrasound data memory 38 or the beam former module 12 ( FIG. 1 ).
  • the received ultrasound data 21 constitutes I, Q data pairs representing the real and imaginary components associated with each data sample.
  • the I, Q data pairs are provided to an ARFI module 49 , a color-flow module 51 , a power Doppler module 53 , a B-mode module 55 , a spectral Doppler module 57 and M-mode module 59 .
  • other modules may be included such as a strain module, a strain rate module and the like.
  • Each of modules 49 - 59 process the I, Q data pairs in a corresponding manner to generate ARFI data 60 , color-flow data 61 , power Doppler data 63 , B-mode data 65 , spectral Doppler data 67 , and M-mode data 69 , all of which are stored in ultrasound data memory 38 .
  • the ultrasound data memory 38 may be divided such that the raw I, Q data pairs are stored in raw data memory, while the processed image data is stored in separate image data memory.
  • the ARFI, color-flow, power Doppler, B-mode, spectral Doppler and M-mode data 60 - 69 may be stored as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system.
  • FIG. 3 illustrates an exemplary block diagram of the display processor module 46 of FIG. 1 formed in accordance with an embodiment of the present invention.
  • the operations of the modules illustrated in FIG. 3 may be controlled by the local ultrasound controller 87 or by the system controller 8 .
  • the modules 73 , 77 and 81 perform display-processor operations.
  • a scan converter module 73 reads from memory 44 the vector data values associated with one or more image frames and converts the set of vector data values to Cartesian coordinates to generate an display image frame 75 formatted for display.
  • the ultrasound image frames 75 generated by scan converter module 73 may be passed to a temporary area in memory 44 for subsequent processing or may be passed directly to one of the 2-D and 3-D processor module's 77 and 81 .
  • the scan converter obtains B-mode vector data sets for images stored in memory 44 .
  • the B-mode vector data is interpolated where necessary and converted into the X,Y format for video display to produce ultrasound image frames.
  • the scan converted ultrasound image frames are passed to the video processor module 77 that maps the video to a grey-scale mapping for video display.
  • the grey-scale map may represent a transfer function of the raw image data to displayed grey levels.
  • the video processor module 77 controls the display 48 to display the image frame in real-time.
  • the B-mode image displayed in the real-time is produced from an image frame of data in which each datum indicates the intensity or brightness of a respective pixel in the display.
  • the display image represents the tissue and/or blood flow in a plane through the region of interest being imaged.
  • the color-flow module 51 may be utilized to provide real-time two-dimensional images of blood velocity in the imaging plane.
  • the frequency of sound waves reflected from the inside of the blood vessels, heart cavities, etc. is shifted in proportion to the velocity of the blood vessels; positively shifted for cells moving toward the transducer and negatively shifted for cells moving away from the transducer.
  • the blood velocity is calculated by measuring the phase shift from firing to firing at a specific range gate. Mean blood velocity from multiple vector positions and multiple range gates along each vector are calculated and a two-dimensional image is made from this information.
  • the color-flow module 51 receives the complex I, Q data pairs from the beamformer module 12 and processes the I, Q data pairs to calculate the mean blood velocity, variance (representing blood turbulence) and total pre-normalized power for all sample volumes within the operator defined region.
  • the 2D video processor module 77 combines one or more of the frames generated from the different types of ultrasound information and physiologic data.
  • the 2D video processor modules 77 may combine a B-mode image frame and a color representation of the physiologic data by mapping the B-mode data to a grey map and mapping the physiologic data to a color map for video display.
  • the color pixel data is superimposed on the grey scale pixel data to form a single multi-mode image frame 79 that may be re-stored in memory 44 or passed over bus 35 to the display 48 .
  • Successive frames of B-mode images, in combination with the associated physiology data may be stored as a cine loop in memory 44 .
  • the cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user.
  • the user may freeze the cine loop by entering a freeze command at the user interface 52 .
  • the user interface 52 represents a keyboard and mouse and all other commands associated with ultrasound system user interface.
  • the spectral Doppler module 57 ( FIG. 2 ) operates upon the I, Q data pairs by integrating (summing) the data pairs over a specified time interval and then sampling the data pairs.
  • the summing interval and the transmission burst length together define the length of the sample volume which is specified by the user at the user interface 52 .
  • the spectral Doppler module 57 may utilize a wall filter to reject any clutter in the signal which may correspond to stationery or very slow moving tissue.
  • the filter output is then fed into a spectrum analyzer, which may implement a Fast Fourier Transform over a moving time window of samples. Each FFT power spectrum is compressed and then output by the spectral Doppler module 57 to memory 44 .
  • the 2D video processor module 77 then maps the compressed spectral Doppler data to grey scale values for display on the display 48 as a single spectral line at a particular time point in the Doppler velocity (frequency) versus a time spectrogram.
  • the 2 -D video processor module 77 may similarly map the physiology data into a graph representing electrical potential fluctuation (along the vertical axis) and time (along the horizontal axis).
  • a 3D processor module 81 is also controlled by user interface 52 and accesses memory 44 to obtain spatially consecutive groups of ultrasound image frames and to generate three dimensional image representation thereof, such as through volume rendering or surface rendering algorithms.
  • the three dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection and the like.
  • FIG. 4 illustrates a,processing sequence carried out by the physiology system 10 of FIG. 1 in connection with acquiring, tracking and combining ultrasound and physiology data in real-time or near real-time during an ablation procedure.
  • the position tracking module 32 registers the ultrasound probe 14 - 16 within the position tracking coordinate system.
  • the position tracking module 32 registers the physiology catheters within the position tracking coordinate system.
  • the beam former module 12 acquires RF echo signals from one or more scan planes of the abalation site and generates I, Q data pairs of therefrom.
  • the ultrasound processor module 30 accesses the raw I, Q data pairs and forms ultrasound data images therefrom based upon the desired mode of operation (as discussed above in connection with FIG. 2 ).
  • the position tracking module 32 provides tracking information to the ultrasound processor module 30 .
  • the tracking information may include a unique time stamp and/or reference point data identifying the position and/or orientation of one or more reference point elements RP 1 , RP 2 on the corresponding ultrasound probe 14 - 16 .
  • the tracking information is stored in memory 38 by the ultrasound processor module 30 with the ultrasound image data.
  • the physiology signal processor module 20 acquires physiology data, and at 412 , forms a physiology image data set.
  • the position tracking module 32 provides tracking information (e.g. time stamps and reference point data) to the physiology signal processor module 20 .
  • the physiology image data set and tracking information are stored by the physiology signal processor module 20 in physiology data memory 40 .
  • the registration module 42 accesses the ultrasound and physiology data memories 38 and 40 , and transforms or maps the ultrasound and physiology image data into a common coordinate reference system. Once mapped to a common coordinate reference system, the ultrasound and physiology image data are stored in a common data memory 44 .
  • the display processor module 46 performs display processing upon the ultrasound physiology image data to form a combined ultrasound and physiology display image.
  • the display 48 presents the combined ultrasound and physiology image for viewing.
  • the previously displayed image is stored in the electrophysiology recording system.
  • the physiology system 10 detects a change in rhythm of the subject of interest produced by a change in cardiac dimensions.
  • a tracking coordinate system 32 is decoupled from previously acquired combined image.
  • the beam former module 12 acquires RF echo signals from one or more scan plans of the ablation site and generates I,Q data pairs of therefrom creating a second set of ultrasound data points.
  • the ultrasound processor module 30 accesses the raw I,Q data peers and forms a second ultrasound data image therefrom based upon the desired mode of operation (as discussed above in connection With FIG. 2 ).
  • the physiology signal processor module 20 acquires physiology data and forms a second physiology image data set.
  • the position tracking module 32 provides tracking information (example time stamps and reference point data) to the physiology signal processor module 20 .
  • the second physiology image data set and tracking information are stored by the physiology signal processor module 20 in physiology data memory 40 .
  • the registration module 42 accesses the ultrasound and physiology data memories 38 and 40 , and transforms or maps the second ultrasound and second physiology image data into a common coordinate reference system. Once mapped to a common coordinate reference system, the second ultrasound and second physiology image data are stored in a common data memory 44 .
  • the display processor module. 46 performs display processing upon the second ultrasound physiology image data to form a combined ultrasound and physiology display image.
  • the display 48 presents the combined ultrasound and physiology image for viewing wherein the first and second display images of the combined ultrasound image and physiology data obtained, in real-time or near real-time, during the ablation procedure to determine the efficacy of the ablation procedure.
  • FIG. 5 illustrates an exemplary application in which the above described embodiments may be utilized.
  • the graphical representation of a heart 500 is illustrated.
  • An ultrasound catheter 502 and EP catheter 504 have been inserted through the inferior vena cava (IVC) into the right atrium (RA).
  • the ultrasound and EP catheters 502 and 504 have passed through a punctured opening through the fossa ovalis into the left atrium (LA).
  • the ultrasound catheter 502 includes a series of spaced apart piezo transducers 506 that may be separately activated and controlled to transmit and receive ultrasound data for corresponding scan planes.
  • the ultrasound catheter 502 and EP catheter 504 are utilized to map the anatomical contour of, and electrical activity at, the interior wall of the left atrium, including proximate the openings to the pulmonary veins denoted at 508 and 510 . It should be understood that other areas of the heart can be monitored by the method described herein, for example, imaging from the right side to the left—the US catheter would stay in the HRA and visualize across the septum to the LA where the ablation catheter would be placed.
  • ARFI allows examination of the functionality of tissue subsets, such as in the heart, organs, tissue, vasculature and the like.
  • ARFI is a phenomenon associated with the propagation of acoustic waves through a dissipative medium. It is caused by a transfer of momentum from the wave to the medium, arising either from absorption or reflection of the wave. This momentum transfer results in the application of a force in the direction of wave propagation. The magnitude of this force is dependent upon both the tissue properties and the acoustic beam parameters. The duration of the force application is determined by the temporal profile of the acoustic wave.
  • ARFI images the response of tissue to acoustic radiation force for the purpose of characterizing the mechanical properties of the tissue.
  • ARFI imaging has many potential clinical applications, including: detecting and characterizing a wide variety of soft tissue lesions, and identifying and characterizing atherosclerosis, plaque, and thromboses.
  • co-displays is not limited to displaying information on a common CRT or monitor, but instead refers also to the use of multiple monitors located in immediately adjacent one another to facilitate substantially simultaneous viewing by a single individual.
  • processor is not intended to be limited to a single processor or CPU.
  • the various blocks and modules are illustrated as conceptually functional units only, but may be implemented utilizing any combination of dedicated or non-dedicated hardware boards, DSPs, processors and the like.
  • the blocks and modules may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with the functional operations distributed between the processors.
  • the blocks and modules may be implemented utilizing a hybrid configuration in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the shelf PC and the like.
  • the figures illustrate diagrams of the functional blocks of various.
  • the functional blocks are not necessarily indicative of the division between hardware circuitry.
  • one or more of the functional blocks e.g., processors or memories
  • the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed imaging software package, and the like.
  • the term “real-time” refers to the changes in tissue being readily apparent to the clinician as the ablation procedure continues. For example, a physician creates a burn at the ablation site, at the completion of that burn the physician sees an indicator which indicates if the burn was successful for the intended purpose. In other words, before completion of the ablation procedure. The indicator may also provide the dimension of the burn as the ablation procedure continues.
  • the term “near, real-time” refers to the system ability to provide feedback to the clinician about the ablation procedure at the ablation site, but not immediately following or during the ablation procedure. For example, the clinician may wait for one to two minutes while the system provides the information or the clinician may assess all of the burns performed at the end of the procedure, but prior to the catheter being removed. In the latter example, the data from the system may not be provided to the clinician for as long as ten minutes.
  • the term “coupled” means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components or the two components and any additional member being attached to one another. Such joining may be permanent in nature or alternatively may be removable or releasable in nature

Abstract

There is provided a method and system for combining physiology information with ultrasound based anatomic structures during an ablation procedure. The embodiments relate to methods and systems that construct a 2D or 3D representation of an anatomical structure based on ultrasound data and superimpose thereon graphical information representative of physiologic characteristics of the anatomic structure in real-time or near real-time during an ablation procedure.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a Continuation-in-Part of application Ser. No. 11/204,711, filed Aug. 16, 2005 which is incorporated herein by this reference.
  • FIELD OF THE INVENTION
  • Embodiments of the present invention generally relate to methods and systems for combining physiology information with ultrasound based anatomic structures. More particularly, embodiments relate to methods and systems that construct a 2D or 3D representation of an anatomical structure based on ultrasound data and superimpose thereon graphical information representative of physiologic characteristics of the anatomic structure in real-time or near real-time during an ablation procedure.
  • BACKGROUND OF THE INVENTION
  • Ablation procedures are used to treat various conditions including artrial fibrillation, artrial flutter, A-V nodal re-entrant tachycardia, by-pass tract tachycardia, ventricular tachycardia and others. Clinicians use catheters which deliver radio frequency, cryo, laser and other forms of energy to destroy selected tissues. Ablation size is determined by various factors including, but not limited to, catheter contact, angle of the catheter in relation to the selected tissue, blood flow in the area, impedance in the setting selected by the clinician on various ablation delivery systems. Ablation efficacy requires tissue destruction at the catheter-tissue interface. Ablation is either set to destroy tissue in a selected area or to permanently prevent electrical conduction by preventing a rhythm from crossing a line drawn by ablations termed a line of block. It is known to provide mapping systems which can record the location of ablations but such systems to not provide any feedback as to the status of the tissue at the catheter tip.
  • Various types of physiology workstations have been proposed such as electrophysiology (EP) workstations, hemo-dynamic (HD) workstations, and the like. Generally, EP, HD and ablation procedures are carried out through the use of, among other things, EP catheters, HD catheters and mapping sensors. The procedure room also includes a fluoroscopy system, a diagnostic ultrasound system, a patient monitoring device and an ablation system. The ultrasound system may utilize a variety of probes, such as ultrasound catheters, transesophageal probes, surface probes and the like. The ultrasound system may be used before, during or after an ablation procedure to monitor the position of the EP catheters and/or ablation catheters. The mapping system is utilized with physiology catheters to detect and record desired physiologic parameters. The mapping system includes equipment to monitor and track the position of a mapping catheter, from which a map is created of the region of interest.
  • Conventional electrophysiology mapping systems utilize a mapping catheter positioned in a heart chamber that may include passive and active electrode sites. The active electrode sites impose an electric field within the chamber. The blood volume and wall motion modulate the electric field that is detected by passive electrode sites on the catheter. Electrophysiology measurements and geometric measurements are taken from the catheter and used to construct a map and to display intrinsic heart activity. Another type of conventional mapping system utilizes an external imaging modality such as ultrasound, SPECT, PET, MRI, CT system that is positioned external to the patient to capture a 3D image of the heart. The diagnostic image is captured before the heart is mapped. The mapping system utilizes data obtained from the catheter to generate a geometric map, with which the diagnostic image is then registered.
  • Heretofore, physiology workstations have operated independent and distinct from the mapping, ablation and ultrasound equipment utilized during the physiology study. Also, conventional mapping, ablation and ultrasound equipment have operated independent and distinct from one another. The mapping, ablation, physiology and ultrasound systems include separate computers, monitors, and user interfaces, all of which are mounted on separate chassis.
  • Thus, there is a need for a method that provides the clinician real-time or near, real-time feedback as to the efficacy of an ablation.
  • BRIEF DESCRIPTION OF THE INVENTION
  • In accordance with at least one embodiment, a method is provided for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site. The method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site. The method further includes generating an ultrasound image based on the ultrasound data. The ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane. The method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based on the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane. The method further includes forming and saving a display image by combining the ultrasound image and physiologic data. The method further includes detecting a change in the subject of interest proximate the ablation site. Receiving in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site. Generating, in real-time, or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scanned plant. Receiving, in real-time or near real-time, the physiology signals from the physiology catheter located proximate the ablation site and, based thereon, producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scanned plane. The method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system. Another embodiment includes comparing the first and second display images of the combined ultrasound image and physiology data, wherein the efficacy of the ablation procedure can be determined.
  • There is also provided a method is provided for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site. The method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site. The method further includes generating an ultrasound image based on the ultrasound data. The ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane. The method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based On the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane. The method further includes forming and saving a display image by combining the ultrasound image and physiologic data. The method further includes detecting a change in the subject of interest proximate the ablation site. Receiving in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site. Generating, in real-time, or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scanned plant. Receiving, in real-time or near real-time, the physiology signals from the physiology catheter located proximate the ablation site and, based thereon, producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scanned plane. The method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system. Tracking a position of an ultrasound probe and a physiology catheter, and generating tracking information denoting positions of the ultrasound probe and physiology catheter with respect to a common reference coordinate system and registering the ultrasound image and physiology data within a common coordinate reference system.
  • There is further provided a method is provided for obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site. The method includes receiving signals from an ultrasound probe located proximate the ablation site and, based upon the receive signals, producing ultrasound data representative of a scan plane including the ablation site. The method further includes generating an ultrasound image based on the ultrasound data. The ultrasound image is representative of an anatomical structure of a portion of the ablation site contained in the scan plane. The method further includes receiving physiology signals from a physiology catheter located proximate the ablation site and, based on the physiology signals, producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane. The method further includes forming and saving a display image by combining the ultrasound image and physiologic data. The method further includes detecting a change in the subject of interest proximate the ablation site. Receiving in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site. Generating, in real-time, or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scanned plant. Receiving, in real-time or near real-time, the physiology signals from the physiology catheter located proximate the ablation site and, based thereon, producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scanned plane. The method further includes the forming in real-time or near real-time a second display image combining the second physiology data in saving the second image in the electro physiology recording system. Forming a volumetric ultrasound data set for a series of the scan planes, the display image constituting a three-dimensional representation of the ultrasound image and physiology data, wherein the ultrasound image and physiology data combined in the display image are obtained at a common time in a cyclical motion of the region of interest.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a block diagram of a physiology system formed in accordance with an embodiment of the present invention.
  • FIG. 2 illustrates a block diagram of the functional modules, within the ultrasound processor module, that are utilized to carry out ultrasound mid-processing operations in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates a block diagram of the functional modules, within the display processor module, that are utilized to carry out the display processing operations in accordance with an embodiment of the present invention.
  • FIG. 4 illustrates a flowchart of the process to acquire, register and display ultrasound images in combination with physiology data in real-time or near real-time during an ablation procedure.
  • FIG. 5 illustrates an exemplary application by which ultrasound data and physiology data may be acquired in connection with an electrophysiology procedure within a subject of interest.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Currently physicians cannot know if ablation has been made, the depth of ablation or if the tissue is destroyed or merely stunned by application of energy during an ablation procedure. Clinicians place ablation lesions immediately adjacent to one another in an attempt to insure that all the desired tissue had been destroyed. Tissues typically begin to swell in the areas surrounding ablations within minutes of energy delivery. Swollen tissues may perform the same as destroyed tissues when tested with contact, pacing, etc. Clinicians may place additional ablations than what are required for the procedure to be successful in an attempt to prevent recurrence of the original condition, these are known as “insurance burns”. Clinicians desire to destroy the selected tissue and leave remaining tissues intact and unaffected. Complications such as complete heart block, myocardial infarctions and pulmonary vein stenosis, among others can occur when ablation lesions are larger in circumference or deeper in the tissue than intended. Conversely, if the lesions are not complete in critical areas, the patient may receive no benefit from the ablation procedure.
  • As discussed below, ultrasound systems have the capacity to perform tissue characterization. When conductive tissue is ablated, the character of the tissue changes from muscle, neuro fibers, etc. to non-conducting tissue that will eventually result in scar tissue. Giving the physician visual and/or auditory feedback as to the efficacy and/or size and depth of an ablation attempt, would provide key clinical data allowing the physician to adjust settings on ablation delivery systems, change catheter position, select a different catheter, etc.
  • As discussed below, clinical feedback of ablation efficacy during ablation procedure can be provided by an ultrasound physiology system to gather tissue characterization at the ablation site. The system analyzes the information and provides the clinician with the depth and diameter of fully ablated tissue as well as the area of injury surrounding the ablation. The data may be displayed to the clinician as numeric and/or may be displayed as a visual indicator, for instance the clinician may see a notation in the log of the recording system stating the ablation destroyed tissue, alternatively the system may color code the location of the ablation with an assigned color designating success and different color designating inefficacy with these indicators being displayed over an ultrasound image previously acquired using, for example, an MR or CT system or overlayed onto a map created by a separate 3-D mapping system.
  • Current ablation procedure mapping systems allow the clinician to draw on a 3-D rendering of the subject of interest, for example, a heart, where ablations have been performed. These marks are cartoon depictions of where energy has been delivered; however, they provide no information to the clinician other than the location. Providing the clinician with information of the properties (characterization) of tissues during and after ablation allows a clinician to know if energy delivery has created a complete line of block and/or destroyed the desired tissues preventing complications and the need for repeat procedures as well as negating the need for “insurance burns” thereby decreasing procedure time.
  • FIG. 1 illustrates a physiology system 10 formed in accordance with an embodiment of the present invention. A system controller 8 manages the overall interaction and operation of the various modules, accessories and the like. The physiology system 10 includes a beam former module 12 configured to be joined with one or more ultrasound probes 14-16. Examples of ultrasound probes may include an intravascular ultrasound (IVUS) catheter 14, an echocardiography (ICE) catheter, a transesophageal probe 15, an interventional probe, an ultrasound surface probe 16 and the like. The beam former module 12 controls transmit and receive operations to and from the probes 14-16. A physiology signal processing module 20 is provided and joined with one or more catheters 22-24. Examples of catheters include a basket catheter 22, a multi-pole electrophysiology catheter 23 (e.g. a 4-pole, 8-pole, 10-pole, 20 pole and the like), a hemodynamic catheter 24 and the like.
  • The beam former module 12 processes radio frequency (RF) echo signals from one or more of probes 14-16 and produces there from I, Q data pairs associated with each data sample within a scan plane through the region of interest. The beam former module 12 may supply the I, Q data pairs directly to the ultrasound processor module 30. Alternatively or in addition, the beam former module 12 may store the collection of I, Q data pairs defining the sample points within a single scan plane in the ultrasound data memory 38 as raw ultrasound data. The ultrasound data memory 38 stores the I, Q data pairs for individual scan planes as two dimensional data sets, or alternatively for collections of scan planes as three dimensional data sets.
  • The ultrasound processor module 30 processes the raw I, Q data pairs, as explained below in more detail, to form ultrasound images (2D or 3D). For example, the ultrasound processor module 30 may form B-mode images, color flow images, power Doppler images, spectral Doppler images, M-mode images, ARFI images, strain images, strain rate images and the like. The ultrasound images contain ultrasound image data representing voxels associated with data samples from the region of interest, where the ultrasound image data may be defined in Cartesian or polar coordinates. The ultrasound images may be stored individually as two dimensional data sets. Alternatively, collections of ultrasound images may be stored as three dimensional data sets. The beam former module 12 and ultrasound processor module 30 processes the signals from the ultrasound probe in real-time during a physiology procedure in order that the display 48 is able to display and continuously update the ultrasound image in real-time during the physiology procedure. By way of example, the ultrasound processor module may generate new ultrasound images at a frame rate of at least seven frames per second such that the display processor module is able to update the ultrasound image information within the displayed image at a frame rate of at least seven frames per second. Alternatively, the frame rate, at which new ultrasound images are generated and displayed, may be increased to 16, 32 or 64 frames per second or higher.
  • The physiology signal processor 20 passively and/or actively operates upon one or more of the catheters 22-24 to measure physiology signals. The physiology signal processor module 20 receives physiology signals from one or more of the catheters 22-24 and produces physiology data representative of the physiologic activity of a portion of the regions of interest proximate the sensors on the corresponding catheter 22-24. The physiology data is stored in physiology data memory 40.
  • ECG leads 26 are provided on the surface of the subject and produce ECG signals that are received by the physiology signal processor module 20 and/or to a cardiac cycle detection module 28. The cardiac cycle detection module 28 monitors the cardiac activity denoted by the ECG signals and generates therefrom timing information representative of cyclical points in the subject's cardiac cycle. The timing information is provided to the physiology signal processor module 20 and to the ultrasound processor module 30. Alternatively, intracardiac signals obtained from EP catheters may provide the cardiac cycle detection signal.
  • A position tracking module 32 is joined with a series of detectors 34 that may operate as transmitters and/or receivers. The position tracking module 32, optionally, may also receive position information from one or more of the ultrasound probes 14-16 and/or physiology catheters 22-24. In the example of FIG. 1, the ultrasound probes 14-16 are each provided with first and second reference point elements (denoted RP1 and RP2 on each probe and catheter). The reference point elements may represent transmitters and/or receivers configured to transmit or receive acoustic energy, radio frequency energy, electromagnetic energy and the like. Alternatively, only a single reference point element or sensor may be provided on one or more of the probes and catheters. Examples of conventional sensor configurations and detector systems are described in U.S. Pat. No. 5,713,946 to Ben-Haim; U.S. Pat. No. 6,216,027 to Willis et al.; U.S. Pat. No. 5,662,108 to Budd et al.; U.S. Pat. No. 5,409,000 to Imran; U.S. Pat. No. 6,650,927 to Keidar; U.S. Pat. No. 6,019,725 to Vesely; U.S. Pat. No. 5,445,150 to Dumoulin, all of which are expressly incorporated herein in their entireties by reference.
  • The position tracking module 32 generates tracking information defining the position of each ultrasound probe and each physiology catheter with respect to a common reference coordinate system. By way of example, the position information may include XYZ coordinates for each reference point element within a common three-dimensional Cartesian coordinate system. Alternatively, the position information may be defined in polar coordinate within a common three-dimensional polar coordinate system. The tracking information may uniquely identify each reference point element, such as through a unique transmit signature and the like. The position tracking module 32 may include a relational table containing an ID for each reference point element uniquely associated with probe/catheter descriptive information (e.g. the serial number, type, dimensions, shape and the like). The tracking information may also include orientation information (e.g. pitch roll and yaw) describing the orientation of a reference axis 17 of a probe or catheter relative to the reference coordinate system.
  • The position tracking module 32 repeatedly monitors and tracks the reference point element, to generate a continuous stream of coordinate position data sets, wherein a single combination of XYZ values represent a single coordinate position data set. Optionally, the position tracking module 32 may record, with each coordinate position data set, a time stamp indicating a time at which the coordinate position data set was obtained. The time stamp may be defined by a system clock 36 that also provides reference timing information to the physiology signal processor module 20 and ultrasound processor module 30. Alternatively, the time stamp may be defined with respect to the cardiac cycle the patient (e.g. X seconds following/preceding the peak of the R-wave). When the timing information is defined based on the cardiac cycle, cardiac cycle timing information is provided by the cardiac cycle detection module 28 to each of the physiology signal processor module 20, ultrasound processor module 30 and position tracking module 32.
  • The position tracking module 32 may provide the position information, orientation information and timing information (collectively referred to as “tracking information”) to the physiology and ultrasound processor modules 20 and 30. When the tracking information is provided to the ultrasound processor module 30, the ultrasound processor module 30 stores the tracking information with the ultrasound image in the ultrasound data memory 38. The tracking information uniquely identifies the time at which the ultrasound image was acquired, as well as the position and/or orientation of the ultrasound probe 14-16 at the time of acquisition. When the tracking information is provided to the physiology processor module 20, the physiology processor module 20 records the tracking information with the physiology data in the physiology data memory 40. The tracking information uniquely identifies the time at which the physiology data was acquired, as well as the position and/or orientation of the physiology catheter(s) 22-24 at the time of acquisition.
  • A registration module 42 accesses the ultrasound and physiology data memories 38 and 40 to obtain one or more ultrasound images and related physiology data sets acquired at the same point(s) in time. The ultrasound images and associated physiology data sets are identified from memories 38 and 40 based on the recorded time stamps. The registration module 42 transforms one or both of the ultrasound image and physiology data into a common coordinate system and stores the results in a common data memory 44. By way of example, the registration module 42 may map the physiology data set into the coordinate system defined by the ultrasound images as stored in the ultrasound data memory 38. Alternatively, the registration module 42 may map the ultrasound images into the coordinate system defined by the physiology data sets as stored in the physiology data memory 40. As a further alternative, the registration module 42 may transform both the ultrasound images and physiology data sets into a new coordinate system.
  • A display processor module 46 accesses the common data memory 44 to obtain select combinations of ultrasound images and physiology data sets for presentation on display 48. The display processor module may form a display image combining the ultrasound image and physiology data set, such that the physiology data is mapped on to an anatomical structure contained in, and defined by, the ultrasound image. Optionally, the display processor module 46 may access a lookup table 50 that is stored as part of, or separate from, the common data memory 44 to define display characteristics, such as transparency, opacity, color, brightness and the like, for individual display pixels defining the resultant display image.
  • The lookup table 50 may be used to define data samples or voxels within the ultrasound image through one of gray scale and color information, and to define the physiology data through the other of gray scale and color information. Optionally, one combination or range of colors may be designated to denote ultrasound information, while a separate combination or range of colors may be designated to denote physiology data. As a further option, the brightness, intensity or opacity of each pixel in the display image may be varied in accordance with one or both of the value of the ultrasound information and the value of the physiology data. For example, the ultrasound image may be defined by B-mode data values for each data point or voxel, while the physiology data associated with the data point or voxel may be defined by one or more colors within a range of colors (e.g., ranging from blue to red, or ranging from light blue to dark blue, or ranging from light red to dark red). Alternatively, the ultrasound image may be defined by non B-mode data values, such as anatomic M-mode, strain or strain rate characteristics of the anatomic structure, with the strain or strain rate being represented in the display image by discrete colors within a range of colors (e.g., ranging from blue to red, or ranging from light blue to dark blue, or ranging from light red to dark red). When the anatomic structure is represented in the display image by discrete colors, the physiology data may be represented through variations of the brightness at each display pixel.
  • A user interface 52 to is provided to control the overall operation of the physiology system 10. The user interface 52 may include, among other things, a keyboard, mouse and/or trackball. The user interface 52 may permit an operator to designate a portion of the ultrasound image, for which physiologic data is of interest. The display processor module 46 and/or physiology signal processor module 20 may then generate a separate physiology graph to be displayed independent and distinct from the ultrasound image. For example, the display 48 may present an ultrasound image as a B-mode sector scan, with one or more points of interest on the B-mode sector scan designated. A separate graph may be co-displayed on display 48 with the ultrasound B-mode image.
  • FIG. 2 illustrates an exemplary block diagram of the ultrasound processor module 33 of FIG. 1 formed in accordance with an embodiment of the present invention. The operations of the modules illustrated in FIG. 2 may be controlled by a local ultrasound controller 87 or by the system controller 8. The modules 49-59 perform mid-processor operations.
  • The ultrasound processor module 30 obtains ultrasound data 21 from the ultrasound data memory 38 or the beam former module 12 (FIG. 1). The received ultrasound data 21 constitutes I, Q data pairs representing the real and imaginary components associated with each data sample. The I, Q data pairs are provided to an ARFI module 49, a color-flow module 51, a power Doppler module 53, a B-mode module 55, a spectral Doppler module 57 and M-mode module 59. Optionally, other modules may be included such as a strain module, a strain rate module and the like. Each of modules 49-59 process the I, Q data pairs in a corresponding manner to generate ARFI data 60, color-flow data 61, power Doppler data 63, B-mode data 65, spectral Doppler data 67, and M-mode data 69, all of which are stored in ultrasound data memory 38. Alternatively, the ultrasound data memory 38 may be divided such that the raw I, Q data pairs are stored in raw data memory, while the processed image data is stored in separate image data memory. The ARFI, color-flow, power Doppler, B-mode, spectral Doppler and M-mode data 60-69 may be stored as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system.
  • FIG. 3 illustrates an exemplary block diagram of the display processor module 46 of FIG. 1 formed in accordance with an embodiment of the present invention. The operations of the modules illustrated in FIG. 3 may be controlled by the local ultrasound controller 87 or by the system controller 8. The modules 73, 77 and 81 perform display-processor operations. A scan converter module 73 reads from memory 44 the vector data values associated with one or more image frames and converts the set of vector data values to Cartesian coordinates to generate an display image frame 75 formatted for display. The ultrasound image frames 75 generated by scan converter module 73 may be passed to a temporary area in memory 44 for subsequent processing or may be passed directly to one of the 2-D and 3-D processor module's 77 and 81. As an example, it may be desired to view a B-mode ultrasound image in real-time associated with the ultrasound signals detected by an ultrasound catheter. To do so, the scan converter obtains B-mode vector data sets for images stored in memory 44. The B-mode vector data is interpolated where necessary and converted into the X,Y format for video display to produce ultrasound image frames. The scan converted ultrasound image frames are passed to the video processor module 77 that maps the video to a grey-scale mapping for video display.
  • The grey-scale map may represent a transfer function of the raw image data to displayed grey levels. Once the video data is mapped to the grey-scale values, the video processor module 77 controls the display 48 to display the image frame in real-time. The B-mode image displayed in the real-time is produced from an image frame of data in which each datum indicates the intensity or brightness of a respective pixel in the display. The display image represents the tissue and/or blood flow in a plane through the region of interest being imaged.
  • The color-flow module 51 (FIG. 2) may be utilized to provide real-time two-dimensional images of blood velocity in the imaging plane. The frequency of sound waves reflected from the inside of the blood vessels, heart cavities, etc., is shifted in proportion to the velocity of the blood vessels; positively shifted for cells moving toward the transducer and negatively shifted for cells moving away from the transducer. The blood velocity is calculated by measuring the phase shift from firing to firing at a specific range gate. Mean blood velocity from multiple vector positions and multiple range gates along each vector are calculated and a two-dimensional image is made from this information. The color-flow module 51 receives the complex I, Q data pairs from the beamformer module 12 and processes the I, Q data pairs to calculate the mean blood velocity, variance (representing blood turbulence) and total pre-normalized power for all sample volumes within the operator defined region.
  • The 2D video processor module 77 combines one or more of the frames generated from the different types of ultrasound information and physiologic data. For example, the 2D video processor modules 77 may combine a B-mode image frame and a color representation of the physiologic data by mapping the B-mode data to a grey map and mapping the physiologic data to a color map for video display. In the final displayed image, the color pixel data is superimposed on the grey scale pixel data to form a single multi-mode image frame 79 that may be re-stored in memory 44 or passed over bus 35 to the display 48. Successive frames of B-mode images, in combination with the associated physiology data, may be stored as a cine loop in memory 44. The cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user. The user may freeze the cine loop by entering a freeze command at the user interface 52. The user interface 52 represents a keyboard and mouse and all other commands associated with ultrasound system user interface.
  • The spectral Doppler module 57 (FIG. 2) operates upon the I, Q data pairs by integrating (summing) the data pairs over a specified time interval and then sampling the data pairs. The summing interval and the transmission burst length together define the length of the sample volume which is specified by the user at the user interface 52. The spectral Doppler module 57 may utilize a wall filter to reject any clutter in the signal which may correspond to stationery or very slow moving tissue. The filter output is then fed into a spectrum analyzer, which may implement a Fast Fourier Transform over a moving time window of samples. Each FFT power spectrum is compressed and then output by the spectral Doppler module 57 to memory 44. The 2D video processor module 77 then maps the compressed spectral Doppler data to grey scale values for display on the display 48 as a single spectral line at a particular time point in the Doppler velocity (frequency) versus a time spectrogram. The 2-D video processor module 77 may similarly map the physiology data into a graph representing electrical potential fluctuation (along the vertical axis) and time (along the horizontal axis).
  • A 3D processor module 81 is also controlled by user interface 52 and accesses memory 44 to obtain spatially consecutive groups of ultrasound image frames and to generate three dimensional image representation thereof, such as through volume rendering or surface rendering algorithms. The three dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection and the like.
  • FIG. 4 illustrates a,processing sequence carried out by the physiology system 10 of FIG. 1 in connection with acquiring, tracking and combining ultrasound and physiology data in real-time or near real-time during an ablation procedure. At 400, the position tracking module 32 registers the ultrasound probe 14-16 within the position tracking coordinate system. At 402, the position tracking module 32 registers the physiology catheters within the position tracking coordinate system. At 404, the beam former module 12 acquires RF echo signals from one or more scan planes of the abalation site and generates I, Q data pairs of therefrom. At 406, the ultrasound processor module 30 accesses the raw I, Q data pairs and forms ultrasound data images therefrom based upon the desired mode of operation (as discussed above in connection with FIG. 2).
  • At 408, the position tracking module 32 provides tracking information to the ultrasound processor module 30. The tracking information may include a unique time stamp and/or reference point data identifying the position and/or orientation of one or more reference point elements RP1, RP2 on the corresponding ultrasound probe 14-16. The tracking information is stored in memory 38 by the ultrasound processor module 30 with the ultrasound image data.
  • At 410, the physiology signal processor module 20 acquires physiology data, and at 412, forms a physiology image data set. At 414, the position tracking module 32 provides tracking information (e.g. time stamps and reference point data) to the physiology signal processor module 20. The physiology image data set and tracking information are stored by the physiology signal processor module 20 in physiology data memory 40.
  • At 416, the registration module 42 accesses the ultrasound and physiology data memories 38 and 40, and transforms or maps the ultrasound and physiology image data into a common coordinate reference system. Once mapped to a common coordinate reference system, the ultrasound and physiology image data are stored in a common data memory 44. At 418, the display processor module 46 performs display processing upon the ultrasound physiology image data to form a combined ultrasound and physiology display image. At 420, the display 48 presents the combined ultrasound and physiology image for viewing.
  • At 422, the previously displayed image is stored in the electrophysiology recording system. At 424, the physiology system 10 detects a change in rhythm of the subject of interest produced by a change in cardiac dimensions. At 426, a tracking coordinate system 32 is decoupled from previously acquired combined image.
  • At 428, the beam former module 12 acquires RF echo signals from one or more scan plans of the ablation site and generates I,Q data pairs of therefrom creating a second set of ultrasound data points. At 430, the ultrasound processor module 30 accesses the raw I,Q data peers and forms a second ultrasound data image therefrom based upon the desired mode of operation (as discussed above in connection With FIG. 2).
  • At 432, the physiology signal processor module 20 acquires physiology data and forms a second physiology image data set. At 434, the position tracking module 32 provides tracking information (example time stamps and reference point data) to the physiology signal processor module 20. The second physiology image data set and tracking information are stored by the physiology signal processor module 20 in physiology data memory 40.
  • At 436, the registration module 42 accesses the ultrasound and physiology data memories 38 and 40, and transforms or maps the second ultrasound and second physiology image data into a common coordinate reference system. Once mapped to a common coordinate reference system, the second ultrasound and second physiology image data are stored in a common data memory 44. At 438, the display processor module.46 performs display processing upon the second ultrasound physiology image data to form a combined ultrasound and physiology display image. At 440, the display 48 presents the combined ultrasound and physiology image for viewing wherein the first and second display images of the combined ultrasound image and physiology data obtained, in real-time or near real-time, during the ablation procedure to determine the efficacy of the ablation procedure.
  • FIG. 5 illustrates an exemplary application in which the above described embodiments may be utilized. The graphical representation of a heart 500 is illustrated. An ultrasound catheter 502 and EP catheter 504 have been inserted through the inferior vena cava (IVC) into the right atrium (RA). The ultrasound and EP catheters 502 and 504 have passed through a punctured opening through the fossa ovalis into the left atrium (LA). The ultrasound catheter 502 includes a series of spaced apart piezo transducers 506 that may be separately activated and controlled to transmit and receive ultrasound data for corresponding scan planes. The ultrasound catheter 502 and EP catheter 504 are utilized to map the anatomical contour of, and electrical activity at, the interior wall of the left atrium, including proximate the openings to the pulmonary veins denoted at 508 and 510. It should be understood that other areas of the heart can be monitored by the method described herein, for example, imaging from the right side to the left—the US catheter would stay in the HRA and visualize across the septum to the LA where the ablation catheter would be placed.
  • ARFI allows examination of the functionality of tissue subsets, such as in the heart, organs, tissue, vasculature and the like. ARFI is a phenomenon associated with the propagation of acoustic waves through a dissipative medium. It is caused by a transfer of momentum from the wave to the medium, arising either from absorption or reflection of the wave. This momentum transfer results in the application of a force in the direction of wave propagation. The magnitude of this force is dependent upon both the tissue properties and the acoustic beam parameters. The duration of the force application is determined by the temporal profile of the acoustic wave. ARFI images the response of tissue to acoustic radiation force for the purpose of characterizing the mechanical properties of the tissue. When the duration of the radiation force is short (less than 1 millisecond), the tissue mechanical impulse response can be observed. ARFI imaging has many potential clinical applications, including: detecting and characterizing a wide variety of soft tissue lesions, and identifying and characterizing atherosclerosis, plaque, and thromboses.
  • The term “co-displays” is not limited to displaying information on a common CRT or monitor, but instead refers also to the use of multiple monitors located in immediately adjacent one another to facilitate substantially simultaneous viewing by a single individual. The term “processor” is not intended to be limited to a single processor or CPU.
  • The various blocks and modules are illustrated as conceptually functional units only, but may be implemented utilizing any combination of dedicated or non-dedicated hardware boards, DSPs, processors and the like. Alternatively, the blocks and modules may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with the functional operations distributed between the processors. As a further option, the blocks and modules may be implemented utilizing a hybrid configuration in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the shelf PC and the like.
  • It is understood that the operations illustrated in any processing sequences or flowcharts may be carried out in any order, including in parallel.
  • The figures illustrate diagrams of the functional blocks of various. The functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or a block or random access memory, hard disk, or the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed imaging software package, and the like.
  • For purposes of this disclosure, the term “real-time” refers to the changes in tissue being readily apparent to the clinician as the ablation procedure continues. For example, a physician creates a burn at the ablation site, at the completion of that burn the physician sees an indicator which indicates if the burn was successful for the intended purpose. In other words, before completion of the ablation procedure. The indicator may also provide the dimension of the burn as the ablation procedure continues. The term “near, real-time” refers to the system ability to provide feedback to the clinician about the ablation procedure at the ablation site, but not immediately following or during the ablation procedure. For example, the clinician may wait for one to two minutes while the system provides the information or the clinician may assess all of the burns performed at the end of the procedure, but prior to the catheter being removed. In the latter example, the data from the system may not be provided to the clinician for as long as ten minutes.
  • For purposes of this disclosure, the term “coupled” means the joining of two components (electrical or mechanical) directly or indirectly to one another. Such joining may be stationary in nature or movable in nature. Such joining may be achieved with the two components (electrical or mechanical) and any additional intermediate members being integrally formed as a single unitary body with one another or with the two components or the two components and any additional member being attached to one another. Such joining may be permanent in nature or alternatively may be removable or releasable in nature

Claims (50)

1. A method of obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site, the method comprising the steps of:
receiving signals from an ultrasound probe located proximate the ablation site and, based thereon producing ultrasound data representative of a scan plane including the ablation site;
generating an ultrasound image based on the ultrasound data, the ultrasound image being representative of an anatomical structure of a portion of the ablation site contained in the scan plane;
receiving physiology signals from a physiology catheter located proximate the ablation site and, based thereon producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane;
forming a display image combining the ultrasound image and physiology data and saving the image in a electrophysiology recording system;
detecting a change in the subject Of interest proximate the ablation site;
receiving, in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site;
generating, in real-time or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scan plane;
receiving, in real-time or near real-time, physiology signals from the physiology catheter located proximate the ablation site and, based thereon producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane; and
forming, in real-time or near real-time, a second display image combining the second ultrasound image and second physiology data and saving the second image in the electrophysiology recording system.
2. The method of claim 1, further comprising the steps of comparing the first and second display images of the combined ultrasound image and physiology data, wherein the efficacy of the ablation procedure can be determined.
3. The method of claim 1, wherein detecting a change in the subject of interest includes determining tissue characterization as being one of conducting and non-conducting tissue.
4. The method of claim 1, wherein the display image includes physiology data mapped onto an anatomical structure contained in and defined by the ultrasound image.
5. The method of claim 1, wherein the ultrasound signal is received from an ultrasound probe constituting at least one of an intravascular ultrasound (IVUS) catheter, an echocardiography (ICE) catheter, a transesophageal probe, an interventional probe and a surface probe.
6. The method of claim 1, wherein the physiology data is received from a physiology catheter constituting at least one of an electrophysiology (EP) catheter and a hemodynamic (HD) catheter.
7. The method of claim 1, further comprising receiving ECG signals from ECG leads provided on the surface of the subject; deriving cardiac cycle data based on one of the ECG signals and intracardiac signals obtained from an electrophysiology catheter; and utilizing the cardiac cycle data to synchronize the ultrasound images and physiology data.
8. The method of claim 1, further comprising tracking a position of an ultrasound probe and a physiology catheter, and generating tracking information denoting positions of the ultrasound probe and physiology catheter with respect to a common reference coordinate system.
9. The method of claim 1, further comprising registering the ultrasound image and physiology data within a common coordinate reference system.
10. The method of claim 1, further comprising receiving ECG signals from ECG leads placed on a subject, and generating timing information from the ECG signals, the timing information being representative of cyclical points in a subject's cardiac cycle.
11. The method of claim 1, further comprising generating and displaying new ultrasound images at a frame rate of at least seven frames per second.
12. The method of claim 1, further comprising forming a volumetric ultrasound data set for a series of the scan planes, the display image constituting a three-dimensional representation of the ultrasound image and physiology data.
13. The method of claim 1, wherein the ultrasound image and physiology data combined in the display image are obtained at a common time in a cyclical motion of the region of interest.
14. The method of claim 1, wherein the ultrasound image is representative of least one of B-mode, power Doppler, color flow, M-mode, anatomic M-mode, ARFI mode, strain and strain rate information.
15. The method of claim 1, wherein the physiology data is denoted in the display image as at least one of gray scale information and color information combined with the ultrasound image.
16. The method of claim 1, further comprising accessing a lookup table based on the ultrasound image data and physiology data to define pixel values of the display image, the lookup table identifying pixel values to be used in the display image based on the ultrasound image data and physiology data.
17. The method of claim 1, further comprising presenting, in the display image, the ultrasound image data as gray scale information and the physiology data as color information.
18. The method of claim 1, further comprising providing a user interface the permits an operator to designate a point on the subject of interest, in response to the user designation, presenting a graph of physiology data over a period of time associated with a designated point on the region of interest.
19. A method of obtaining real-time or near real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site, the method comprising the steps of:
receiving signals from an ultrasound probe located proximate the ablation site and, based thereon producing ultrasound data representative of a scan plane including the ablation site;
generating an ultrasound image based on the ultrasound data, the ultrasound image being representative of an anatomical structure of a portion of the ablation site contained in the scan plane;
receiving physiology signals from a physiology catheter located proximate the ablation site and, based thereon producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane;
forming a display image combining the ultrasound image and physiology data and saving the image in a electrophysiology recording system;
detecting a change in the subject of interest proximate the ablation site;
receiving, in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site;
generating, in real-time or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scan plane;
receiving, in real-time or near real-time, physiology signals from the physiology catheter located proximate the ablation site and, based thereon producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane;
forming, in real-time or near real-time, a second display image combining the second ultrasound image and second physiology data and saving the second image in the electrophysiology recording system; and
tracking a position of an ultrasound probe and a physiology catheter, and generating tracking information denoting positions of the ultrasound probe and physiology catheter with respect to a common reference coordinate system and registering the ultrasound image and physiology data within a common coordinate reference system.
20. The method of claim 19, further comprising the steps of comparing the first and second display images of the combined ultrasound image and physiology data, wherein the efficacy of the ablation procedure can be determined.
21. The method of claim 19, wherein detecting a change in the subject of interest includes determining tissue characterization as being one of conducting and non-conducting tissue.
22. The method of claim 19, wherein the display image includes physiology data mapped onto an anatomical structure contained in and defined by the ultrasound image.
23. The method of claim 19, wherein the ultrasound signal is received from-an ultrasound probe constituting at least one of an intravascular ultrasound (IVUS) catheter, an echocardiography (ICE) catheter, a transesophageal probe, an interventional probe and a surface probe.
24. The method of claim 19, wherein the physiology data is received from a physiology catheter constituting at least one of an electrophysiology (EP) catheter and a hemodynamic (HD) catheter.
25. The method of claim 19, further comprising receiving ECG signals from ECG leads provided on the surface of the subject; deriving cardiac cycle data based on one of the ECG signals and intracardiac signals obtained from an electrophysiology catheter; and utilizing the cardiac cycle data to synchronize the ultrasound images and physiology data.
26. The method of claim 19, further comprising receiving ECG signals from ECG leads placed on a subject, and generating timing information from the ECG signals, the timing information being representative of cyclical points in a subject's cardiac cycle.
27. The method of claim 19, further comprising generating and displaying new ultrasound images at a frame rate of at least seven frames per second.
28. The method of claim 19, further comprising forming a volumetric ultrasound data set for a series of the scan planes, the display image constituting a three-dimensional representation of the ultrasound image and physiology data.
29. The method of claim 19, wherein the ultrasound image and physiology data combined in the display image are obtained at a common time in a cyclical motion of the region of interest.
30. The method of claim 19, wherein the ultrasound image is representative of least one of B-mode, power Doppler, color flow, M-mode, anatomic M-mode, ARFI mode, strain and strain rate information.
31. The method of claim 19, wherein the physiology data is denoted in the display image as at least one of gray scale information and color information combined with the ultrasound image.
32. The method of claim 19, further comprising accessing a lookup table based on the ultrasound image data and physiology data to define pixel values of the display image, the lookup table identifying pixel values to be used in the display image based on the ultrasound image data and physiology data.
33. The method of claim 19, further comprising presenting, in the display image, the ultrasound image data as gray scale information and the physiology data as color information.
34. The method of claim 19, further comprising providing a user interface the permits an operator to designate a point on the subject of interest, in response to the user designation, presenting a graph of physiology data over a period of time associated with a designated point on the region of interest.
35. A method of obtaining real-time or neat real-time feedback as to the efficacy of an ablation procedure on a subject of interest at an ablation site, the method comprising the steps of:
receiving signals from an ultrasound probe located proximate the ablation site and, based thereon producing ultrasound data representative of a scan plane including the ablation site;
generating an ultrasound image based on the ultrasound data, the ultrasound image being representative of an anatomical structure of a portion of the ablation site contained in the scan plane;
receiving physiology signals from a physiology catheter located proximate the ablation site and, based thereon producing physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane;
forming a display image combining the ultrasound image and physiology data and saving the image in a electrophysiology recording system;
detecting a change in the subject of interest proximate the ablation site;
receiving, in real-time or near real-time, signals from the ultrasound probe located proximate the ablation site and, based thereon producing a second ultrasound data representative of the scan plane including the ablation site;
generating, in real-time or near real-time, a second ultrasound image based on the second ultrasound data, the second ultrasound image being representative of the change in the anatomical structure of a portion of the ablation site contained in the scan plane;
receiving, in real-time or near real-time, physiology signals from the physiology catheter located proximate the ablation site and, based thereon producing a second physiology data representative of physiologic activity of the portion of the ablation site contained in the scan plane;
forming, in real-time or near real-time, a second display image combining the second ultrasound image and second physiology data and saving the second image in the electrophysiology recording system; and
forming a volumetric ultrasound data set for a series of the scan planes, the display image constituting a three-dimensional representation of the ultrasound image and physiology data, wherein the ultrasound image and physiology data combined in the display image are obtained at a common time in a cyclical motion of the region of interest.
36. The method of claim 35, further comprising the steps of comparing the first and second display images of the combined ultrasound image and physiology data, wherein the efficacy of the ablation procedure can be determined.
37. The method of claim 35, wherein detecting a change in the subject of interest includes determining tissue characterization as being one of conducting and non-conducting tissue.
38. The method of claim 35, wherein the display image includes physiology data mapped onto an anatomical structure contained in and defined by the ultrasound image.
39. The method of claim 35, wherein the ultrasound signal is received from an ultrasound probe constituting at least one of an intravascular ultrasound (IVUS) catheter, an echocardiography (ICE) catheter, a transesophageal probe, an interventional probe and a surface probe.
40. The method of claim 35, wherein the physiology data is received from a physiology catheter constituting at least one of an electrophysiology (EP) catheter and a hemodynamic (HD) catheter.
41. The method of claim 35, further comprising receiving ECG signals from ECG leads provided on the surface of the subject; deriving cardiac cycle data based on one of the ECG signals and intracardiac signals obtained from an electrophysiology catheter; and utilizing the cardiac cycle data to synchronize the ultrasound images and physiology data.
42. The method of claim 35, further comprising tracking a position of an ultrasound probe and a physiology catheter, and generating tracking information denoting positions of the ultrasound probe and physiology catheter with respect to a common reference coordinate system.
43. The method of claim 35, further comprising registering the ultrasound image and physiology data within a common coordinate reference system.
44. The method of claim 35, further comprising receiving ECG signals from ECG leads placed on a subject, and generating timing information from the ECG signals, the timing information being representative of cyclical points in a subject's cardiac cycle.
45. The method of claim 35, further comprising generating and displaying new ultrasound images at a frame rate of at least seven frames per second.
46. The method of claim 35, wherein the ultrasound image is representative of least one of B-mode, power Doppler, color flow, M-mode, anatomic M-mode, ARFI mode, strain and strain rate information.
47. The method of claim 35, wherein the physiology data is denoted in the display image as at least one of gray scale information and color information combined with the ultrasound image.
48. The method of claim 35, further comprising accessing a lookup table based on the ultrasound image data and physiology data to define pixel values of the display image, the lookup table identifying pixel values to be used in the display image based on the ultrasound image data and physiology data.
49. The method of claim 35, further comprising presenting, in the display image, the ultrasound image data as gray scale information and the physiology data as color information.
50. The method of claim 35, further comprising providing a user interface the permits an operator to designate a point on the subject of interest, in response to the user designation, presenting a graph of physiology data over a period of time associated with a designated point on the region of interest.
US11/312,023 2005-08-16 2005-12-20 Clinical feedback of ablation efficacy during ablation procedure Abandoned US20070049827A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/312,023 US20070049827A1 (en) 2005-08-16 2005-12-20 Clinical feedback of ablation efficacy during ablation procedure

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/204,711 US7740584B2 (en) 2005-08-16 2005-08-16 Method and system for mapping physiology information onto ultrasound-based anatomic structure
US11/312,023 US20070049827A1 (en) 2005-08-16 2005-12-20 Clinical feedback of ablation efficacy during ablation procedure

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/204,711 Continuation-In-Part US7740584B2 (en) 2005-08-16 2005-08-16 Method and system for mapping physiology information onto ultrasound-based anatomic structure

Publications (1)

Publication Number Publication Date
US20070049827A1 true US20070049827A1 (en) 2007-03-01

Family

ID=37440973

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/204,711 Active 2026-09-12 US7740584B2 (en) 2005-08-16 2005-08-16 Method and system for mapping physiology information onto ultrasound-based anatomic structure
US11/312,023 Abandoned US20070049827A1 (en) 2005-08-16 2005-12-20 Clinical feedback of ablation efficacy during ablation procedure

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/204,711 Active 2026-09-12 US7740584B2 (en) 2005-08-16 2005-08-16 Method and system for mapping physiology information onto ultrasound-based anatomic structure

Country Status (3)

Country Link
US (2) US7740584B2 (en)
DE (1) DE112006002162T5 (en)
WO (1) WO2007021511A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090111724A1 (en) * 2007-06-20 2009-04-30 Kaaret Thomas W Natural Cleaning Compositions
US20090270731A1 (en) * 2008-04-24 2009-10-29 Boston Scientific Scimed, Inc Methods, systems, and devices for tissue characterization by spectral similarity of intravascular ultrasound signals
US20100268072A1 (en) * 2007-11-15 2010-10-21 Koninklijke Philips Electronics N.V. Method and apparatus for positional tracking of therapeutic ultrasound transducer
WO2011062681A1 (en) * 2009-11-20 2011-05-26 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US20140128735A1 (en) * 2012-11-02 2014-05-08 Cardiac Science Corporation Wireless real-time electrocardiogram and medical image integration
US20140301621A1 (en) * 2013-04-03 2014-10-09 Toshiba Medical Systems Corporation Image processing apparatus, image processing method and medical imaging device
WO2015077474A1 (en) * 2013-11-20 2015-05-28 The George Washington University Systems and methods for hyperspectral analysis of cardiac tissue
US9144461B2 (en) 2008-12-03 2015-09-29 Koninklijke Philips N.V. Feedback system for integrating interventional planning and navigation
US9549713B2 (en) 2008-04-24 2017-01-24 Boston Scientific Scimed, Inc. Methods, systems, and devices for tissue characterization and quantification using intravascular ultrasound signals
US20180140281A1 (en) * 2015-08-21 2018-05-24 Fujifilm Corporation Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus
US10076238B2 (en) 2011-09-22 2018-09-18 The George Washington University Systems and methods for visualizing ablated tissue
US10143517B2 (en) 2014-11-03 2018-12-04 LuxCath, LLC Systems and methods for assessment of contact quality
US10456105B2 (en) 2015-05-05 2019-10-29 Boston Scientific Scimed, Inc. Systems and methods with a swellable material disposed over a transducer of an ultrasound imaging system
CN111093516A (en) * 2017-11-21 2020-05-01 深圳迈瑞生物医疗电子股份有限公司 Ultrasound system and method for planning ablation
US10722301B2 (en) 2014-11-03 2020-07-28 The George Washington University Systems and methods for lesion assessment
US10736512B2 (en) 2011-09-22 2020-08-11 The George Washington University Systems and methods for visualizing ablated tissue
US10779904B2 (en) 2015-07-19 2020-09-22 460Medical, Inc. Systems and methods for lesion formation and assessment
US10881376B2 (en) 2017-11-08 2021-01-05 Biosense Webster (Israel) Ltd. System and method for providing auditory guidance in medical systems
CN112312840A (en) * 2018-06-22 2021-02-02 皇家飞利浦有限公司 Intravascular ultrasound location identification
US20210137488A1 (en) * 2019-11-12 2021-05-13 Biosense Webster (Israel) Ltd. Historical ultrasound data for display of live location data
US11096584B2 (en) 2013-11-14 2021-08-24 The George Washington University Systems and methods for determining lesion depth using fluorescence imaging
US11185720B2 (en) * 2014-10-17 2021-11-30 Koninklijke Philips N.V. Ultrasound patch for ultrasound hyperthermia and imaging
US20210393233A1 (en) * 2018-11-15 2021-12-23 Koninklijke Philips N.V. Simultaneous sensor tracking in medical interventions

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100569186C (en) * 2002-03-15 2009-12-16 比约恩·A·J·安杰尔森 Ultrasonic imaging method and system, ultrasound transducer array and probe
CA2659898C (en) 2006-08-03 2017-08-29 Christoph Scharf Method and device for determining and presenting surface charge and dipole densities on cardiac walls
US20080146942A1 (en) * 2006-12-13 2008-06-19 Ep Medsystems, Inc. Catheter Position Tracking Methods Using Fluoroscopy and Rotational Sensors
US7925068B2 (en) * 2007-02-01 2011-04-12 General Electric Company Method and apparatus for forming a guide image for an ultrasound image scanner
US9629571B2 (en) 2007-03-08 2017-04-25 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US8781193B2 (en) 2007-03-08 2014-07-15 Sync-Rx, Ltd. Automatic quantitative vessel analysis
WO2012176191A1 (en) 2011-06-23 2012-12-27 Sync-Rx, Ltd. Luminal background cleaning
US11197651B2 (en) 2007-03-08 2021-12-14 Sync-Rx, Ltd. Identification and presentation of device-to-vessel relative motion
US9968256B2 (en) 2007-03-08 2018-05-15 Sync-Rx Ltd. Automatic identification of a tool
US10716528B2 (en) 2007-03-08 2020-07-21 Sync-Rx, Ltd. Automatic display of previously-acquired endoluminal images
US9375164B2 (en) 2007-03-08 2016-06-28 Sync-Rx, Ltd. Co-use of endoluminal data and extraluminal imaging
US11064964B2 (en) 2007-03-08 2021-07-20 Sync-Rx, Ltd Determining a characteristic of a lumen by measuring velocity of a contrast agent
WO2014002095A2 (en) 2012-06-26 2014-01-03 Sync-Rx, Ltd. Flow-related image processing in luminal organs
JP5639764B2 (en) 2007-03-08 2014-12-10 シンク−アールエックス,リミティド Imaging and tools for use with moving organs
US9173638B2 (en) 2007-06-04 2015-11-03 Biosense Webster, Inc. Cardiac mechanical assessment using ultrasound
CN101820819A (en) * 2007-10-10 2010-09-01 皇家飞利浦电子股份有限公司 Supersonic communication via wave point and patient monitor
US8906011B2 (en) 2007-11-16 2014-12-09 Kardium Inc. Medical device for use in bodily lumens, for example an atrium
EP2252203A2 (en) 2008-01-17 2010-11-24 Christoph Scharf A device and method for the geometric determination of electrical dipole densities on the cardiac wall
ES2450391T3 (en) 2008-06-19 2014-03-24 Sync-Rx, Ltd. Progressive progress of a medical instrument
US8855744B2 (en) 2008-11-18 2014-10-07 Sync-Rx, Ltd. Displaying a device within an endoluminal image stack
US9974509B2 (en) 2008-11-18 2018-05-22 Sync-Rx Ltd. Image super enhancement
US11064903B2 (en) 2008-11-18 2021-07-20 Sync-Rx, Ltd Apparatus and methods for mapping a sequence of images to a roadmap image
US9101286B2 (en) 2008-11-18 2015-08-11 Sync-Rx, Ltd. Apparatus and methods for determining a dimension of a portion of a stack of endoluminal data points
US9095313B2 (en) 2008-11-18 2015-08-04 Sync-Rx, Ltd. Accounting for non-uniform longitudinal motion during movement of an endoluminal imaging probe
US10362962B2 (en) 2008-11-18 2019-07-30 Synx-Rx, Ltd. Accounting for skipped imaging locations during movement of an endoluminal imaging probe
US9144394B2 (en) 2008-11-18 2015-09-29 Sync-Rx, Ltd. Apparatus and methods for determining a plurality of local calibration factors for an image
KR101302384B1 (en) * 2010-07-19 2013-09-02 삼성메디슨 주식회사 Ultrasonic Diagnostic Apparatus and the Method thereof
US9486273B2 (en) 2011-01-21 2016-11-08 Kardium Inc. High-density electrode-based medical device system
CA2764494A1 (en) 2011-01-21 2012-07-21 Kardium Inc. Enhanced medical device for use in bodily cavities, for example an atrium
WO2012122517A2 (en) 2011-03-10 2012-09-13 Acutus Medical, Inc. Device and method for the geometric determination of electrical dipole densities on the cardiac wall
US11109835B2 (en) 2011-12-18 2021-09-07 Metritrack Llc Three dimensional mapping display system for diagnostic ultrasound machines
WO2013101562A2 (en) * 2011-12-18 2013-07-04 Metritrack, Llc Three dimensional mapping display system for diagnostic ultrasound machines
KR102114417B1 (en) 2013-06-11 2020-05-22 삼성메디슨 주식회사 Method and apparatus for image registration
AU2014318872B2 (en) 2013-09-13 2018-09-13 Acutus Medical, Inc. Devices and methods for determination of electrical dipole densities on a cardiac surface
WO2015148470A1 (en) 2014-03-25 2015-10-01 Acutus Medical, Inc. Cardiac analysis user interface system and method
CN107106124B (en) 2014-11-18 2021-01-08 C·R·巴德公司 Ultrasound imaging system with automatic image rendering
CN106999146B (en) 2014-11-18 2020-11-10 C·R·巴德公司 Ultrasound imaging system with automatic image rendering
US11304676B2 (en) 2015-01-23 2022-04-19 The University Of North Carolina At Chapel Hill Apparatuses, systems, and methods for preclinical ultrasound imaging of subjects
CN107847173B (en) * 2015-05-12 2022-08-30 阿库图森医疗有限公司 Ultrasonic sequencing system and method
WO2018017717A1 (en) 2016-07-19 2018-01-25 Shifamed Holdings, Llc Medical devices and methods of use
WO2018093911A1 (en) * 2016-11-16 2018-05-24 Kusumoto Walter Electrophysiology mapping with echo probe data
EP3599977A4 (en) 2017-03-30 2020-12-30 Shifamed Holdings, LLC Medical tool positioning devices, systems, and methods of use and manufacture
JP7404369B2 (en) 2018-08-23 2023-12-25 ヌベラ・メディカル・インコーポレイテッド Medical device positioning devices, systems, and methods of use and manufacture
WO2020102389A1 (en) 2018-11-13 2020-05-22 Shifamed Holdings, Llc Medical tool positioning devices, systems, and methods of use and manufacture
US20230329678A1 (en) * 2022-04-14 2023-10-19 Biosense Webster (Israel) Ltd. Augmented ultrasonic images

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5391199A (en) * 1993-07-20 1995-02-21 Biosense, Inc. Apparatus and method for treating cardiac arrhythmias
US5588432A (en) * 1988-03-21 1996-12-31 Boston Scientific Corporation Catheters for imaging, sensing electrical potentials, and ablating tissue
US5657760A (en) * 1994-05-03 1997-08-19 Board Of Regents, The University Of Texas System Apparatus and method for noninvasive doppler ultrasound-guided real-time control of tissue damage in thermal therapy
US5788636A (en) * 1997-02-25 1998-08-04 Acuson Corporation Method and system for forming an ultrasound image of a tissue while simultaneously ablating the tissue
US6575901B2 (en) * 2000-12-29 2003-06-10 Ge Medical Systems Information Technologies Distributed real time replication-based annotation and documentation system for cardiology procedures
US6643535B2 (en) * 1999-05-26 2003-11-04 Endocare, Inc. System for providing computer guided ablation of tissue
US20040078036A1 (en) * 2002-10-21 2004-04-22 Yaron Keidar Real-time monitoring and mapping of ablation lesion formation in the heart
US20040080336A1 (en) * 2002-06-28 2004-04-29 Nec Electronics Corporation Output buffer apparatus capable of adjusting output impedance in synchronization with data signal
US20040127798A1 (en) * 2002-07-22 2004-07-01 Ep Medsystems, Inc. Method and system for using ultrasound in cardiac diagnosis and therapy
US20050013473A1 (en) * 2003-07-18 2005-01-20 Furnas William J. Container inspection machine

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2509255A1 (en) * 1981-07-08 1983-01-14 Centre Nat Etd Spatiales METHOD FOR ALTITUDE STABILIZING A BALLOON, AND ATMOSPHERIC BALLOONS SUITABLE FOR THE IMPLEMENTATION OF SAID METHOD
JPH0787588B2 (en) * 1987-12-29 1995-09-20 富士写真フイルム株式会社 Method and device for automatic white balance adjustment
JPH0828877B2 (en) * 1988-07-18 1996-03-21 富士写真フイルム株式会社 Method and device for automatic white balance adjustment
US5200269A (en) 1990-06-01 1993-04-06 E. I. Du Pont De Nemours And Company Apparatus and method for baling cut fibers and product
US5329361A (en) * 1991-09-04 1994-07-12 Fuji Photo Film Co., Ltd. White balance control wherein valves of control signals are fixed or converged within a renewable, variable region
US5445150A (en) 1991-11-18 1995-08-29 General Electric Company Invasive system employing a radiofrequency tracking system
US5662108A (en) 1992-09-23 1997-09-02 Endocardial Solutions, Inc. Electrophysiology mapping system
US5687737A (en) 1992-10-09 1997-11-18 Washington University Computerized three-dimensional cardiac mapping with interactive visual displays
US5409000A (en) 1993-09-14 1995-04-25 Cardiac Pathways Corporation Endocardial mapping and ablation system utilizing separately controlled steerable ablation catheter with ultrasonic imaging capabilities and method
US5409007A (en) * 1993-11-26 1995-04-25 General Electric Company Filter to reduce speckle artifact in ultrasound imaging
JP3550440B2 (en) * 1995-04-13 2004-08-04 イーストマン・コダックジャパン株式会社 Auto white balance adjustment device
US6421083B1 (en) * 1996-03-29 2002-07-16 Sony Corporation Color imaging device and method
US6019725A (en) 1997-03-07 2000-02-01 Sonometrics Corporation Three-dimensional tracking and imaging system
US6490474B1 (en) 1997-08-01 2002-12-03 Cardiac Pathways Corporation System and method for electrode localization using ultrasound
US6086532A (en) 1997-09-26 2000-07-11 Ep Technologies, Inc. Systems for recording use of structures deployed in association with heart tissue
US6200269B1 (en) 1998-05-28 2001-03-13 Diasonics, Ultrasound, Inc. Forward-scanning ultrasound catheter probe
JP2000152018A (en) * 1998-08-31 2000-05-30 Canon Inc Method and device for picture processing and recording medium
US6556695B1 (en) 1999-02-05 2003-04-29 Mayo Foundation For Medical Education And Research Method for producing high resolution real-time images, of structure and function during medical procedures
US6413219B1 (en) 1999-03-31 2002-07-02 General Electric Company Three-dimensional ultrasound data display using multiple cut planes
US6447450B1 (en) 1999-11-02 2002-09-10 Ge Medical Systems Global Technology Company, Llc ECG gated ultrasonic image compounding
WO2001043640A2 (en) * 1999-12-15 2001-06-21 Koninklijke Philips Electronics N.V. Diagnostic imaging system with ultrasound probe
US8221402B2 (en) * 2000-01-19 2012-07-17 Medtronic, Inc. Method for guiding a medical device
US6650927B1 (en) 2000-08-18 2003-11-18 Biosense, Inc. Rendering of diagnostic imaging data on a three-dimensional map
US20040152974A1 (en) * 2001-04-06 2004-08-05 Stephen Solomon Cardiology mapping and navigation system
US7314446B2 (en) * 2002-07-22 2008-01-01 Ep Medsystems, Inc. Method and apparatus for time gating of medical images
US7599730B2 (en) * 2002-11-19 2009-10-06 Medtronic Navigation, Inc. Navigation system for cardiac therapies
JP4200824B2 (en) * 2003-06-18 2008-12-24 コニカミノルタエムジー株式会社 Color image output apparatus, image data control program, and storage medium storing the program
US20050096543A1 (en) * 2003-11-03 2005-05-05 Jackson John I. Motion tracking for medical imaging
US7486819B2 (en) * 2003-12-23 2009-02-03 Aptina Imaging Corporation Sampling images for color balance information
EP1838378B1 (en) * 2005-01-18 2017-03-22 Philips Electronics LTD Apparatus for guiding an instrument to a target in the lung
US7517318B2 (en) * 2005-04-26 2009-04-14 Biosense Webster, Inc. Registration of electro-anatomical map with pre-acquired image using ultrasound
US10143398B2 (en) * 2005-04-26 2018-12-04 Biosense Webster, Inc. Registration of ultrasound data with pre-acquired image

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588432A (en) * 1988-03-21 1996-12-31 Boston Scientific Corporation Catheters for imaging, sensing electrical potentials, and ablating tissue
US5391199A (en) * 1993-07-20 1995-02-21 Biosense, Inc. Apparatus and method for treating cardiac arrhythmias
US5657760A (en) * 1994-05-03 1997-08-19 Board Of Regents, The University Of Texas System Apparatus and method for noninvasive doppler ultrasound-guided real-time control of tissue damage in thermal therapy
US5788636A (en) * 1997-02-25 1998-08-04 Acuson Corporation Method and system for forming an ultrasound image of a tissue while simultaneously ablating the tissue
US6643535B2 (en) * 1999-05-26 2003-11-04 Endocare, Inc. System for providing computer guided ablation of tissue
US6575901B2 (en) * 2000-12-29 2003-06-10 Ge Medical Systems Information Technologies Distributed real time replication-based annotation and documentation system for cardiology procedures
US20040080336A1 (en) * 2002-06-28 2004-04-29 Nec Electronics Corporation Output buffer apparatus capable of adjusting output impedance in synchronization with data signal
US20040127798A1 (en) * 2002-07-22 2004-07-01 Ep Medsystems, Inc. Method and system for using ultrasound in cardiac diagnosis and therapy
US20040078036A1 (en) * 2002-10-21 2004-04-22 Yaron Keidar Real-time monitoring and mapping of ablation lesion formation in the heart
US20050013473A1 (en) * 2003-07-18 2005-01-20 Furnas William J. Container inspection machine

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090111724A1 (en) * 2007-06-20 2009-04-30 Kaaret Thomas W Natural Cleaning Compositions
US20100268072A1 (en) * 2007-11-15 2010-10-21 Koninklijke Philips Electronics N.V. Method and apparatus for positional tracking of therapeutic ultrasound transducer
US20090270731A1 (en) * 2008-04-24 2009-10-29 Boston Scientific Scimed, Inc Methods, systems, and devices for tissue characterization by spectral similarity of intravascular ultrasound signals
US9549713B2 (en) 2008-04-24 2017-01-24 Boston Scientific Scimed, Inc. Methods, systems, and devices for tissue characterization and quantification using intravascular ultrasound signals
US9144461B2 (en) 2008-12-03 2015-09-29 Koninklijke Philips N.V. Feedback system for integrating interventional planning and navigation
US9173611B2 (en) 2009-11-20 2015-11-03 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US20110125150A1 (en) * 2009-11-20 2011-05-26 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
WO2011062681A1 (en) * 2009-11-20 2011-05-26 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US11324550B2 (en) 2009-11-20 2022-05-10 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US8454589B2 (en) 2009-11-20 2013-06-04 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US10130419B2 (en) 2009-11-20 2018-11-20 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for assessing effective delivery of ablation therapy
US10736512B2 (en) 2011-09-22 2020-08-11 The George Washington University Systems and methods for visualizing ablated tissue
US11559192B2 (en) 2011-09-22 2023-01-24 The George Washington University Systems and methods for visualizing ablated tissue
US10076238B2 (en) 2011-09-22 2018-09-18 The George Washington University Systems and methods for visualizing ablated tissue
US10716462B2 (en) 2011-09-22 2020-07-21 The George Washington University Systems and methods for visualizing ablated tissue
US20140128735A1 (en) * 2012-11-02 2014-05-08 Cardiac Science Corporation Wireless real-time electrocardiogram and medical image integration
US10282631B2 (en) * 2013-04-03 2019-05-07 Toshiba Medical Systems Corporation Image processing apparatus, image processing method and medical imaging device
US20140301621A1 (en) * 2013-04-03 2014-10-09 Toshiba Medical Systems Corporation Image processing apparatus, image processing method and medical imaging device
US11096584B2 (en) 2013-11-14 2021-08-24 The George Washington University Systems and methods for determining lesion depth using fluorescence imaging
CN105744883A (en) * 2013-11-20 2016-07-06 乔治华盛顿大学 Systems and methods for hyperspectral analysis of cardiac tissue
US11457817B2 (en) 2013-11-20 2022-10-04 The George Washington University Systems and methods for hyperspectral analysis of cardiac tissue
WO2015077474A1 (en) * 2013-11-20 2015-05-28 The George Washington University Systems and methods for hyperspectral analysis of cardiac tissue
US11185720B2 (en) * 2014-10-17 2021-11-30 Koninklijke Philips N.V. Ultrasound patch for ultrasound hyperthermia and imaging
US10682179B2 (en) 2014-11-03 2020-06-16 460Medical, Inc. Systems and methods for determining tissue type
US10143517B2 (en) 2014-11-03 2018-12-04 LuxCath, LLC Systems and methods for assessment of contact quality
US11596472B2 (en) 2014-11-03 2023-03-07 460Medical, Inc. Systems and methods for assessment of contact quality
US11559352B2 (en) 2014-11-03 2023-01-24 The George Washington University Systems and methods for lesion assessment
US10722301B2 (en) 2014-11-03 2020-07-28 The George Washington University Systems and methods for lesion assessment
US10456105B2 (en) 2015-05-05 2019-10-29 Boston Scientific Scimed, Inc. Systems and methods with a swellable material disposed over a transducer of an ultrasound imaging system
US10779904B2 (en) 2015-07-19 2020-09-22 460Medical, Inc. Systems and methods for lesion formation and assessment
US20180140281A1 (en) * 2015-08-21 2018-05-24 Fujifilm Corporation Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus
US11666310B2 (en) * 2015-08-21 2023-06-06 Fujifilm Corporation Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus using predetermined imaging conditions for B-mode image generation
US10881376B2 (en) 2017-11-08 2021-01-05 Biosense Webster (Israel) Ltd. System and method for providing auditory guidance in medical systems
US11678863B2 (en) 2017-11-08 2023-06-20 Biosense Webster (Israel) Ltd. System and method for providing auditory guidance in medical systems
CN111093516A (en) * 2017-11-21 2020-05-01 深圳迈瑞生物医疗电子股份有限公司 Ultrasound system and method for planning ablation
CN112312840A (en) * 2018-06-22 2021-02-02 皇家飞利浦有限公司 Intravascular ultrasound location identification
US11660064B2 (en) * 2018-06-22 2023-05-30 Koninklijke Philips N.V. Intravascular ultrasound position identification
US20210393233A1 (en) * 2018-11-15 2021-12-23 Koninklijke Philips N.V. Simultaneous sensor tracking in medical interventions
US11944487B2 (en) * 2018-11-15 2024-04-02 Koninklijke Philips N.V. Simultaneous sensor tracking in medical interventions
US20210137488A1 (en) * 2019-11-12 2021-05-13 Biosense Webster (Israel) Ltd. Historical ultrasound data for display of live location data

Also Published As

Publication number Publication date
US20070055150A1 (en) 2007-03-08
WO2007021511A1 (en) 2007-02-22
DE112006002162T5 (en) 2008-07-03
US7740584B2 (en) 2010-06-22

Similar Documents

Publication Publication Date Title
US20070049827A1 (en) Clinical feedback of ablation efficacy during ablation procedure
US20070016029A1 (en) Physiology workstation with real-time fluoroscopy and ultrasound imaging
US7270634B2 (en) Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
CN103889337B (en) Diagnostic ultrasound equipment and ultrasonic diagnosis apparatus control method
US11730446B2 (en) Ultrasound based three-dimensional lesion verification within a vasculature
Fenster et al. 3-D ultrasound imaging: A review
US7529393B2 (en) Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
US9697634B2 (en) Ultrasound image processing to render three-dimensional images from two-dimensional images
US7796789B2 (en) Guidance of invasive medical devices by three dimensional ultrasonic imaging
JP5566580B2 (en) Mechanical evaluation of the heart using ultrasound
US20060270934A1 (en) Guidance of invasive medical devices with combined three dimensional ultrasonic imaging system
WO2003077765A1 (en) Ultrasonographic system and ultrasonography
JP3946815B2 (en) Ultrasonic diagnostic equipment
JP2001518342A (en) Method and apparatus for calculating and displaying ultrasound imaging strain in real time
US20190231316A1 (en) Diagnosis and monitoring of myocardial infarction using ecg data for treatment with sonoreperfusion ultrasound
US20190247066A1 (en) Treatment of myocardial infarction using sonothrombolytic ultrasound
EP4108181A2 (en) Estimating strain on tissue using 4d ultrasound catheter
Dickie et al. A flexible research interface for collecting clinical ultrasound images
Pelissiera et al. A Flexible Research Interface for Collecting Clinical Ultrasound
WO2004084736A1 (en) Guidance of invasive medical devices by three dimensional ultrasonic imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DONALDSON, BRENDA L.;REEL/FRAME:017383/0101

Effective date: 20051216

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION