US20030065260A1 - Identification and quantification of needle and seed displacement departures from treatment plan - Google Patents

Identification and quantification of needle and seed displacement departures from treatment plan Download PDF

Info

Publication number
US20030065260A1
US20030065260A1 US10/193,718 US19371802A US2003065260A1 US 20030065260 A1 US20030065260 A1 US 20030065260A1 US 19371802 A US19371802 A US 19371802A US 2003065260 A1 US2003065260 A1 US 2003065260A1
Authority
US
United States
Prior art keywords
image
needle
images
ultrasound
therapeutic agents
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/193,718
Inventor
Gang Cheng
Yan Yu
Haisong Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alpha Intervention Technology Inc
Original Assignee
Alpha Intervention Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alpha Intervention Technology Inc filed Critical Alpha Intervention Technology Inc
Priority to US10/193,718 priority Critical patent/US20030065260A1/en
Assigned to ALPHA INTERVENTION TECHNOLOGY, INC. reassignment ALPHA INTERVENTION TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHENG, GANG, LIU, HAISONG, YU, YAN
Publication of US20030065260A1 publication Critical patent/US20030065260A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1001X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
    • A61N5/1007Arrangements or means for the introduction of sources into the body
    • A61N2005/1011Apparatus for permanent insertion of sources
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1001X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
    • A61N5/1027Interstitial radiation therapy

Definitions

  • the present invention is directed to an improvement to treatment plans using brachytherapy or the like and more specifically to a technique for rapid and accurate identification and quantification of needle placement departures from such a treatment plan.
  • the present invention is also directed to improve the brachytherapy treatment quality by realizing intraoperative seed (or other radioactive source or therapeutic agent) localization, dosimetry evaluation and feedback optimization.
  • the present invention also relates to medical ultrasound imaging of prostate, to ultrasound imaging, and more particularly to a method and apparatus for generating 3D orthogonal compound ultrasound (OCU) for the diagnosis and treatment of prostate cancer.
  • OCU orthogonal compound ultrasound
  • the number of those holes, which intersect with the prostate cross section, and therefore are potentially usable, is typically about 60.
  • the number of mathematical combinations is therefore greatly in excess of 10 16 , each of which is a potential treatment plan but is associated with different degrees of cancer control and a different likelihood of treatment complications [Y. Yu and M. C. Schell, “A genetic algorithm for the optimization of prostate implants,” Med. Phys. 23, 2085-2091 (1996)].
  • Such treatment planning systems enable the user to manually outline the prostate in relation to a template grid, to turn on or off any available needle positions and seed positions within each needle, and to examine the resultant dose distribution in two or three dimensions.
  • Examples of such planning systems include those offered by Multimedia Medical Systems (MMS) of Charlottesville, Va., SSGI Prowess, of Chico, Calif., Nucletron Plato, from Columbia, Md., Computerized Medical Systems (CMS) Focus, of St Louis, Mo., Radiation Oncology Computer Systems (ROCS), of Carlsbad, Calif., ADAC Laboratory's Pinnacle, of Milpitas, Calif. and Theraplan, available from Theratronics International Ltd. of Kanata, Ontario, Canada.
  • MMS Multimedia Medical Systems
  • CMS Computerized Medical Systems
  • ROS Radiation Oncology Computer Systems
  • An implant planning engine plans implants for radiotherapy, e.g., prostate brachytherapy.
  • the system optimizes intraoperative treatment planning on a real-time basis using a synergistic formulation of a genetic algorithm, multi-objective decision theory and a statistical sensitive analysis.
  • TRUS transrectal ultrasound
  • the current imaging modality is a single-scan by using either transverse imaging or longitudinal imaging ultrasound transducer, and thus the image quality is inadequate for clearly visualizing tumors, prostate anatomy, and implanted radioactive sources due to the noise, speckles, and shadows existing in the TRUS images.
  • TRUS transrectal ultrasound
  • Siemens Medical Systems, Inc. owns a patent named “3-Dimensional compound ultrasound field of view,” (U.S. Pat. No. 5,655,535).
  • compound field of view ultrasound image is derived from correlated frames of ultrasound image data, while ours is uncorrelated. They used frames of sensed echo signals to detect probe motion without the use of a dedicated position sensor or motion sensor, which is also different from our sensor based position and orientation detection method.
  • Their purpose of compounding is to increase the limited field of view of the ultrasound transducer, not to increase the image quality of ultrasound.
  • OCU orthogonal compound ultrasound
  • CT Computer Tomography
  • MRI Magnetic Resonance Imaging
  • a first embodiment of the present invention is directed to a technique for identifying and quantifying needle and seed displacement departures from a placement plan for the placement of radioactive sources in a prostate or other internal organ for brachytherapy or the like.
  • the placement plan is made available to an intraoperative tracking interface, which also shows a live ultrasound image of the needle or catheter placement in the prostate.
  • the needle or catheter placement in live ultrasound images can be located manually by clicking a mouse or semi-automatically by image segmentation algorithms.
  • the manual or semi-automatic segmentation methods depend upon the contrast between background image and brightness of needle tips.
  • the difference in the x-y plane between the planned and actual locations of the needle or catheter is calculated, and from that difference, the error in position of each seed is calculated from the implantation pattern of that needle.
  • the seeds are moved, or the operator changes the number of seeds, and the dose is recalculated.
  • a small three-dimensional column of ultrasound images is taken along the track of needle, and each seed located in the column of images is automatically recognized and given a confidence level.
  • the confidence level shows the accuracy of the seed recognition in view of the large noise interference in the transrectal ultrasound images.
  • the dosimetry is recalculated based on the recognized seeds, whose confidence level exceeds a threshold preset by the operator.
  • fluoroscopic x-rays are taken, and the seed coordinates found in transrectal ultrasound images are matched to the seed coordinates found in fluoroscopic x-ray image. Seed locations with low confidence levels are adjusted based on the x-ray locations because the x-ray images can provide accurate planar projection of each seed, and the dosimetry is recalculated based on those revised seed positions.
  • the technique is carried out through the following steps.
  • the needle/catheter placement plan is made available to an intraoperative tracking interface. That interface contains an electronic worksheet of needle and seed coordinates, a live ultrasound image window into which real-time video image of needle/catheter placement is fed, and a series of isodose dosimetry panels reflecting the current state of dose coverage.
  • Each of the needles/catheters can be activated by highlighting the corresponding row in the coordinates worksheet, or by highlighting the corresponding grid location graphically.
  • a hyperechoic spot appears on the live ultrasound image. That location of hyperechoic spot is manually identified by the operator or semi-automatic thresholding method. The difference in the x-y plane between the planned location and the actual location of the needle/catheter is calculated to give errors ⁇ x and ⁇ y. The errors ⁇ x and ⁇ y are then reflected on the grid location and worksheet.
  • the errors of each seed, ⁇ x′ and ⁇ y′ are calculated based on straight line interpolation at the planned z location of the seed; the straight line is constructed by joining two known points: (a) the actual needle location shown on ultrasound at the known z plane, (b) the template coordinate outside the patient body, through which the needle is inserted under precision template guidance (therefore at that location ⁇ x and ⁇ y shall be assumed to equal zero).
  • the dose is then recalculated by moving the seeds along the activated needle/catheter in x and y by amounts ⁇ x′ and ⁇ y′, which may be the same or different for each and every seed.
  • the dosimetry updated by such feedback of seed placement errors is redisplayed on the series of isodose panels.
  • the operator is permitted to change the number of seeds deposited by the needle/catheter in question. In that case, the operator is required to enter the new seed locations along the needle/catheter, which overrides the original treatment plan. Seed placement errors in such a case are tracked identically to the procedure described above.
  • a small 3D column of ultrasound images is acquired along the needle track as constructed above. That column can be perpendicular to the x-y plane, or in fact may often sustain an angle ⁇ and an angle ⁇ from the x and the y planes, respectively.
  • the exact number of seeds as deposited is identified using image processing and recognition algorithms in that column of 3D ultrasound region of interest. Each seed identified in that manner is assigned a confidence level, which depicts the likelihood/uncertainty of seed localization.
  • the size of that column is initially set small; if the total number of seeds found in that manner is not equal to the number of seeds deposited by the given needle/catheter due to the misplacement of seeds, the width of the column is adaptively adjusted to find the lost seeds (e.g., the width is increased to find additional seeds).
  • the ultrasound step quantifies ⁇ z′ for each seed and at the same time further corrects ⁇ x′ and ⁇ y′. If the confidence level of a given seed's localization exceeds a threshold value (to be set by the operator), the dosimetry is re-calculated yet again using the updated seed location and displayed in the same isodose panels. The isodose calculated is assigned a confidence level, which is a numerical composite of the individual confidence levels of the seeds and the dosimetric impact of positional uncertainties at each seed location (e.g., in high dose region, positional uncertainty has low impact).
  • fluoroscopic x-ray images may be taken in the anterior-posterior direction and at up to ⁇ 45 degrees on either side of the anterior-posterior directions.
  • the seed coordinates as determined above with ultrasound images are two-dimensionally projected in the same orientations.
  • a best match to the x-ray seed projections is made based on multiple point matching using those seed identifications with the highest confidence levels.
  • the seed locations with low confidence levels are adjusted based on the x-ray locations. As a result, the confidence levels of those latter seeds are increased by an amount reflective of the best match quality.
  • the dosimetry is recalculated.
  • the confidence level of the dosimetry is updated using updated confidence levels of the seeds.
  • 3D TRUS images are usually created by collecting a series of 2D image slices at evenly spaced intervals in real time during a transducer scan, in combination with detection of the position and/or orientation of the transducer.
  • One approach is mounting the transducer in a computer-controlled motor-driven assembly.
  • Another approach is attaching a sensor to the transducer and measuring the relative position and orientation from the sensor to a transmitter by using non-contact sensing techniques.
  • EM electromagnetic
  • Linear scanning uses a transducer with a curvilinear transverse array that is pulled back manually or mechanically in the cranio-caudal direction to acquire images of transverse (axial) view of the prostate, whereas the rotational scanning technique rotates a transducer with a side-firing array manually or mechanically to acquire the longitudinal images.
  • the linear scanning technique is now widely used because therapy guidance requires that the transverse positions of the needles are clearly visible.
  • that scanning method has two disadvantages.
  • One is that the edges of the base and apex of the prostate are nearly invisible on the transverse images because of the complexity of human anatomy structure in those two areas; that leads to inaccuracies in volume estimates and dosimetric planning.
  • the other is that the implanted seeds cannot be fully localized because some seeds are invisible on the transverse image series due to the impact of the speckle noise, shadowing by other seeds, and calcification, bleeding and tissue heterogeneity of the gland.
  • the apex and base of the prostate can be more easily determined on longitudinal images from the rotational scan.
  • Ultrasound transducers that can electronically switch from longitudinal to transverse imaging (by pressing a button) are now commercially available and widely used in prostate imaging.
  • the basic equipment investment for OCU is already in place for most clinicians; in addition, both scan modes are routinely used by practitioners to visually assess needle placement. Adding orthogonal compounding therefore do not introduce complexity to the intervention, but may result in greatly improved imaging.
  • the present invention the dual-scan orthogonal compound ultrasound, is a novel imaging modality aimed to overcome those drawbacks of the conventional TRUS imaging, and become the next generation ultrasound imaging method for the treatment (brachytherapy/thermal/cryo-therapies), diagnosis (biopsy), and screening of prostate cancer.
  • the present invention upgrades the conventional TRUS imaging from 2D to 3D, and compounds two orthogonal views of the prostate by dual-scan imaging so that a clearer visualization of the prostate anatomy and tumors can be obtained.
  • 3D adaptive compounding of transverse and longitudinal scan sets could improve both signal-to-noise and contrast-to-noise ratios and generate phantom images with clearly visible seeds.
  • the present invention also includes a neural network solution for detecting tumors directly from the 3D orthogonal compound ultrasound images.
  • ATL Ultrasound Company now offers a commercial ultrasound scanner with real-time 2D compound imaging using electronic beam steering (“SonoCTTM”). Although they have been issued a series of patents related to ultrasonic diagnostic imaging system with spatial compounding (U.S. Pat. Nos. 6,117,081; 6,126,598; 6,135,956; 6,210,328), there are two main differences with the present invention. The first one is that their compounding target is a two-dimensional image, while the compounding target of the present invention is a three dimensional image volume. The second one is that they use a series of partially overlapping component image frames to compound, while the present invention uses orthogonal image volumes as the compounding components that are fully independent. Since 3D compounding can combine more independent views than 2D compounding and offers the possibility of out-of-plane refraction correction, the orthogonal 3D compound can utilize more benefits of spatial compounding.
  • texture features can be used to identify radioactive seeds in TRUS images, such as first order statistical parameters (mean, maximum, minimum, summation, etc.), second order statistical parameters (standard deviation, covariance, correlation coefficient, etc), higher order statistical parameters (skewness, kurtosis, etc.), frequency domain parameters (high pass filtering, low pass filtering, band pass filtering, etc.), and geometry parameters (area, Euler number, etc.).
  • first order statistical parameters mean, maximum, minimum, summation, etc.
  • second order statistical parameters standard deviation, covariance, correlation coefficient, etc
  • higher order statistical parameters skewness, kurtosis, etc.
  • frequency domain parameters high pass filtering, low pass filtering, band pass filtering, etc.
  • geometry parameters area, Euler number, etc.
  • Logit and Probit models, discriminant analysis, and neural network classification methods are applied to two test groups.
  • One group represents the seeds which are identified manually from the TRUS images; the other group represents the non-seeds group including typical background noises, calcifications, bleeding, air gaps, and shadows.
  • the selection of the two groups is confirmed by the day 0 CT (i.e., CT images taken within the same day of the actual implant): the positions of the radioactive seeds in the CT images are identified automatically or manually, and affine-transformed to match the TRUS image series. Some clearly visible seeds in the TRUS image series can be visually identified and used as markers to correct the registration.
  • the seed positions identified from CT are affine-transformed until they achieve a best match with those identified from TRUS.
  • FIG. 1 shows a schematic diagram of a system for carrying out any of the preferred embodiments of the present invention
  • FIGS. 2 A- 2 C show a flowchart of a process according to a first preferred embodiment
  • FIG. 3 shows a user interface used in the first preferred embodiment
  • FIG. 4 shows the user interface of FIG. 3 after the calculation of a needle offset and also identifies certain process steps of FIG. 2A with certain components of the user interface;
  • FIG. 5 shows a flow chart of an image processing technique used to identify seeds in the ultrasound images
  • FIGS. 6A and 6B show an image with a desired grayscale distribution and a histogram of the desired grayscale distribution, respectively;
  • FIGS. 7A and 7B show an image with a typical grayscale distribution and a histogram of the typical grayscale distribution, respectively;
  • FIGS. 8A and 8B show the image of FIG. 7A after preprocessing and a histogram of the resulting grayscale distribution, respectively;
  • FIGS. 9A and 9B show a sequence of images taken in a column and an identification of those images having hyperechoic spots, respectively;
  • FIG. 10 shows a plot of a threshold used to locate the hyperechoic spots
  • FIGS. 11A and 11B show ideal and typical plots, respectively, of brightness along a needle path
  • FIGS. 12 A- 12 C show three types of peaks which may occur in image data
  • FIGS. 13 A- 13 D show the locations of seeds in image data
  • FIG. 14A shows a known technique for 2D spatial compounding
  • FIG. 14B shows a schematic of 3D orthogonal compounding according to another preferred embodiment of the present invention.
  • FIG. 15 is a diagram illustrating an imaging system with a rectal shell used in the preferred embodiment of FIG. 14B;
  • FIGS. 16 A- 16 D show four steps in the process of 3D image acquisition and reconstruction according to the preferred embodiment of FIG. 14B;
  • FIG. 17 is a schematic diagram illustrating the geometry used to describe the reconstruction algorithm used in FIGS. 16 A- 16 D;
  • FIG. 18 is a flow chart of the algorithm of 3D reconstruction of longitudinal images
  • FIG. 19 is a schematic diagram illustrating the construction of the 2D joint histogram
  • FIG. 20 is a flow chart of the registration algorithm
  • FIGS. 21 A- 21 C are three images showing one of the sagittal images from the linear scan, from the rotational scan, and the compound image, respectively;
  • FIGS. 22A and 22B are illustrations of the intelligent compounding algorithm using global weighting and regional weighting, respectively;
  • FIG. 23 is a schematic diagram of the neural network architecture used in the normalized statistical tumor probability model (NSTPM);
  • FIG. 24 is a flow chart of the use of NSTPM during biopsy under the guidance of OCU imaging
  • FIGS. 25 A- 25 C show seed templates used to identify seeds in TRUS images according to another preferred embodiment of the present invention.
  • FIG. 26 shows a flow chart of a process for finding optimized weight coefficients for identification of the seeds in the TRUS images
  • FIGS. 27 A- 27 D show scans of the prostate in the manual adjustment of a needle pattern
  • FIG. 28A shows an original ultrasound image used for a global search in another preferred embodiment of the present invention
  • FIG. 28B shows a grayscale transformed image based on the image of FIG. 28A.
  • FIG. 28C shows a graph of the measurement of the gray-scale summation of each point in the image of FIG. 28B.
  • FIG. 1 shows a system 100 on which any of the preferred embodiments can be implemented.
  • the system 100 includes a computer 102 , which can be the same as the computer used in either of the above-cited Edmundson and Yu references or any other suitable device.
  • the computer uses a display 104 and a user input device or devices such as a keyboard 106 and a mouse 108 .
  • Other input devices can be used; for example, the mouse 108 can be replaced by a light pen for use with the display 104 .
  • the computer also receives input from an ultrasound device 110 and a fluoroscopic x-ray device 112 by image digitization.
  • the system also includes components for administering the brachytherapy to the patient.
  • Those components include needles 114 having radioactive seeds 116 (or other radioactive objects, or, even more generally, other therapeutic agents) spaced therealong in accordance with a treatment plan.
  • a template 118 having a grid of holes 120 is used to position the needles 114 for insertion into the patent's prostate.
  • the specifics of the needles 114 , the seeds 116 and the template 118 are known from the prior art cited above.
  • the needles 114 can be replaced by hollow needles or catheters in accordance with the treatment plan to be used.
  • a treatment plan is developed.
  • Such a treatment plan can be the one developed in the above-cited Yu reference and can be developed either on the computer 102 or on a different device.
  • the treatment plan is made available to an intraoperative tracking interface implemented on the computer 102 . If the treatment plan is not developed on the computer 102 , an appropriate communication medium can be provided to supply the treatment plan to the computer 102 .
  • the intraoperative tracking interface is displayed to the user on the display 104 .
  • the intraoperative tracking interface 300 includes the following components.
  • An electronic worksheet 302 shows needle and seed coordinates, based on the grid of holes 120 in the template 118 , and identifies needle locations with dots 304 .
  • a live ultrasound image window 306 shows a real-time image of a section of the prostate obtained from the ultrasound device 110 and allows a real-time view of needle placement in the prostate. From the placement of the seeds, the dosimetry is calculated, and a series of dosimetry panels 308 are shown, each showing the dosimetry in a respective slice of the prostate from the base to lo the apex.
  • the dosimetry in the panels 308 is shown by isodose lines 310 .
  • the electronic worksheet 302 further includes a spreadsheet 312 in which each row indicates the information of one of the needles.
  • the spreadsheet 312 includes a column 314 indicating a needle by a number, a column 316 identifying the hole 120 in the template 118 into which that needle is inserted by its coordinates (letter and number), a column 318 indicating an offset, a column 320 indicating the number of seeds on the needle, a column 322 indicating a ⁇ x offset of the needle from its planned position, a column 324 indicating a ⁇ y offset of the needle from its planned position, a column 326 indicating the number of currently selected seeds whose offsets have been calculated and a column 328 indicating a total number of seeds whose offsets have been calculated.
  • a needle position 304 which the operator has selected is shown on the interface 300 as flashing, as is the corresponding row 330 in the spreadsheet 312 .
  • the live ultrasound image 306 of the interface 300 displays a bright (hyperechoic) spot 332 in step 208 .
  • the operator manually identifies the spot 332 , e.g., by clicking on its center with the mouse 108 .
  • the difference in the x-y plane between the planned location and the actual location of the needle or catheter is calculated to give errors ⁇ x and ⁇ y, which are shown both on the grid 302 and on the highlighted row 330 of the spreadsheet.
  • the positional errors in the x-y plane of each seed, ⁇ x′ and ⁇ y′, are calculated in step 214 based on straight line interpolation at the planned z location of the seed.
  • the straight line used in the interpolation is constructed by joining two known points: (a) the actual needle location shown on ultrasound at the known z plane and (b) the template coordinate outside the patient body through which the needle is inserted under precision template guidance.
  • ⁇ x and ⁇ y are assumed to equal zero.
  • the dose is then recalculated in step 216 by moving the seeds along the activated needle or catheter in the x and y directions by those amounts ⁇ x′ and ⁇ y′, which may be the same or different for every seed.
  • FIG. 4 shows the updated interface 300 and also identifies some of the above-mentioned method steps in association with the corresponding elements of the interface 300 .
  • the operator is permitted to change the number of seeds deposited by the needle or catheter in question in step 220 .
  • the operator is required to enter the new seed locations along the needle or catheter, which overrides the original treatment plan in step 222 . Seed placement errors in such a case are tracked identically to the procedure described above.
  • step 224 a small 3D column of ultrasound images is acquired along the straight line constructed in step 214 . That column can be perpendicular to the x-y plane or may be at an arbitrary angle from the x and/or the y planes.
  • the exact number of seeds as deposited is identified in step 226 , using image processing and recognition algorithms to be described below, in the 3D column of ultrasound images.
  • Each seed identified in the ultrasound images is assigned a confidence level in step 228 , which indicates the likelihood or uncertainty of seed localization.
  • the size of the 3D column is initially set small. If it is determined in step 230 that the total number of seeds found in step 226 is not equal to the number of seeds deposited by the given needle or catheter, the width of the column is adaptively adjusted in step 232 ; for instance, the width is increased to find additional seeds.
  • ⁇ z′ is quantified for each seed, and at the same time, ⁇ x′ and ⁇ y′ are further corrected according to step 214 . If it is determined in step 234 that the confidence level of a given seed's localization exceeds a threshold value (set by the operator), the dosimetry is re-calculated yet again in step 236 using the updated seed location and displayed in the same isodose panels. The isodose calculated is assigned a confidence level in step 238 , which is a numerical composite of the individual confidence levels of the seeds and the dosimetric impact of positional uncertainties at each seed location. For example, in a high dose region, positional uncertainty has low impact.
  • fluoroscopic x-ray images may be taken in step 240 in the anterior-posterior direction and at up to ⁇ 45 degrees on either side of the anterior-posterior direction.
  • the seed coordinates as determined above are projected in the same orientations in step 242 .
  • a best match to the x-ray seed projections is made in step 244 based on multiple point matching using those seed identifications with the highest confidence levels.
  • the seed locations with low confidence levels are adjusted in step 246 based on the x-ray locations. As a result, the confidence levels of those latter seeds are increased by a amount reflective of the best match quality.
  • the dosimetry is recalculated, and the confidence level of the dosimetry is updated using the updated confidence levels of the seeds.
  • step 502 which is a preprocessing step, the image brightness and contrast are adjusted to make the hyperechoic spots more distinct.
  • step 504 the seed pathway is tracked for further correcting the offsets ⁇ x′ and ⁇ y′ of the implanted seeds.
  • step 506 the seeds are identified for correcting ⁇ z′ for each seed along the tracking pathway and fine-tuning of ⁇ x and ⁇ y for each seed.
  • Step 502 involves executing a grayscale histogram transformation to each image in the ultrasound series from apex to base and is thus a pre-processing step.
  • the purpose of step 502 is to adjust the brightness and contrast of the images so that the hyper-echoic spots will be more distinct in the transformed images.
  • an image suitable for seed recognition processing has a grayscale histogram similar to that shown in FIGS. 6A and 6B, whereas in most cases, the images as taken have grayscale histograms similar to that shown in FIGS. 7A and 7B.
  • the background be very dark while the hyperechoic spots be very distinct.
  • 50% of the pixels have grayscale levels below 30, representing the background and dark tissues; 90% of the pixels have grayscale levels below 60, with grayscale levels between 30 and 60 most likely representing the brighter tissues of the gland; and 95% of the pixels have grayscale levels below 80, with levels between 60 and 80 most likely representing some much brighter tissues and some weaker airgaps
  • the pixels with the highest grayscale levels are the hyper-echoic spots of seeds and some stronger air gaps.
  • the images are assumed to have an eight-bit grayscale depth, namely, with grayscale values from zero to 255 inclusive.
  • grayscale depths can be used instead.
  • the 50%, 90% and 95% grayscale levels are higher than the preferred ones set forth above.
  • they are 60, 110 and 135, respectively.
  • FIGS. 8A and 8B When the image of FIGS. 7A and 7B is subjected to such a transformation, the result is as shown in FIGS. 8A and 8B.
  • a comparison of FIGS. 7A and 7B with FIGS. 8A and 8B shows that the hyper-echoic spots in transformed image of FIGS. 8A and 8B are more distinct than they are in the original image of FIGS. 7A and 7B.
  • the subsequent algorithms it is easier for the subsequent algorithms to track and identify the seeds. More importantly, it is possible for the algorithms to use unified parameters to process cases with different brightness and contrast settings. At the same time, that gray-scale transformation does not lose the important information in the ultrasound images.
  • Step 504 automatic tracking of the seeds along a same needle, is used to correct ⁇ z′ (displacement from the planned location) of the implanted seeds.
  • Step 504 involves tracking the pathway of the seeds, not just the seeds themselves. In other words, the air gaps are also included, and step 504 does not discriminate the seeds from the air gaps.
  • Step 504 uses the grayscale information region of interest (ROI), such as the maximum value of a hyper-echoic spot, the mean and the standard deviation of the ROI, the contrast defined by the maximum value divided by the mean, and first/second order texture parameters, etc.
  • ROI grayscale information region of interest
  • a center and the size of an ROI are preset. That operation can be manually done by the operator by clicking the mouse on the hyper-echoic spots at any z-position or automatically done by using the information from the treatment plan and above needle tracking method. Thresholding and analysis are then used to determine whether there is a hyper-echoic spot in the ROI. If there is, the ROI center of the next image in the series is switched to the current center position. If not, the previous center is kept.
  • FIG. 9A shows a column of images taken along the pathway corresponding to grid coordinates I 2 in the grid 302 of FIG. 3.
  • FIG. 9B shows the same column of images, with boxes identifying the images in which hyperechoic spots have been identified. As shown in FIG. 9B, each hyperechoic spot occupies five consecutive images because of the dimensions of the seed relative to the interval at which the images are taken; an illustrative example of the relevant dimensions will be given below.
  • FIG. 10 shows only the maximum, mean, and contrast measurements because they can be shown in a 2-D plot.
  • FIG. 10 is not drawn to scale, and the parameters are examples only, used to make the illustration more intuitive.
  • the ROI whose grayscale features fall in the shadow area of FIG. 10 is identified as an ROI containing a hyper-echoic spot.
  • the four borders of the shadow area are represented with four lines a, b, c, and d, respectively.
  • the lines a and b indicate that the maximum value of the ROT should be between grayscale levels 75 and 255.
  • the line c indicates that the mean value of the ROI should be greater than 5.
  • the line d indicates that the contrast (the slope of the line in the 2-D coordinate system constructed by the mean and maximum) should be greater than 2.
  • the line d may be replaced by a curve e (the dotted curve in FIG. 10), which delimits the border more accurately. That is because variations of the mean and the contrast may result in different thresholds. Generally speaking, the greater the mean, the smaller the threshold. As a result, curve e is in the form as shown in the figure.
  • the curve e can be implemented as a curve equation or as a look-up table for correlating the threshold to the mean.
  • Step 506 detecting the real z-position of each seed placed along the needle track, is in fact a task of cutting the seed pathway into several segments by discriminating the spots representing seeds from any spots representing air gaps.
  • the grayscale information cannot be used to achieve that goal because some stronger air gaps have greater measurement values than weak seeds, as will be explained below with reference to FIG. 11B. Therefore, a wave form analysis method is used instead.
  • the second peak 1114 has a wave form of peak, while each of the air gaps 1122 and 1124 has the wave form of valley. That distinction between peaks and valleys can be used to discriminate the seeds from the air gaps.
  • v l represents each feature such as maximum, contrast, and standard deviation, etc
  • a i represents the coefficient of each feature.
  • the combination of those features is not constrained to the linear composition, which is the simplest one. Simple least square statistics will determine the value and the confidence interval for each coefficient.
  • the MV waveform should be smoothed before it can be processed because the raw signal may contain many noise peaks, as shown in FIG. 12A.
  • the false peaks are removed. For example, if two peaks have a distance less than 6 units along the z-axis, they most likely represent the same seed, so one of them will be absorbed by the other, stronger one, as shown in FIG. 12B. If a peak lies between two other higher peaks and has no distinct drop off before and after it, it is most likely noise, as shown in FIG. 12C.
  • the peaks are detected to determine how many peaks there are. If the number is greater than the implanted number N of seeds, only the highest N peaks are taken as the seeds, as explained above with reference to FIG. 11B. If the number is less than N, either seed identification is forced using second-tier peaks (with reduced confidence), or the preset transverse size of the ultrasound column is changed to process a larger needle track that includes the exact number of the implanted seeds.
  • FIGS. 13 A- 13 D show sample seed identifications along grid location 12 .
  • the seeds are identified by black marks M, while in FIGS. 13B and 13D, they are left unmarked.
  • Each seed identified in that manner is assigned a confidence level according to the MV level and the fall-off characteristics of the peak. The greater the MV level and the fall off of the peak, the more likely it is a seed.
  • the locations of the seeds and their confidence values are convoluted into subsequent dosimetry calculations, which result in a confidence level for each of the dosimetry parameters arising from the dose-volume histogram, including V100, D100, D95, D90, D80 and D50.
  • fluoroscopy x-ray imaging of the seeds after the implantation will be used to increase the localization confidence of the given seeds found in TRUS images.
  • Ultrasound-based seed identification of high confidence values will be used as “anchors” (fiducial marks) to register the ultrasound and x-ray spaces.
  • the stabilizing needles in the prostate can also be used as positioning marks.
  • the z-coordinates of the low confidence seed localizations will then be corrected using the x-ray projection(s).
  • the confidence values are increased by a variable based on the degree of seed overlap on the x-ray image, the quality of the overall registration, and the quality of the x-ray itself for seed localization.
  • the 3 dimensional coordinates of the TRUS detected seeds will be projected onto x-z plane to form a 2 dimensional pattern. Then that pattern will be matched with the 2 dimensional pattern of x-ray by using the affine transform and optimized to a global minimum distance differences. Then the seeds pairs with distance differences less than a threshold, for example, 0.2 cm, will be considered as matched pairs with high confidence and used as fiducial marks for registration.
  • a threshold for example, 0.2 cm
  • FIG. 14A shows a simple illustration of the known 2D spatial compounding. Two (or more) scans of the same plane are accurately registered and then combined together to produce a compounded image.
  • FIG. 14B shows a schematic of 3D orthogonal compounding according to the present embodiment. Stacks of images captured from both the transverse view and the longitudinal view are converted to image volume format in Cartesian coordinates and combined. A scan is a set of images captured in one continuous sequence. The image stack captured by a scan is reconstructed into a volume. A compound volume is made up of dual orthogonal scans (or, more generally, non-parallel scans) with accurate registration.
  • a personal computer installed with a video frame grabber and an electromagnetic, optical or mechanical position and orientation measurement system (e.g., miniBIRDTM, Ascension Technology, Inc., Burlington, Vt.) is connected to the ultrasound probe and used to acquire a series of 2D images using both linear and rotational scanning methods.
  • the present inventors have verified that the EM sensor can provide a resolution of 0.5 mm in static position and 0.1° in orientation at 30 cm distance from the electromagnetic field transmitter.
  • the frame grabber can capture real-time images from the composite video output of the ultrasound machine at the rate of 30 frames per second.
  • FIG. 15 A system to overcome that problem is shown in FIG. 15.
  • That system 1500 uses a rigid sheath (“rectal shell”) 1502 fitted around the TRUS transducer 1504 to avoid its direct contact with the rectal wall of the patient P so that prostate movement is eliminated during the pull back or rotation of the transducer.
  • the sheath 1502 is made of bio-compatible materials and safe for human tissues. The material should also have low attenuation of ultrasonic energy so that the ultrasound image quality will not decrease significantly. Acoustic coupling is ensured by applying a small amount of gel 1506 between the surface of the transducer 1504 and rectal shell 1502 .
  • the posterior side of the rectal shell 1502 vents with tissue so that pressure does not build up during forward movement.
  • Fiducial markers 1508 visible in both the transverse and longitudinal scans can be embedded in the rectal shell 1502 and used to help register the image volumes.
  • a position sensor 1510 such as an electromagnetic sensing system or an optical or mechanical encoder, is also provided.
  • a bi-plane ultrasound transducer which can easily switch between transverse imaging and longitudinal imaging, is used for imaging.
  • Transverse view images are obtained while the transducer is pulled back from base to apex at even spacing (e.g. 0.5 mm).
  • Longitudinal images are obtained while the transducer is rotated slowly by free hand at even angle (e.g. 0.5°).
  • Image acquisition is triggered automatically when the EM sensor detects probe movement to each image interval.
  • the actual (rather than evenly-spaced) coordinates and angles of the probe at the instant of image acquisition are registered with the image, therefore even when the probe movement is very rapid there is no mis-registration (but there may be potential under-sampling).
  • the measurement rate of the EM sensor is set to 120 Hz.
  • the stream data reading mode and real-time filtering can be used to avoid the delay of the sensor. That technique has been extensively tested by the present inventors for both linear and rotational scans under realistic operative conditions. Careful error analysis indicates that sufficient accuracy and sampling may be achieved if some attention is paid to the EM transmitter positioning and speed of the scans (no more that 1.5 cm/s for linear scan and 15°/s for rotation).
  • the acquired 2D images are reconstructed into 3D volume in Cartesian coordinates by using reformatting, so that the reformatted image volume can be registered and compounded with the image volume acquired by the linear scan.
  • FIG. 16A shows a schematic diagram of the acquired images with respect to the transducer. The images are taken while the transducer is rotated and are thus arranged as a fan of a series of 2D images, radiating outward from the transducer, and evenly spaced over an arc of a certain degree (typically 100°).
  • FIG. 16B shows a resulting stack of (r, z) ⁇ images. The data from the stack of images shown in FIG. 16B are rearranged by interpretation to form a stack of (r, ⁇ ) z images by interpolation.
  • FIG. 16C Such a stack is shown in FIG. 16C.
  • the data from that stack are reconstructed into a 3D (x, y, z) image such as that shown in FIG. 16D.
  • the reconstruction produces a 3D image data cube in which the relative orientation of the acquired images is preserved.
  • FIG. 17 is a schematic diagram illustrating the geometry used to describe the reconstruction algorithm.
  • the source image grid is in polar coordinates (r, ⁇ ), while the destination image grid is in Cartesian coordinates (x, y).
  • the gray scale value at any point p in the destination image is calculated from those of its neighbors a, b, c, and d in the source image in a manner which will now be described.
  • the polar coordinate (r, ⁇ ) ( ⁇ square root ⁇ (x 2 +y 2 ), arctan (x/y)) is computed, and the corresponding grayscale value is interpolated from its nearest neighbors in the source image.
  • P a , P b , P c , P d are the gray scale values of the neighbor points of p in the source image, a,b,c,d, respectively.
  • is the angular separation between successive 2D longitudinal images, at ⁇ l and ⁇ l+1 , ⁇ is the angle between ⁇ i and p , ⁇ r is the radial separation between r l and r l+1 , and ⁇ r is the distance between r l and p , as shown in FIG. 17; the dashed lines represent the polar coordinates in which the points a,b,c,d, of the source image are located, and solid lines represent the Cartesian coordinates in which the point p of the destination image grid is located.
  • step 1802 a two-dimensional series of longitudinal images in the form of a stack of (r, z) ⁇ images is collected.
  • step 1804 the stack of step 1802 is rearranged into a stack of (r, ⁇ ) z images by interpolation.
  • step 1806 using Equation 1 above, for each z plane, the grayscale value at each grid point (x, y) in the destination image is calculated.
  • step 1808 the 2D series of transformed images is collected in the form of an (x, y) z stack to form an image volume (3D image) in Cartesian coordinates (x, y, z).
  • the total reconstruction time can be reduced by producing a look-up table that designates which source pixels correspond to a,b,c,d in equation, and the associated values of ⁇ / ⁇ and ⁇ r/ ⁇ r, for every destination grid point p in the volume.
  • the reconstructed 3D image is then built up by using that table repeatedly for each successive value of z.
  • the two image volumes to be registered are obtained from different scanning methods. Since position readings of each slice are available from the position sensor 1510 of FIG. 15, a preliminary registration based on geometry is used and significantly reduce the registration time of the subsequent registration.
  • the MMI method uses the concept of mutual information (MI), which is a basic concept from information theory for measuring the statistical dependence between two random variables, here to measure the statistical dependence between the image intensities of corresponding voxels in two images, which is assumed to be maximal if the images are geometrically aligned. Because no assumptions are made regarding the nature of that dependence and no limiting constraints are imposed on the image content, MMI is a very general and powerful criterion, allowing robust, fully automated affine registration of different images with different contrast and resolution without the need for segmentation or other preprocessing.
  • MI mutual information
  • H(x) and H(y) are the entropy of data sets x and y, respectively, and H (x, y) is the joint entropy of the two data sets.
  • the entropy of a data set is defined as its average information content, while the joint entropy is the average information of two data sets.
  • I (x, y) ⁇ x , y ⁇ p ⁇ ( x , y ) ⁇ ⁇ log ⁇ ⁇ p ⁇ ( x , y ) p ⁇ ( x ) ⁇ p ⁇ ( y )
  • the gray-scale values of a pair of corresponding voxels in two images that are to be registered are considered as two random data sets x and y.
  • the probability densities can be simply approximated by the normalized gray scale histograms, and the joint probability density can be approximated by the normalized 2D joint histogram, with the two base axes corresponding to gray scale levels in the two data sets.
  • the 2D joint histogram is constructed by raster scanning through all voxels in one image volume and incrementing bin counts corresponding to the gray scale values from the geometrically mapped voxel pair in the other image volume.
  • the joint histogram is normalized by dividing by the total number of voxels in the image volume.
  • the histogram value p (g 1 , g 2 ) is equal to the normalized count of pixels with value g 1 in the first image volume that match a value g 2 at the same location in the second image volume under the current mapping transformation.
  • the procedure of constructing the 2D joint histogram is illustrated in FIG. 19. For ultrasound images that are 8-bit grayscale (256 grayscale levels), the number of elements in the joint histogram is 256 ⁇ 256.
  • the two image volumes x and y are related through a geometric transformation T ⁇ defined by the registration parameter ⁇ .
  • the MMI registration criterion states that the images are geometrically aligned by the transformation T ⁇ for which the mutual information I (x, y) is maximal.
  • the registration optimization works by iteratively maximizing the mutual information I (x, y).
  • the only user interaction in the process is the definition of an approximate initial alignment of the two image volumes by placing three control points in each set.
  • the control points define a geometric transformation that is applied to match one image volume to the other.
  • the mutual information of the matched pair is calculated, and the optimization algorithms are used to modify the position of the control points until I (x, y) is maximized.
  • the Nelder-Maed down hill simplex algorithm can be used. That method is initialized with N+1 points, defining a non-degenerate simplex in N-dimensional parameter space. That simplex is then deformed iteratively by reflection, expansion or contraction in order to move the vertices of the simplex towards the minimum of the objective function. Convergence is declared when the fractional difference between the lowest and the highest function values evaluated at the vertices of the simplex is smaller than a pre-defined threshold.
  • the optimization works in three stages, using increasing numbers of control points and thus increasingly complex geometrical transformations.
  • the first stage uses three control points (rotation and translation), the second stage four (full affine transform), and the third a number greater than four (thin plate spline warping), which is a trade-off between registration accuracy and computation time.
  • Each stage stops and passes its final transformation onto the next stage when the mutual information increment is less than a predefined threshold.
  • each stage uses a compressed version of the original data set and different parameters to confine variation of the control point positions and to determine convergence of each optimization cycle.
  • FIG. 20 shows a flow chart of the registration algorithm.
  • the algorithm includes two loops, the outer one of which controls the transition from stage to stage.
  • step 2002 the control point placement is initiated; the user selects the approximate control points in both volumes.
  • step 2004 the geometric mapping is computed. The first time step 2004 is performed, it is performed in stage 1 , with rotation and translation only; the transition to stage 2 (full affine transformation, including scale) and stage 3 (thin plate spline warping) will be explained below.
  • step 2006 interpolated mapped volumes are computed.
  • a 2D join histogram is constructed.
  • step 2010 the mutual information is computed.
  • step 2012 it is determined whether I is maximized. If not, then in step 2014 , the control points are moved according to the optimizer, and the algorithm loops back to step 2004 . If I is maximized, then in step 2016 , it is determined whether the last stage out of stages 1 - 3 has been performed. If not, then in step 2018 , the control points are created for the next stage, and the algorithm loops back to step 2004 , which is carried out for the next stage. If the last stage has been performed, then in step 2020 , the algorithm stops. After registering and transforming the reformatted rotational scan onto the linear scan, combining the two volumes into one creates a compounded image volume. The most common techniques for combination of overlapping data are the maximum and mean.
  • an intelligent compounding method is used.
  • the information content of each slice of one image volume is computed and compared with the information content of the corresponding slice of the other image volume.
  • the contributions of the single scan images to the compounded image is determined by the information content of each.
  • the intensity value of each pixel in the compounded image is a weighted mean of the intensity of corresponding pixels in both single scan images.
  • That method is different from the weighted spatial compounding proposed by Leotta and Martin (2000) in that their weighting is based on incidence angle of the ultrasound beam, which is an exterior determining factor, whereas the weighting of the present embodiment is based on the information content computed directly from the images, which is an interior determining factor.
  • the weighting can be global, as shown in FIG. 22A, or regional, as shown in FIG. 22B.
  • the information content can be estimated by entropy H, as described above.
  • Weighted compounding at sub-image (or pixel) level e.g., moving region-of-interest (ROI) that scans the image plane
  • ROI moving region-of-interest
  • FIGS. 21A and 21B show, respectively, one of the sagittal images from the linear scan, and one of the sagittal images from the rotational scan. They are registered in the same sagittal plane by observing the whole series of images from both volumes and matching certain distinct marks, e.g. the urethra, the edges of the phantom, the bright line on the top of the images formed by the echoes from the margin of the phantom container, and some clearly visible seeds.
  • a simple compounding by averaging the images of FIGS. 21A and 21B forms the compounded image shown in FIG. 21C. All four seeds along that needle track are clearly visible on the compounded image of FIG. 21C, but not on the image of FIG. 21A or 21 B individually.
  • the signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) are used to measure the image statistics of the single scan images and the compounded image. SNR is defined as
  • ⁇ o and ⁇ b are the mean of object and background intensity, respectively, and ⁇ b is the standard deviation (STD) of the background.
  • STD standard deviation
  • the table below lists the image statistics of seeds, phantom, and background, respectively, and computes the CNR of seeds to phantom CNR s ⁇ p and phantom to background CNR p ⁇ b , respectively.
  • the table shows that the CNR s ⁇ p , CNR p ⁇ b , and SNR of the compounded image are 1.39, 1.36, and 1.41 times those of the average values of the two single scan images, respectively.
  • SNR N ⁇ square root ⁇ square root over (NSNR) ⁇ 0
  • SNR N and SNR 0 are the SNR of the compounded image averaged from N views and of the single uncorrelated views, respectively. That indicates that the image volumes from the linear scan and the rotational scan are nearly uncorrelated, which is just expected for increasing SNR of the compounded image and thus reducing the speckle noise.
  • a normalized statistical tumor probability model aimed to help effective biopsy based on OCU imaging for tumor detection can be used in the present embodiment.
  • the model is based on a large number (e.g., 100) of individual prostate specimens with localized cancers. It not only provides statistical information on a 3D spatial distribution of prostate cancers, but also shows the probability of cancer detection at each location in the prostate.
  • the probability model can be dynamically mapped into OCU images during in vivo TRUS-guided needle biopsy by dynamically registering the OCU images with the 3D model and overlaying them together. Since the new biopsy technique is based on statistical analysis of a quantitative database of digitized prostate specimens, it is expected to significantly improve the accuracy of prostate cancer detection.
  • a prostate gland is divided into different zones (instead of dealing with individual pixel points because the pixel-based statistical analysis is easily disturbed by the noise and the final results can not be directly used by urologists or computer software) that are accessible by the urologists under the guidance of OCU imaging.
  • the cancer inside each of those zones is calculated by calculating the statistical probability from the cases.
  • the result is a bounding box of a prostate gland with a certain number of zones each with information on spatial tumor probability in it.
  • a neural network is used to realize the adaptive sub-classification of the NSTPM because it is affected by some other parameters such as the patient's age, PSA level, gland volume, etc.
  • neural network computational models contain layers of simple interconnected computing nodes that operate as nonlinear summing devices. Those nodes are organized into layers and interconnected by weighted connection lines. From each layer, every neuron is connected to all neurons in the next layer. The weight values on the connection lines are adjusted when data is presented to the network during a training process. Successful training can result in an artificial neural network that performs a classification.
  • the most popular neural network architecture is the multi-layered perception (MLP) with three layers of neurons. An input layer receives the data to be analyzed. The second layer, consisting of hidden units, organizes weighted combinations of inputs, or “features” that aid in discriminating the different categories in a classification. The output layer then combines those features appropriately to perform the classification.
  • MLP multi-layered perception
  • the present embodiment uses a neural network to construct a normalized statistical tumor probability model (NSTPM), which is aimed to help effective biopsy, not aimed to detect tumor directly from the prostate by neural networks. It is more realistic and applicable. This is additionally aided by improved imaging of the OCU modality.
  • NSTPM normalized statistical tumor probability model
  • FIG. 23 shows the neural network used for sub-classification of the NSTPM.
  • the neural network 2300 includes an input layer 2301 , at least one hidden layer 2304 , and an output layer 2306 .
  • the neurons 2308 in the input layer 2302 receive the raw numbers from the related parameters, such as the patient's age, PSA level, gland volume, etc. Those numbers are weighted and combined by the hidden neurons 2310 in the hidden layer or layers 2304 of the network 2306 , which extract features useful for classification. Then, values calculated by the hidden neurons 2310 are weighted and combined by the output neurons 2312 in the output layer 2306 , to decide on the classification.
  • the present example uses four output neurons 2312 , which represent different properties of the anatomical zones in pathological analysis. Those neurons 2312 represent benign, malignant, malignant Gleason 1 - 3 and malignant Gleason 4 - 5 . Based on that NN model, corresponding network structures are obtained for various anatomical zones and the final NN structure precisely reflects the relationship between the parameters with the tumor distribution probability in different zones.
  • the network is trained by supervised learning, in which the correct classification is presented to the output layer during training, and the weights are adjusted so as to make the network's output more closely match the correct classification at each point. After training, the network outputs the classification when presented with an input vector at its input layer.
  • the fundamentals of neural network training are known in the art and will therefore not be described in detail here. It is important to avoid “data over-training” of the dataset. To accomplish that goal, we divide the original dataset into three parts. The first part is the training set, which is used to train the NN. The second part is the testing set, or verification set, which is used to test when to stop training the network.
  • the network's error when applied to the testing set decreases during training until the point at which over-training begins, and at that point, the error on the testing set starts to increase.
  • training is stopped when the error on the testing set is minimized.
  • the third set is the validation set, which is used to measure the performance on the trained network.
  • the validation set is completely independent with both the training and testing sets and thus gives a suitable benchmark on the NN performance. If the validation set is used to train or to determine the stopping point for training, then the performance of the network on the validation set would be a predictor for new, independent data.
  • Performance of the trained network is then measured by a root-mean-square-error (RMS error) calculation, accuracy, sensitivity and selectivity calculations, receiver operating characteristics (ROC) curve, and prediction/confidence intervals.
  • the RMS error shows literally the differences between the desired output for the neural network and its actual output.
  • the prediction and/or confidence interval reflect the amount of variation inherent in the training and calculation methodologies of the neural network.
  • the sensitivity is the fraction of data items that the network classifies as containing a feature of interest taken out of all the items that have that feature of interest.
  • the selectivity is the fraction of data items that the network classifies as not containing a feature of interest taken from the entire set of data that does not in fact contain that feature. It is important to set the optimized tradeoff between sensitivity and selectivity.
  • a higher sensitivity causes a lower selectivity, and vice versa.
  • the ROC (receiver operating characteristics) curve can provide that tradeoff.
  • a neural network is applied to a classification problem, an entire ROC curve can be calculated based on its output when the data set is presented to the network.
  • Each point along that ROC has a pair of values of sensitivity and selectivity. The best point along the curve can be selected, taking into account the clinical consequences of high or low sensitivity versus selectivity.
  • a high sensitivity is selected, and a lower selectivity is accepted.
  • the 3D NSTPM is built and saved in the computer, in which the successive slices of the prostate are put together into regions such as prostate, suspicious areas, high Gleason areas, and low Gleason areas, and landmarks such as urethra.
  • the NSTPM is used during biopsy as shown in the flow chart of FIG. 24.
  • a dual-scan of the prostate is performed and orthogonal compound image slices of the prostate obtained.
  • segmentation of prostate boundary, rectum and urethra is performed on that OCU image series.
  • An advanced 3D overlay display ability is employed in the system.
  • the series of boundaries of the prostate segmented from the OCU images are reconstructed into a 3D surface model in step 2408 .
  • the 3D tumor probability model is registered with the 3D surface model in step 2410 .
  • each zone of the 3D tumor probability model is color-coded according to the value of its probability in step 2412 .
  • the zone with highest probability is coded as red, the second highest zone as pink and so on.
  • the zone with lowest probability can be coded as blue.
  • the urologists can easily navigate the whole prostate in step 2412 and pick up the best candidate areas for biopsy by dynamically moving and/or turning the virtual US beam and making it intersect with the candidates in 3D space.
  • any other suitable coding could be used.
  • the biopsy is optimized to sample the high Gleason area as much as possible.
  • Logit and Probit models are known in various fields, such as sociology, but have not yet been applied in the same manner as in the present embodiment.
  • One group represents the seeds group which are identified manually from the TRUS images
  • the other group represents the non-seeds group including typical background noises, calcifications, bleeding, air gaps, and shadows.
  • the selection of the two groups is confirmed by the day 0 CT (i.e., CT images taken within the same day of the actual implant): the positions of the radioactive seeds in the CT images are identified automatically or manually, and affine-transformed to match the TRUS image series. Some clearly visible seeds in the TRUS image series can be visually identified and used as markers to correct the registration.
  • the seed positions identified from CT are affine-transformed until they achieve a best match with those identified from TRUS.
  • a seed template is designed to correlate with the sub-image area in the TRUS image series, and the correlation coefficient is used as a numerical feature.
  • the seed template is a bright central part surrounded by dark background. The whole area is also 2.5 ⁇ 2.5 mm, while the central part is a seeds echo shaped area, as shown in FIGS. 25 A- 25 C. Since the seeds in different parts of the prostate may appear as bright echo with different orientations, different seed templates may be created for different prostate positions.
  • FIGS. 25 A- 25 C show only three different seed templates corresponding to the orientations of seeds in the left, middle, and right parts of the prostate, respectively. Those seed templates are used to compute correlation coefficients with sub images. More templates can be used for more orientations; therefore, FIGS. 25 A- 25 C are illustrative rather than limiting.
  • the process uses a neural network, whose structure and training method have already been disclosed with reference to FIG. 23.
  • step 2608 those features are supplied to a neural network to find the optimized weight coefficients of those features.
  • the outputs of the neural network include true positives, false negatives, true negatives and false positives in steps 2610 , 2612 , 2614 and 2626 , respectively. From those outputs, the sensitivity, specificity and efficiency of the weight coefficients are determined in steps 2618 , 2620 and 2622 , respectively. Those quantities are used in a feedback loop to optimize the weight coefficients; the efficiency is used directly, while the sensitivity and specificity are used through an ROC curve in step 2624 .
  • N is the number of points.
  • the two-dimensional needle pattern can be manually adjusted before applying automatic needle tracking algorithm. After capturing the final scan image series, the automatic needle tracking starting positions are important to the algorithm. Because of prostate swelling and seed migration, the needle tracking starting positions may be different from the previous positions clicked on the computer screen during the needle insertion procedure. So the needle pattern must be shifted and scaled as a group, and any single needle can be adjusted in its position as desired according to the current TRUS image series.
  • the user interface explained above, or any other suitable user interface, can be used.
  • FIGS. 27 A- 27 D illustrate the process.
  • FIG. 27A shows the original needle pattern.
  • FIG. 27B shows needle pattern expansion;
  • FIG. 27C shift;
  • FIG. 27D single needle movement.
  • scale movement and the shift movement may be alternatively performed several times for best match.
  • a global search method can be used to identify radioactive seeds in TRUS image series.
  • the global search method can be used to identify radioactive seed candidates in TRUS image series and use their positions to correct minor bias of the seed positions identified by needle tracking/seed segmentation algorithm.
  • the global search method can be used to identify radioactive seed candidates in TRUS image series and use their positions and planned needle pattern to match out the final needle pattern.
  • FIGS. 28 A- 28 C show an example: FIG. 28A is an original ultrasound image; FIG. 28B is the grayscale transformed image, in which the radioactive seeds echo are more distinct; FIG.
  • the measurement 28C is the measurement of the grayscale summation of each points, from which one can see the clear relationship between the radioactive seed echo bright points and the local maximum of the measurement matrix. That is only an example.
  • the measurement in fact, can be a weighted summation of various image features.
  • That method can identify most clearly visible radioactive seeds in TRUS image series. However, it can also mis-idenfity some other echo bright points such as calcifications, bleeding, air gaps, etc, as radioactive seeds. So the identifying results may be only used as “seed candidates,” but not radioactive seeds themselves.
  • the seed candidates can be used either to match out the final needle pattern with the comparison of planned needle pattern (or manually clicked needle pattern) or to correct minor bias of the seed positions identified by needle tracking/seed segmentation algorithm.
  • the 3D pattern of the seed candidates set is projected onto a 2D plane and matched by the 2D pattern of needles, which is drawn on the screen in real time when the needle is inserted. Since the 2D needle pattern can be adjusted as a whole for shift and scale changes, and each needle position can be adjusted separately, the final needle pattern may be obtained in reference to the 2D projection pattern of the seed candidates.
  • the final detected seed positions may not be exactly on the 3 dimensional geometry center of the radioactive seeds. So we can compare their positions with the seed candidates; if any pair is within a distance threshold, for example, 0.15 cm in 3 dimensional, we can consider them as one radioactive seed and merge it to the position of the seed candidate. By such method, we can correct some minor bias when using the needle tracking based identifying algorithm.

Abstract

A placement plan is developed for the placement of radioactive seeds or other therapeutic agents in a prostate for brachytherapy. The placement plan is made available to an intraoperative tracking interface which also shows a live ultrasound image of the needle or catheter placement in the prostate. Position errors can be detected and corrected. Techniques for merging of orthogonal ultrasound image sets, classification of malignancy through neural networks, and statistical detection of seeds in an ultrasound image can be used.

Description

    REFERENCE TO RELATED APPLICATION
  • The present application is a continuation-in-part of U.S. patent application Ser. No. 09/636,551, filed Aug. 11, 2000, which claims the benefit of U.S. Provisional Application No. 60/200,493, filed Apr. 28, 2000. The disclosures of both of the above-referenced applications are hereby incorporated by reference in their entireties into the present disclosure.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention is directed to an improvement to treatment plans using brachytherapy or the like and more specifically to a technique for rapid and accurate identification and quantification of needle placement departures from such a treatment plan. The present invention is also directed to improve the brachytherapy treatment quality by realizing intraoperative seed (or other radioactive source or therapeutic agent) localization, dosimetry evaluation and feedback optimization. The present invention also relates to medical ultrasound imaging of prostate, to ultrasound imaging, and more particularly to a method and apparatus for generating 3D orthogonal compound ultrasound (OCU) for the diagnosis and treatment of prostate cancer. [0003]
  • 2. Description of Related Art [0004]
  • (1) Background of Intraoperative Optimization and Quality Evaluation [0005]
  • In the treatment of prostate cancer, a method is often employed to implant numerous radioactive seeds in a carefully preplanned pattern in three dimensions within the prostate. That procedure serves to deliver an amount of radiation dosage concentrated around the prostate, while at the same time sparing radiation-sensitive tissues, such as the urethra, the bladder and the rectal wall. Customarily, 60 to 140 seeds are placed through 15 to 30 needles in the inferior (feet) to superior (head) direction. Those needle positions are selected from a 13×13 grid of 0.5 cm evenly spaced template holes [U.S. Pat. No. 5,957,935, Sep. 28, 1999, Guide and holding bracket for a prostate implant stabilization device], which are used to achieve precise needle insertion. The number of those holes, which intersect with the prostate cross section, and therefore are potentially usable, is typically about 60. The number of mathematical combinations is therefore greatly in excess of 10[0006] 16, each of which is a potential treatment plan but is associated with different degrees of cancer control and a different likelihood of treatment complications [Y. Yu and M. C. Schell, “A genetic algorithm for the optimization of prostate implants,” Med. Phys. 23, 2085-2091 (1996)].
  • In current clinical practice, the design of a suitable seed configuration that is customized to the anatomy of a patient is achieved by a highly trained medical physicist or dosimetrist by using trial-and-error manual iterations. The practitioner usually starts with an initial needle configuration based on personnel experience or rules of thumb, and then adjusts the radioactive strength per seed or the locations of certain needles or both, until the calculated dose intensity distribution satisfies a set of clinical considerations. That process requires between 15 minutes and 2 hours, depending on the experience of the treatment planner and the geometric complexity of the relationship between the prostate and the surrounding anatomical structures. [0007]
  • Currently, those known treatment planning processes are typically aided by one of several available commercial computerized treatment planning systems. Such treatment planning systems enable the user to manually outline the prostate in relation to a template grid, to turn on or off any available needle positions and seed positions within each needle, and to examine the resultant dose distribution in two or three dimensions. Examples of such planning systems include those offered by Multimedia Medical Systems (MMS) of Charlottesville, Va., SSGI Prowess, of Chico, Calif., Nucletron Plato, from Columbia, Md., Computerized Medical Systems (CMS) Focus, of St Louis, Mo., Radiation Oncology Computer Systems (ROCS), of Carlsbad, Calif., ADAC Laboratory's Pinnacle, of Milpitas, Calif. and Theraplan, available from Theratronics International Ltd. of Kanata, Ontario, Canada. [0008]
  • In a number of such known commercial treatment planning systems, for example, those available from MMS and SSGI, the initial needle configuration that otherwise would have to be turned on by the human treatment planner is automatically set up by the computer system. That initial setup of seed configuration is based on simple rules of thumb, such as uniform loading, peripheral loading or modified peripheral loading. In a number of instances, the manufacturer claims that its planning system offers “automatic planning,” “geometric optimization,” or “real-time dosimetry.” However, none of those commercial planning systems offer true optimization in that the automatically loaded seeds are not designed based on customized dosimetric calculations. Rather, they are designed to fill the space of the prostate in some predetermined manner. Therefore, such known automatic seed loading techniques are designed to save between 15 and 30 mouse clicks by the operator (or about 1 minute of operation). However, the user is still required to apply his or her expert knowledge to iteratively improve upon that initial design in order to achieve customized optimization planning for any individual patient. Thus, there are two significant drawbacks of the above-mentioned current techniques: first, the complete treatment planning process is under the manual guidance of a radiation planning expert using trial and error techniques; and second, the adjustment of the delivered dose is achieved by varying the radioactive strength per seed until an isodose surface with the desired shape and size is scaled up or down to the prescription dose, i.e., those techniques will suffer when the activity per seed is fixed, as at the time of surgical implantation in the operating suite. [0009]
  • Because of those two severe drawbacks, the currently available commercial treatment planning systems are not suitable for intraoperative treatment planning in the surgical suite, where the patient is placed under anesthesia in volatile conditions and where the cost per minute is very high. The variability of human performance, experience and stress, and the general inability of humans to manage large amounts of numerical data in 1 to 2 minutes are also factors that deter current practitioners from performing intraoperative treatment planning. [0010]
  • Because of those and other recent postimplant dosimetry reports, there is now a resurgence of research interest as well as commercial interest in developing methods for immediate dose verification based on intraoperative imaging of the prostate and implanted seeds on ultrasound during the procedure. Technology advances now permit intraoperative optimized planning of dosimetry, auto-segmentation of prostate anatomy, and software tracking of seed deposition in real time, given adequate image quality. Regions of sub-optimal dosimetry can be identified before the end of the procedure, when supplemental seed placement is still possible. [0011]
  • An optimization technique for treatment planning is taught by U.S. Pat. No. 5,391,139 to Edmundson. More specifically, Edmundson is intended for use with a high dose rate (HDR) source which is moved within a hollow needle implanted in a prostate or other anatomical portion. The medical personnel using the system of Edmundson select a needle location using empirically predetermined placement rules. An image is taken of the prostate with the hollow needles implanted in it, and the dwell time of the source at each dwell position in the needle is optimized. However, placement itself is not optimized, but must instead be determined by a human operator. [0012]
  • Another optimization technique is taught by WO 00/25865 to one of the inventors of the present invention. An implant planning engine plans implants for radiotherapy, e.g., prostate brachytherapy. The system optimizes intraoperative treatment planning on a real-time basis using a synergistic formulation of a genetic algorithm, multi-objective decision theory and a statistical sensitive analysis. [0013]
  • While the above techniques allow calculation of optimized dwell time, placement or the like, they do not provide for detection and correction of errors in needle or seed placement. The optimized pre-planning does not mean that the best plan can be realized in clinical practice. In fact, it is very difficult to insert the implantation needles to the precise three-dimensional positions manually. That is, the actual implantation results do not match with the plans. Furthermore, seeds are displaced from their planned locations by the elasticity of surrounding tissue as soon as they are dropped from the needle, thus introducing further errors. In those situations, the realized treatment dosimetry will be lower than the planned dosimetry so that the prostate will be underdosed and the patient will not receive the most effective therapy. To solve that problem, intraoperative detection and compensation of errors in needle and seed placements must be performed automatically, without halting the operation, both during the procedure and immediately after the completion of the implantation plan. [0014]
  • (2) Background of 3D Orthogonal Compound Ultrasound [0015]
  • Although transrectal ultrasound (TRUS) imaging is now widely used in the diagnosis and treatment of the prostate cancer as an image guidance tool, the current imaging modality is a single-scan by using either transverse imaging or longitudinal imaging ultrasound transducer, and thus the image quality is inadequate for clearly visualizing tumors, prostate anatomy, and implanted radioactive sources due to the noise, speckles, and shadows existing in the TRUS images. [0016]
  • The effective accuracy of intraoperative optimization rests on the inherent information content of the TRUS image set. Research to date has carefully quantified the percentile detection rate of seeds in the prostate as well as prostate segmentation during brachytherapy, showing that the image content from standard axial or sagittal scans alone cannot consistently achieve 100% accuracy for dosimetry purposes. [0017]
  • The most common technique used in the diagnosis of prostate cancer is core biopsy under transrectal ultrasound (TRUS) image-guidance and histopathological analysis of the sampled tissues. Although TRUS guided core biopsy is routinely used clinically, the existing popular systematic sextant biopsy protocol is qualitative in nature and results in a positive predictive value of only 32% so that a significant number of patients with prostate cancer are not being diagnosed and treated. Currently, standard transrectal ultrasound, computer tomography, and magnetic resonance imaging with endo-rectal coils have failed to accurately determine the volume and stage of prostate tumors pre-operatively, and prostate cancer grade determined on biopsy commonly under-grades the actual tumor discovered in the final specimen. So there is an increasing need to improve the accuracy of assessing the volume, location, and stage of prostate tumors prior to treatment, especially since brachytherapy has gained popularity. Accurate knowledge of such tumor characteristics through an enhanced non-invasive imaging modality would result in more precise candidate selection and execution of that treatment. Other innovative treatment options would also benefit. However, most innovative imaging systems use methodologies that are not widely available to clinicians in day-to-day clinical practice. Those include magnetic resonance spectroscopy/angiography or PET scans. Such imaging studies, while initially promising, are not a significant improvement over existing methods and would not be expected to be widely available to practicing physicians in the near future. By contrast, virtually all urologists and radiologists are familiar with transrectal prostate ultrasonography and biopsy or implantation of radioactive implants. However, conventional TRUS imaging has two drawbacks: (1) an inability to ensure biopsy/treatment in a similar location to the original site and (2) images are two-dimensional and thus cannot be correlated reliably to histologic sections. [0018]
  • Siemens Medical Systems, Inc. owns a patent named “3-Dimensional compound ultrasound field of view,” (U.S. Pat. No. 5,655,535). However, their compound field of view ultrasound image is derived from correlated frames of ultrasound image data, while ours is uncorrelated. They used frames of sensed echo signals to detect probe motion without the use of a dedicated position sensor or motion sensor, which is also different from our sensor based position and orientation detection method. Their purpose of compounding is to increase the limited field of view of the ultrasound transducer, not to increase the image quality of ultrasound. [0019]
  • SUMMARY AND OBJECTIVES OF THE PRESENT INVENTION
  • It will be apparent from the above that a need exists in the art to detect and correct errors in implementation of a brachytherapy treatment plan. [0020]
  • It is therefore a primary object of the present invention to permit rapid and accurate identification and quantification of needle and seed placement departures from a treatment plan generated prior to a brachytherapy implant based on real-time ultrasound. [0021]
  • It is another primary object of the invention to allow a 3D orthogonal compound ultrasound (OCU) imaging system to obtain the real-time images of prostate anatomy and radioactive sources with higher signal-to-noise ratio than current ultrasound technologies. [0022]
  • It is another object of the invention to allow the 3D OCU imaging system to be used to improve the precision and correctness of radio-active source localization of intraoperative dosimetry evaluation and treatment feedback. [0023]
  • It is another object of the invention to allow the 3D OCU imaging system to be used as an imaging guidance in physical biopsy so that the tumor lesions and prostate anatomy can be more easily visualized by the doctors for better diagnosis. [0024]
  • It is another object of the invention to allow real-time correction to the brachytherapy dosimetry and iterative compensation of loss of dose coverage due to misplacement of the needles/catheters and seeds. [0025]
  • It is still another object of the invention to permit such identification, [0026] quantification 20 and correction without the need for Computer Tomography (CT) or Magnetic Resonance Imaging (MRI) during the interval between needle/catheter placement in the target organ and final deposition of radioactive sources for irradiation of the target organ.
  • (1) Dynamic Dosimetry Evaluation and Treatment Feedback [0027]
  • To achieve the above and other objects, a first embodiment of the present invention is directed to a technique for identifying and quantifying needle and seed displacement departures from a placement plan for the placement of radioactive sources in a prostate or other internal organ for brachytherapy or the like. The placement plan is made available to an intraoperative tracking interface, which also shows a live ultrasound image of the needle or catheter placement in the prostate. The needle or catheter placement in live ultrasound images can be located manually by clicking a mouse or semi-automatically by image segmentation algorithms. The manual or semi-automatic segmentation methods depend upon the contrast between background image and brightness of needle tips. The difference in the x-y plane between the planned and actual locations of the needle or catheter is calculated, and from that difference, the error in position of each seed is calculated from the implantation pattern of that needle. The seeds are moved, or the operator changes the number of seeds, and the dose is recalculated. After the seeds are implanted in the tissue, a small three-dimensional column of ultrasound images is taken along the track of needle, and each seed located in the column of images is automatically recognized and given a confidence level. The confidence level shows the accuracy of the seed recognition in view of the large noise interference in the transrectal ultrasound images. The dosimetry is recalculated based on the recognized seeds, whose confidence level exceeds a threshold preset by the operator. Periodically throughout the seed placement and at the end of seed placement, fluoroscopic x-rays are taken, and the seed coordinates found in transrectal ultrasound images are matched to the seed coordinates found in fluoroscopic x-ray image. Seed locations with low confidence levels are adjusted based on the x-ray locations because the x-ray images can provide accurate planar projection of each seed, and the dosimetry is recalculated based on those revised seed positions. [0028]
  • In a preferred embodiment, the technique is carried out through the following steps. [0029]
  • 1. The needle/catheter placement plan is made available to an intraoperative tracking interface. That interface contains an electronic worksheet of needle and seed coordinates, a live ultrasound image window into which real-time video image of needle/catheter placement is fed, and a series of isodose dosimetry panels reflecting the current state of dose coverage. Each of the needles/catheters can be activated by highlighting the corresponding row in the coordinates worksheet, or by highlighting the corresponding grid location graphically. [0030]
  • 2. Following insertion of each needle/catheter, a hyperechoic (i.e., bright) spot appears on the live ultrasound image. That location of hyperechoic spot is manually identified by the operator or semi-automatic thresholding method. The difference in the x-y plane between the planned location and the actual location of the needle/catheter is calculated to give errors Δx and Δy. The errors Δx and Δy are then reflected on the grid location and worksheet. The errors of each seed, Δx′ and Δy′, are calculated based on straight line interpolation at the planned z location of the seed; the straight line is constructed by joining two known points: (a) the actual needle location shown on ultrasound at the known z plane, (b) the template coordinate outside the patient body, through which the needle is inserted under precision template guidance (therefore at that location Δx and Δy shall be assumed to equal zero). The dose is then recalculated by moving the seeds along the activated needle/catheter in x and y by amounts Δx′ and Δy′, which may be the same or different for each and every seed. The dosimetry updated by such feedback of seed placement errors is redisplayed on the series of isodose panels. [0031]
  • In addition, the operator is permitted to change the number of seeds deposited by the needle/catheter in question. In that case, the operator is required to enter the new seed locations along the needle/catheter, which overrides the original treatment plan. Seed placement errors in such a case are tracked identically to the procedure described above. [0032]
  • 3. A small 3D column of ultrasound images is acquired along the needle track as constructed above. That column can be perpendicular to the x-y plane, or in fact may often sustain an angle α and an angle β from the x and the y planes, respectively. The exact number of seeds as deposited is identified using image processing and recognition algorithms in that column of 3D ultrasound region of interest. Each seed identified in that manner is assigned a confidence level, which depicts the likelihood/uncertainty of seed localization. The size of that column is initially set small; if the total number of seeds found in that manner is not equal to the number of seeds deposited by the given needle/catheter due to the misplacement of seeds, the width of the column is adaptively adjusted to find the lost seeds (e.g., the width is increased to find additional seeds). [0033]
  • Whereas the previous step only quantifies the errors Δx′ and Δy′ for each seed, the ultrasound step quantifies Δz′ for each seed and at the same time further corrects Δx′ and Δy′. If the confidence level of a given seed's localization exceeds a threshold value (to be set by the operator), the dosimetry is re-calculated yet again using the updated seed location and displayed in the same isodose panels. The isodose calculated is assigned a confidence level, which is a numerical composite of the individual confidence levels of the seeds and the dosimetric impact of positional uncertainties at each seed location (e.g., in high dose region, positional uncertainty has low impact). [0034]
  • 4. Periodically throughout the seed placement procedure and at the end of seed placement, fluoroscopic x-ray images may be taken in the anterior-posterior direction and at up to ±45 degrees on either side of the anterior-posterior directions. The seed coordinates as determined above with ultrasound images are two-dimensionally projected in the same orientations. A best match to the x-ray seed projections is made based on multiple point matching using those seed identifications with the highest confidence levels. Subsequent to such matching, the seed locations with low confidence levels are adjusted based on the x-ray locations. As a result, the confidence levels of those latter seeds are increased by an amount reflective of the best match quality. The dosimetry is recalculated. The confidence level of the dosimetry is updated using updated confidence levels of the seeds. [0035]
  • (2) Structure of 3D Orthogonal Compound Ultrasound [0036]
  • Currently, 3D TRUS images are usually created by collecting a series of 2D image slices at evenly spaced intervals in real time during a transducer scan, in combination with detection of the position and/or orientation of the transducer. One approach is mounting the transducer in a computer-controlled motor-driven assembly. Another approach is attaching a sensor to the transducer and measuring the relative position and orientation from the sensor to a transmitter by using non-contact sensing techniques. Currently, the technique of using an electromagnetic (EM) sensing system for that purpose has already been successfully implemented by a number of groups including the present inventors. [0037]
  • Two scanning methods for 3D TRUS imaging are linear scanning and rotational scanning. Linear scanning uses a transducer with a curvilinear transverse array that is pulled back manually or mechanically in the cranio-caudal direction to acquire images of transverse (axial) view of the prostate, whereas the rotational scanning technique rotates a transducer with a side-firing array manually or mechanically to acquire the longitudinal images. [0038]
  • The linear scanning technique is now widely used because therapy guidance requires that the transverse positions of the needles are clearly visible. However, that scanning method has two disadvantages. One is that the edges of the base and apex of the prostate are nearly invisible on the transverse images because of the complexity of human anatomy structure in those two areas; that leads to inaccuracies in volume estimates and dosimetric planning. The other is that the implanted seeds cannot be fully localized because some seeds are invisible on the transverse image series due to the impact of the speckle noise, shadowing by other seeds, and calcification, bleeding and tissue heterogeneity of the gland. On the other side, the apex and base of the prostate can be more easily determined on longitudinal images from the rotational scan. Though seed visualization still suffers from the various artifacts listed above in rotational scans, the direction of the beam relative to the seeds as well as shadowing artifacts are uncorrelated with that of the transverse scan, i.e., a given seed may be invisible in one look direction but clearly visible in the other look direction. That is the basis of the rationale for orthogonal compounding. [0039]
  • Ultrasound transducers that can electronically switch from longitudinal to transverse imaging (by pressing a button) are now commercially available and widely used in prostate imaging. The basic equipment investment for OCU is already in place for most clinicians; in addition, both scan modes are routinely used by practitioners to visually assess needle placement. Adding orthogonal compounding therefore do not introduce complexity to the intervention, but may result in greatly improved imaging. [0040]
  • The present invention, the dual-scan orthogonal compound ultrasound, is a novel imaging modality aimed to overcome those drawbacks of the conventional TRUS imaging, and become the next generation ultrasound imaging method for the treatment (brachytherapy/thermal/cryo-therapies), diagnosis (biopsy), and screening of prostate cancer. The present invention upgrades the conventional TRUS imaging from 2D to 3D, and compounds two orthogonal views of the prostate by dual-scan imaging so that a clearer visualization of the prostate anatomy and tumors can be obtained. However, it has been discovered that 3D adaptive compounding of transverse and longitudinal scan sets could improve both signal-to-noise and contrast-to-noise ratios and generate phantom images with clearly visible seeds. Spatial compounding, the combination of individual ultrasound images or scans obtained from different view points, has been studied since the 1950s (Holmes et al. 1954). In gray-scale imaging, image compounding provides additional benefits such as reduction of speckles noise and shadows (Kossoff et al. 1976). Since array probes have become available, 2D compounding by electronic steering of the beam has been studied. (Berson et al. 1981; Jespersen et al. 1998; Shattuck and von Ramm 1982). The present invention also includes a neural network solution for detecting tumors directly from the 3D orthogonal compound ultrasound images. [0041]
  • ATL Ultrasound Company now offers a commercial ultrasound scanner with real-[0042] time 2D compound imaging using electronic beam steering (“SonoCT™”). Although they have been issued a series of patents related to ultrasonic diagnostic imaging system with spatial compounding (U.S. Pat. Nos. 6,117,081; 6,126,598; 6,135,956; 6,210,328), there are two main differences with the present invention. The first one is that their compounding target is a two-dimensional image, while the compounding target of the present invention is a three dimensional image volume. The second one is that they use a series of partially overlapping component image frames to compound, while the present invention uses orthogonal image volumes as the compounding components that are fully independent. Since 3D compounding can combine more independent views than 2D compounding and offers the possibility of out-of-plane refraction correction, the orthogonal 3D compound can utilize more benefits of spatial compounding.
  • (3) Statistical Analysis of Texture Features [0043]
  • Many kinds of texture features can be used to identify radioactive seeds in TRUS images, such as first order statistical parameters (mean, maximum, minimum, summation, etc.), second order statistical parameters (standard deviation, covariance, correlation coefficient, etc), higher order statistical parameters (skewness, kurtosis, etc.), frequency domain parameters (high pass filtering, low pass filtering, band pass filtering, etc.), and geometry parameters (area, Euler number, etc.). [0044]
  • In order to find effective features and their optimized combination for the purpose of identifying seeds, Logit and Probit models, discriminant analysis, and neural network classification methods are applied to two test groups. One group represents the seeds which are identified manually from the TRUS images; the other group represents the non-seeds group including typical background noises, calcifications, bleeding, air gaps, and shadows. The selection of the two groups is confirmed by the [0045] day 0 CT (i.e., CT images taken within the same day of the actual implant): the positions of the radioactive seeds in the CT images are identified automatically or manually, and affine-transformed to match the TRUS image series. Some clearly visible seeds in the TRUS image series can be visually identified and used as markers to correct the registration. The seed positions identified from CT are affine-transformed until they achieve a best match with those identified from TRUS.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Preferred embodiments of the present invention will be set forth in detail with reference to the drawings, in which: [0046]
  • FIG. 1 shows a schematic diagram of a system for carrying out any of the preferred embodiments of the present invention; [0047]
  • FIGS. [0048] 2A-2C show a flowchart of a process according to a first preferred embodiment;
  • FIG. 3 shows a user interface used in the first preferred embodiment; [0049]
  • FIG. 4 shows the user interface of FIG. 3 after the calculation of a needle offset and also identifies certain process steps of FIG. 2A with certain components of the user interface; [0050]
  • FIG. 5 shows a flow chart of an image processing technique used to identify seeds in the ultrasound images; [0051]
  • FIGS. 6A and 6B show an image with a desired grayscale distribution and a histogram of the desired grayscale distribution, respectively; [0052]
  • FIGS. 7A and 7B show an image with a typical grayscale distribution and a histogram of the typical grayscale distribution, respectively; [0053]
  • FIGS. 8A and 8B show the image of FIG. 7A after preprocessing and a histogram of the resulting grayscale distribution, respectively; [0054]
  • FIGS. 9A and 9B show a sequence of images taken in a column and an identification of those images having hyperechoic spots, respectively; [0055]
  • FIG. 10 shows a plot of a threshold used to locate the hyperechoic spots; [0056]
  • FIGS. 11A and 11B show ideal and typical plots, respectively, of brightness along a needle path; [0057]
  • FIGS. [0058] 12A-12C show three types of peaks which may occur in image data;
  • FIGS. [0059] 13A-13D show the locations of seeds in image data;
  • FIG. 14A shows a known technique for 2D spatial compounding; [0060]
  • FIG. 14B shows a schematic of 3D orthogonal compounding according to another preferred embodiment of the present invention; [0061]
  • FIG. 15 is a diagram illustrating an imaging system with a rectal shell used in the preferred embodiment of FIG. 14B; [0062]
  • FIGS. [0063] 16A-16D show four steps in the process of 3D image acquisition and reconstruction according to the preferred embodiment of FIG. 14B;
  • FIG. 17 is a schematic diagram illustrating the geometry used to describe the reconstruction algorithm used in FIGS. [0064] 16A-16D;
  • FIG. 18 is a flow chart of the algorithm of 3D reconstruction of longitudinal images; [0065]
  • FIG. 19 is a schematic diagram illustrating the construction of the 2D joint histogram; [0066]
  • FIG. 20 is a flow chart of the registration algorithm; [0067]
  • FIGS. [0068] 21A-21C are three images showing one of the sagittal images from the linear scan, from the rotational scan, and the compound image, respectively;
  • FIGS. 22A and 22B are illustrations of the intelligent compounding algorithm using global weighting and regional weighting, respectively; [0069]
  • FIG. 23 is a schematic diagram of the neural network architecture used in the normalized statistical tumor probability model (NSTPM); [0070]
  • FIG. 24 is a flow chart of the use of NSTPM during biopsy under the guidance of OCU imaging; [0071]
  • FIGS. [0072] 25A-25C show seed templates used to identify seeds in TRUS images according to another preferred embodiment of the present invention;
  • FIG. 26 shows a flow chart of a process for finding optimized weight coefficients for identification of the seeds in the TRUS images; [0073]
  • FIGS. [0074] 27A-27D show scans of the prostate in the manual adjustment of a needle pattern;
  • FIG. 28A shows an original ultrasound image used for a global search in another preferred embodiment of the present invention; [0075]
  • FIG. 28B shows a grayscale transformed image based on the image of FIG. 28A; and [0076]
  • FIG. 28C shows a graph of the measurement of the gray-scale summation of each point in the image of FIG. 28B. [0077]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Various preferred embodiments of the present invention will be set forth in detail with reference to the drawings, in which like reference numerals refer to like elements throughout. [0078]
  • FIG. 1 shows a [0079] system 100 on which any of the preferred embodiments can be implemented. The system 100 includes a computer 102, which can be the same as the computer used in either of the above-cited Edmundson and Yu references or any other suitable device. The computer uses a display 104 and a user input device or devices such as a keyboard 106 and a mouse 108. Other input devices can be used; for example, the mouse 108 can be replaced by a light pen for use with the display 104. The computer also receives input from an ultrasound device 110 and a fluoroscopic x-ray device 112 by image digitization.
  • The system also includes components for administering the brachytherapy to the patient. Those components include [0080] needles 114 having radioactive seeds 116 (or other radioactive objects, or, even more generally, other therapeutic agents) spaced therealong in accordance with a treatment plan. A template 118 having a grid of holes 120 is used to position the needles 114 for insertion into the patent's prostate. The specifics of the needles 114, the seeds 116 and the template 118 are known from the prior art cited above. The needles 114 can be replaced by hollow needles or catheters in accordance with the treatment plan to be used.
  • (1) Dynamic Dosimetry Feedback [0081]
  • The use of the [0082] system 100 in implementing the first preferred embodiment will now be explained with reference to the flow chart of FIGS. 2A-2C. In step 202, a treatment plan is developed. Such a treatment plan can be the one developed in the above-cited Yu reference and can be developed either on the computer 102 or on a different device. In step 204, the treatment plan is made available to an intraoperative tracking interface implemented on the computer 102. If the treatment plan is not developed on the computer 102, an appropriate communication medium can be provided to supply the treatment plan to the computer 102.
  • The intraoperative tracking interface is displayed to the user on the [0083] display 104. As shown in FIG. 3, the intraoperative tracking interface 300 includes the following components. An electronic worksheet 302 shows needle and seed coordinates, based on the grid of holes 120 in the template 118, and identifies needle locations with dots 304. A live ultrasound image window 306 shows a real-time image of a section of the prostate obtained from the ultrasound device 110 and allows a real-time view of needle placement in the prostate. From the placement of the seeds, the dosimetry is calculated, and a series of dosimetry panels 308 are shown, each showing the dosimetry in a respective slice of the prostate from the base to lo the apex. The dosimetry in the panels 308 is shown by isodose lines 310. The electronic worksheet 302 further includes a spreadsheet 312 in which each row indicates the information of one of the needles. The spreadsheet 312 includes a column 314 indicating a needle by a number, a column 316 identifying the hole 120 in the template 118 into which that needle is inserted by its coordinates (letter and number), a column 318 indicating an offset, a column 320 indicating the number of seeds on the needle, a column 322 indicating a Δx offset of the needle from its planned position, a column 324 indicating a Δy offset of the needle from its planned position, a column 326 indicating the number of currently selected seeds whose offsets have been calculated and a column 328 indicating a total number of seeds whose offsets have been calculated. A needle position 304 which the operator has selected is shown on the interface 300 as flashing, as is the corresponding row 330 in the spreadsheet 312.
  • Following the insertion of each needle or catheter in [0084] step 206, the live ultrasound image 306 of the interface 300 displays a bright (hyperechoic) spot 332 in step 208. In step 210, the operator manually identifies the spot 332, e.g., by clicking on its center with the mouse 108. In step 212, the difference in the x-y plane between the planned location and the actual location of the needle or catheter is calculated to give errors Δx and Δy, which are shown both on the grid 302 and on the highlighted row 330 of the spreadsheet. The positional errors in the x-y plane of each seed, Δx′ and Δy′, are calculated in step 214 based on straight line interpolation at the planned z location of the seed. The straight line used in the interpolation is constructed by joining two known points: (a) the actual needle location shown on ultrasound at the known z plane and (b) the template coordinate outside the patient body through which the needle is inserted under precision template guidance. At the template 118, Δx and Δy are assumed to equal zero. The dose is then recalculated in step 216 by moving the seeds along the activated needle or catheter in the x and y directions by those amounts Δx′ and Δy′, which may be the same or different for every seed. The dosimetry updated by such feedback of seed placement errors is redisplayed in step 218 on the series of isodose panels 308. FIG. 4 shows the updated interface 300 and also identifies some of the above-mentioned method steps in association with the corresponding elements of the interface 300.
  • In addition, the operator is permitted to change the number of seeds deposited by the needle or catheter in question in [0085] step 220. In that case, the operator is required to enter the new seed locations along the needle or catheter, which overrides the original treatment plan in step 222. Seed placement errors in such a case are tracked identically to the procedure described above.
  • In [0086] step 224, a small 3D column of ultrasound images is acquired along the straight line constructed in step 214. That column can be perpendicular to the x-y plane or may be at an arbitrary angle from the x and/or the y planes. The exact number of seeds as deposited is identified in step 226, using image processing and recognition algorithms to be described below, in the 3D column of ultrasound images. Each seed identified in the ultrasound images is assigned a confidence level in step 228, which indicates the likelihood or uncertainty of seed localization.
  • The size of the 3D column is initially set small. If it is determined in [0087] step 230 that the total number of seeds found in step 226 is not equal to the number of seeds deposited by the given needle or catheter, the width of the column is adaptively adjusted in step 232; for instance, the width is increased to find additional seeds.
  • Thus, Δz′ is quantified for each seed, and at the same time, Δx′ and Δy′ are further corrected according to [0088] step 214. If it is determined in step 234 that the confidence level of a given seed's localization exceeds a threshold value (set by the operator), the dosimetry is re-calculated yet again in step 236 using the updated seed location and displayed in the same isodose panels. The isodose calculated is assigned a confidence level in step 238, which is a numerical composite of the individual confidence levels of the seeds and the dosimetric impact of positional uncertainties at each seed location. For example, in a high dose region, positional uncertainty has low impact.
  • Periodically throughout the seed placement procedure and the end of seed placement, fluoroscopic x-ray images may be taken in [0089] step 240 in the anterior-posterior direction and at up to ±45 degrees on either side of the anterior-posterior direction. The seed coordinates as determined above are projected in the same orientations in step 242. A best match to the x-ray seed projections is made in step 244 based on multiple point matching using those seed identifications with the highest confidence levels. Subsequent to such matching, the seed locations with low confidence levels are adjusted in step 246 based on the x-ray locations. As a result, the confidence levels of those latter seeds are increased by a amount reflective of the best match quality. In step 248, the dosimetry is recalculated, and the confidence level of the dosimetry is updated using the updated confidence levels of the seeds.
  • The image processing algorithms used in carrying out [0090] step 226 will now be explained. As shown in the flow chart of FIG. 5, there are three basic steps. In step 502, which is a preprocessing step, the image brightness and contrast are adjusted to make the hyperechoic spots more distinct. In step 504, the seed pathway is tracked for further correcting the offsets Δx′ and Δy′ of the implanted seeds. In step 506, the seeds are identified for correcting Δz′ for each seed along the tracking pathway and fine-tuning of Δx and Δy for each seed.
  • [0091] Step 502 involves executing a grayscale histogram transformation to each image in the ultrasound series from apex to base and is thus a pre-processing step. The purpose of step 502 is to adjust the brightness and contrast of the images so that the hyper-echoic spots will be more distinct in the transformed images. According to experience acquired from many actual OR cases, an image suitable for seed recognition processing has a grayscale histogram similar to that shown in FIGS. 6A and 6B, whereas in most cases, the images as taken have grayscale histograms similar to that shown in FIGS. 7A and 7B.
  • As shown in FIG. 6B, it is preferred that the background be very dark while the hyperechoic spots be very distinct. For that preferred case, 50% of the pixels have grayscale levels below 30, representing the background and dark tissues; 90% of the pixels have grayscale levels below 60, with grayscale levels between 30 and 60 most likely representing the brighter tissues of the gland; and 95% of the pixels have grayscale levels below 80, with levels between 60 and 80 most likely representing some much brighter tissues and some weaker airgaps The pixels with the highest grayscale levels (from 80 to 255) are the hyper-echoic spots of seeds and some stronger air gaps. [0092]
  • Here, the images are assumed to have an eight-bit grayscale depth, namely, with grayscale values from zero to 255 inclusive. Of course, other grayscale depths can be used instead. [0093]
  • In the images as taken, the 50%, 90% and 95% grayscale levels are higher than the preferred ones set forth above. In the example of FIGS. 7A and 7B, they are 60, 110 and 135, respectively. [0094]
  • To transform an image as taken into an image as preferred, the following grayscale transformation scheme can be used: [0095]
    Original image Transformed image
    Below median (0˜50%) 1
    50%˜90% 1˜50
    90%˜95% 51˜75 
    95%˜100% 76˜255
  • When the image of FIGS. 7A and 7B is subjected to such a transformation, the result is as shown in FIGS. 8A and 8B. A comparison of FIGS. 7A and 7B with FIGS. 8A and 8B shows that the hyper-echoic spots in transformed image of FIGS. 8A and 8B are more distinct than they are in the original image of FIGS. 7A and 7B. Thus, it is easier for the subsequent algorithms to track and identify the seeds. More importantly, it is possible for the algorithms to use unified parameters to process cases with different brightness and contrast settings. At the same time, that gray-scale transformation does not lose the important information in the ultrasound images. [0096]
  • [0097] Step 504, automatic tracking of the seeds along a same needle, is used to correct Δz′ (displacement from the planned location) of the implanted seeds. Step 504 involves tracking the pathway of the seeds, not just the seeds themselves. In other words, the air gaps are also included, and step 504 does not discriminate the seeds from the air gaps. Step 504 uses the grayscale information region of interest (ROI), such as the maximum value of a hyper-echoic spot, the mean and the standard deviation of the ROI, the contrast defined by the maximum value divided by the mean, and first/second order texture parameters, etc.
  • In [0098] step 504, a center and the size of an ROI are preset. That operation can be manually done by the operator by clicking the mouse on the hyper-echoic spots at any z-position or automatically done by using the information from the treatment plan and above needle tracking method. Thresholding and analysis are then used to determine whether there is a hyper-echoic spot in the ROI. If there is, the ROI center of the next image in the series is switched to the current center position. If not, the previous center is kept.
  • FIG. 9A shows a column of images taken along the pathway corresponding to grid coordinates I[0099] 2 in the grid 302 of FIG. 3. FIG. 9B shows the same column of images, with boxes identifying the images in which hyperechoic spots have been identified. As shown in FIG. 9B, each hyperechoic spot occupies five consecutive images because of the dimensions of the seed relative to the interval at which the images are taken; an illustrative example of the relevant dimensions will be given below.
  • The threshold measurement based on the grayscale analysis of the ROI can be illustrated by FIG. 10. For the sake of clarity of illustration, FIG. 10 shows only the maximum, mean, and contrast measurements because they can be shown in a 2-D plot. FIG. 10 is not drawn to scale, and the parameters are examples only, used to make the illustration more intuitive. [0100]
  • The ROI whose grayscale features fall in the shadow area of FIG. 10 is identified as an ROI containing a hyper-echoic spot. In the figure, the four borders of the shadow area are represented with four lines a, b, c, and d, respectively. The lines a and b indicate that the maximum value of the ROT should be between [0101] grayscale levels 75 and 255. The line c indicates that the mean value of the ROI should be greater than 5. The line d indicates that the contrast (the slope of the line in the 2-D coordinate system constructed by the mean and maximum) should be greater than 2.
  • In practice, the line d may be replaced by a curve e (the dotted curve in FIG. 10), which delimits the border more accurately. That is because variations of the mean and the contrast may result in different thresholds. Generally speaking, the greater the mean, the smaller the threshold. As a result, curve e is in the form as shown in the figure. The curve e can be implemented as a curve equation or as a look-up table for correlating the threshold to the mean. [0102]
  • Extending the illustrative example of FIG. 10 to more measurement parameters results in a multi-dimensional space and a shadowed sub-space similar to the shadow area in the 2-D space in FIG. 10. [0103]
  • [0104] Step 506, detecting the real z-position of each seed placed along the needle track, is in fact a task of cutting the seed pathway into several segments by discriminating the spots representing seeds from any spots representing air gaps. The grayscale information cannot be used to achieve that goal because some stronger air gaps have greater measurement values than weak seeds, as will be explained below with reference to FIG. 11B. Therefore, a wave form analysis method is used instead.
  • To simplify the illustration, it is assumed that the distance between two contiguous images is 0.5 mm, so that one seed can occupy at most 10 images in the series, and it usually occupies fewer than 10 due to its slant. Thus, in a case in which the gland has a length of 4.5 cm, the offset is 5 mm, and there are 5 seeds with special spacing, i.e, no spacer, at the apex, an ideal waveform of a needle track should have the appearance shown in FIG. 11A, having [0105] rectangular peaks 1102, 1104, 1106, 1108 and 1110 indicating the seeds. However, a real waveform is more likely to have the appearance shown in FIG. 11B, having irregular peaks 1112, 1114, 1116, 1118 and 1120 indicating the seeds.
  • It can be seen in FIG. 11B that although the measured value (MV) of the second peak [0106] 1114 is less than that of the air gap 1122 between the peaks 1116 and 1118 or that of the air gap 1124 between the peaks 1118 and 1120, the second peak 1114 has a wave form of peak, while each of the air gaps 1122 and 1124 has the wave form of valley. That distinction between peaks and valleys can be used to discriminate the seeds from the air gaps.
  • Since it is already known how many seeds are placed in the needle track, the positions of the top several peaks are identified as the centers of seeds. In the case of FIG. 11B, if the plan has four seeds, their positions are taken as the [0107] peaks 1112, 1116, 1118 and 1120, but not 1114.
  • That principle is simple, while the difficulty is the representation of the MV. Since any single grayscale measurement cannot reflect the whole feature of the ROI, it is natural to use their linear combination as the final MV, i.e., [0108]
  • MV=Σaivl,
  • in which v[0109] l represents each feature such as maximum, contrast, and standard deviation, etc, and ai represents the coefficient of each feature. Of course, the combination of those features is not constrained to the linear composition, which is the simplest one. Simple least square statistics will determine the value and the confidence interval for each coefficient.
  • Of course, the MV waveform should be smoothed before it can be processed because the raw signal may contain many noise peaks, as shown in FIG. 12A. Next, the false peaks are removed. For example, if two peaks have a distance less than 6 units along the z-axis, they most likely represent the same seed, so one of them will be absorbed by the other, stronger one, as shown in FIG. 12B. If a peak lies between two other higher peaks and has no distinct drop off before and after it, it is most likely noise, as shown in FIG. 12C. [0110]
  • After those adjustments to the waveform, the peaks are detected to determine how many peaks there are. If the number is greater than the implanted number N of seeds, only the highest N peaks are taken as the seeds, as explained above with reference to FIG. 11B. If the number is less than N, either seed identification is forced using second-tier peaks (with reduced confidence), or the preset transverse size of the ultrasound column is changed to process a larger needle track that includes the exact number of the implanted seeds. [0111]
  • FIGS. [0112] 13A-13D show sample seed identifications along grid location 12. In FIGS. 13A and 13C, the seeds are identified by black marks M, while in FIGS. 13B and 13D, they are left unmarked.
  • Each seed identified in that manner is assigned a confidence level according to the MV level and the fall-off characteristics of the peak. The greater the MV level and the fall off of the peak, the more likely it is a seed. The locations of the seeds and their confidence values are convoluted into subsequent dosimetry calculations, which result in a confidence level for each of the dosimetry parameters arising from the dose-volume histogram, including V100, D100, D95, D90, D80 and D50. [0113]
  • If the confidence on the chosen dosimetry parameter (currently D90) is acceptably high, seed localization is said to be reliable enough for re-planning and re-optimization of dosimetry, in order to compensate for the dosimetric impact of the aggregate seed misplacements. If the confidence on the chosen dosimetry parameter is not sufficiently high, simple Baysian statistics are used to determine which seed localizations require increased confidence to achieve acceptable confidence in dosimetry. Repeat ultrasound scans are acquired; imaging data for the given needle column(s) are fused using redundant information but with increased signal-to-noise ratio. The above-described process is repeated starting from active seed pathway tracking and ending with dosimetry confidence analysis. [0114]
  • If repeated application of the above process still cannot achieve acceptable dosimetry confidence, fluoroscopy x-ray imaging of the seeds after the implantation will be used to increase the localization confidence of the given seeds found in TRUS images. Ultrasound-based seed identification of high confidence values will be used as “anchors” (fiducial marks) to register the ultrasound and x-ray spaces. The stabilizing needles in the prostate can also be used as positioning marks. The z-coordinates of the low confidence seed localizations will then be corrected using the x-ray projection(s). The confidence values are increased by a variable based on the degree of seed overlap on the x-ray image, the quality of the overall registration, and the quality of the x-ray itself for seed localization. The 3 dimensional coordinates of the TRUS detected seeds will be projected onto x-z plane to form a 2 dimensional pattern. Then that pattern will be matched with the 2 dimensional pattern of x-ray by using the affine transform and optimized to a global minimum distance differences. Then the seeds pairs with distance differences less than a threshold, for example, 0.2 cm, will be considered as matched pairs with high confidence and used as fiducial marks for registration. Note that the 2 dimensional x-ray pattern is the distortion corrected pattern. The geometry parameter used for correction is acquired when the x-ray is taken. [0115]
  • (2) 3D Orthogonal Compound Ultrasound [0116]
  • Another preferred embodiment of the present invention is directed to 3D spatial compounding, which will be compared with a known technique for 2D spatial compounding. FIG. 14A shows a simple illustration of the known 2D spatial compounding. Two (or more) scans of the same plane are accurately registered and then combined together to produce a compounded image. FIG. 14B shows a schematic of 3D orthogonal compounding according to the present embodiment. Stacks of images captured from both the transverse view and the longitudinal view are converted to image volume format in Cartesian coordinates and combined. A scan is a set of images captured in one continuous sequence. The image stack captured by a scan is reconstructed into a volume. A compound volume is made up of dual orthogonal scans (or, more generally, non-parallel scans) with accurate registration. [0117]
  • A personal computer installed with a video frame grabber and an electromagnetic, optical or mechanical position and orientation measurement system (e.g., miniBIRD™, Ascension Technology, Inc., Burlington, Vt.) is connected to the ultrasound probe and used to acquire a series of 2D images using both linear and rotational scanning methods. The present inventors have verified that the EM sensor can provide a resolution of 0.5 mm in static position and 0.1° in orientation at 30 cm distance from the electromagnetic field transmitter. The frame grabber can capture real-time images from the composite video output of the ultrasound machine at the rate of 30 frames per second. [0118]
  • Any movement of the prostate due to probe pullback during image acquisition can cause instantaneous deformation of the prostate and therefore mis-registration of different image series. A system to overcome that problem is shown in FIG. 15. That system [0119] 1500 uses a rigid sheath (“rectal shell”) 1502 fitted around the TRUS transducer 1504 to avoid its direct contact with the rectal wall of the patient P so that prostate movement is eliminated during the pull back or rotation of the transducer. The sheath 1502 is made of bio-compatible materials and safe for human tissues. The material should also have low attenuation of ultrasonic energy so that the ultrasound image quality will not decrease significantly. Acoustic coupling is ensured by applying a small amount of gel 1506 between the surface of the transducer 1504 and rectal shell 1502. The posterior side of the rectal shell 1502 vents with tissue so that pressure does not build up during forward movement. Fiducial markers 1508 visible in both the transverse and longitudinal scans can be embedded in the rectal shell 1502 and used to help register the image volumes. A position sensor 1510, such as an electromagnetic sensing system or an optical or mechanical encoder, is also provided.
  • A bi-plane ultrasound transducer, which can easily switch between transverse imaging and longitudinal imaging, is used for imaging. Transverse view images are obtained while the transducer is pulled back from base to apex at even spacing (e.g. 0.5 mm). Longitudinal images are obtained while the transducer is rotated slowly by free hand at even angle (e.g. 0.5°). Image acquisition is triggered automatically when the EM sensor detects probe movement to each image interval. The actual (rather than evenly-spaced) coordinates and angles of the probe at the instant of image acquisition are registered with the image, therefore even when the probe movement is very rapid there is no mis-registration (but there may be potential under-sampling). The measurement rate of the EM sensor is set to 120 Hz. The stream data reading mode and real-time filtering can be used to avoid the delay of the sensor. That technique has been extensively tested by the present inventors for both linear and rotational scans under realistic operative conditions. Careful error analysis indicates that sufficient accuracy and sampling may be achieved if some attention is paid to the EM transmitter positioning and speed of the scans (no more that 1.5 cm/s for linear scan and 15°/s for rotation). The acquired 2D images are reconstructed into 3D volume in Cartesian coordinates by using reformatting, so that the reformatted image volume can be registered and compounded with the image volume acquired by the linear scan. [0120]
  • The 3D reconstruction of the image volume acquired by rotational scan is in fact a problem of coordinate transformation from the polar coordinates (r, θ) into Cartesian coordinates (x, y). FIG. 16A shows a schematic diagram of the acquired images with respect to the transducer. The images are taken while the transducer is rotated and are thus arranged as a fan of a series of 2D images, radiating outward from the transducer, and evenly spaced over an arc of a certain degree (typically 100°). FIG. 16B shows a resulting stack of (r, z)[0121] θimages. The data from the stack of images shown in FIG. 16B are rearranged by interpretation to form a stack of (r, θ)z images by interpolation. Such a stack is shown in FIG. 16C. The data from that stack are reconstructed into a 3D (x, y, z) image such as that shown in FIG. 16D. The reconstruction produces a 3D image data cube in which the relative orientation of the acquired images is preserved.
  • FIG. 17 is a schematic diagram illustrating the geometry used to describe the reconstruction algorithm. The source image grid is in polar coordinates (r, θ), while the destination image grid is in Cartesian coordinates (x, y). The gray scale value at any point p in the destination image is calculated from those of its neighbors a, b, c, and d in the source image in a manner which will now be described. [0122]
  • If the rotation axis of the transducer is designated as the z-axis, all the 2D image slices are perpendicular to the x-y plane, so that the reconstruction for every x-y plane is the same. The 3D reconstruction problem is therefore reduced to a 2D problem of mapping the source data points, collected in cylindrical coordinates (r, θ, z), onto a regular grid of destination data points, in Cartesian coordinates (x, y, z), i.e., for each value of z, P (r, θ, z)→P (x, y, z), where P is the grayscale value of a data point. [0123]
  • We use the destination-oriented method for the conversion because the source-oriented method [i.e., for each grid point (r, θ) in the source image, the Cartesian coordinate (x, y)=(rsin θ, rcos θ) is computed, and the corresponding grayscale value is distributed to its nearest neighbors in the destination grid] suffers from under-sampling far from the axis, i.e., some destination grid points may not receive contributions from any source point. [0124]
  • In the destination-oriented method, for each grid point (x, y) in the destination image, the polar coordinate (r, θ)=({square root}(x[0125] 2+y2), arctan (x/y)) is computed, and the corresponding grayscale value is interpolated from its nearest neighbors in the source image. We use the bilinear interpolation method, i.e., linear interpolation in both r and θ, for the reconstruction, in which the grayscale value P of a grid point p in the destination image can be calculated as: P = ( 1 - δ θ Δ θ ) δ r Δ r P a + δ θ Δ θ δ r Δ r P b + δ θ Δ θ ( 1 - δ r Δ r ) P c + ( 1 - δ θ Δ θ ) ( 1 - δ r Δ r ) P d , (Eq.  1)
    Figure US20030065260A1-20030403-M00001
  • where P[0126] a, Pb, Pc, Pd are the gray scale values of the neighbor points of p in the source image, a,b,c,d, respectively. Δθ is the angular separation between successive 2D longitudinal images, at θl and θl+1, δθ is the angle between θi and p , Δr is the radial separation between rl and rl+1, and δ r is the distance between rl and p , as shown in FIG. 17; the dashed lines represent the polar coordinates in which the points a,b,c,d, of the source image are located, and solid lines represent the Cartesian coordinates in which the point p of the destination image grid is located.
  • The above process will be summarized with reference to the flow chart shown in FIG. 18. In [0127] step 1802, a two-dimensional series of longitudinal images in the form of a stack of (r, z)θ images is collected. In step 1804, the stack of step 1802 is rearranged into a stack of (r, θ)z images by interpolation. In step 1806, using Equation 1 above, for each z plane, the grayscale value at each grid point (x, y) in the destination image is calculated. In step 1808, the 2D series of transformed images is collected in the form of an (x, y)z stack to form an image volume (3D image) in Cartesian coordinates (x, y, z).
  • Since the reconstruction algorithm for every x-y plane is the same, the total reconstruction time can be reduced by producing a look-up table that designates which source pixels correspond to a,b,c,d in equation, and the associated values of δθ/Δθ and δ r/Δr, for every destination grid point p in the volume. The reconstructed 3D image is then built up by using that table repeatedly for each successive value of z. [0128]
  • Precise registration between separate scans obtained in the same volume of interest is the key for 3D spatial compounding to improve data quality. The present invention use the image-based non-rigid registration of the grayscale data sets method because the soft tissue movement and the sound speed inhomogeneities make the compound using only the position sensor readings inappropriate and inaccurate. [0129]
  • The two image volumes to be registered are obtained from different scanning methods. Since position readings of each slice are available from the [0130] position sensor 1510 of FIG. 15, a preliminary registration based on geometry is used and significantly reduce the registration time of the subsequent registration.
  • Maximization of mutual information (MMI) of the histogram is a new approach of image-based registration that has been used for multimodality medical image registration by many groups (Collignon et al. 1995; Studholme et al. 1995; Viola and Wells 1997; Maes et al. 1997; Meyer et al. 1997). Recently, that method has also been used for registration of 3D ultrasound images from scanning at several different transducer tilt angles (Krücker et al. 2000). But it has never been used for registration of intraoperative 3D TRUS image volumes from linear scanning and rotational scanning, as proposed in the present invention. [0131]
  • The MMI method uses the concept of mutual information (MI), which is a basic concept from information theory for measuring the statistical dependence between two random variables, here to measure the statistical dependence between the image intensities of corresponding voxels in two images, which is assumed to be maximal if the images are geometrically aligned. Because no assumptions are made regarding the nature of that dependence and no limiting constraints are imposed on the image content, MMI is a very general and powerful criterion, allowing robust, fully automated affine registration of different images with different contrast and resolution without the need for segmentation or other preprocessing. [0132]
  • The mutual information is defined as [0133]
  • I(x, y)=H(x)+H(y)−H(x, y)
  • where H(x) and H(y) are the entropy of data sets x and y, respectively, and H (x, y) is the joint entropy of the two data sets. The entropy of a data set is defined as its average information content, while the joint entropy is the average information of two data sets. Entropy H is a measure of the amount of uncertainty about the data set and can be computed using the probability densities p (x), p (y) and joint probability density p (x, y) by [0134] H ( x ) = - x p ( x ) log p ( x ) H ( y ) = - y p ( y ) log p ( y ) H ( x , y ) = - x , y p ( x , y ) log p ( x , y )
    Figure US20030065260A1-20030403-M00002
  • So the mutual information I (x, y) can be computed by [0135] I ( x , y ) = x , y p ( x , y ) log p ( x , y ) p ( x ) p ( y )
    Figure US20030065260A1-20030403-M00003
  • The gray-scale values of a pair of corresponding voxels in two images that are to be registered are considered as two random data sets x and y. The probability densities can be simply approximated by the normalized gray scale histograms, and the joint probability density can be approximated by the normalized 2D joint histogram, with the two base axes corresponding to gray scale levels in the two data sets. The 2D joint histogram is constructed by raster scanning through all voxels in one image volume and incrementing bin counts corresponding to the gray scale values from the geometrically mapped voxel pair in the other image volume. The joint histogram is normalized by dividing by the total number of voxels in the image volume. So the histogram value p (g[0136] 1, g2) is equal to the normalized count of pixels with value g1 in the first image volume that match a value g2 at the same location in the second image volume under the current mapping transformation. The procedure of constructing the 2D joint histogram is illustrated in FIG. 19. For ultrasound images that are 8-bit grayscale (256 grayscale levels), the number of elements in the joint histogram is 256×256.
  • Although typical ultrasound images have 256 grayscale levels, fewer bins can be used (such as 64 bins, thus requiring {fraction (1/16)} of the computation intensity) to form the 2D joint histogram because most radioactive seeds implanted into the prostate have large grayscale values while tissues have small values. That feature can be used to reduce the number of grayscale bins. That means that at least some of the implanted radioactive seeds play an important role in the MMI registration. [0137]
  • The two image volumes x and y are related through a geometric transformation T[0138] α defined by the registration parameter α. The MMI registration criterion states that the images are geometrically aligned by the transformation Tα for which the mutual information I (x, y) is maximal. The registration optimization works by iteratively maximizing the mutual information I (x, y). The only user interaction in the process is the definition of an approximate initial alignment of the two image volumes by placing three control points in each set. The control points define a geometric transformation that is applied to match one image volume to the other. The mutual information of the matched pair is calculated, and the optimization algorithms are used to modify the position of the control points until I (x, y) is maximized.
  • The Nelder-Maed down hill simplex algorithm can be used. That method is initialized with N+1 points, defining a non-degenerate simplex in N-dimensional parameter space. That simplex is then deformed iteratively by reflection, expansion or contraction in order to move the vertices of the simplex towards the minimum of the objective function. Convergence is declared when the fractional difference between the lowest and the highest function values evaluated at the vertices of the simplex is smaller than a pre-defined threshold. [0139]
  • The optimization works in three stages, using increasing numbers of control points and thus increasingly complex geometrical transformations. The first stage uses three control points (rotation and translation), the second stage four (full affine transform), and the third a number greater than four (thin plate spline warping), which is a trade-off between registration accuracy and computation time. Each stage stops and passes its final transformation onto the next stage when the mutual information increment is less than a predefined threshold. For efficient optimization, each stage uses a compressed version of the original data set and different parameters to confine variation of the control point positions and to determine convergence of each optimization cycle. [0140]
  • FIG. 20 shows a flow chart of the registration algorithm. As seen in that flow chart, the algorithm includes two loops, the outer one of which controls the transition from stage to stage. In [0141] step 2002, the control point placement is initiated; the user selects the approximate control points in both volumes. In step 2004, the geometric mapping is computed. The first time step 2004 is performed, it is performed in stage 1, with rotation and translation only; the transition to stage 2 (full affine transformation, including scale) and stage 3 (thin plate spline warping) will be explained below. In step 2006, interpolated mapped volumes are computed. In step 2008, a 2D join histogram is constructed. In step 2010, the mutual information is computed. In step 2012, it is determined whether I is maximized. If not, then in step 2014, the control points are moved according to the optimizer, and the algorithm loops back to step 2004. If I is maximized, then in step 2016, it is determined whether the last stage out of stages 1-3 has been performed. If not, then in step 2018, the control points are created for the next stage, and the algorithm loops back to step 2004, which is carried out for the next stage. If the last stage has been performed, then in step 2020, the algorithm stops. After registering and transforming the reformatted rotational scan onto the linear scan, combining the two volumes into one creates a compounded image volume. The most common techniques for combination of overlapping data are the maximum and mean. However, those methods do not take into account the relative confidence in the data, which may vary with different scans. So in the present invention, an intelligent compounding method is used. In that method, the information content of each slice of one image volume is computed and compared with the information content of the corresponding slice of the other image volume. The contributions of the single scan images to the compounded image is determined by the information content of each. The intensity value of each pixel in the compounded image is a weighted mean of the intensity of corresponding pixels in both single scan images. That method is different from the weighted spatial compounding proposed by Leotta and Martin (2000) in that their weighting is based on incidence angle of the ultrasound beam, which is an exterior determining factor, whereas the weighting of the present embodiment is based on the information content computed directly from the images, which is an interior determining factor. In the present embodiment, the weighting can be global, as shown in FIG. 22A, or regional, as shown in FIG. 22B. The information content can be estimated by entropy H, as described above. Weighted compounding at sub-image (or pixel) level, e.g., moving region-of-interest (ROI) that scans the image plane, can be used in the entropy formalism, so that the compounded image plane contains differential weighting of the two original image sets based on regional information content.
  • FIGS. 21A and 21B show, respectively, one of the sagittal images from the linear scan, and one of the sagittal images from the rotational scan. They are registered in the same sagittal plane by observing the whole series of images from both volumes and matching certain distinct marks, e.g. the urethra, the edges of the phantom, the bright line on the top of the images formed by the echoes from the margin of the phantom container, and some clearly visible seeds. A simple compounding by averaging the images of FIGS. 21A and 21B forms the compounded image shown in FIG. 21C. All four seeds along that needle track are clearly visible on the compounded image of FIG. 21C, but not on the image of FIG. 21A or [0142] 21B individually. The signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) are used to measure the image statistics of the single scan images and the compounded image. SNR is defined as
  • SNR=μ/σ
  • where μ and σ are the mean and standard deviation (STD) of the intensity, respectively. CNR of object to background CNR[0143] o−b is defined as
  • CNR o−b=(μo−μb)/σb
  • where μ[0144] o and μb are the mean of object and background intensity, respectively, and σb is the standard deviation (STD) of the background. The table below lists the image statistics of seeds, phantom, and background, respectively, and computes the CNR of seeds to phantom CNRs−p and phantom to background CNRp−b, respectively. The table shows that the CNRs−p, CNRp−b, and SNR of the compounded image are 1.39, 1.36, and 1.41 times those of the average values of the two single scan images, respectively. Those numbers are very close to the theoretical estimation that SNRN={square root}{square root over (NSNR)}0, where SNRN and SNR0 are the SNR of the compounded image averaged from N views and of the single uncorrelated views, respectively. That indicates that the image volumes from the linear scan and the rotational scan are nearly uncorrelated, which is just expected for increasing SNR of the compounded image and thus reducing the speckle noise.
  • List of the image statistics of seeds, phantom and background of linear scan, rotational scan, and orthogonally compounded image. [0145]
    Ratio of the CNR/
    SNR of compounded
    image divided by the
    average values of the
    Linear Rotational Com- two single scan
    scan scan pounded images
    Mean of seeds 198 128 164
    μs
    Mean of 103 56 80
    phantom μp
    Mean of back- 18 15.7 16
    ground μb
    STD of 18 15 12
    phantom σp
    STD of back- 7 3.5 4
    ground σb
    CNRs-p 5.3 4.8 7.0 1.39
    CNRp-b 12.1 11.5 16.0 1.36
    SNR of 5.72 3.73 6.67 1.41
    phantom
  • That experiment strongly validates the principle that the orthogonally compounded image volume is better than any single scan image volume for both prostate segmentation and the detection of heterogeneities such as tumor modules or seeds. It suggests that tissue typing by OCU in heterogeneous tissues (focal lesion in prostate) may be more efficient than conventional TRUS. The dual-scan OCU imaging can be used for improving the image quality in prostate brachytherapy and biopsy. [0146]
  • A normalized statistical tumor probability model (NSTPM) aimed to help effective biopsy based on OCU imaging for tumor detection can be used in the present embodiment. The model is based on a large number (e.g., 100) of individual prostate specimens with localized cancers. It not only provides statistical information on a 3D spatial distribution of prostate cancers, but also shows the probability of cancer detection at each location in the prostate. During the biopsy procedure, the probability model can be dynamically mapped into OCU images during in vivo TRUS-guided needle biopsy by dynamically registering the OCU images with the 3D model and overlaying them together. Since the new biopsy technique is based on statistical analysis of a quantitative database of digitized prostate specimens, it is expected to significantly improve the accuracy of prostate cancer detection. [0147]
  • To construct the statistical probability model of prostate tumor distribution, we first collect pathological information about the tumor distributions of real cases. Sample distributions are obtained immediately after the surgical removal of the gland (radical prostatectomy). We perform whole-mounted pathological analysis of step-sectioned prostatectomy samples. The prostate specimens are sectioned at even intervals (e.g. 2.25 mm). Pathologists identify and grade all areas of cancer using Gleason's method of scoring. A high resolution image of each whole mount slide is captured using a microscopy system. Each digitized slice is then segmented to identify key pathological structure including surgical margin, capsule, urethra, seminal vesicles as well as tumors. The pathologists provide independently reviewed whole-mount slices of the prostate, trace the borders of the prostate and identify the circumference of each prostate cancer, which is assigned a Gleason's score. [0148]
  • Based on the current clinical conventions for TRUS-guided prostate biopsy, a prostate gland is divided into different zones (instead of dealing with individual pixel points because the pixel-based statistical analysis is easily disturbed by the noise and the final results can not be directly used by urologists or computer software) that are accessible by the urologists under the guidance of OCU imaging. The cancer inside each of those zones is calculated by calculating the statistical probability from the cases. The result is a bounding box of a prostate gland with a certain number of zones each with information on spatial tumor probability in it. [0149]
  • A neural network is used to realize the adaptive sub-classification of the NSTPM because it is affected by some other parameters such as the patient's age, PSA level, gland volume, etc. As a general nonlinear approach to pattern recognition and classification, neural network computational models contain layers of simple interconnected computing nodes that operate as nonlinear summing devices. Those nodes are organized into layers and interconnected by weighted connection lines. From each layer, every neuron is connected to all neurons in the next layer. The weight values on the connection lines are adjusted when data is presented to the network during a training process. Successful training can result in an artificial neural network that performs a classification. The most popular neural network architecture is the multi-layered perception (MLP) with three layers of neurons. An input layer receives the data to be analyzed. The second layer, consisting of hidden units, organizes weighted combinations of inputs, or “features” that aid in discriminating the different categories in a classification. The output layer then combines those features appropriately to perform the classification. [0150]
  • Many applications of neural networks in medicine and medical decision have been reported. For example, in a U.S. Pat. No. (6,025,128), titled “Prediction of prostate cancer progression by analysis of selected predictive parameters,” a neural network was used to analyze and interpret cell morphology data. Utilizing Markovian actors and other biomarkers as parameters, the network is first trained with a set of cell data from known progressors and known non-progressors, and is then used to predict prostate cancer progression in patient samples. In another U.S. Pat. No. (6,004,267), titled “Method for diagnosing and staging prostate cancer,” a neural network was used for providing prostate cancer stage information for a patient based upon input data which includes the patient's preoperative serum PSA, biopsy Gleason score, and systematic biopsy-based information. [0151]
  • Unlike those patents, the present embodiment uses a neural network to construct a normalized statistical tumor probability model (NSTPM), which is aimed to help effective biopsy, not aimed to detect tumor directly from the prostate by neural networks. It is more realistic and applicable. This is additionally aided by improved imaging of the OCU modality. [0152]
  • FIG. 23 shows the neural network used for sub-classification of the NSTPM. As noted above and as known in the art of neural networks, the neural network [0153] 2300 includes an input layer 2301, at least one hidden layer 2304, and an output layer 2306. The neurons 2308 in the input layer 2302 receive the raw numbers from the related parameters, such as the patient's age, PSA level, gland volume, etc. Those numbers are weighted and combined by the hidden neurons 2310 in the hidden layer or layers 2304 of the network 2306, which extract features useful for classification. Then, values calculated by the hidden neurons 2310 are weighted and combined by the output neurons 2312 in the output layer 2306, to decide on the classification. The present example uses four output neurons 2312, which represent different properties of the anatomical zones in pathological analysis. Those neurons 2312 represent benign, malignant, malignant Gleason 1-3 and malignant Gleason 4-5. Based on that NN model, corresponding network structures are obtained for various anatomical zones and the final NN structure precisely reflects the relationship between the parameters with the tumor distribution probability in different zones.
  • The network is trained by supervised learning, in which the correct classification is presented to the output layer during training, and the weights are adjusted so as to make the network's output more closely match the correct classification at each point. After training, the network outputs the classification when presented with an input vector at its input layer. The fundamentals of neural network training are known in the art and will therefore not be described in detail here. It is important to avoid “data over-training” of the dataset. To accomplish that goal, we divide the original dataset into three parts. The first part is the training set, which is used to train the NN. The second part is the testing set, or verification set, which is used to test when to stop training the network. Typically, the network's error when applied to the testing set decreases during training until the point at which over-training begins, and at that point, the error on the testing set starts to increase. In a well-controlled study, training is stopped when the error on the testing set is minimized. The third set is the validation set, which is used to measure the performance on the trained network. The validation set is completely independent with both the training and testing sets and thus gives a suitable benchmark on the NN performance. If the validation set is used to train or to determine the stopping point for training, then the performance of the network on the validation set would be a predictor for new, independent data. [0154]
  • Performance of the trained network is then measured by a root-mean-square-error (RMS error) calculation, accuracy, sensitivity and selectivity calculations, receiver operating characteristics (ROC) curve, and prediction/confidence intervals. The RMS error shows literally the differences between the desired output for the neural network and its actual output. The prediction and/or confidence interval reflect the amount of variation inherent in the training and calculation methodologies of the neural network. The sensitivity is the fraction of data items that the network classifies as containing a feature of interest taken out of all the items that have that feature of interest. The selectivity is the fraction of data items that the network classifies as not containing a feature of interest taken from the entire set of data that does not in fact contain that feature. It is important to set the optimized tradeoff between sensitivity and selectivity. Usually, a higher sensitivity causes a lower selectivity, and vice versa. The ROC (receiver operating characteristics) curve can provide that tradeoff. When a neural network is applied to a classification problem, an entire ROC curve can be calculated based on its output when the data set is presented to the network. Each point along that ROC has a pair of values of sensitivity and selectivity. The best point along the curve can be selected, taking into account the clinical consequences of high or low sensitivity versus selectivity. Usually, a high sensitivity is selected, and a lower selectivity is accepted. [0155]
  • The 3D NSTPM is built and saved in the computer, in which the successive slices of the prostate are put together into regions such as prostate, suspicious areas, high Gleason areas, and low Gleason areas, and landmarks such as urethra. [0156]
  • The NSTPM is used during biopsy as shown in the flow chart of FIG. 24. At the beginning of the biopsy, in [0157] step 2402, a dual-scan of the prostate is performed and orthogonal compound image slices of the prostate obtained. Then, in step 2404, segmentation of prostate boundary, rectum and urethra is performed on that OCU image series. Based on the structure information of prostate boundaries and urethra, we register the NSTPM into that real prostate and obtain the tumor probability distribution in that prostate in step 2406. An advanced 3D overlay display ability is employed in the system. The series of boundaries of the prostate segmented from the OCU images are reconstructed into a 3D surface model in step 2408. The 3D tumor probability model is registered with the 3D surface model in step 2410. For easy visual interface, here each zone of the 3D tumor probability model is color-coded according to the value of its probability in step 2412. For example, the zone with highest probability is coded as red, the second highest zone as pink and so on. The zone with lowest probability can be coded as blue. As such, the urologists can easily navigate the whole prostate in step 2412 and pick up the best candidate areas for biopsy by dynamically moving and/or turning the virtual US beam and making it intersect with the candidates in 3D space. Of course, any other suitable coding could be used. Then the biopsy is optimized to sample the high Gleason area as much as possible.
  • (3) Statistical Analysis of Texture Features [0158]
  • In determining needle and seed displacement, it is helpful to be able to determine the seeds quickly and accurately. To that end, another preferred embodiment of the present invention provides a statistical technique for doing so. [0159]
  • In order to find effective features and their optimized combination for the purpose of identifying seeds, we applied Logit and Probit models, discriminant analysis, and neural network classification methods to two test groups. Logit and Probit models are known in various fields, such as sociology, but have not yet been applied in the same manner as in the present embodiment. One group represents the seeds group which are identified manually from the TRUS images, the other group represents the non-seeds group including typical background noises, calcifications, bleeding, air gaps, and shadows. The selection of the two groups is confirmed by the [0160] day 0 CT (i.e., CT images taken within the same day of the actual implant): the positions of the radioactive seeds in the CT images are identified automatically or manually, and affine-transformed to match the TRUS image series. Some clearly visible seeds in the TRUS image series can be visually identified and used as markers to correct the registration. The seed positions identified from CT are affine-transformed until they achieve a best match with those identified from TRUS.
  • For each test element in both groups, we compute some numerical features from its 3 dimensional neighboring area of 2.5 mm×2.5 mm×3.5 mm block, some features from 2 dimensional slice of 2.5×2.5 mm neighboring area. A seed template is designed to correlate with the sub-image area in the TRUS image series, and the correlation coefficient is used as a numerical feature. The seed template is a bright central part surrounded by dark background. The whole area is also 2.5×2.5 mm, while the central part is a seeds echo shaped area, as shown in FIGS. [0161] 25A-25C. Since the seeds in different parts of the prostate may appear as bright echo with different orientations, different seed templates may be created for different prostate positions.
  • FIGS. [0162] 25A-25C show only three different seed templates corresponding to the orientations of seeds in the left, middle, and right parts of the prostate, respectively. Those seed templates are used to compute correlation coefficients with sub images. More templates can be used for more orientations; therefore, FIGS. 25A-25C are illustrative rather than limiting.
  • All numerical features of both groups are computed and used as regressors in the logit and probit models. A binary dependent variable Y is created by assigning 1 to the seeds group and 0 to the non-seeds group respectively. The regression coefficients, standard errors, t-statistics, and probability are computed. Normally, a probability lower than 0.05 is taken as strong evidence of rejection of a hypothesis that the true coefficient is zero. So, we keep those features with probability lower than 0.05. Those features are: maximum, maximum minus mean, standard deviation, correlation coefficient, skewness, kurtosis, and area divided by Euler number. [0163]
  • FIG. 26 shows a flow chart of the process of finding optimized weight coefficients a[0164] i (i=1, . . . ,N) of numerical features vl (i=1, . . . ,N) to identify a manually selected seeds group and a manually selected non-seeds group. The process uses a neural network, whose structure and training method have already been disclosed with reference to FIG. 23. The data from the seeds group (Y=1) and the non-seeds group (Y=0) are supplied in steps 2602 and 2604, respectively. In step 2606, a logit or probit model is used for selecting the proper features vi, i=1, . . . , N, which may be useful for identifying seeds and non-seeds. In step 2608, those features are supplied to a neural network to find the optimized weight coefficients of those features. The outputs of the neural network include true positives, false negatives, true negatives and false positives in steps 2610, 2612, 2614 and 2626, respectively. From those outputs, the sensitivity, specificity and efficiency of the weight coefficients are determined in steps 2618, 2620 and 2622, respectively. Those quantities are used in a feedback loop to optimize the weight coefficients; the efficiency is used directly, while the sensitivity and specificity are used through an ROC curve in step 2624.
  • Then we use discriminant analysis and neural network as non-linear classification methods to the hyper-space constructed by the multiple features. We calculated receiver operating characteristics (ROC) curve and then selected the best combination of those features. The network is trained by supervised learning, in which the correct classification is presented to the output layer during training, and the weights are adjusted so as to make the network's output more closely match the correct classification at each point. After training, the network outputs the classification when presented with an input vector at its input layer. Performance of the trained network is then measured by receiver operating characteristics (ROC) curve. When a neural network is applied to a classification problem, an entire ROC curve can be calculated based on its output when the data set is presented to the network. Each point along that ROC has a pair of values of sensitivity and 1-specificity. The best point along the curve can be selected, taking into account the consequences of high or low sensitivity versus specificity. [0165]
  • Finally, since the exact weights of those features may be different for various degrees of TRUS image quality due to different ultrasound machines, different machine settings, etc, we optimize the weights of the features by finding a best match to a [0166] day 0 CT seeds identification results. Up to now, the features we used are maximum minus mean, standard deviation, correlation coefficient, area divided by Euler number, skewness, kurtosis, and band pass filtering by using FFT.
  • The relevant statistical quantities will now be defined. It will be seen that each of the quantities builds on previous quantities. [0167]
  • Definition of Mean: For univariate data Y[0168] 1, Y2, . . . , YN, the formula for the mean is: Y _ = i = 1 N Y i N
    Figure US20030065260A1-20030403-M00004
  • where N is the number of points. [0169]
  • Definition of standard deviation: For univariate data Y[0170] 1, Y2, . . . , YN, the formula for the standard deviation is: s = i = 1 N ( Y i - Y _ ) 2 N
    Figure US20030065260A1-20030403-M00005
  • Definition of Skewness: For univariate data Y[0171] 1, Y2, . . . , YN, the formula for skewness is: skewness = i = 1 N ( Y i - Y _ ) 3 ( N - 1 ) s 3
    Figure US20030065260A1-20030403-M00006
  • Definition of Kurtosis: For univariate data Y[0172] 1, Y2, . . . , YN, the formula for kurtosis is: kurtosis = i = 1 N ( Y i - Y _ ) 4 ( N - 1 ) s 4
    Figure US20030065260A1-20030403-M00007
  • Definition of correlation coefficient: For two univariate data X and Y, the formula for the correlation coefficient is: [0173] r = Σ X Y - Σ X Σ Y N ( Σ X 2 - ( Σ X ) 2 N ) ( Σ Y 2 - ( Σ Y ) 2 N )
    Figure US20030065260A1-20030403-M00008
  • Definition of area divided by Euler number: the sub-image will first be changed to a binary image by thresholding, then compute the area and Euler number of that binary image. Area is the number of “on” pixels in the image, and Euler number is the number of objects in the image minus the total number of holes in those objects. [0174]
  • The two-dimensional needle pattern can be manually adjusted before applying automatic needle tracking algorithm. After capturing the final scan image series, the automatic needle tracking starting positions are important to the algorithm. Because of prostate swelling and seed migration, the needle tracking starting positions may be different from the previous positions clicked on the computer screen during the needle insertion procedure. So the needle pattern must be shifted and scaled as a group, and any single needle can be adjusted in its position as desired according to the current TRUS image series. The user interface explained above, or any other suitable user interface, can be used. [0175]
  • FIGS. [0176] 27A-27D illustrate the process. FIG. 27A shows the original needle pattern. FIG. 27B shows needle pattern expansion; FIG. 27C, shift; and FIG. 27D, single needle movement. There is no required order of scale and shift movement, scale movement and the shift movement may be alternatively performed several times for best match.
  • A global search method can be used to identify radioactive seeds in TRUS image series. The global search method can be used to identify radioactive seed candidates in TRUS image series and use their positions to correct minor bias of the seed positions identified by needle tracking/seed segmentation algorithm. Alternatively, the global search method can be used to identify radioactive seed candidates in TRUS image series and use their positions and planned needle pattern to match out the final needle pattern. [0177]
  • After the prostate boundaries are drawn, either manually or automatically, a 3D global search is executed within the prostate boundaries. It searches the local maximum in the 3D data cube, which is composed of the measurements of each 3D coordinate. (For a different imaging technology, the local minimum could be used instead.) The measurement is a weighted summation of various image features in a small roll area of 4.5 mm long, 1 mm width, and 1 mm height, just like the geometry of a seed. FIGS. [0178] 28A-28C show an example: FIG. 28A is an original ultrasound image; FIG. 28B is the grayscale transformed image, in which the radioactive seeds echo are more distinct; FIG. 28C is the measurement of the grayscale summation of each points, from which one can see the clear relationship between the radioactive seed echo bright points and the local maximum of the measurement matrix. That is only an example. The measurement, in fact, can be a weighted summation of various image features.
  • That method can identify most clearly visible radioactive seeds in TRUS image series. However, it can also mis-idenfity some other echo bright points such as calcifications, bleeding, air gaps, etc, as radioactive seeds. So the identifying results may be only used as “seed candidates,” but not radioactive seeds themselves. The seed candidates can be used either to match out the final needle pattern with the comparison of planned needle pattern (or manually clicked needle pattern) or to correct minor bias of the seed positions identified by needle tracking/seed segmentation algorithm. [0179]
  • For the former use, the 3D pattern of the seed candidates set is projected onto a 2D plane and matched by the 2D pattern of needles, which is drawn on the screen in real time when the needle is inserted. Since the 2D needle pattern can be adjusted as a whole for shift and scale changes, and each needle position can be adjusted separately, the final needle pattern may be obtained in reference to the 2D projection pattern of the seed candidates. [0180]
  • For the latter use, since there may be some minor bias when tracking needles, the final detected seed positions may not be exactly on the 3 dimensional geometry center of the radioactive seeds. So we can compare their positions with the seed candidates; if any pair is within a distance threshold, for example, 0.15 cm in 3 dimensional, we can consider them as one radioactive seed and merge it to the position of the seed candidate. By such method, we can correct some minor bias when using the needle tracking based identifying algorithm. [0181]
  • While various preferred embodiments of the present invention have been set forth above in detail, those skilled in the art who have reviewed the present disclosure will readily appreciate that other embodiments can be realized within the scope of the present invention. For example, the numerical values and statistical parameters set forth above should be construed as illustrative rather than limiting. The same is true of the arrangement of the user interface of FIG. 3. Also, whenever radioactive seeds are specified, it will be understood that other radioactive sources or other therapeutic agents can be used instead. Furthermore, two sets of images taken in orthogonal directions can be replaced, whenever feasible, by two sets of images taken at a large angle, or any non-zero angle, to each other. Moreover, even though various preferred embodiments are taught separately, it will be understood that they can be used together as needed. Therefore, the present invention should be construed as limited only by the appended claims. [0182]

Claims (80)

We claim:
1. A method for identifying and quantifying departures in placement of needles or catheters from intended placements in a treatment plan for treating a bodily organ, the needles or catheters carrying therapeutic agents for insertion into the bodily organ for use in the treatment plan, the method comprising:
(a) inputting the intended placements into an intraoperative tracking interface;
(b) for at least one needle or catheter, calculating a difference in an x-y plane between the intended placement of that needle or catheter and an actual placement of that needle or catheter;
(c) calculating, from the difference calculated in step (b), a position error for each of the therapeutic agents;
(d) adjusting positions of therapeutic agents along each needle or catheter by an amount determined in accordance with the position error calculated in step (c); and
(e) determining positions of a deposited number of therapeutic agents through imaging.
2. The method of claim 1, further comprising:
(f) assigning a confidence level to each of the positions determined in step (e); and
(g) for each therapeutic agent whose position is assigned a confidence level exceeding a predetermined threshold, recalculating a dosimetry associated with that therapeutic agent.
3. The method of claim 2, further comprising:
(h) adjusting positions of the therapeutic agents having low confidence values assigned in step (t) in accordance with x-ray imaging;
(i) recalculating a dosimetry associated with all of the therapeutic agents; and
(j) assigning a confidence level to the dosimetry recalculated in step (i).
4. The method of claim 1, wherein the intraoperative tracking interface permits an operator to select a needle or catheter for step (b).
5. The method of claim 1, wherein the intraoperative tracking interface displays at least one isodose plot of the bodily organ.
6. The method of claim 1, wherein step (e) is performed through real-time ultrasound imaging of the bodily organ.
7. The method of claim 6, wherein:
the intraoperative tracking interface permits an operator to select a needle or catheter; and
the real-time ultrasound imaging is performed in a direction of the needle or catheter selected by the operator.
8. The method of claim 7, wherein the real-time ultrasound imaging results in a column of ultrasound images along the needle or catheter selected by the operator.
9. The method of claim 8, wherein the column of ultrasound images undergoes gray-scale preprocessing.
10. The method of claim 9, wherein the gray-scale preprocessing corrects a gray-scale histogram of the column of ultrasound images to produce gray-scale corrected ultrasound images.
11. The method of claim 10, wherein the gray-scale corrected ultrasound images are used to find locations of the therapeutic agents along the needle or catheter selected by the operator.
12. The method of claim 11, wherein the locations of the therapeutic agents found in the gray-scale corrected ultrasound images are used for further correction of the positions of the therapeutic agents.
13. The method of claim 11, wherein a number of therapeutic agents whose locations are found in the gray-scale corrected ultrasound images is compared to a number of therapeutic agents which are actually along the needle or catheter selected by the operator.
14. The method of claim 13, wherein, when the numbers of therapeutic agents are not equal, a column width of the column of ultrasound images is changed, and the column of ultrasound images is taken again.
15. The method of claim 1, wherein the bodily organ is a prostate.
16. A system for identifying and quantifying departures in the placement of needles or catheters from intended placements in a treatment plan for treating a bodily organ, the needles or catheters carrying therapeutic agents for insertion into the bodily organ for use in the treatment plan, the system comprising:
an imaging device for imaging the bodily organ;
an intraoperative tracking interface comprising a display; and
a computing device, in electronic communication with the intraoperative tracking interface, for:
(a) inputting the intended placements into the intraoperative tracking interface;
(b) for at least one needle or catheter, calculating a difference in an x-y plane between the intended placement of that needle or catheter and an actual placement of that needle or catheter;
(c) calculating, from the difference calculated in step (b), a position error for each of the therapeutic agents;
(d) adjusting positions of therapeutic agents along each needle or catheter by an amount determined in accordance with the position error calculated in step (c); and
(e) determining positions of a deposited number of therapeutic agents through imaging carried out through the imaging device.
17. The system of claim 16, wherein the computing device also performs the following:
(f) assigning a confidence level to each of the positions determined in step (e); and
(g) for each therapeutic agent whose position is assigned a confidence level exceeding a predetermined threshold, recalculating a dosimetry associated with that therapeutic agent.
18. The system of claim 16, wherein the imaging device comprises an x-ray imaging device, and wherein the computing device further performs the following:
(h) adjusting positions of the therapeutic agents having low confidence values assigned in step (f) in accordance with x-ray imaging carried out through the x-ray imaging device;
(i) recalculating a dosimetry associated with all of the therapeutic agents; and
(j) assigning a confidence level to the dosimetry recalculated in step (i).
19. The system of claim 16, wherein the intraoperative tracking interface comprises an input device which permits the operator to select a needle or catheter for step (b).
20. The system of claim 16, wherein the intraoperative tracking interface displays at least one isodose plot of the bodily organ.
21. The system of claim 16, wherein:
the imaging device comprises a real-time ultrasound imaging device; and
the computing device performs step (e) through real-time ultrasound imaging of the bodily organ by the real-time ultrasound imaging device.
22. The system of claim 21, wherein:
the intraoperative tracking interface permits an operator to select a needle or catheter; and
the real-time ultrasound imaging is performed in a direction of the needle or catheter selected by the operator.
23. The system of claim 22, wherein the real-time ultrasound imaging results in a column of ultrasound images along the needle or catheter selected by the operator.
24. The system of claim 23, wherein the computing device performs gray-scale preprocessing on the column of ultrasound images.
25. The system of claim 24, wherein the gray-scale preprocessing corrects a gray-scale histogram of the column of ultrasound images to produce gray-scale corrected ultrasound images.
26. The system of claim 25, wherein the gray-scale corrected ultrasound images are used to find locations of the therapeutic agents along the needle or catheter selected by the operator.
27. The system of claim 26, wherein the locations of the therapeutic agents found in the gray-scale corrected ultrasound images are used for further correction of the positions of the therapeutic agents.
28. The system of claim 26, wherein a number of therapeutic agents whose locations are found in the gray-scale corrected ultrasound images is compared to a number of therapeutic agents which are actually along the needle or catheter selected by the operator.
29. The system of claim 28, wherein, when the numbers of therapeutic agents are not equal, a column width of the column of ultrasound images is changed, and the column of ultrasound images is taken again.
30. A method for generating a 3D orthogonal compound ultrasound image volume of a patient prostate, with high contrast and detailed information of prostate anatomy, tumor lesions, and implanted radioactive sources if any, comprising the steps of:
(a) providing a bi-plane ultrasound transducer having a transverse imaging function and a longitudinal imaging function;
(b) using the transverse imaging function of the bi-plane ultrasound transducer, moving the transducer linearly within the patient rectum in the cranio-caudal direction to acquire a first image set of images of transverse view of the prostate;
(c) using the longitudinal imaging function of the bi-plane transducer, rotating the transducer within the rectum to acquire a second image set of images of sagittal-coronal views of the prostate, which are orthogonal to the transverse view images obtained in step (b);
(d) measuring the positions and orientations of the ultrasound transducer in real-time while the images are taken in steps (b) and (c);
(e) reconstructing the first and second image sets to 3D image volumes with a common voxel size;
(f) registering the two 3D image volumes reconstructed in step (e); and
(g) combining the two 3D image volumes which have been registered in step (f) by using an intelligent compounding method.
31. The method of claim 30, wherein step (d) is performed by using an electromagnetic sensing system.
32. The method of claim 30, wherein step (d) is performed by using a mechanical encoder.
33. The method of claim 30, wherein step (d) is performed by using an optical positioning system.
34. The method of claim 30, in which the transducer comprises a switch for switching the transducer between the transverse imaging function and the longitudinal imaging function, and wherein at least one of steps (b) and (c) comprises pressing the switch.
35. The method of claim 30, step (d) comprises registering actual coordinates and orientations of the transducer with each image at the instant of acquisition of said each image, whereby mis-registration is eliminated even when the transducer movement is very rapid.
36. The method of claim 30, wherein the longitudinal images are reconstructed into a 3D image volume by using the destination-oriented method for every x-y plane, and then built up by using a look-up table repeatedly for each successive plane.
37. The method of claim 30, wherein the two 3D image volumes are registered by using non-rigid image registration methodology of grayscale data sets.
38. The method of claim 30, wherein the two 3D image volumes are registered by using a maximization of mutual information (MMI) of a joint histogram formed between the two 3D image volumes.
39. The method of claim 38, wherein a feature of high image intensity is used to reduce the number of grayscale bins when computing the joint histogram.
40. The method of claim 30, wherein step (g) is performed by use of a compounding weight, the compounding weight being based on regional image information content.
41. A method for generating a 3D orthogonal compound ultrasound image volume of a human organ, the method comprising:
(a) providing an ultrasound transducer capable of taking a first set of images in a first direction and a second set of images in a second direction which is orthogonal or at an angle to the first direction;
(b) taking the first set of images of the organ;
(c) taking the second set of images of the organ;
(d) registering the first set of images with the second set of images; and
(e) combining the first and second sets of images which have been registered in step (d) to produce the ultrasound image volume.
42. An ultrasound imaging system for generating 3D orthogonal compound ultrasound images for the diagnosis and treatment of prostate cancer, comprising:
(a) a bi-plane ultrasound transducer capable of transverse imaging and longitudinal imaging of the prostate to produce a transverse image set and a longitudinal image set;
(b) a measuring system for measuring the position and orientation of thetransducer;
(c) a rigid bio-compatible sheath which covers the ultrasound transducer to isolate motion of the transducer from the prostate; and
(d) an image processing work station for acquiring the transverse image set and the longitudinal image set and for producing an image of the prostate from the transverse image set and the longitudinal image set.
43. A normalized statistical tumor probability model (NSTPM) aimed to help effective biopsy for tumor detection in an organ of a patient, the model comprising:
an input layer for receiving raw data concerning the patient;
at least one hidden layer for extracting features from the raw data; and
an output layer for weighting and combining the features extracted by the at least one hidden layer to produce a classification of the organ for the biopsy.
44. The normalized statistical tumor probability model (NSTPM) of claim 43, wherein the organ is a prostate.
45. A method to re-optimize a dosimetry plan for implantation of therapeutic agents into an organ of a patient after partial or complete implantation of an original implant plan, the method comprising:
(a) determining locations of therapeutic agents which have been implanted in the partial or complete implantation;
(b) evaluating a dosimetry of the locations determined in step (a); and
(c) re-optimizing the dosimetry plan in accordance with the dosimetry evaluated in step (b).
46. The method of claim 45, wherein the locations are determined in step (a) such that each of the locations has an uncertainty level.
47. The method of claim 46, wherein step (a) is performed automatically using a computer.
48. The method of claim 46, wherein step (a) is performed by taking an ultrasound image and performing a human inspection of the ultrasound image.
49. The method of claim 45, wherein step (c) is performed by a computer algorithm.
50. The method of claim 45, wherein step (c) is performed by human inspection through trial-and-error.
51. A method of programming an artificial neural network to recognize objects in images, the method comprising:
(a) providing first and second sets of image data, the objects being present in at least one of the first and second sets of image data;
(b) calculating numerical values of features in the first and second sets of image data;
(c) supplying the numerical values of the features to the artificial neural network and controlling the artificial neural network to recognize the objects;
(d) identifying correct and incorrect identifications of the objects made by the artificial neural network in step (c); and
(e) using the correct and incorrect identifications to train the artificial neural network through feedback.
52. The method of claim 51, wherein the correct and incorrect identifications comprise true positives, false negatives, true negatives and false positives.
53. The method of claim 51, wherein the objects are therapeutic agents in an organ of a patient.
54. The method of claim 53, wherein the numerical values of the features comprise a correlation coefficient of one of the objects with a template.
55. The method of claim 54, wherein a plurality of templates are provided for different orientations of the objects.
56. The method of claim 53, wherein the numerical values of the features comprise statistical parameters of selected sub-images in the first and second sets of image data.
57. The method of claim 56, wherein the statistical parameters comprise frequency domain parameters.
58. The method of claim 56, wherein the statistical parameters comprise geometry parameters.
59. The method of claim 1, wherein step (a) comprises receiving a manual adjustment of a position of at least one of the needles or catheters.
60. The method of claim 59, wherein the manual adjustment comprises an expansion of a pattern of the intended placements.
61. The method of claim 59, wherein the manual adjustment comprises a shift of a pattern of the intended placements.
62. The method of claim 59, wherein the manual adjustment comprises an adjustment of a single one of the needles or catheters.
63. The method of claim 1, wherein step (e) is performed using a global search method.
64. The method of claim 63, wherein the global search method is used to provide candidate locations of the therapeutic agents to correct a bias in a tracking or segmentation algorithm.
65. The method of claim 63, wherein the global search method is used to provide candidate locations of the therapeutic agents, and wherein the candidate locations of the therapeutic agents are used with an original plan of the needles or catheters to locate actual positions of the needles or catheters.
66. The method of claim 1, wherein the global search method is performed on a grayscale transform of an ultrasound image.
67. A method for adjusting a treatment plan for treating a bodily organ, the treatment plan comprising insertion of needles or catheters carrying therapeutic agents into the bodily organ, the method comprising:
(a) providing a computing device with a user interface;
(b) entering positions of the needles or catheters into the computing device;
(c) taking at least one image of the bodily organ and inputting the at least one image into the computing device; and
(d) using the user interface to adjust at least one of the positions entered in step (b) in accordance with the at least one image input in step (c).
68. The method of claim 67, wherein the at least one image comprises an ultrasound image.
69. The method of claim 67, wherein step (d) comprises expanding a pattern of the positions.
70. The method of claim 67, wherein step (d) comprises shifting a pattern of the positions.
71. The method of claim 67, wherein step (d) comprises adjusting a single one of the positions.
72. The method of claim 67, further comprising:
(e) after step (d), performing an automatic tracking of the needles or catheters in accordance with the at least one position adjusted in step (d).
73. A method for locating a therapeutic agent which have been inserted into a bodily organ, the method comprising:
(a) providing a computing device with a user interface;
(b) taking at least one image of the bodily organ and inputting the at least one image into the computing device;
(c) drawing a boundary of the bodily organ;
(d) performing a global search in a portion of the at least one image contained within the boundary for at least one local extremum; and
(e) determining a location of the therapeutic agent in accordance with the at least one local extremum.
74. The method of claim 73, wherein step (c) is performed manually.
75. The method of claim 73, wherein step (c) is performed automatically.
76. The method of claim 73, wherein the at least one image comprises at least one ultrasound image.
77. The method of claim 73, wherein the at least one local extremum is a local maximum.
78. The method of claim 73, wherein step (e) comprises:
(i) determining a candidate location of the therapeutic agent; and
(ii) using the candidate location to correct a bias in a location of the therapeutic agent determined by another technique.
79. The method of claim 78, wherein the other technique comprises a needle tracking and segmentation algorithm.
80. The method of claim 73, wherein step (e) comprises:
(i) determining a candidate location of the therapeutic agent; and
(ii) using the candidate location and a planned needle pattern to determine a final needle pattern.
US10/193,718 2000-04-28 2002-07-12 Identification and quantification of needle and seed displacement departures from treatment plan Abandoned US20030065260A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/193,718 US20030065260A1 (en) 2000-04-28 2002-07-12 Identification and quantification of needle and seed displacement departures from treatment plan

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US20049300P 2000-04-28 2000-04-28
US09/636,551 US6438401B1 (en) 2000-04-28 2000-08-11 Indentification and quantification of needle displacement departures from treatment plan
US10/193,718 US20030065260A1 (en) 2000-04-28 2002-07-12 Identification and quantification of needle and seed displacement departures from treatment plan

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/636,551 Continuation-In-Part US6438401B1 (en) 2000-04-28 2000-08-11 Indentification and quantification of needle displacement departures from treatment plan

Publications (1)

Publication Number Publication Date
US20030065260A1 true US20030065260A1 (en) 2003-04-03

Family

ID=26895809

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/636,551 Expired - Lifetime US6438401B1 (en) 2000-04-28 2000-08-11 Indentification and quantification of needle displacement departures from treatment plan
US10/193,718 Abandoned US20030065260A1 (en) 2000-04-28 2002-07-12 Identification and quantification of needle and seed displacement departures from treatment plan

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/636,551 Expired - Lifetime US6438401B1 (en) 2000-04-28 2000-08-11 Indentification and quantification of needle displacement departures from treatment plan

Country Status (5)

Country Link
US (2) US6438401B1 (en)
EP (1) EP1278560A2 (en)
AU (1) AU2001251554A1 (en)
CA (1) CA2407577A1 (en)
WO (1) WO2001082995A2 (en)

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040066957A1 (en) * 2002-10-03 2004-04-08 Miller Edward A. System and method for comparing ultrasound images corresponding to two user-selected data points
US20040092786A1 (en) * 2002-11-08 2004-05-13 Memorial Sloan-Kettering Cancer Center Intraoperative dynamic dosimetry for prostate implants
US20040152981A1 (en) * 2000-11-24 2004-08-05 Shih-Ping Wang Method and system for instant biopsy specimen analysis
US20040225174A1 (en) * 2003-05-06 2004-11-11 Fuller Donald B. Method for computed tomography-ultrasound interactive prostate brachytherapy
US20040228509A1 (en) * 2003-05-15 2004-11-18 Beth Israel Deaconess Medical Center, Inc. Automatic spacial identification of tissue implanted linear sources using medical imaging
US6846282B1 (en) * 2000-06-09 2005-01-25 Varian Medical Systems, Inc. Brachytherapy apparatus and methods
US20050080328A1 (en) * 2002-06-04 2005-04-14 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
US20050102316A1 (en) * 2003-09-10 2005-05-12 Lawson Phillip W.Jr. Spherical modeling tool
US20050143777A1 (en) * 2003-12-19 2005-06-30 Sra Jasbir S. Method and system of treatment of heart failure using 4D imaging
US20050171428A1 (en) * 2003-07-21 2005-08-04 Gabor Fichtinger Registration of ultrasound to fluoroscopy for real time optimization of radiation implant procedures
US20050197568A1 (en) * 2002-03-15 2005-09-08 General Electric Company Method and system for registration of 3d images within an interventional system
US20050240882A1 (en) * 2004-04-21 2005-10-27 Ge Medical Systems Information Technologies, Inc. Method and system for displaying regions of pathological interest
US20050256402A1 (en) * 2002-09-27 2005-11-17 Olympus Corporation Ultrasonograph
US20060002631A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. ROI selection in image registration
US20060002601A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. DRR generation using a non-linear attenuation model
US20060002615A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20060002630A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US20060002632A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Motion field generation for non-rigid image registration
US20060030777A1 (en) * 2004-07-30 2006-02-09 Liang David H T-statistic method for suppressing artifacts in blood vessel ultrasonic imaging
US20060078195A1 (en) * 2004-10-13 2006-04-13 Regis Vaillant Method and system for registering 3D models of anatomical regions with projection images of the same
US20060079759A1 (en) * 2004-10-13 2006-04-13 Regis Vaillant Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system
US20060274885A1 (en) * 2005-06-02 2006-12-07 Hongwu Wang Treatment planning software and corresponding user interface
US20070053490A1 (en) * 2005-06-20 2007-03-08 Hongwu Wang Automatic generation of an envelope of constraint points for inverse planning
US20070055129A1 (en) * 2005-08-24 2007-03-08 Siemens Aktiengesellschaft Method and device for displaying a surgical instrument during placement thereof in a patient during a treatment
US20070078327A1 (en) * 2005-07-18 2007-04-05 Nucletron B.V. Method for identifying the location at least one treatment channel from a group of a plurality of treatment channels as well as a system for effecting radiation treatment on a pre-selected anatomical portion of an animal body
US20070167801A1 (en) * 2005-12-02 2007-07-19 Webler William E Methods and apparatuses for image guided medical procedures
US20070230756A1 (en) * 2006-02-10 2007-10-04 Xu Wen Method and apparatus for processing ultrasound image signals
US20080027354A1 (en) * 2006-07-28 2008-01-31 Lutz Gundel Method for improving the accuracy of a bronchial biopsy, and device for carrying out a method for improving the accuracy of a bronchial biopsy
US7330578B2 (en) 2005-06-23 2008-02-12 Accuray Inc. DRR generation and enhancement using a dedicated graphics device
US20080161687A1 (en) * 2006-12-29 2008-07-03 Suri Jasjit S Repeat biopsy system
US20080186378A1 (en) * 2007-02-06 2008-08-07 Feimo Shen Method and apparatus for guiding towards targets during motion
US20090009512A1 (en) * 2004-08-31 2009-01-08 Koninklijke Philips Electronics N.V. Imaging system
US20090022380A1 (en) * 2006-06-29 2009-01-22 Meide Zhao Sub-voxel motion correction for phase-contrast magnetic resonance imaging
US20090035218A1 (en) * 2007-04-13 2009-02-05 The Regents Of The University Of Michigan Systems and methods for tissue imaging
WO2009055813A1 (en) * 2007-10-25 2009-04-30 The Research Foundation Of State University Of New York Apparatus and method of detection and localized treatment of abnormal conditions
US20090214120A1 (en) * 2007-02-23 2009-08-27 Hiroshi Hashimoto Volumetric data connecting apparatus and method
US20090234237A1 (en) * 2008-02-29 2009-09-17 The Regents Of The University Of Michigan Systems and methods for imaging changes in tissue
US20090253987A1 (en) * 2008-04-02 2009-10-08 Medison Co., Ltd. Formation of an elastic image in an ultrasound system
US20100019918A1 (en) * 2006-05-02 2010-01-28 Galil Medical Ltd. Probe Insertion Guide with User-Directing Features
WO2010028379A2 (en) * 2008-09-05 2010-03-11 The Research Foundation Of State University Of New York Bladder wall thickness mapping for tumor detection
US20100086183A1 (en) * 2007-03-30 2010-04-08 Koninklijke Philips Electronics N.V. Treatment plan evaluation in radiotherapy by stochastic analysis of delineation uncertainty
US20100172559A1 (en) * 2008-11-11 2010-07-08 Eigen, Inc System and method for prostate biopsy
WO2010102119A1 (en) * 2009-03-04 2010-09-10 Imricor Medical Systems, Inc. Combined field location and mri tracking
US20110064189A1 (en) * 2009-09-14 2011-03-17 Siemens Corporation Method and System for Needle Tracking in Fluoroscopic Image Sequences
US20120057767A1 (en) * 2007-02-23 2012-03-08 General Electric Company Method and apparatus for generating variable resolution medical images
US20120071757A1 (en) * 2010-09-17 2012-03-22 University Of British Columbia Ultrasound Registration
US20120116219A1 (en) * 2010-11-10 2012-05-10 Miller Nathan D System and method of ultrasound image processing
US20120207372A1 (en) * 2010-07-15 2012-08-16 Toshiba Medical Systems Corporation Radiotherapy system and control method for radiotherapy system
US20120228493A1 (en) * 2011-03-07 2012-09-13 Bernard Gottschalk Water equivalent depth measurement
US20120245455A1 (en) * 2009-08-31 2012-09-27 Universite Joseph Fournier Control System and Method for Precisely Guiding a Percutaneous Needle Toward the Prostate
US20130102891A1 (en) * 2010-06-30 2013-04-25 Koninklijke Philips Electronics N.V. System and method for guided adaptive brachytherapy
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20140193051A1 (en) * 2013-01-10 2014-07-10 Samsung Electronics Co., Ltd. Lesion diagnosis apparatus and method
US20140357977A1 (en) * 2011-05-12 2014-12-04 William Beaumont Hospital Catheter Placement Detection System and Method for Surgical Procedures
WO2015028641A1 (en) * 2013-08-30 2015-03-05 Koninklijke Philips N.V. A system and method for visualizing information in a procedure of placing sources
US9053534B2 (en) 2011-11-23 2015-06-09 The Regents Of The University Of Michigan Voxel-based approach for disease detection and evolution
US20150313571A1 (en) * 2014-04-30 2015-11-05 Emory University Systems, methods and computer readable storage media storing instructions for generating planning images based on hdr applicators
WO2015181632A1 (en) * 2014-05-28 2015-12-03 Nucletron Operations B.V. Methods and systems for brachytherapy planning based on imaging data
US20160000519A1 (en) * 2013-03-28 2016-01-07 Koninklijke Philips N.V. Instrument localization in guided high dose rate brachytherapy
WO2017037060A1 (en) * 2015-09-01 2017-03-09 Koninklijke Philips N.V. Device and method for assisting in a brachytherapy procedure
US9700737B2 (en) 2012-05-15 2017-07-11 Koninklijke Philips N.V. Brachytherapy apparatus
US20170262982A1 (en) * 2016-03-09 2017-09-14 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network
WO2017157974A1 (en) 2016-03-16 2017-09-21 Koninklijke Philips N.V. System for assisting in performing an interventional procedure
US20170265840A1 (en) * 2014-12-01 2017-09-21 Koninklijke Philips N.V. Registration of optical shape sensing tool
US9773311B2 (en) 2011-06-29 2017-09-26 The Regents Of The University Of Michigan Tissue phasic classification mapping system and method
US9851426B2 (en) 2012-05-04 2017-12-26 The Regents Of The University Of Michigan Error analysis and correction of MRI ADC measurements for gradient nonlinearity
US9867998B2 (en) 2012-12-06 2018-01-16 Koninkljke Philips N.V. Calibration apparatus
JP2018518277A (en) * 2015-06-12 2018-07-12 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Radiation planning system
US20180263593A1 (en) * 2017-03-14 2018-09-20 Clarius Mobile Health Corp. Systems and methods for detecting and enhancing viewing of a needle during ultrasound imaging
US10083518B2 (en) * 2017-02-28 2018-09-25 Siemens Healthcare Gmbh Determining a biopsy position
US10152643B2 (en) * 2004-05-06 2018-12-11 Christopher L. Kavanau Method and system for aligning and classifying images
EP3466487A1 (en) * 2017-10-03 2019-04-10 Koninklijke Philips N.V. Robustness evaluation of brachytherapy treatment plan
US10413749B2 (en) * 2014-09-26 2019-09-17 Koninklijke Philips N.V. Dual localization of brachytherapy radioactive seeds
US20190378274A1 (en) * 2018-06-06 2019-12-12 International Business Machines Corporation Joint registration and segmentation of images using deep learning
US10549123B2 (en) 2012-12-11 2020-02-04 Koninklijke Philips N.V. Assisting apparatus for assisting in performing a brachytherapy
US10650512B2 (en) 2016-06-14 2020-05-12 The Regents Of The University Of Michigan Systems and methods for topographical characterization of medical image data
US10786310B2 (en) * 2014-03-24 2020-09-29 Koninklijke Philips N.V. Quality assurance and data coordination for electromagnetic tracking systems
US11247070B2 (en) 2018-03-21 2022-02-15 Koninklijke Philips N.V. Brachytherapy treatment planning system
US11291863B2 (en) * 2016-12-16 2022-04-05 Koninklijke Philips N.V. Positioning assistance device for focal radiation therapy
US11580677B2 (en) * 2017-09-29 2023-02-14 General Electric Company Systems and methods for deep learning-based image reconstruction

Families Citing this family (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6438401B1 (en) * 2000-04-28 2002-08-20 Alpha Intervention Technology, Inc. Indentification and quantification of needle displacement departures from treatment plan
US6430525B1 (en) 2000-06-05 2002-08-06 Masimo Corporation Variable mode averager
US8565860B2 (en) 2000-08-21 2013-10-22 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system
US7826889B2 (en) * 2000-08-21 2010-11-02 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8909325B2 (en) 2000-08-21 2014-12-09 Biosensors International Group, Ltd. Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8489176B1 (en) 2000-08-21 2013-07-16 Spectrum Dynamics Llc Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8036731B2 (en) 2001-01-22 2011-10-11 Spectrum Dynamics Llc Ingestible pill for diagnosing a gastrointestinal tract
US20030139700A1 (en) * 2000-11-10 2003-07-24 Mentor Corporation User interface for an automated radioisotope system
US7776310B2 (en) 2000-11-16 2010-08-17 Microspherix Llc Flexible and/or elastic brachytherapy seed or strand
CN1310617C (en) 2001-01-22 2007-04-18 V-目标技术有限公司 Ingestible pill
US6549802B2 (en) * 2001-06-07 2003-04-15 Varian Medical Systems, Inc. Seed localization system and method in ultrasound by fluoroscopy and ultrasound fusion
US7853312B2 (en) * 2001-06-07 2010-12-14 Varian Medical Systems, Inc. Seed localization system for use in an ultrasound system and method of using the same
AU2002322662A1 (en) * 2001-07-31 2003-02-17 University Of Rochester Method and device for optimization of preloaded brachytherapy needles
US7060020B2 (en) * 2001-11-02 2006-06-13 Ideamatrix, Inc. Delivery system and method for interstitial radiation therapy
US6761680B2 (en) 2001-11-02 2004-07-13 Richard A. Terwilliger Delivery system and method for interstitial radiation therapy using seed strands constructed with preformed strand housing
US7094198B2 (en) * 2001-11-02 2006-08-22 Worldwide Medical Technologies, Llc Delivery system and method for interstitial radiation therapy using seed elements with ends having one of projections and indentations
US7074291B2 (en) * 2001-11-02 2006-07-11 Worldwide Medical Technologies, L.L.C. Delivery system and method for interstitial radiation therapy using strands constructed with extruded strand housings
US6786858B2 (en) * 2001-11-02 2004-09-07 Ideamatrix, Inc. Delivery system and method for interstitial radiotherapy using hollow seeds
EP1316330A1 (en) * 2001-11-29 2003-06-04 Nucletron B.V. Quality assurance for brachytherapy using a dose measuring system
DE60208009T2 (en) 2002-06-17 2006-08-24 Nucletron B.V. System for real-time planning of radiotherapy
US7289599B2 (en) * 2002-10-04 2007-10-30 Varian Medical Systems Technologies, Inc. Radiation process and apparatus
US6961405B2 (en) * 2002-10-07 2005-11-01 Nomos Corporation Method and apparatus for target position verification
US7797057B2 (en) * 2002-10-23 2010-09-14 Medtronic, Inc. Medical paddle lead and method for spinal cord stimulation
AU2003276658A1 (en) * 2002-11-04 2004-06-07 V-Target Technologies Ltd. Apparatus and methods for imaging and attenuation correction
US6997862B2 (en) * 2003-05-13 2006-02-14 Ideamatrix, Inc. Delivery system and method for interstitial radiation therapy using seed strands with custom end spacing
US7252090B2 (en) 2003-09-15 2007-08-07 Medtronic, Inc. Selection of neurostimulator parameter configurations using neural network
US20050080314A1 (en) * 2003-10-09 2005-04-14 Terwilliger Richard A. Shielded transport for multiple brachytheapy implants with integrated measuring and cutting board
CA2457376C (en) * 2003-10-14 2015-09-15 The University Of British Columbia Method for imaging the mechanical properties of tissue
WO2007010537A2 (en) 2005-07-19 2007-01-25 Spectrum Dynamics Llc Reconstruction stabilizer and active vision
WO2005067383A2 (en) 2004-01-13 2005-07-28 Spectrum Dynamics Llc Multi-dimensional image reconstruction
US8571881B2 (en) 2004-11-09 2013-10-29 Spectrum Dynamics, Llc Radiopharmaceutical dispensing, administration, and imaging
US8586932B2 (en) 2004-11-09 2013-11-19 Spectrum Dynamics Llc System and method for radioactive emission measurement
US9470801B2 (en) 2004-01-13 2016-10-18 Spectrum Dynamics Llc Gating with anatomically varying durations
WO2008010227A2 (en) 2006-07-19 2008-01-24 Spectrum Dynamics Llc Imaging protocols
US7968851B2 (en) 2004-01-13 2011-06-28 Spectrum Dynamics Llc Dynamic spect camera
US8280124B2 (en) * 2004-06-01 2012-10-02 Spectrum Dynamics Llc Methods of view selection for radioactive emission measurements
EP1778957A4 (en) 2004-06-01 2015-12-23 Biosensors Int Group Ltd Radioactive-emission-measurement optimization to specific body structures
US9943274B2 (en) 2004-11-09 2018-04-17 Spectrum Dynamics Medical Limited Radioimaging using low dose isotope
US8423125B2 (en) * 2004-11-09 2013-04-16 Spectrum Dynamics Llc Radioimaging
US8615405B2 (en) 2004-11-09 2013-12-24 Biosensors International Group, Ltd. Imaging system customization using data from radiopharmaceutical-associated data carrier
US9316743B2 (en) 2004-11-09 2016-04-19 Biosensors International Group, Ltd. System and method for radioactive emission measurement
EP1827505A4 (en) 2004-11-09 2017-07-12 Biosensors International Group, Ltd. Radioimaging
US7872235B2 (en) 2005-01-13 2011-01-18 Spectrum Dynamics Llc Multi-dimensional image reconstruction and analysis for expert-system diagnosis
US8837793B2 (en) 2005-07-19 2014-09-16 Biosensors International Group, Ltd. Reconstruction stabilizer and active vision
US8644910B2 (en) 2005-07-19 2014-02-04 Biosensors International Group, Ltd. Imaging protocols
US8187159B2 (en) 2005-07-22 2012-05-29 Biocompatibles, UK Therapeutic member including a rail used in brachytherapy and other radiation therapy
US7736293B2 (en) 2005-07-22 2010-06-15 Biocompatibles Uk Limited Implants for use in brachytherapy and other radiation therapy that resist migration and rotation
US7705316B2 (en) 2005-11-09 2010-04-27 Spectrum Dynamics Llc Dynamic SPECT camera
US8204500B2 (en) 2005-12-28 2012-06-19 Starhome Gmbh Optimal voicemail deposit for roaming cellular telephony
US8306624B2 (en) 2006-04-28 2012-11-06 Medtronic, Inc. Patient-individualized efficacy rating
US7715920B2 (en) 2006-04-28 2010-05-11 Medtronic, Inc. Tree-based electrical stimulator programming
US8380300B2 (en) 2006-04-28 2013-02-19 Medtronic, Inc. Efficacy visualization
US20070265487A1 (en) * 2006-05-09 2007-11-15 Worldwide Medical Technologies Llc Applicators for use in positioning implants for use in brachytherapy and other radiation therapy
US7988611B2 (en) 2006-05-09 2011-08-02 Biocompatibles Uk Limited After-loader for positioning implants for needle delivery in brachytherapy and other radiation therapy
US8894974B2 (en) 2006-05-11 2014-11-25 Spectrum Dynamics Llc Radiopharmaceuticals for diagnosis and therapy
US7601966B2 (en) * 2006-06-28 2009-10-13 Spectrum Dynamics Llc Imaging techniques for reducing blind spots
US7878964B1 (en) 2006-09-07 2011-02-01 Biocompatibles Uk Limited Echogenic spacers and strands
US7874976B1 (en) 2006-09-07 2011-01-25 Biocompatibles Uk Limited Echogenic strands and spacers therein
US8610075B2 (en) 2006-11-13 2013-12-17 Biosensors International Group Ltd. Radioimaging applications of and novel formulations of teboroxime
WO2008075362A2 (en) 2006-12-20 2008-06-26 Spectrum Dynamics Llc A method, a system, and an apparatus for using and processing multidimensional data
US20080269540A1 (en) * 2007-04-27 2008-10-30 Worldwide Medical Technologies Llc Seed cartridge adaptor and methods for use therewith
US8521253B2 (en) 2007-10-29 2013-08-27 Spectrum Dynamics Llc Prostate imaging
WO2009156893A1 (en) * 2008-06-25 2009-12-30 Koninklijke Philips Electronics N.V. Method and system for brachytherapy
US20100063400A1 (en) * 2008-09-05 2010-03-11 Anne Lindsay Hall Method and apparatus for catheter guidance using a combination of ultrasound and x-ray imaging
US8338788B2 (en) 2009-07-29 2012-12-25 Spectrum Dynamics Llc Method and system of optimized volumetric imaging
DK2753243T3 (en) 2011-09-06 2016-07-04 Ezono Ag Imaging probe and method for obtaining positional and / or information information
US10245447B2 (en) * 2011-10-18 2019-04-02 Koninklijke Philips N.V. Magnetic resonance imaging guided brachytherapy with displaying the catheter placement position
US9529080B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. System and apparatus having an application programming interface for flexible control of execution ultrasound actions
US10076313B2 (en) 2012-12-06 2018-09-18 White Eagle Sonic Technologies, Inc. System and method for automatically adjusting beams to scan an object in a body
US9773496B2 (en) 2012-12-06 2017-09-26 White Eagle Sonic Technologies, Inc. Apparatus and system for adaptively scheduling ultrasound system actions
US10499884B2 (en) 2012-12-06 2019-12-10 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US9983905B2 (en) 2012-12-06 2018-05-29 White Eagle Sonic Technologies, Inc. Apparatus and system for real-time execution of ultrasound system actions
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
GB201303917D0 (en) 2013-03-05 2013-04-17 Ezono Ag System for image guided procedure
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US10456595B2 (en) 2013-07-17 2019-10-29 Koninklijke Philips N.V. Portal imaging for brachytherapy
US9700284B2 (en) * 2013-11-13 2017-07-11 Siemens Medical Solutions Usa, Inc. Three-dimensional ultrasound reconstruction with confidence information
JP2017520316A (en) 2014-07-14 2017-07-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Grid calibration system for interventional procedures
CN104248454B (en) * 2014-10-09 2016-06-29 哈尔滨工程大学 A kind of two-dimensional ultrasonic image and the coplanar determination methods of puncture needle
US11386556B2 (en) 2015-12-18 2022-07-12 Orthogrid Systems Holdings, Llc Deformed grid based intra-operative system and method of use
US10991070B2 (en) 2015-12-18 2021-04-27 OrthoGrid Systems, Inc Method of providing surgical guidance
WO2020056086A1 (en) 2018-09-12 2020-03-19 Orthogrid Systems, Inc. An artificial intelligence intra-operative surgical guidance system and method of use
US11540794B2 (en) 2018-09-12 2023-01-03 Orthogrid Systesm Holdings, LLC Artificial intelligence intra-operative surgical guidance system and method of use

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5391139A (en) * 1992-09-03 1995-02-21 William Beaumont Hospital Real time radiation treatment planning system
US5655535A (en) * 1996-03-29 1997-08-12 Siemens Medical Systems, Inc. 3-Dimensional compound ultrasound field of view
US5957935A (en) * 1998-04-15 1999-09-28 Brown; Samuel D. Guide and holding bracket for a prostate implant stabilization device
US6083167A (en) * 1998-02-10 2000-07-04 Emory University Systems and methods for providing radiation therapy and catheter guides
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6200255B1 (en) * 1998-10-30 2001-03-13 University Of Rochester Prostate implant planning engine for radiotherapy
US6311084B1 (en) * 1998-05-04 2001-10-30 Robert A. Cormack Radiation seed implant method and apparatus
US6390982B1 (en) * 1999-07-23 2002-05-21 Univ Florida Ultrasonic guidance of target structures for medical procedures
US6438401B1 (en) * 2000-04-28 2002-08-20 Alpha Intervention Technology, Inc. Indentification and quantification of needle displacement departures from treatment plan
US6505065B1 (en) * 1999-10-29 2003-01-07 Koninklijke Philips Electronics, N.V. Methods and apparatus for planning and executing minimally invasive procedures for in-vivo placement of objects
US6546787B1 (en) * 1999-03-25 2003-04-15 Regents Of The University Of Minnesota Means and method for modeling and treating specific tissue structures
US6610013B1 (en) * 1999-10-01 2003-08-26 Life Imaging Systems, Inc. 3D ultrasound-guided intraoperative prostate brachytherapy

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4976680A (en) * 1988-10-07 1990-12-11 Hayman Michael H Apparatus for in situ radiotherapy
US5107839A (en) * 1990-05-04 1992-04-28 Pavel V. Houdek Computer controlled stereotaxic radiotherapy system and method
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US6234951B1 (en) * 1996-02-29 2001-05-22 Scimed Life Systems, Inc. Intravascular radiation delivery system
DE19612668C1 (en) * 1996-03-29 1997-08-21 Martin Dr Med Busch Radiation data optimisation method for radiation therapy
US6032066A (en) * 1997-02-07 2000-02-29 Jcrt Radiation Oncology Support Services Method and apparatus for virtual radiotherapy beam projection localization in real space

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5391139A (en) * 1992-09-03 1995-02-21 William Beaumont Hospital Real time radiation treatment planning system
US5655535A (en) * 1996-03-29 1997-08-12 Siemens Medical Systems, Inc. 3-Dimensional compound ultrasound field of view
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6083167A (en) * 1998-02-10 2000-07-04 Emory University Systems and methods for providing radiation therapy and catheter guides
US5957935A (en) * 1998-04-15 1999-09-28 Brown; Samuel D. Guide and holding bracket for a prostate implant stabilization device
US6311084B1 (en) * 1998-05-04 2001-10-30 Robert A. Cormack Radiation seed implant method and apparatus
US6200255B1 (en) * 1998-10-30 2001-03-13 University Of Rochester Prostate implant planning engine for radiotherapy
US6546787B1 (en) * 1999-03-25 2003-04-15 Regents Of The University Of Minnesota Means and method for modeling and treating specific tissue structures
US6390982B1 (en) * 1999-07-23 2002-05-21 Univ Florida Ultrasonic guidance of target structures for medical procedures
US6610013B1 (en) * 1999-10-01 2003-08-26 Life Imaging Systems, Inc. 3D ultrasound-guided intraoperative prostate brachytherapy
US6505065B1 (en) * 1999-10-29 2003-01-07 Koninklijke Philips Electronics, N.V. Methods and apparatus for planning and executing minimally invasive procedures for in-vivo placement of objects
US6438401B1 (en) * 2000-04-28 2002-08-20 Alpha Intervention Technology, Inc. Indentification and quantification of needle displacement departures from treatment plan

Cited By (152)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6846282B1 (en) * 2000-06-09 2005-01-25 Varian Medical Systems, Inc. Brachytherapy apparatus and methods
US7018333B2 (en) * 2000-11-24 2006-03-28 U-Systems, Inc. Method and system for instant biopsy specimen analysis
US20040152981A1 (en) * 2000-11-24 2004-08-05 Shih-Ping Wang Method and system for instant biopsy specimen analysis
US20050197568A1 (en) * 2002-03-15 2005-09-08 General Electric Company Method and system for registration of 3d images within an interventional system
US7499743B2 (en) 2002-03-15 2009-03-03 General Electric Company Method and system for registration of 3D images within an interventional system
US7778686B2 (en) 2002-06-04 2010-08-17 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
US20050080328A1 (en) * 2002-06-04 2005-04-14 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
US20050256402A1 (en) * 2002-09-27 2005-11-17 Olympus Corporation Ultrasonograph
US7775977B2 (en) * 2002-09-27 2010-08-17 Olympus Corporation Ultrasonic tomographic diagnostic apparatus
US20040066957A1 (en) * 2002-10-03 2004-04-08 Miller Edward A. System and method for comparing ultrasound images corresponding to two user-selected data points
US7593554B2 (en) * 2002-10-03 2009-09-22 Koninklijke Philips Electronics N.V. System and method for comparing ultrasound images corresponding to two user-selected data points
US20040092786A1 (en) * 2002-11-08 2004-05-13 Memorial Sloan-Kettering Cancer Center Intraoperative dynamic dosimetry for prostate implants
US20040225174A1 (en) * 2003-05-06 2004-11-11 Fuller Donald B. Method for computed tomography-ultrasound interactive prostate brachytherapy
US20040228509A1 (en) * 2003-05-15 2004-11-18 Beth Israel Deaconess Medical Center, Inc. Automatic spacial identification of tissue implanted linear sources using medical imaging
US20050171428A1 (en) * 2003-07-21 2005-08-04 Gabor Fichtinger Registration of ultrasound to fluoroscopy for real time optimization of radiation implant procedures
US7408554B2 (en) * 2003-09-10 2008-08-05 Lawson Jr Phillip W Spherical modeling tool
US20050102316A1 (en) * 2003-09-10 2005-05-12 Lawson Phillip W.Jr. Spherical modeling tool
US20050143777A1 (en) * 2003-12-19 2005-06-30 Sra Jasbir S. Method and system of treatment of heart failure using 4D imaging
US20050240882A1 (en) * 2004-04-21 2005-10-27 Ge Medical Systems Information Technologies, Inc. Method and system for displaying regions of pathological interest
US10152643B2 (en) * 2004-05-06 2018-12-11 Christopher L. Kavanau Method and system for aligning and classifying images
US11450082B2 (en) 2004-05-06 2022-09-20 Christopher L. Kavanau Method and system for aligning and classifying images
US20080159612A1 (en) * 2004-06-30 2008-07-03 Dongshan Fu DRR generation using a non-linear attenuation model
US20060002615A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7522779B2 (en) 2004-06-30 2009-04-21 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20090091567A1 (en) * 2004-06-30 2009-04-09 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7505617B2 (en) 2004-06-30 2009-03-17 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7231076B2 (en) * 2004-06-30 2007-06-12 Accuray, Inc. ROI selection in image registration
US20060002631A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. ROI selection in image registration
US20060002601A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. DRR generation using a non-linear attenuation model
US7840093B2 (en) 2004-06-30 2010-11-23 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US20060002630A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US20060002632A1 (en) * 2004-06-30 2006-01-05 Accuray, Inc. Motion field generation for non-rigid image registration
US7426318B2 (en) 2004-06-30 2008-09-16 Accuray, Inc. Motion field generation for non-rigid image registration
US7366278B2 (en) 2004-06-30 2008-04-29 Accuray, Inc. DRR generation using a non-linear attenuation model
US20080101673A1 (en) * 2004-06-30 2008-05-01 Dongshan Fu Fiducial-less tracking with non-rigid image registration
US20060030777A1 (en) * 2004-07-30 2006-02-09 Liang David H T-statistic method for suppressing artifacts in blood vessel ultrasonic imaging
US20090009512A1 (en) * 2004-08-31 2009-01-08 Koninklijke Philips Electronics N.V. Imaging system
US20060078195A1 (en) * 2004-10-13 2006-04-13 Regis Vaillant Method and system for registering 3D models of anatomical regions with projection images of the same
US7327872B2 (en) 2004-10-13 2008-02-05 General Electric Company Method and system for registering 3D models of anatomical regions with projection images of the same
US8515527B2 (en) * 2004-10-13 2013-08-20 General Electric Company Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system
US20060079759A1 (en) * 2004-10-13 2006-04-13 Regis Vaillant Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system
US8077936B2 (en) * 2005-06-02 2011-12-13 Accuray Incorporated Treatment planning software and corresponding user interface
US20060274885A1 (en) * 2005-06-02 2006-12-07 Hongwu Wang Treatment planning software and corresponding user interface
US9993662B2 (en) 2005-06-02 2018-06-12 Accuray Incorporated Treatment planning software and corresponding user interface
US7801349B2 (en) 2005-06-20 2010-09-21 Accuray Incorporated Automatic generation of an envelope of constraint points for inverse planning
US20070053490A1 (en) * 2005-06-20 2007-03-08 Hongwu Wang Automatic generation of an envelope of constraint points for inverse planning
US20080069422A1 (en) * 2005-06-23 2008-03-20 Bai Wang DRR generation and enhancement using a dedicated graphics device
US7330578B2 (en) 2005-06-23 2008-02-12 Accuray Inc. DRR generation and enhancement using a dedicated graphics device
US20070078327A1 (en) * 2005-07-18 2007-04-05 Nucletron B.V. Method for identifying the location at least one treatment channel from a group of a plurality of treatment channels as well as a system for effecting radiation treatment on a pre-selected anatomical portion of an animal body
US9364685B2 (en) 2005-07-18 2016-06-14 Nucletron Operations B.V. Method for identifying the location at least one treatment channel from a group of a plurality of treatment channels as well as a system for effecting radiation treatment on a pre-selected anatomical portion of an animal body
US10279195B2 (en) 2005-07-18 2019-05-07 Nucletron B.V. Method for identifying the location of at least one treatment channel from a group of a plurality of treatment channels as well as a system for effecting radiation treatment on a pre-selected anatomical portion of an animal body
US8145290B2 (en) * 2005-07-18 2012-03-27 Nucletron B.V. Method for identifying the location of at least one treatment channel that is one of a plurality of treatment channels, as well as a system for effecting radiation treatment on a pre-selected anatomical portion of an animal body
US20070055129A1 (en) * 2005-08-24 2007-03-08 Siemens Aktiengesellschaft Method and device for displaying a surgical instrument during placement thereof in a patient during a treatment
US20070167801A1 (en) * 2005-12-02 2007-07-19 Webler William E Methods and apparatuses for image guided medical procedures
US8303505B2 (en) 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US20070230756A1 (en) * 2006-02-10 2007-10-04 Xu Wen Method and apparatus for processing ultrasound image signals
US8029445B2 (en) 2006-02-10 2011-10-04 The University Of British Columbia Method and apparatus for processing ultrasound image signals
US20100019918A1 (en) * 2006-05-02 2010-01-28 Galil Medical Ltd. Probe Insertion Guide with User-Directing Features
US7848554B2 (en) * 2006-06-29 2010-12-07 Vassol Inc. Sub-voxel motion correction for phase-contrast magnetic resonance imaging
US20090022380A1 (en) * 2006-06-29 2009-01-22 Meide Zhao Sub-voxel motion correction for phase-contrast magnetic resonance imaging
US8577102B2 (en) * 2006-07-28 2013-11-05 Siemens Aktiengesellschaft Method for improving the accuracy of a bronchial biopsy, and device for carrying out a method for improving the accuracy of a bronchial biopsy
US20080027354A1 (en) * 2006-07-28 2008-01-31 Lutz Gundel Method for improving the accuracy of a bronchial biopsy, and device for carrying out a method for improving the accuracy of a bronchial biopsy
WO2008083225A3 (en) * 2006-12-29 2008-08-21 Eigen Repeat biopsy system
US20080161687A1 (en) * 2006-12-29 2008-07-03 Suri Jasjit S Repeat biopsy system
US20080186378A1 (en) * 2007-02-06 2008-08-07 Feimo Shen Method and apparatus for guiding towards targets during motion
US8824754B2 (en) * 2007-02-23 2014-09-02 General Electric Company Method and apparatus for generating variable resolution medical images
US7961928B2 (en) * 2007-02-23 2011-06-14 Ge Medical Systems Global Technology Company, Llc Volumetric data connecting apparatus and method
US20090214120A1 (en) * 2007-02-23 2009-08-27 Hiroshi Hashimoto Volumetric data connecting apparatus and method
US20120057767A1 (en) * 2007-02-23 2012-03-08 General Electric Company Method and apparatus for generating variable resolution medical images
US9616247B2 (en) * 2007-03-30 2017-04-11 Koninklijke Philips N.V. Treatment plan evaluation in radiotherapy by stochastic analysis of delineation uncertainty
US20100086183A1 (en) * 2007-03-30 2010-04-08 Koninklijke Philips Electronics N.V. Treatment plan evaluation in radiotherapy by stochastic analysis of delineation uncertainty
US8768431B2 (en) 2007-04-13 2014-07-01 The Regents Of The University Of Michigan Systems and methods for tissue imaging
US8185186B2 (en) * 2007-04-13 2012-05-22 The Regents Of The University Of Michigan Systems and methods for tissue imaging
US20090035218A1 (en) * 2007-04-13 2009-02-05 The Regents Of The University Of Michigan Systems and methods for tissue imaging
US20100256610A1 (en) * 2007-10-25 2010-10-07 Basil Rigas Apparatus and method of detection and localized treatment of abnormal conditions
WO2009055813A1 (en) * 2007-10-25 2009-04-30 The Research Foundation Of State University Of New York Apparatus and method of detection and localized treatment of abnormal conditions
US9289140B2 (en) 2008-02-29 2016-03-22 The Regents Of The University Of Michigan Systems and methods for imaging changes in tissue
US20090234237A1 (en) * 2008-02-29 2009-09-17 The Regents Of The University Of Michigan Systems and methods for imaging changes in tissue
US8864668B2 (en) * 2008-04-02 2014-10-21 Medison Co., Ltd. Formation of an elastic image in an ultrasound system
US20090253987A1 (en) * 2008-04-02 2009-10-08 Medison Co., Ltd. Formation of an elastic image in an ultrasound system
WO2010028379A3 (en) * 2008-09-05 2010-05-14 The Research Foundation Of State University Of New York Bladder wall thickness mapping for tumor detection
WO2010028379A2 (en) * 2008-09-05 2010-03-11 The Research Foundation Of State University Of New York Bladder wall thickness mapping for tumor detection
US20110237929A1 (en) * 2008-09-05 2011-09-29 Zhengrong Liang Bladder wall thickness mapping for tumor detection
US20100172559A1 (en) * 2008-11-11 2010-07-08 Eigen, Inc System and method for prostate biopsy
WO2010102119A1 (en) * 2009-03-04 2010-09-10 Imricor Medical Systems, Inc. Combined field location and mri tracking
US9271664B2 (en) 2009-03-04 2016-03-01 Imricor Medical Systems, Inc. Combined field location and MRI tracking
US9265442B2 (en) 2009-03-04 2016-02-23 Imricor Medical Systems, Inc. Method of calibrating combined field location and MRI tracking
US8702579B2 (en) * 2009-08-31 2014-04-22 Koelis Control system and method for precisely guiding a percutaneous needle toward the prostate
US20120245455A1 (en) * 2009-08-31 2012-09-27 Universite Joseph Fournier Control System and Method for Precisely Guiding a Percutaneous Needle Toward the Prostate
US20110064189A1 (en) * 2009-09-14 2011-03-17 Siemens Corporation Method and System for Needle Tracking in Fluoroscopic Image Sequences
US8675914B2 (en) * 2009-09-14 2014-03-18 Siemens Aktiengesellschaft Method and system for needle tracking in fluoroscopic image sequences
US9256947B2 (en) * 2010-03-19 2016-02-09 Koninklijke Philips N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20130102891A1 (en) * 2010-06-30 2013-04-25 Koninklijke Philips Electronics N.V. System and method for guided adaptive brachytherapy
US10850126B2 (en) * 2010-06-30 2020-12-01 Koninklijke Philips N.V. System and method for guided adaptive brachytherapy
US20120207372A1 (en) * 2010-07-15 2012-08-16 Toshiba Medical Systems Corporation Radiotherapy system and control method for radiotherapy system
US9192781B2 (en) * 2010-07-15 2015-11-24 Kabushiki Kaisha Toshiba Radiotherapy system and control method for radiotherapy system
US20120071757A1 (en) * 2010-09-17 2012-03-22 University Of British Columbia Ultrasound Registration
US20120116219A1 (en) * 2010-11-10 2012-05-10 Miller Nathan D System and method of ultrasound image processing
US8840555B2 (en) * 2010-11-10 2014-09-23 Echometrix, Llc System and method of ultrasound image processing
US9480862B2 (en) * 2011-03-07 2016-11-01 Ion Beam Applications S.A. Water equivalent depth measurement for providing a continuous calibration range based on discrete measurements
US20120228493A1 (en) * 2011-03-07 2012-09-13 Bernard Gottschalk Water equivalent depth measurement
US20140357977A1 (en) * 2011-05-12 2014-12-04 William Beaumont Hospital Catheter Placement Detection System and Method for Surgical Procedures
US9773311B2 (en) 2011-06-29 2017-09-26 The Regents Of The University Of Michigan Tissue phasic classification mapping system and method
US9053534B2 (en) 2011-11-23 2015-06-09 The Regents Of The University Of Michigan Voxel-based approach for disease detection and evolution
US9851426B2 (en) 2012-05-04 2017-12-26 The Regents Of The University Of Michigan Error analysis and correction of MRI ADC measurements for gradient nonlinearity
US9700737B2 (en) 2012-05-15 2017-07-11 Koninklijke Philips N.V. Brachytherapy apparatus
US9867998B2 (en) 2012-12-06 2018-01-16 Koninkljke Philips N.V. Calibration apparatus
US10549123B2 (en) 2012-12-11 2020-02-04 Koninklijke Philips N.V. Assisting apparatus for assisting in performing a brachytherapy
US9773305B2 (en) * 2013-01-10 2017-09-26 Samsung Electronics Co., Ltd. Lesion diagnosis apparatus and method
US20140193051A1 (en) * 2013-01-10 2014-07-10 Samsung Electronics Co., Ltd. Lesion diagnosis apparatus and method
JP2016514531A (en) * 2013-03-28 2016-05-23 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Device location in guided high-dose rate brachytherapy
US20160000519A1 (en) * 2013-03-28 2016-01-07 Koninklijke Philips N.V. Instrument localization in guided high dose rate brachytherapy
CN105492072A (en) * 2013-08-30 2016-04-13 皇家飞利浦有限公司 A system and method for visualizing information in a procedure of placing sources
WO2015028641A1 (en) * 2013-08-30 2015-03-05 Koninklijke Philips N.V. A system and method for visualizing information in a procedure of placing sources
US10183180B2 (en) 2013-08-30 2019-01-22 Koninklijke Philips N.V. System and method for visualizing information in a procedure of placing sources
US10786310B2 (en) * 2014-03-24 2020-09-29 Koninklijke Philips N.V. Quality assurance and data coordination for electromagnetic tracking systems
US9877697B2 (en) * 2014-04-30 2018-01-30 Emory University Systems, methods and computer readable storage media storing instructions for generating planning images based on HDR applicators
US20180161000A1 (en) * 2014-04-30 2018-06-14 Emory University Systems, Methods and Computer Readable Storage Media Storing Instructions for Generating Planning Images Based on HDR Applicators
US20150313571A1 (en) * 2014-04-30 2015-11-05 Emory University Systems, methods and computer readable storage media storing instructions for generating planning images based on hdr applicators
US10758745B2 (en) * 2014-05-28 2020-09-01 Nucletron Operations B.V. Methods and systems for brachytherapy planning based on imaging data
US20170120072A1 (en) * 2014-05-28 2017-05-04 Nucletron Operations B.V. Methods and systems for brachytherapy planning based on imaging data
WO2015181632A1 (en) * 2014-05-28 2015-12-03 Nucletron Operations B.V. Methods and systems for brachytherapy planning based on imaging data
CN106132484A (en) * 2014-05-28 2016-11-16 核通业务有限公司 Method and system for plesioradiotherapy plan based on imaging data
EP3197554B1 (en) * 2014-09-26 2020-09-09 Koninklijke Philips N.V. Dual localization of brachytherapy radioactive seeds
US10413749B2 (en) * 2014-09-26 2019-09-17 Koninklijke Philips N.V. Dual localization of brachytherapy radioactive seeds
US20170265840A1 (en) * 2014-12-01 2017-09-21 Koninklijke Philips N.V. Registration of optical shape sensing tool
US11259774B2 (en) * 2014-12-01 2022-03-01 Koninklijke Philips N.V. Registration of optical shape sensing tool
JP2018518277A (en) * 2015-06-12 2018-07-12 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Radiation planning system
US10933254B2 (en) 2015-09-01 2021-03-02 Koninklijke Philips N.V. Device and method for assisting in a brachytherapy procedure
WO2017037060A1 (en) * 2015-09-01 2017-03-09 Koninklijke Philips N.V. Device and method for assisting in a brachytherapy procedure
JP2018526108A (en) * 2015-09-01 2018-09-13 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Devices and methods for supporting brachytherapy procedures
US20170262982A1 (en) * 2016-03-09 2017-09-14 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network
US10964424B2 (en) * 2016-03-09 2021-03-30 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network
CN112957074A (en) * 2016-03-09 2021-06-15 安科诺思公司 Ultrasound image recognition system and method using artificial intelligence network
US11571589B2 (en) 2016-03-16 2023-02-07 Koninklijke Philips N.V. System for assisting in performing an interventional procedure
RU2746151C2 (en) * 2016-03-16 2021-04-07 Конинклейке Филипс Н.В. System for supporting the performance of interventional procedures
WO2017157974A1 (en) 2016-03-16 2017-09-21 Koninklijke Philips N.V. System for assisting in performing an interventional procedure
JP2019508163A (en) * 2016-03-16 2019-03-28 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. System to support the implementation of interventional procedures
CN108778421A (en) * 2016-03-16 2018-11-09 皇家飞利浦有限公司 System for assisting executing intervention flow
US10650512B2 (en) 2016-06-14 2020-05-12 The Regents Of The University Of Michigan Systems and methods for topographical characterization of medical image data
US11291863B2 (en) * 2016-12-16 2022-04-05 Koninklijke Philips N.V. Positioning assistance device for focal radiation therapy
US10083518B2 (en) * 2017-02-28 2018-09-25 Siemens Healthcare Gmbh Determining a biopsy position
US10588596B2 (en) * 2017-03-14 2020-03-17 Clarius Mobile Health Corp. Systems and methods for detecting and enhancing viewing of a needle during ultrasound imaging
US20180263593A1 (en) * 2017-03-14 2018-09-20 Clarius Mobile Health Corp. Systems and methods for detecting and enhancing viewing of a needle during ultrasound imaging
US11580677B2 (en) * 2017-09-29 2023-02-14 General Electric Company Systems and methods for deep learning-based image reconstruction
WO2019068525A1 (en) * 2017-10-03 2019-04-11 Koninklijke Philips N.V. Robustness evaluation of brachytherapy treatment plan
EP3466487A1 (en) * 2017-10-03 2019-04-10 Koninklijke Philips N.V. Robustness evaluation of brachytherapy treatment plan
US11247070B2 (en) 2018-03-21 2022-02-15 Koninklijke Philips N.V. Brachytherapy treatment planning system
US10726555B2 (en) * 2018-06-06 2020-07-28 International Business Machines Corporation Joint registration and segmentation of images using deep learning
US20190378274A1 (en) * 2018-06-06 2019-12-12 International Business Machines Corporation Joint registration and segmentation of images using deep learning

Also Published As

Publication number Publication date
US6438401B1 (en) 2002-08-20
CA2407577A1 (en) 2001-11-08
AU2001251554A1 (en) 2001-11-12
EP1278560A2 (en) 2003-01-29
WO2001082995A2 (en) 2001-11-08
WO2001082995A3 (en) 2002-04-04
WO2001082995B1 (en) 2002-06-27

Similar Documents

Publication Publication Date Title
US20030065260A1 (en) Identification and quantification of needle and seed displacement departures from treatment plan
US8444543B2 (en) Apparatus and computing device for performing brachytherapy and methods of imaging using the same
US6561980B1 (en) Automatic segmentation of prostate, rectum and urethra in ultrasound imaging
US8948471B2 (en) Image registration of multiple medical imaging modalities using a multiple degree-of-freedom-encoded fiducial device
CN103402453B (en) Auto-initiation and the system and method for registration for navigation system
Uherčík et al. Model fitting using RANSAC for surgical tool localization in 3-D ultrasound images
CN101410066B (en) Targeting method, targeting device, computer readable medium and program element
Blackall et al. Alignment of sparse freehand 3-D ultrasound with preoperative images of the liver using models of respiratory motion and deformation
JP4126318B2 (en) Radiotherapy apparatus control apparatus and radiotherapy apparatus control method
Ding et al. Automatic needle segmentation in three‐dimensional ultrasound images using two orthogonal two‐dimensional image projections
US20080037843A1 (en) Image segmentation for DRR generation and image registration
US20070270687A1 (en) Ultrasound Imaging System and Methods Of Imaging Using the Same
US20090014015A1 (en) Intraoperative dosimetry for prostate brachytherapy using transrectal ultrasound and x-ray fluoroscopy
Wen et al. Detection of brachytherapy seeds using 3-D transrectal ultrasound
US20040049109A1 (en) Seed localization system for use in an ultrasound system and method of using the same
Munbodh et al. Automated 2D‐3D registration of a radiograph and a cone beam CT using line‐segment enhancement a
US6027446A (en) Pubic arch detection and interference assessment in transrectal ultrasound guided prostate cancer therapy
Wang et al. Deep learning applications in automatic needle segmentation in ultrasound‐guided prostate brachytherapy
Fallavollita et al. Registration between ultrasound and fluoroscopy or CT in prostate brachytherapy
Karnik et al. Evaluation of intersession 3D‐TRUS to 3D‐TRUS image registration for repeat prostate biopsies
Wei et al. Automated localization of implanted seeds in 3D TRUS images used for prostate brachytherapy
US20040092786A1 (en) Intraoperative dynamic dosimetry for prostate implants
Mu et al. A probabilistic framework based on hidden Markov model for fiducial identification in image-guided radiation treatments
Vidakovic et al. Post‐implant computed tomography–magnetic resonance prostate image registration using feature line parallelization and normalized mutual information
Ding et al. Automated seed localization for intraoperative prostate brachytherapy based on 3D line segment patterns

Legal Events

Date Code Title Description
AS Assignment

Owner name: ALPHA INTERVENTION TECHNOLOGY, INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHENG, GANG;YU, YAN;LIU, HAISONG;REEL/FRAME:013391/0864;SIGNING DATES FROM 20020814 TO 20020923

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION