WO2006092594A2 - 3d ultrasound registration - Google Patents

3d ultrasound registration Download PDF

Info

Publication number
WO2006092594A2
WO2006092594A2 PCT/GB2006/000735 GB2006000735W WO2006092594A2 WO 2006092594 A2 WO2006092594 A2 WO 2006092594A2 GB 2006000735 W GB2006000735 W GB 2006000735W WO 2006092594 A2 WO2006092594 A2 WO 2006092594A2
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
image
representation
images
parameters
Prior art date
Application number
PCT/GB2006/000735
Other languages
French (fr)
Other versions
WO2006092594A3 (en
Inventor
David Hawkes
Graeme Penney
Dean Barratt
Original Assignee
Kings College London
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0504175A external-priority patent/GB0504175D0/en
Application filed by Kings College London filed Critical Kings College London
Publication of WO2006092594A2 publication Critical patent/WO2006092594A2/en
Publication of WO2006092594A3 publication Critical patent/WO2006092594A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention relates to a computer implemented method of registering a 3D ultrasound image to another 3D image of a body part
  • Ultrasound imaging techniques can be used to generate 3D images of body parts.
  • Known techniques involve tracking the location and orientation of an ultrasound transducer while a patient is being scanned. Information on the location and shape of the patient's body parts can be derived from the image data and the data from tracking the transducer. Tracking techniques which can be used in relation to the transducer are well known.
  • the 3D image of a body part can be created by extracting points from the scanned images that lie on the surface of the body part and calculating their positions relative to a reference coordinate system.
  • the bone surface is then represented in the coordinate system by a sparsely sampled collection of surface points that are derived from the ultrasound scan data.
  • the surface points might be used in a preoperative model using a standard point to surface matching algorithm.
  • the transformation which is calculated in this way provides an estimates of the physical to model transformation.
  • Localisation errors can arise from ultrasonic effects which can limit the accuracy of the registration of scan data to an image of patient tissue. Such errors can arise due to assumptions which are made in relation to ultrasound imaging techniques, for example concerning the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the tissue and the receiver, and the effect of beam thickness. These errors can restrict the accuracy with which images can be created, and prevent the use of ultrasound techniques to image creation for applications such as preoperative planning in orthopaedic and other procedures.
  • the present invention provides a technique for registering an ultrasound image with a representation of a body part by optimising the fit of intensity values in the ultrasound image. Accordingly, in one aspect, the invention provides a computer implemented method for registering a 3D ultrasound image with a 3D representation of a body part, the method comprising optimising the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
  • the method of the invention should preferably perform the registration between ultrasound image data and a representation of the volume of the body part. This can be contrasted with existing techniques in which a registration is performed with a representation of the surface of a body part.
  • a further advantage arises from the method of the present invention when compared with existing techniques based on surface analysis techniques is that it can provide enhanced accuracy.
  • the method of the invention involves altering the six rigid body parameters (location and orientation data in six degrees of freedom) and the or each relevant ultrasound scaling parameter so as to optimise the value of a similarity measure between the ultrasound image and the representation of the body part.
  • the image and the representation are then converted into probabilistic images where the intensity of the pixels is proportional to the probability of that pixel containing a bone-to-soft tissue interface.
  • the conversion to probabilistic images can be carried out using probability density functions. These functions map two features extracted from the image and the represent- ation (such as the local image gradient) into a probability. The functions are calculated by a process which only needs to be carried out once.
  • the probability density function used for converting the ultrasound images can be based on at least one function intended to highlight bone edges in the ultrasound images.
  • the function can be intended to identify image features indicative bone to soft tissue boundary or a bone shadow region of low US intensity. More than one image feature can be used and preferably at least two image features are used. More than two image features can be used.
  • the probability density function used for converting the representation images can be based on at least one function intended to highlight edges in the representation.
  • the probability density function used for converting the representation images can also be based on a function intended to identify different types of boundaries in the representation, preferably bone to soft tissue boundaries and skin to air boundaries. More than one image feature can be used and preferably at least two image features are used. More than two image features can be used.
  • Ultrasound scaling parameters can relate to, for example, the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the tissue and the receiver, the effect of beam thickness, and ultrasound image scaling (for example in mm.pixel "1 ) in the lateral and axial directions relative to the ultrasound beam.
  • the ultrasound image data is subjected to an artefact removal step, to remove regions of the ultrasound image that are caused by artefacts such as shadowing behind boundaries between different body tissue components.
  • the similarity measure can be calculated as follows. An estimate of the rigid body parameters and the or each relevant ultrasound scaling parameter are used to reslice the probabilistic image derived from the representation in the plane of each ultrasound slice, the pixel values in these reformatted slices and in the probabilistic image derived from the ultrasound images are then compared using the normalised cross-correlation similarity measure. -A-
  • the algorithm proceeds by altering each of the rigid body parameters and the or each relevant ultrasound scaling parameter by a given step size, and calculating the value of the similarity measure. The algorithm then moves the current estimate of the solution in the direction which produces the greatest improvement in the similarity measure.
  • the fit of the intensity is optimised using at least two separate stages, and more preferably at least three separate stages. More than three stages can be used.
  • the or each optimisation stage can include altering a step size between parameters used during the optimisation stage.
  • the step size can be reduced, and is preferably halved, between iterations of the optimisation during the or each optimisation stage.
  • the ultrasound scaling parameter or parameters can be used to optimise the fit in less than all of the stages.
  • the ultrasound scaling parameters are used to optimise the fit in only one stage, and more preferably in only the last stage.
  • the optimisation stages can have different resolutions.
  • the first stage has a lower resolution than subsequent stages.
  • the lower resolution stage or stages can include blurring the representation data.
  • the 3D representation of the body part can be derived using scanning techniques such as magnetic resonance, X-rays etc.
  • CT scanning techniques is particularly preferred, especially when the method of the invention is practised in relation to a bone part which comprises bone tissue.
  • a data processing apparatus for registering a 3D ultrasound image with a 3D representation of a body part, the data processing apparatus including a data processor and a storage device storing computer program instructions which can configure the data processor to optimise the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
  • embodiments of the present invention employ various processes involving data stored in or transferred through one or more computer systems.
  • Embodiments of the present invention also relate to an apparatus for performing these operations.
  • This apparatus may be specially constructed for the required purposes, or it may be a general- purpose computer selectively activated or reconfigured by a computer program and/or data structure stored in the computer.
  • the processes presented herein are not inherently related to any particular computer or other apparatus.
  • various general-purpose machines may be used with programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required method steps. A particular structure for a variety of these machines will appear from the description given below.
  • embodiments of the present invention relate to computer readable media or computer program products that include program instructions and/or data (including data structures) for performing various computer-implemented operations.
  • Examples of computer-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto- optical media; semiconductor memory devices, and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM).
  • ROM read-only memory devices
  • RAM random access memory
  • the data and program instructions of this invention may also be embodied on a carrier wave or other transport medium.
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • Figure 1 shows a flow chart illustrating activities relating to a computer assisted surgical procedure in which the invention can be used
  • Figure 2 shows an ultrasound probe which can be used to capture ultrasound images used in the invention
  • Figure 3 shows a schematic illustration of a surgical setup which can be used to allow the invention to be carried out
  • Figure 4 shows a schematic block diagram of apparatus including the probe shown in Figure 2 and set up as shown in Figure 3 including data processing apparatus according to the invention
  • Figure 5 shows a process flow chart illustrating a first embodiment of a registration method of the invention
  • Figure 6 shows a process flow chart illustrating a method for creating a probability density function for CT images used in the invention
  • Figure 7 shows a process flow chart illustrating a method for creating a probability density function for US images used in the invention
  • Figure 8 shows a process flow chart illustrating a second embodiment of the registration method of the invention.
  • FIG 9 shows a schematic block diagram of a data processing apparatus according to the invention for carrying out the registration method illustrated in Figures 5 and 8.
  • FIG. 1 With reference to Figure 1, there is shown a flow chart illustrating at a high level various activities related to a surgical procedure 100 in which the registration method of the present invention can be used. The registration method will be described within the context of a surgical procedure being carried out on a femur but the invention is not limited to that particular application.
  • a first pre-operative step 101 probability density functions are created for ultra sound (US) and computer tomography (CT) images using US and CT images as training data and which are used subsequently during the registration method to create probabilistic images.
  • a probabilistic image the intensity of any pixel, or voxel, is indicative of the probability of that pixel, or voxel, being at a bone to soft tissue interface.
  • the probability density functions only need to be created once and can be reused subsequently.
  • the probability density functions can be created entirely independently of the subsequent activities illustrated in Figure 1 and also in a sequence different to that shown in Figure 1. All that is required is that the probability density functions are available for use in the registration step 106. Step 101 will be described in greater detail below with reference to Figures 6 and 7.
  • images of the patient's body part, in this instance the femur are captured by carrying out a CT scan of the femur.
  • the digitized CT scan data is then transferred, e.g. over a network, or otherwise made available to the computer of the computer assisted or aided surgery (CAS) system that will carry out the registration method of the invention.
  • CAS computer assisted or aided surgery
  • a marker 4 trackable by a tracking system part 1 of the CAS system 10 is attached to the patient's body part to allow the position of the patients body part 6 in the reference frame of the tracking system to be dynamically determined.
  • This trackable marker will be referred to herein as a Dynamic Reference Object or DRO.
  • This intra-operative step is not illustrated in Figure 1.
  • step 104 US images 5 of the patient's body part are taken intra-operatively and the US scan data is processed and made available to the computer part 12 of the CAS system 10.
  • step 106 the registration method of the invention is used to register the patient model 2 in the reference frame of the tracking system of the CAS system.
  • the registered patient model can be used to navigate various instruments, implants and other entities during a navigated surgical procedure 108. For example, if a pre-operative surgical plan has been defined relative to the CT model 2, then registration of the CT model will register the surgical plan in the reference frame of the tracking system.
  • the navigated surgical procedure may include various general operations such as planning the surgical procedure or image guided surgical steps with which a person of ordinary skill in the art will be familiar. Subsequent steps have not been described in greater detail so as not to obscure the present invention.
  • Fig. 3 shows a schematic diagram illustrating of the set-up used for the registration method, together with the rigid-body transformations (T) that relate the 3D coordinate systems of the various elements indicated.
  • the model of the patients body part in its frame of reference is illustrated by element 2 and Figure 3 shows the ultrasound probe 8 being used to capture an US image 5 through the patients femur 6.
  • the apparatus includes a standard clinical US scanner 14 (such as a Philips- ATL HDI- 5000 as provided by Philips Medical Systems, Bothell, WA) with a high-frequency linear- array scan-probe 8 (such as an L12-5 probe with a 5-12MHz broadband transducer) which are used to obtain the US images.
  • the scan-probe can be tracked by a suitable trackable marker 3 such as an array of LEDs, as shown in Figs. 2 and 3.
  • the 3D position of each LED can be measured by the optical localiser 1 part of the tracking system and these positions used to calculate the position and orientation of the marker 3 relative to the dynamic reference object marker 4.
  • Suitable acquisition software for example written in C++ using Visual Studio 6 (Microsoft Corp., USA), is used to synchronize US image capture with tracking measurements.
  • US images are captured using an analogue-to-digital converter 16 (such as a Canopus ADVC-100, as provided by Canopus UK, Berkshire, UK) connected between the composite video output of the US scanner 14 and the computer 12 of the CAS system.
  • Computer 12 includes or has access to a mass storage device 18 which can store various data required by the CAS system, including the US image data and the pre-operatively captured CT data or other data defining the model 2.
  • the scan-probe 8 can be covered with a plastic sheath containing a small quantity of US gel to maintain acoustic coupling.
  • the scan-probe should be swept across the skin surface slowly during step 104.
  • the rigid-body calibration transformation for the 3D US system can be determined using a point-target phantom as is known in the art and as described, for example, in "Accuracy of an Electromagnetic Three-Dimensional Ultrasound System for Cartoid Artery Imaging", Ultrasound Med Biol, Vol. 27, pp. 1421- 1425, 2001 which is incorporated herein by reference for all purposes.
  • the calibration procedure involves obtaining tracked images of a pinhead immersed in a water-glycerol solution from many different views. A plurality fo images captured from different directions are used to calculate the calibration transformation. The coordinates of the pinhead in each image are identified manually using custom- written software and the pixel scaling can be determined using electronic calliper measurements provided by the US scanner.
  • tracked US images of the surface of the femur are acquired from many different views of the anterior and the posterior femoral shaft, the greater trochanter, the femoral neck, and the epicondyles. US images can also be captured of the pelvis if the pelvis is to be registered instead of the femur. US images can be acquired in 2D compounding mode ('SonoCTTM' with the level of compounding set to 'Survey') with a maximum penetration depth of 6cm and a single focal zone. Before images are acquired, the level of the focal zone can be adjusted to approximately correspond to the average depth of bone surface in order to maximise the image resolution at that depth.
  • 2D compounding mode 'SonoCTTM' with the level of compounding set to 'Survey'
  • the level of the focal zone can be adjusted to approximately correspond to the average depth of bone surface in order to maximise the image resolution at that depth.
  • compound mode a number of images are formed by electronically steering the US beam through several preset angles. These images are then combined to produce a single compounded image. Compounding generally reduces speckle artefacts, but has also been found to improve the echo intensity from regions of a curved bone surface not perpendicular to the axial direction of the image.
  • the optical localiser 1 (such as an Optotrak 3020 as provided by Northern Digital Inc., Ontario, Canada) is used to track the US probe 8 and the dynamic reference object (DRO) 4 which is rigidly attached to the femur 6.
  • the aim is to calculate the registration transformation, T CT _ DR0 , which transforms physical positions in the coordinate system or reference frame of the DRO into voxel positions within the preoperative CT volume 2.
  • the calibration matrix, T PROBE _ US which transforms positions from the US image to positions relative to the infrared-emitting diodes (IREDs) attached to the probe can be calculated as described above by scanning a pin-head in a water-glycerol solution.
  • This calibration consists of eight parameters, six rigid-body parameters (three translations and three rotations) and two scaling parameters, the longitudinal or axial scaling parameter s (which is related to the speed of US) and the transverse or lateral scaling parameter.
  • the transformation T' OPT _ DRO transforms positions relative to the IREDs on the dynamic reference object (DRO) to positions relative to the cameras on the Optotrak localiser.
  • the transformation T' OPT _p ROBE transforms positions relative to the coordinate system defined by IRED positions on the US probe to the Optotrak coordinate system.
  • the method 50 of the invention calculates the transformation T by altering the six rigid-body parameters which define transformation T CTVDR0 and the probe calibration scaling parameter in the vertical (axial or y) US image direction, s y , and horizontal direction (lateral or x) S x , in order to optimise the value of a similarity measure between the US and CT images.
  • the rigid-body parameters or the equivalent transformation i.e. T CT>-DR0
  • T CT>-DR0 can be used to determine the position and orientation of any tracked and calibrated object (such as a surgical drill) in the CT scan, and so relate the intraoperative positions of such instruments to a preoperative plan described with reference to the CT image 2.
  • the method 50 is an intensity based method and so does not require segmentation of the US images or CT volume. Further, the method includes optimising at least one US scaling parameter in order to improve the registration accuracy.
  • the US and CT images are converted into probabilistic images using probability density functions that have previously been created using US and CT training data 54 which was manually segmented.
  • the pixel or voxel intensity is proportional to the probability of the pixel or voxel containing a bone to soft tissue interface.
  • a current estimate of the six rigid body parameters for T cl v DRO and the two ultrasound scaling parameters are used to reslice the CT probabilistic image in the plane of each US slice.
  • the similarity of the pixel values in the reformatted CT slices and the US slices are compared.
  • the similarity measure for the method is calculated as follows.
  • the current estimate of the rigid-body parameters, s y and S x are used to reslice the CT probability image in the plane of each US slice.
  • the pixel values in these reformatted slices and in the US probability images are then compared using a normalised cross- correlation similarity measure (also known as linear correlation, or "Pearson's r").
  • the rigid body parameters and US parameters are incremented or decremented and the similarity measure is re-calculated.
  • the current estimate of the rigid body parameters and the US parameters is then changed at step 62 to values that improve the similarity measure.
  • step 64 it is then determined, as illustrated by return loop 64, whether further changing the parameters will increase the similarity measure, i.e. improve the registration. If so, then processing returns to step 58 in which the similarity measure is again calculated with the CT slices reformatted using the current estimate of the rigid body and US scaling parameters.
  • loop 64 stops and the method ends and outputs data 66 indicating the values for the six rigid body transformation parameters and the values for the two US scaling parameters corresponding to the maximum similarity measure. Hence, registration has been achieved.
  • Figure 6 shows a flow chart illustrating a method 70 for creating a probability density function from training data in the form of a set of CT images, and which corresponds generally to a part of step 101 of Figure 1.
  • the method 70 allows a function to be created which can convert the pre-operatively captured CT images of the patient into probabilistic images.
  • a set of CT images is captured as training data.
  • an initial CT image is selected at step 74 and at step 76 the bone is segmented. This can be carried out semi-automatically, for example, by delineating the bone boundary in each transverse slice using the software package Analyze 6.0 as provided by the Mayo Foundation, Rochester, MN.
  • a binary image of the bone is formed by setting voxels outside the delineated boundary to zero and voxels within the boundary to one.
  • all those voxels lying on a bone to soft tissue boundary, S edgeCT are identified. This can be done by applying vertical and horizontal Sobel gradient operators to each slice of the binary volume.
  • AU the voxles in the training data, S c ⁇ are also identified.
  • a next CT image from the training data is processed in the same way as indicated by loop 84 until all the training data CT images have been processed.
  • the probability density function p c ⁇ (a,b) is determined using:
  • the function C calculates the cardinality (i.e. the number of elements) of the set.
  • the first CT feature f CT i(x) is the intensity of a gradient image G 2D (x).
  • the gradient image is set to be the magnitude of the vertical and horizontal gradient images calculated by convolving each 2D CT slice I 2D (x) with 3 x 3 vertical and horizontal Sobel operators.
  • Figure 7 shows a flow chart illustrating a method 90 for creating a probability density function from training data in the form of a set of US images, and which corresponds generally to a part of step 101 of Figure 1.
  • the method 90 allows a function to be created which can convert the intra-operatively obtained US images of the patient into probabilistic images.
  • Artifact removal uses the fact that most strong reflections cause a loss of signal intensity along the direction of the beam.
  • Artifact removal 94 begins by starting at the bottom of each column of pixels in the US image and moving upward toward the transducer face. The pixels of the image are labelled as artifact until a threshold value of pixel intensity T art is reached. For example a threshold value of 40, in arbitrary intensity units, can be used. Pixels in a small region immediately adjacent to the transducer face are also labelled as artifacts. For example those pixels in a region within approximately 3mm of the transducer face. This helps to remove artifacts caused by imperfect coupling at ths skin surface. Image regions labelled as artifacts are not used in subsequent image processing. This step effectively produces a mask so that only pixels in the relevant region of the image are further processed.
  • a first US image is selected at step 96 and a number of points are manually picked along the centre of the bone edge at step 98. Then the points are connected by straight lines at step 100. Then at step 102, the closets pixel in each column of pixels to the line defining the bone edge is identified. These pixels define the set S edgeUS being the pixels defining the bone to soft tissue boundary in the US image. The pixel set S us is also determined, being all of the non-artifact pixels. Then a next US image from the training set is selected at step 104 and processing loops as indicated by line 106, until S edgeUS and S us have been determined for all the training data.
  • the probability density function for US images, p us is then calculated at step 108, using:
  • the US probability density function p us is also based on two image features.
  • the first US feature, f usl (x), is the intensity of the US image I us (x).
  • the second US feature, f US2 (x) is the number of pixels between x and the bottom of the image which were not labelled as an artifact. Both of these features help to highlight bone edges due to the large change in acoustic impedance between soft tissue and bone, which results in the high US intensity values at the boundary (detected by fusi(x)) and a bone shadow region of very low US intensities (detected by f US2 (x))-
  • FIG 8 shows a process flow chart illustrating a further embodiment of the invention corresponding generally to step 106 of Figure 1.
  • the registration method 110 is generally similar to that shown in Figure 5 above.
  • the computer implemented registration method 110 begins and at step 112 artifacts are removed from each US image slice through the patient in the same manner as described above in connection with creation of the US probability density function.
  • the non-artifact part of each US image slice is converted from an intensity image I(x) into a corresponding probabilistic image P(x) us , where x represents the image position, and P(x)us represents the probability of the pixel containing a bone to soft tissue interface.
  • the CT image is also converted into a probabilistic image in a similar manner.
  • the CT images are converted from an intensity image I(x) into a corresponding probabilistic image P(x), where x represents the image position, and P(x) c ⁇ represents the probability of the voxel containing a bone to soft tissue interface.
  • the probability images are calculated using the CT probability density function p c ⁇ , which, given a set of image features F(x) c ⁇ returns an estimate of the probability that position x is a bone edge, i.e.
  • P(X) CT P CT (F(X) CT )-
  • a current estimate of the six rigid body parameters and s y are used to reslice the CT probability image in the plane of each US slice.
  • the method includes three different sets of optimisation parameters which are used in three iterations of the general method.
  • the US scaling parameter s y is only actually used in the final iteration of the method.
  • the similarity of the resliced CT data and US data is compared at step 120.
  • the pixel values in the reformatted CT slices and in the US probability images are compared using the normalised cross-correlation similarity measure, also known as linear correlation, or Pearson's r.
  • the method uses a three stage optimisation approach.
  • a low resolution stage a high resolution stage and then a further high resolution stage including optimisation of s y .
  • the following table summarises the optimisation parameters used for each stage.
  • the blurring and subsampling apply to the CT volume and ⁇ denotes the standard deviation of a Gaussian blurring kernel.
  • the step sizes indicate the maximum and minimum step sizes, and % change for s y , used in the method at each stage of the optimisation of the registration.
  • the US image resolution is not altered at each stage, but it is reduced by a factor of four prior to registration by Gaussian blurring and subsampling to give pixel sizes of 0.486 x 0.486mm 2 .
  • step 122 it is determined whether the similarity has been maximised, that is whether the fit has been optimised. Initially the fit will not be optimised and so processing proceeds to step 124 at which at least one or some of the parameters are varied and processing returns to steps 118 and 120 and the similarity value is calculated again. The method then updates the current optimum value of the registration parameters by moving in the direction which produced the greatest improvement in the similarity measure. Processing continues to loop until no improvement in the similarity measure for the optimum parameters is found.
  • step 126 Processing then proceeds to step 126 at which, for the current parameter set, the parameter step size is reduced by a factor of two. Processing then returns to step 118 and the proceeding steps are repeated as above. When no improvement in similarity occurs then again at step 126, the parameter step size is reduced by a factor of two and processing returns to step 118. Processing proceeds in this way until the minimum step size has been reached for the parameter set, in the first stage, the low res parameters in the table.
  • step 130 processing proceeds to step 132 at which the next set of optimisation parameters are used in a second optimisation stage which has a higher resolution than the first optimisation. Steps 118 to 132 are repeated as indicated above until finally a third optimisation is carried out using a third set of optimisation parameters in which the value of s y is also optimised.
  • the output of the method is the six rigid body parameters and US scaling parameter used to register the CT data in the reference frame of the tracking system.
  • Figure 9 illustrates a typical computer system that, when appropriately configured or designed, can serve as the data processing apparatus or computer of the CAS system according to the invention.
  • the data processing apparatus or computer 500 includes any number of processors 502 (also referred to as central processing units, or CPUs) that are coupled to storage devices including primary storage 506 (typically a random access memory, or RAM), primary storage 504 (typically a read only memory, or ROM).
  • CPU . 502 may be of various types including microcontrollers and microprocessors such as programmable devices (e.g., CPLDs and FPGAs) and unprogrammable devices such as gate array ASICs or general purpose microprocessors.
  • primary storage 504 acts to transfer data and instructions uni-directionally to the CPU and primary storage 506 is used typically to transfer data and instructions in a bi-directional manner.
  • Mass storage device 508 is also coupled bi-directionally to CPU 502 and provides additional data storage capacity and may include any of the computer-readable media described above. Mass storage device 508 may be used to store programs, data and the like and is typically a secondary storage medium such as a hard disk. It will be appreciated that the information retained within the mass storage device 508, may, in appropriate cases, be incorporated in standard fashion as part of primary storage 506 as virtual memory. A specific mass storage device such as a CD-ROM 514 may also pass data uni-directionally to the CPU.
  • CPU 502 is also coupled to an interface 510 that connects to one or more input/output devices such as such as video monitors, track balls, mice, keyboards, microphones, touch- sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers.
  • CPU 502 optionally may be coupled to an external device such as a database or a computer or telecommunications network using an external connection as shown generally at 512. With such a connection, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the method steps described herein.
  • the present invention has a much broader range of applicability.
  • aspects of the present invention is not limited to any particular kind of orthopaedic procedure and can be applied to virtually any method in which US images are sufficient to allow the anatomical structure of interest to be distinguished.
  • the techniques of the present invention could be used to register non-rigid anatomical structures, i.e. non-bony structures, such as organs.
  • non-rigid anatomical structures i.e. non-bony structures, such as organs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Methods, apparatus and computer program code for registering a 3D ultrasound image with a 3D representation of a body part are described. The fit of the intensity values in the ultrasound image to the representation is optimising using at least one ultrasound scaling parameter as well as position and orientation parameters. The fit of the intensity can be optimised using at least two separate stages. The image and the representation can be converted into probabilistic images where the intensity of the pixels is proportional to the probability of that pixel containing a bone-to-soft tissue interface.

Description

3D ULTRASOUND REGISTRATION
The present invention relates to a computer implemented method of registering a 3D ultrasound image to another 3D image of a body part
Ultrasound imaging techniques can be used to generate 3D images of body parts. Known techniques involve tracking the location and orientation of an ultrasound transducer while a patient is being scanned. Information on the location and shape of the patient's body parts can be derived from the image data and the data from tracking the transducer. Tracking techniques which can be used in relation to the transducer are well known.
The 3D image of a body part (for example a bone) can be created by extracting points from the scanned images that lie on the surface of the body part and calculating their positions relative to a reference coordinate system. The bone surface is then represented in the coordinate system by a sparsely sampled collection of surface points that are derived from the ultrasound scan data. The surface points might be used in a preoperative model using a standard point to surface matching algorithm. The transformation which is calculated in this way provides an estimates of the physical to model transformation.
Localisation errors can arise from ultrasonic effects which can limit the accuracy of the registration of scan data to an image of patient tissue. Such errors can arise due to assumptions which are made in relation to ultrasound imaging techniques, for example concerning the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the tissue and the receiver, and the effect of beam thickness. These errors can restrict the accuracy with which images can be created, and prevent the use of ultrasound techniques to image creation for applications such as preoperative planning in orthopaedic and other procedures.
The present invention provides a technique for registering an ultrasound image with a representation of a body part by optimising the fit of intensity values in the ultrasound image. Accordingly, in one aspect, the invention provides a computer implemented method for registering a 3D ultrasound image with a 3D representation of a body part, the method comprising optimising the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
Advantages arise from the method of the invention in that, by optimising the fit of intensity values, the need to perform a segmentation step on scan data is avoided. This can be a time consuming step in the analysis of scan data. The elimination of a segmentation step can allow scan data to be analysed in preparation for a procedure more quickly, and with less user intervention than has been required previously.
The method of the invention should preferably perform the registration between ultrasound image data and a representation of the volume of the body part. This can be contrasted with existing techniques in which a registration is performed with a representation of the surface of a body part. A further advantage arises from the method of the present invention when compared with existing techniques based on surface analysis techniques is that it can provide enhanced accuracy.
Preferably, the method of the invention involves altering the six rigid body parameters (location and orientation data in six degrees of freedom) and the or each relevant ultrasound scaling parameter so as to optimise the value of a similarity measure between the ultrasound image and the representation of the body part. Preferably, the image and the representation are then converted into probabilistic images where the intensity of the pixels is proportional to the probability of that pixel containing a bone-to-soft tissue interface. The conversion to probabilistic images can be carried out using probability density functions. These functions map two features extracted from the image and the represent- ation (such as the local image gradient) into a probability. The functions are calculated by a process which only needs to be carried out once. This process requires sets of training data relating to the ultrasound image and the representation, in which the bone-to-soft tissue interface has been manually segmented. The probability density function used for converting the ultrasound images can be based on at least one function intended to highlight bone edges in the ultrasound images. For example, the function can be intended to identify image features indicative bone to soft tissue boundary or a bone shadow region of low US intensity. More than one image feature can be used and preferably at least two image features are used. More than two image features can be used.
The probability density function used for converting the representation images can be based on at least one function intended to highlight edges in the representation. The probability density function used for converting the representation images can also be based on a function intended to identify different types of boundaries in the representation, preferably bone to soft tissue boundaries and skin to air boundaries. More than one image feature can be used and preferably at least two image features are used. More than two image features can be used.
Ultrasound scaling parameters can relate to, for example, the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the tissue and the receiver, the effect of beam thickness, and ultrasound image scaling (for example in mm.pixel"1) in the lateral and axial directions relative to the ultrasound beam.
Preferably, the ultrasound image data is subjected to an artefact removal step, to remove regions of the ultrasound image that are caused by artefacts such as shadowing behind boundaries between different body tissue components.
The similarity measure can be calculated as follows. An estimate of the rigid body parameters and the or each relevant ultrasound scaling parameter are used to reslice the probabilistic image derived from the representation in the plane of each ultrasound slice, the pixel values in these reformatted slices and in the probabilistic image derived from the ultrasound images are then compared using the normalised cross-correlation similarity measure. -A-
The algorithm proceeds by altering each of the rigid body parameters and the or each relevant ultrasound scaling parameter by a given step size, and calculating the value of the similarity measure. The algorithm then moves the current estimate of the solution in the direction which produces the greatest improvement in the similarity measure.
Preferably the fit of the intensity is optimised using at least two separate stages, and more preferably at least three separate stages. More than three stages can be used.
The or each optimisation stage can include altering a step size between parameters used during the optimisation stage. The step size can be reduced, and is preferably halved, between iterations of the optimisation during the or each optimisation stage.
The ultrasound scaling parameter or parameters can be used to optimise the fit in less than all of the stages. Preferably, the ultrasound scaling parameters are used to optimise the fit in only one stage, and more preferably in only the last stage.
The optimisation stages can have different resolutions. Preferably the first stage has a lower resolution than subsequent stages. The lower resolution stage or stages can include blurring the representation data.
The 3D representation of the body part can be derived using scanning techniques such as magnetic resonance, X-rays etc. The use of CT scanning techniques is particularly preferred, especially when the method of the invention is practised in relation to a bone part which comprises bone tissue.
Examples of surgical procedures to which the technique of the present invention is applicable include orthopaedic joint procedures (for example replacement of hip, knee, shoulder, ankle and elbow joints), peri-acetabular osteotomy, tibial osteotomy, distal radius osteotomy, anterior cruciate ligament reconstruction, osteoid osteoma excision, bone tumour resection, spinal procedures (for example in the placement of pedicle screws), and fracture surgery According to a further aspect of the invention, there is provided a data processing apparatus for registering a 3D ultrasound image with a 3D representation of a body part, the data processing apparatus including a data processor and a storage device storing computer program instructions which can configure the data processor to optimise the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
Preferred features of the method indicated above are also preferred counterpart features of the apparatus aspect of the invention.
Generally, embodiments of the present invention employ various processes involving data stored in or transferred through one or more computer systems. Embodiments of the present invention also relate to an apparatus for performing these operations. This apparatus may be specially constructed for the required purposes, or it may be a general- purpose computer selectively activated or reconfigured by a computer program and/or data structure stored in the computer. The processes presented herein are not inherently related to any particular computer or other apparatus. In particular, various general-purpose machines may be used with programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required method steps. A particular structure for a variety of these machines will appear from the description given below.
According to a further aspects of the invention, there are provided computer program code for carrying out the method of the invention, and any preferred features, and a computer readable medium bearing such computer program code.
In addition, embodiments of the present invention relate to computer readable media or computer program products that include program instructions and/or data (including data structures) for performing various computer-implemented operations. Examples of computer-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto- optical media; semiconductor memory devices, and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). The data and program instructions of this invention may also be embodied on a carrier wave or other transport medium. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
Embodiments of the invention will now be described in detail, by way of example only, and with reference to the accompanying drawings, in which:
Figure 1 shows a flow chart illustrating activities relating to a computer assisted surgical procedure in which the invention can be used;
Figure 2 shows an ultrasound probe which can be used to capture ultrasound images used in the invention;
Figure 3 shows a schematic illustration of a surgical setup which can be used to allow the invention to be carried out; Figure 4 shows a schematic block diagram of apparatus including the probe shown in Figure 2 and set up as shown in Figure 3 including data processing apparatus according to the invention;
Figure 5 shows a process flow chart illustrating a first embodiment of a registration method of the invention; Figure 6 shows a process flow chart illustrating a method for creating a probability density function for CT images used in the invention;
Figure 7 shows a process flow chart illustrating a method for creating a probability density function for US images used in the invention;
Figure 8 shows a process flow chart illustrating a second embodiment of the registration method of the invention; and
Figure 9 shows a schematic block diagram of a data processing apparatus according to the invention for carrying out the registration method illustrated in Figures 5 and 8.
Like items in different figures share common reference numerals unless indicated otherwise. With reference to Figure 1, there is shown a flow chart illustrating at a high level various activities related to a surgical procedure 100 in which the registration method of the present invention can be used. The registration method will be described within the context of a surgical procedure being carried out on a femur but the invention is not limited to that particular application.
In a first pre-operative step 101, probability density functions are created for ultra sound (US) and computer tomography (CT) images using US and CT images as training data and which are used subsequently during the registration method to create probabilistic images. In a probabilistic image, the intensity of any pixel, or voxel, is indicative of the probability of that pixel, or voxel, being at a bone to soft tissue interface. It will be appreciated that the probability density functions only need to be created once and can be reused subsequently. It will also be appreciated that the probability density functions can be created entirely independently of the subsequent activities illustrated in Figure 1 and also in a sequence different to that shown in Figure 1. All that is required is that the probability density functions are available for use in the registration step 106. Step 101 will be described in greater detail below with reference to Figures 6 and 7.
In a further pre-operative step 102, images of the patient's body part, in this instance the femur, are captured by carrying out a CT scan of the femur. The digitized CT scan data is then transferred, e.g. over a network, or otherwise made available to the computer of the computer assisted or aided surgery (CAS) system that will carry out the registration method of the invention.
After imaging step 102, a marker 4 trackable by a tracking system part 1 of the CAS system 10 is attached to the patient's body part to allow the position of the patients body part 6 in the reference frame of the tracking system to be dynamically determined. This trackable marker will be referred to herein as a Dynamic Reference Object or DRO. This intra-operative step is not illustrated in Figure 1.
Then at step 104 US images 5 of the patient's body part are taken intra-operatively and the US scan data is processed and made available to the computer part 12 of the CAS system 10. Then at step 106 the registration method of the invention is used to register the patient model 2 in the reference frame of the tracking system of the CAS system. Then, once the model 2 of the patient's body part has been registered the registered patient model can be used to navigate various instruments, implants and other entities during a navigated surgical procedure 108. For example, if a pre-operative surgical plan has been defined relative to the CT model 2, then registration of the CT model will register the surgical plan in the reference frame of the tracking system. The navigated surgical procedure may include various general operations such as planning the surgical procedure or image guided surgical steps with which a person of ordinary skill in the art will be familiar. Subsequent steps have not been described in greater detail so as not to obscure the present invention.
Having given an overview of how the US based registration method of the present invention fits into the overall CAS procedure, the surgical set up and associated apparatus will be described in greater detail with reference to Figures 2, 3 and 4.
Fig. 3 shows a schematic diagram illustrating of the set-up used for the registration method, together with the rigid-body transformations (T) that relate the 3D coordinate systems of the various elements indicated. The model of the patients body part in its frame of reference is illustrated by element 2 and Figure 3 shows the ultrasound probe 8 being used to capture an US image 5 through the patients femur 6.
The apparatus includes a standard clinical US scanner 14 (such as a Philips- ATL HDI- 5000 as provided by Philips Medical Systems, Bothell, WA) with a high-frequency linear- array scan-probe 8 (such as an L12-5 probe with a 5-12MHz broadband transducer) which are used to obtain the US images. The scan-probe can be tracked by a suitable trackable marker 3 such as an array of LEDs, as shown in Figs. 2 and 3. The 3D position of each LED can be measured by the optical localiser 1 part of the tracking system and these positions used to calculate the position and orientation of the marker 3 relative to the dynamic reference object marker 4. Suitable acquisition software, for example written in C++ using Visual Studio 6 (Microsoft Corp., USA), is used to synchronize US image capture with tracking measurements. US images are captured using an analogue-to-digital converter 16 (such as a Canopus ADVC-100, as provided by Canopus UK, Berkshire, UK) connected between the composite video output of the US scanner 14 and the computer 12 of the CAS system. Computer 12 includes or has access to a mass storage device 18 which can store various data required by the CAS system, including the US image data and the pre-operatively captured CT data or other data defining the model 2. Although elements are shown as separate in Figure 4 for the sake of clarity of explanation, it will be appreciated that the elements can be integrated into a single custom unit.
During US scanning, the scan-probe 8 can be covered with a plastic sheath containing a small quantity of US gel to maintain acoustic coupling. In order to minimize motion artefacts and ensure that captured images are closely synchronized with the tracking data, the scan-probe should be swept across the skin surface slowly during step 104.
The rigid-body calibration transformation for the 3D US system, indicated as TPROBE<_US in Fig. 2, can be determined using a point-target phantom as is known in the art and as described, for example, in "Accuracy of an Electromagnetic Three-Dimensional Ultrasound System for Cartoid Artery Imaging", Ultrasound Med Biol, Vol. 27, pp. 1421- 1425, 2001 which is incorporated herein by reference for all purposes. The calibration procedure involves obtaining tracked images of a pinhead immersed in a water-glycerol solution from many different views. A plurality fo images captured from different directions are used to calculate the calibration transformation. The coordinates of the pinhead in each image are identified manually using custom- written software and the pixel scaling can be determined using electronic calliper measurements provided by the US scanner.
As illustrated by step 104, tracked US images of the surface of the femur are acquired from many different views of the anterior and the posterior femoral shaft, the greater trochanter, the femoral neck, and the epicondyles. US images can also be captured of the pelvis if the pelvis is to be registered instead of the femur. US images can be acquired in 2D compounding mode ('SonoCT™' with the level of compounding set to 'Survey') with a maximum penetration depth of 6cm and a single focal zone. Before images are acquired, the level of the focal zone can be adjusted to approximately correspond to the average depth of bone surface in order to maximise the image resolution at that depth. In compound mode, a number of images are formed by electronically steering the US beam through several preset angles. These images are then combined to produce a single compounded image. Compounding generally reduces speckle artefacts, but has also been found to improve the echo intensity from regions of a curved bone surface not perpendicular to the axial direction of the image.
The optical localiser 1 (such as an Optotrak 3020 as provided by Northern Digital Inc., Ontario, Canada) is used to track the US probe 8 and the dynamic reference object (DRO) 4 which is rigidly attached to the femur 6. The aim is to calculate the registration transformation, TCT_DR0, which transforms physical positions in the coordinate system or reference frame of the DRO into voxel positions within the preoperative CT volume 2.
To calculate TCT_DR0, 3D freehand US images are acquired of the bone surface at step 104. These images are then registered to the preoperative CT volume 2 to obtain transformation, T, which maps pixel positions, xus, in the ith US image to voxel positions, x, within the CT volume 2. Transformation T is computed from four separate transformations: T = TCT^DR0{TQPT^DRQ) T0PT_PR0BETpR0BE^us
The calibration matrix, TPROBE_US, which transforms positions from the US image to positions relative to the infrared-emitting diodes (IREDs) attached to the probe can be calculated as described above by scanning a pin-head in a water-glycerol solution. This calibration consists of eight parameters, six rigid-body parameters (three translations and three rotations) and two scaling parameters, the longitudinal or axial scaling parameter s (which is related to the speed of US) and the transverse or lateral scaling parameter. The transformation T'OPT_DRO transforms positions relative to the IREDs on the dynamic reference object (DRO) to positions relative to the cameras on the Optotrak localiser. Similarly, the transformation T'OPT_pROBE transforms positions relative to the coordinate system defined by IRED positions on the US probe to the Optotrak coordinate system.
An embodiment of the method of the invention is shown by way of example in the flow chart in Figure 5. An overview of an embodiment of the method of the invention will be described with reference to Figure 5, and a further embodiment of the invention will be described in greater detail with reference to Figures 6 to 8.
As illustrated in Figure 5, the method 50 of the invention calculates the transformation T by altering the six rigid-body parameters which define transformation TCTVDR0 and the probe calibration scaling parameter in the vertical (axial or y) US image direction, sy, and horizontal direction (lateral or x) Sx, in order to optimise the value of a similarity measure between the US and CT images. After optimisation, the rigid-body parameters (or the equivalent transformation i.e. TCT>-DR0 ) can be used to determine the position and orientation of any tracked and calibrated object (such as a surgical drill) in the CT scan, and so relate the intraoperative positions of such instruments to a preoperative plan described with reference to the CT image 2.
The method 50 is an intensity based method and so does not require segmentation of the US images or CT volume. Further, the method includes optimising at least one US scaling parameter in order to improve the registration accuracy.
At step 52, the US and CT images are converted into probabilistic images using probability density functions that have previously been created using US and CT training data 54 which was manually segmented. In the probabilistic images created at step 52, the pixel or voxel intensity is proportional to the probability of the pixel or voxel containing a bone to soft tissue interface. At step 54, a current estimate of the six rigid body parameters for TclvDRO and the two ultrasound scaling parameters are used to reslice the CT probabilistic image in the plane of each US slice.
Then at step 58, the similarity of the pixel values in the reformatted CT slices and the US slices are compared. The similarity measure for the method is calculated as follows. The current estimate of the rigid-body parameters, sy and Sx are used to reslice the CT probability image in the plane of each US slice. The pixel values in these reformatted slices and in the US probability images are then compared using a normalised cross- correlation similarity measure (also known as linear correlation, or "Pearson's r"). At step 60, the rigid body parameters and US parameters are incremented or decremented and the similarity measure is re-calculated. The current estimate of the rigid body parameters and the US parameters is then changed at step 62 to values that improve the similarity measure. It is then determined, as illustrated by return loop 64, whether further changing the parameters will increase the similarity measure, i.e. improve the registration. If so, then processing returns to step 58 in which the similarity measure is again calculated with the CT slices reformatted using the current estimate of the rigid body and US scaling parameters.
When it is determined that the similarity measure is not increasing, then the best fit has been found and loop 64 stops and the method ends and outputs data 66 indicating the values for the six rigid body transformation parameters and the values for the two US scaling parameters corresponding to the maximum similarity measure. Hence, registration has been achieved.
The invention will now be described in greater detail with reference to Figures 6 to 8.
Figure 6 shows a flow chart illustrating a method 70 for creating a probability density function from training data in the form of a set of CT images, and which corresponds generally to a part of step 101 of Figure 1. The method 70 allows a function to be created which can convert the pre-operatively captured CT images of the patient into probabilistic images. At step 72 a set of CT images is captured as training data. Then an initial CT image is selected at step 74 and at step 76 the bone is segmented. This can be carried out semi-automatically, for example, by delineating the bone boundary in each transverse slice using the software package Analyze 6.0 as provided by the Mayo Foundation, Rochester, MN. Then at step 78 a binary image of the bone is formed by setting voxels outside the delineated boundary to zero and voxels within the boundary to one. Then at step 80, all those voxels lying on a bone to soft tissue boundary, SedgeCT, are identified. This can be done by applying vertical and horizontal Sobel gradient operators to each slice of the binary volume. AU the voxles in the training data, S, are also identified. Then a next CT image from the training data is processed in the same way as indicated by loop 84 until all the training data CT images have been processed. Then at step 86 the probability density function p(a,b) is determined using:
(a b) = C^XG SedgeCT>fcTl (*) = a JcT2 (*) = fil)
where the function C calculates the cardinality (i.e. the number of elements) of the set.
In p, two image features are used. Both CT features are calculated from two dimensional image slices by applying operators to each CT slice in turn. The first CT feature fCTi(x) is the intensity of a gradient image G2D(x). The gradient image is set to be the magnitude of the vertical and horizontal gradient images calculated by convolving each 2D CT slice I2D(x) with 3 x 3 vertical and horizontal Sobel operators. The second CT feature fCT2(x) *s set equal to the maximum value under a 3 x 3 mask centred on pixel x. The effect of the first feature is to highlight edges within the CT volume, while the second feature helps distinguish between bone to soft tissue edges and skin to air edges.
Figure 7 shows a flow chart illustrating a method 90 for creating a probability density function from training data in the form of a set of US images, and which corresponds generally to a part of step 101 of Figure 1. The method 90 allows a function to be created which can convert the intra-operatively obtained US images of the patient into probabilistic images.
Initially US image data is collected 92 to provide a training set. The captured images are then processed to remove artifacts. Artifact removal uses the fact that most strong reflections cause a loss of signal intensity along the direction of the beam. Artifact removal 94 begins by starting at the bottom of each column of pixels in the US image and moving upward toward the transducer face. The pixels of the image are labelled as artifact until a threshold value of pixel intensity Tart is reached. For example a threshold value of 40, in arbitrary intensity units, can be used. Pixels in a small region immediately adjacent to the transducer face are also labelled as artifacts. For example those pixels in a region within approximately 3mm of the transducer face. This helps to remove artifacts caused by imperfect coupling at ths skin surface. Image regions labelled as artifacts are not used in subsequent image processing. This step effectively produces a mask so that only pixels in the relevant region of the image are further processed.
Then a first US image is selected at step 96 and a number of points are manually picked along the centre of the bone edge at step 98. Then the points are connected by straight lines at step 100. Then at step 102, the closets pixel in each column of pixels to the line defining the bone edge is identified. These pixels define the set SedgeUS being the pixels defining the bone to soft tissue boundary in the US image. The pixel set Sus is also determined, being all of the non-artifact pixels. Then a next US image from the training set is selected at step 104 and processing loops as indicated by line 106, until SedgeUS and Sus have been determined for all the training data.
The probability density function for US images, pus, is then calculated at step 108, using:
(a b) = C(fr 1 X 6 SedgeUS>fuSl (*) = g *fθS2 (*) = frP
Paste ~ c([φeScτ}fusi (x) = ajus2(χ) = b])
where again C calculates the cardinality of the set. The US probability density function pus is also based on two image features. The first US feature, fusl(x), is the intensity of the US image Ius(x). The second US feature, fUS2(x) is the number of pixels between x and the bottom of the image which were not labelled as an artifact. Both of these features help to highlight bone edges due to the large change in acoustic impedance between soft tissue and bone, which results in the high US intensity values at the boundary (detected by fusi(x)) and a bone shadow region of very low US intensities (detected by fUS2(x))-
Figure 8 shows a process flow chart illustrating a further embodiment of the invention corresponding generally to step 106 of Figure 1. The registration method 110 is generally similar to that shown in Figure 5 above. The computer implemented registration method 110 begins and at step 112 artifacts are removed from each US image slice through the patient in the same manner as described above in connection with creation of the US probability density function. Then at step 114, the non-artifact part of each US image slice is converted from an intensity image I(x) into a corresponding probabilistic image P(x)us, where x represents the image position, and P(x)us represents the probability of the pixel containing a bone to soft tissue interface. The probability images are calculated using the US probability density function pus, which, given a set of image features F(x)us returns an estimate of the probability that position x is a bone edge, i.e. P(X)Us=PUs(F(X)Us)- The probability density function for the US image slices, pus, is based on two image features F(x)us = F(fusi(x)> W(X)) as described above.
Then at step 116 the CT image is also converted into a probabilistic image in a similar manner. The CT images are converted from an intensity image I(x) into a corresponding probabilistic image P(x), where x represents the image position, and P(x) represents the probability of the voxel containing a bone to soft tissue interface. The probability images are calculated using the CT probability density function p, which, given a set of image features F(x) returns an estimate of the probability that position x is a bone edge, i.e. P(X)CT=PCT(F(X)CT)- The probability density function for the CT image slices, p, is based on two image features F(x) = F(fCT1(x), fCT2(x)) as described above.
Then at step 118, a current estimate of the six rigid body parameters and sy are used to reslice the CT probability image in the plane of each US slice. The method includes three different sets of optimisation parameters which are used in three iterations of the general method. The US scaling parameter sy is only actually used in the final iteration of the method. After reslicing, the similarity of the resliced CT data and US data is compared at step 120. In particular the pixel values in the reformatted CT slices and in the US probability images are compared using the normalised cross-correlation similarity measure, also known as linear correlation, or Pearson's r.
As indicated above the method uses a three stage optimisation approach. A low resolution stage, a high resolution stage and then a further high resolution stage including optimisation of sy. The following table summarises the optimisation parameters used for each stage.
Figure imgf000016_0001
Figure imgf000017_0001
Except where specified, all measurements are in mm. The blurring and subsampling apply to the CT volume and σ denotes the standard deviation of a Gaussian blurring kernel. The step sizes indicate the maximum and minimum step sizes, and % change for sy, used in the method at each stage of the optimisation of the registration. The US image resolution is not altered at each stage, but it is reduced by a factor of four prior to registration by Gaussian blurring and subsampling to give pixel sizes of 0.486 x 0.486mm2.
At step 122 it is determined whether the similarity has been maximised, that is whether the fit has been optimised. Initially the fit will not be optimised and so processing proceeds to step 124 at which at least one or some of the parameters are varied and processing returns to steps 118 and 120 and the similarity value is calculated again. The method then updates the current optimum value of the registration parameters by moving in the direction which produced the greatest improvement in the similarity measure. Processing continues to loop until no improvement in the similarity measure for the optimum parameters is found.
Processing then proceeds to step 126 at which, for the current parameter set, the parameter step size is reduced by a factor of two. Processing then returns to step 118 and the proceeding steps are repeated as above. When no improvement in similarity occurs then again at step 126, the parameter step size is reduced by a factor of two and processing returns to step 118. Processing proceeds in this way until the minimum step size has been reached for the parameter set, in the first stage, the low res parameters in the table.
Then when it is determined at step 130 that the minimum step size has been reached, then processing proceeds to step 132 at which the next set of optimisation parameters are used in a second optimisation stage which has a higher resolution than the first optimisation. Steps 118 to 132 are repeated as indicated above until finally a third optimisation is carried out using a third set of optimisation parameters in which the value of sy is also optimised. After the optimisation has been completed, the output of the method is the six rigid body parameters and US scaling parameter used to register the CT data in the reference frame of the tracking system. Figure 9 illustrates a typical computer system that, when appropriately configured or designed, can serve as the data processing apparatus or computer of the CAS system according to the invention. The data processing apparatus or computer 500 includes any number of processors 502 (also referred to as central processing units, or CPUs) that are coupled to storage devices including primary storage 506 (typically a random access memory, or RAM), primary storage 504 (typically a read only memory, or ROM). CPU . 502 may be of various types including microcontrollers and microprocessors such as programmable devices (e.g., CPLDs and FPGAs) and unprogrammable devices such as gate array ASICs or general purpose microprocessors. As is well known in the art, primary storage 504 acts to transfer data and instructions uni-directionally to the CPU and primary storage 506 is used typically to transfer data and instructions in a bi-directional manner. Both of these primary storage devices may include any suitable computer-readable media such as those described above. A mass storage device 508 is also coupled bi-directionally to CPU 502 and provides additional data storage capacity and may include any of the computer-readable media described above. Mass storage device 508 may be used to store programs, data and the like and is typically a secondary storage medium such as a hard disk. It will be appreciated that the information retained within the mass storage device 508, may, in appropriate cases, be incorporated in standard fashion as part of primary storage 506 as virtual memory. A specific mass storage device such as a CD-ROM 514 may also pass data uni-directionally to the CPU.
CPU 502 is also coupled to an interface 510 that connects to one or more input/output devices such as such as video monitors, track balls, mice, keyboards, microphones, touch- sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, or other well-known input devices such as, of course, other computers. Finally, CPU 502 optionally may be coupled to an external device such as a database or a computer or telecommunications network using an external connection as shown generally at 512. With such a connection, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the method steps described herein. Although the above has generally described the present invention according to specific processes and apparatus, the present invention has a much broader range of applicability. In particular, aspects of the present invention is not limited to any particular kind of orthopaedic procedure and can be applied to virtually any method in which US images are sufficient to allow the anatomical structure of interest to be distinguished. Thus, in some embodiments, the techniques of the present invention could be used to register non-rigid anatomical structures, i.e. non-bony structures, such as organs. One of ordinary skill in the art would recognize other variants, modifications and alternatives in light of the foregoing discussion.

Claims

CLAIMS:
1. A computer implemented method for registering a 3D ultrasound image with a 3D representation of a body part, the method comprising optimising the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
2. A method as claimed in claim 1 , which includes a step of altering the six rigid body parameters and the or each relevant ultrasound scaling parameter so as to optimise the value of a similarity measure between the ultrasound image and the representation of the body part.
3. A method as claimed in claim 2, which includes the subsequent step of converting the image and the representation into probabilistic images where the intensity of the pixels is proportional to the probability of that pixel containing a bone-to-soft tissue interface.
4. A method as claimed in claim 3, in which the conversion step is carried out using probability density functions.
5. A method as claimed in claim 4, wherein the probability density function used for converting the ultrasound images is based on at least one function intended to highlight bone edges in the ultrasound images.
6. A method as claimed in claim 4 or 5, wherein the probability density function used for converting the representation images is based on at least one function intended to highlight edges in the representation.
7. A method as claimed in claim 6, wherein the probability density function used for converting the representation images is also based on a function intended to identify different types of boundaries in the representation.
8. A method as claimed in claim 2, in which the similarity measure is calculated by a method which comprises estimating the rigid body parameters and the or each relevant ultrasound scaling parameter, reslicing the probabilistic image derived from the representation in the plane of each ultrasound slice, and comparing the pixel values in these reformatted slices and in the probabilistic image derived from the ultrasound images using the normalised cross-correlation similarity measure.
9. A method as claimed in claim 1 , in which the ultrasound scaling parameters relate to at least one of the speed with which sound travels through tissue, the path taken by the sound waves between the transmitter, the tissue and the receiver, the effect of beam thickness, and ultrasound image scaling in the lateral and axial directions relative to the ultrasound beam.
10. A method as claimed in claim 1, which includes a step of removing regions of the ultrasound image that are caused by artefacts.
11. A method as claimed in claim 1, wherein the fit of the intensity is optimised using at least two separate stages.
12. A method as claimed in claim 11, wherein the ultrasound scaling parameter is used to optimise the fit in only one of the stages.
13. A method as claimed in claim 11 or 12, wherein the optimisation stages have different resolutions.
14. Computer program code executable by a data processing device to provide the method of claim 1.
15. A computer program product comprising a computer readable medium bearing computer program code as claimed in claim 14.
16. A data processing apparatus for registering a 3D ultrasound image with a 3D representation of a body part, the data processing apparatus including a data processor and a storage device storing computer program instructions which can configure the data processor to optimise the fit of the intensity values in the ultrasound image to the representation using at least one ultrasound scaling parameter as well as position and orientation parameters.
PCT/GB2006/000735 2005-03-01 2006-03-01 3d ultrasound registration WO2006092594A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB0504175A GB0504175D0 (en) 2005-03-01 2005-03-01 3D ultrasound registration
GB0504175.1 2005-03-01
GB0521825A GB0521825D0 (en) 2005-03-01 2005-10-26 3D ultrasound registration
GB0521825.0 2005-10-26

Publications (2)

Publication Number Publication Date
WO2006092594A2 true WO2006092594A2 (en) 2006-09-08
WO2006092594A3 WO2006092594A3 (en) 2007-04-05

Family

ID=36676003

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/000735 WO2006092594A2 (en) 2005-03-01 2006-03-01 3d ultrasound registration

Country Status (1)

Country Link
WO (1) WO2006092594A2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2914466A1 (en) * 2007-04-02 2008-10-03 Inst Nat Rech Inf Automat Bidimensional ultrasonic or three dimensational magnetic resonance image data processing device for e.g. cerebrum of human, has processing module resetting structures of one card with respect to that of other using transformation
WO2010015957A1 (en) * 2008-08-04 2010-02-11 Koninklijke Philips Electronics, N.V. Automatic pre-alignment for registration of medical images
WO2014106153A1 (en) * 2012-12-31 2014-07-03 Mako Surgical Corp. Systems and methods of registration using an ultrasound probe
WO2014134188A1 (en) * 2013-02-28 2014-09-04 Rivanna Medical, LLC Systems and methods for ultrasound imaging
WO2016033065A1 (en) * 2014-08-26 2016-03-03 Rational Surgical Solutions, Llc Image registration for ct or mr imagery and ultrasound imagery using mobile device
EP3027116A1 (en) * 2013-07-31 2016-06-08 Qview Medical, Inc. Reduced image reading time and improved patient flow in automated breast ultrasound using enhanced, whole-breast navigator overview images
US9486291B2 (en) 2012-06-21 2016-11-08 Rivanna Medical Llc Target region identification for imaging applications
US10548564B2 (en) 2015-02-26 2020-02-04 Rivanna Medical, LLC System and method for ultrasound imaging of regions containing bone structure
US11147536B2 (en) 2013-02-28 2021-10-19 Rivanna Medical Llc Localization of imaging target regions and associated systems, devices and methods
US11446090B2 (en) * 2017-04-07 2022-09-20 Orthosoft Ulc Non-invasive system and method for tracking bones

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6775404B1 (en) * 1999-03-18 2004-08-10 University Of Washington Apparatus and method for interactive 3D registration of ultrasound and magnetic resonance images based on a magnetic position sensor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6775404B1 (en) * 1999-03-18 2004-08-10 University Of Washington Apparatus and method for interactive 3D registration of ultrasound and magnetic resonance images based on a magnetic position sensor

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
BASS W A ET AL: "Surface-based registration of physical space with CT images using A-mode ultrasound localization of the skull" PROCEEDINGS OF THE SPIE, SPIE, BELLINGHAM, VA, US, vol. 3335, February 1998 (1998-02), pages 228-238, XP002376997 ISSN: 0277-786X *
BLACKALL J M ET AL: "An Image Registration Approach to Automated Calibration for Freehand 3D Ultrasound" MICCAI 2000, LECTURE NOTES ON COMPUTER SCIENCE, vol. 1935, 2000, pages 462-471, XP019001288 Springer-Verlag Berlin Heidelberg *
PENNEY G P ET AL: "Registration of freehand 3D ultrasound and magnetic resonance liver images." MEDICAL IMAGE ANALYSIS. MAR 2004, vol. 8, no. 1, March 2004 (2004-03), pages 81-91, XP008070483 ISSN: 1361-8415 *
PENNEY G P, BARRATT D C ET AL: "Cadaver Validation of Intensity-Based Ultrasound to CT Registration" MICCAI 2005 - LECTURE NOTES ON COMPUTER SCIENCE, vol. 3750, 27 September 2005 (2005-09-27), pages 1000-1007, XP019021741 Springer-Verlag Berlin Heidelberg *
PLUIM J P W ET AL: "MUTUAL-INFORMATION-BASED REGISTRATION OF MEDICAL IMAGES: A SURVEY" IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 22, no. 8, August 2003 (2003-08), pages 986-1004, XP001233412 ISSN: 0278-0062 *
RAJ SHEKHAR ET AL: "Mutual Information-Based Rigid and Nonrigid Registration of Ultrasound Volumes" IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 20, no. 1, January 2002 (2002-01), XP011036191 ISSN: 0278-0062 *
ZITOVA B ET AL: "IMAGE REGISTRATION METHODS: A SURVEY" IMAGE AND VISION COMPUTING, GUILDFORD, GB, vol. 21, no. 11, October 2003 (2003-10), pages 977-1000, XP001189327 ISSN: 0262-8856 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008135668A3 (en) * 2007-04-02 2009-05-07 Inst Nat Rech Inf Automat Image processing device for matching images of a same portion of a body obtained by magnetic resonance and ultrasounds
US8755581B2 (en) 2007-04-02 2014-06-17 Inria Institut National De Recherche En Informatique Et En Automatique Image processing device for matching images of a same portion of a body obtained by magnetic resonance and ultrasounds
FR2914466A1 (en) * 2007-04-02 2008-10-03 Inst Nat Rech Inf Automat Bidimensional ultrasonic or three dimensational magnetic resonance image data processing device for e.g. cerebrum of human, has processing module resetting structures of one card with respect to that of other using transformation
WO2010015957A1 (en) * 2008-08-04 2010-02-11 Koninklijke Philips Electronics, N.V. Automatic pre-alignment for registration of medical images
US8553961B2 (en) 2008-08-04 2013-10-08 Koninklijke Philips N.V. Automatic pre-alignment for registration of medical images
US9486291B2 (en) 2012-06-21 2016-11-08 Rivanna Medical Llc Target region identification for imaging applications
US10441314B2 (en) 2012-06-21 2019-10-15 Rivanna Medical Llc Target region identification for imaging application
WO2014106153A1 (en) * 2012-12-31 2014-07-03 Mako Surgical Corp. Systems and methods of registration using an ultrasound probe
CN104902839A (en) * 2012-12-31 2015-09-09 玛口外科股份有限公司 Systems and methods of registration using ultrasound probe
US10134125B2 (en) 2013-02-28 2018-11-20 Rivanna Medical Llc Systems and methods for ultrasound imaging
WO2014134188A1 (en) * 2013-02-28 2014-09-04 Rivanna Medical, LLC Systems and methods for ultrasound imaging
US10679347B2 (en) 2013-02-28 2020-06-09 Rivanna Medical Llc Systems and methods for ultrasound imaging
US11147536B2 (en) 2013-02-28 2021-10-19 Rivanna Medical Llc Localization of imaging target regions and associated systems, devices and methods
EP3027116A1 (en) * 2013-07-31 2016-06-08 Qview Medical, Inc. Reduced image reading time and improved patient flow in automated breast ultrasound using enhanced, whole-breast navigator overview images
EP3027116A4 (en) * 2013-07-31 2017-04-05 Qview Medical, Inc. Reduced image reading time and improved patient flow in automated breast ultrasound using enhanced, whole-breast navigator overview images
EP3552551A3 (en) * 2013-07-31 2019-10-23 Qview Medical, Inc. Reduced image reading time and improved patient flow in automated breast ultrasound using enhanced, whole-breast navigator overview image
WO2016033065A1 (en) * 2014-08-26 2016-03-03 Rational Surgical Solutions, Llc Image registration for ct or mr imagery and ultrasound imagery using mobile device
US10966688B2 (en) 2014-08-26 2021-04-06 Rational Surgical Solutions, Llc Image registration for CT or MR imagery and ultrasound imagery using mobile device
US10548564B2 (en) 2015-02-26 2020-02-04 Rivanna Medical, LLC System and method for ultrasound imaging of regions containing bone structure
US11446090B2 (en) * 2017-04-07 2022-09-20 Orthosoft Ulc Non-invasive system and method for tracking bones

Also Published As

Publication number Publication date
WO2006092594A3 (en) 2007-04-05

Similar Documents

Publication Publication Date Title
US10166002B2 (en) Ultrasonic bone motion tracking system
US11123040B2 (en) Real-time 3-D ultrasound reconstruction of knee and its implications for patient specific implants and 3-D joint injections
AU2018316801B2 (en) Ultrasound bone registration with learning-based segmentation and sound speed calibration
WO2006092594A2 (en) 3d ultrasound registration
EP3213682B1 (en) Method and apparatus for three dimensional reconstruction of a joint using ultrasound
Taylor et al. Computer-integrated revision total hip replacement surgery: concept and preliminary results
JP5882743B2 (en) Customized orthopedic implants and related methods and intelligent cartilage systems
Penney et al. Cadaver validation of intensity-based ultrasound to CT registration
WO2019020048A1 (en) Spinal image generation system based on ultrasonic rubbing technique and navigation positioning system for spinal surgery
US10512451B2 (en) Method and apparatus for three dimensional reconstruction of a joint using ultrasound
Kowal et al. Automated bone contour detection in ultrasound B‐mode images for minimally invasive registration in computer‐assisted surgery—an in vitro evaluation
Okada et al. Computer-assisted preoperative planning for reduction of proximal femoral fracture using 3-D-CT data
WO2006092602A1 (en) 3d ultrasound imaging
Hacihaliloglu et al. Non-iterative partial view 3D ultrasound to CT registration in ultrasound-guided computer-assisted orthopedic surgery
Schumann State of the art of ultrasound-based registration in computer assisted orthopedic interventions
Hacihaliloglu 3D ultrasound for orthopedic interventions
Penney et al. Cadaver validation of intensity-based ultrasound to CT registration
Muratore et al. Vertebral surface extraction from ultrasound images for technology-guided therapy
US20230368465A1 (en) Methods and apparatus for three-dimensional reconstruction
Fakhfakh et al. Automatic registration of pre-and intraoperative data for long bones in Minimally Invasive Surgery
Hassenpflug et al. Generation of attributed relational vessel graphs from three-dimensional freehand ultrasound for intraoperative registration in image-guided liver surgery
Peterhans et al. A method for frame-by-frame US to CT registration in a joint calibration and registration framework
Wang et al. Automatic bone segmentation and ultrasound—CT registration for robotic assisted femoral shaft fracture reduction
Masson-Sibut et al. A new automatic landmark extraction framework on ultrasound images of femoral condyles
Krivonos et al. From planning of complex bone deformities correction to computer aided operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

WWW Wipo information: withdrawn in national office

Country of ref document: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06709959

Country of ref document: EP

Kind code of ref document: A2

WWW Wipo information: withdrawn in national office

Ref document number: 6709959

Country of ref document: EP