US20120016269A1 - Registration of Anatomical Data Sets - Google Patents

Registration of Anatomical Data Sets Download PDF

Info

Publication number
US20120016269A1
US20120016269A1 US12/835,384 US83538410A US2012016269A1 US 20120016269 A1 US20120016269 A1 US 20120016269A1 US 83538410 A US83538410 A US 83538410A US 2012016269 A1 US2012016269 A1 US 2012016269A1
Authority
US
United States
Prior art keywords
data set
volume data
reference frame
bone
anatomical structure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/835,384
Other versions
US8675939B2 (en
Inventor
Jose Luis Moctezuma De La Barrera
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stryker European Operations Holdings LLC
Original Assignee
Stryker Leibinger GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stryker Leibinger GmbH and Co KG filed Critical Stryker Leibinger GmbH and Co KG
Priority to US12/835,384 priority Critical patent/US8675939B2/en
Assigned to STRYKER LEIBINGER GMBH & CO., KG. reassignment STRYKER LEIBINGER GMBH & CO., KG. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MOCTEZUMA DE LA BARRERA, JOSE LUIS
Priority to DE201110106812 priority patent/DE102011106812A1/en
Priority to JP2011154736A priority patent/JP2012020133A/en
Publication of US20120016269A1 publication Critical patent/US20120016269A1/en
Priority to US14/100,055 priority patent/US9572548B2/en
Publication of US8675939B2 publication Critical patent/US8675939B2/en
Application granted granted Critical
Assigned to STRYKER EUROPEAN HOLDINGS VI, LLC reassignment STRYKER EUROPEAN HOLDINGS VI, LLC NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER LEIBINGER GMBH & CO. KG
Assigned to STRYKER EUROPEAN HOLDINGS I, LLC reassignment STRYKER EUROPEAN HOLDINGS I, LLC NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS VI, LLC
Assigned to STRYKER EUROPEAN HOLDINGS III, LLC reassignment STRYKER EUROPEAN HOLDINGS III, LLC NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS I, LLC
Assigned to STRYKER EUROPEAN OPERATIONS HOLDINGS LLC reassignment STRYKER EUROPEAN OPERATIONS HOLDINGS LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: STRYKER EUROPEAN HOLDINGS III, LLC
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1073Measuring volume, e.g. of limbs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0875Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone

Definitions

  • the present disclosure relates to systems and methods for registering anatomical image data sets and relating anatomical information between anatomical image data sets.
  • patient related data can include, for example, anatomical information or image data obtained using a variety of imaging techniques or modalities, such as ultrasound, magnetic resonance imaging (“MRI”), computed tomography (“CT”), single photon emission computed tomography, positron emission tomography, etc.
  • imaging techniques or modalities such as ultrasound, magnetic resonance imaging (“MRI”), computed tomography (“CT”), single photon emission computed tomography, positron emission tomography, etc.
  • One technique to register patient related data across different modalities is a “point to point” or “paired point” matching technique, wherein landmarks or fiducials that can be identified across different modalities are used to determine a transformation matrix and establish a spatial relationship between the different modalities.
  • landmarks or fiducials are placed on a patient prior to an image scan, for example, an MRI or CT scan, and such landmarks or fiducials are identified in the image scan and on the patient during a surgical procedure to establish the registration between the patient and patient related data from the image scan.
  • an image scan for example, an MRI or CT scan
  • landmarks or fiducials are identified in the image scan and on the patient during a surgical procedure to establish the registration between the patient and patient related data from the image scan.
  • surface registration is used, wherein multiple surface points of a structure or region of interest are used to establish a registration surface.
  • the surface points are identified independently in different modalities often using different techniques.
  • the surface registration technique is used in ear-nose-throat surgery where a face of a patient is used as a registration surface. For example, a CT or MRI scan of the face of the patient is obtained and the surface of the skin is identified in the scan and then matched to digitized points on the face of the patient during surgery. Such digitized points can either be collected directly with a digitization device, such as a pointer, or indirectly via a registration mask.
  • the above registration techniques generally serve only to register patient related data from one modality to a different modality. Most commonly, the registration techniques register pre-operative image data of a patient to the anatomy of the patient during surgery for localization purposes of surgical instruments used to perform the surgery.
  • biomechanical and functional information of joints play an important role in determining the extent or cause of a disease.
  • Such information is generally captured through a motion analysis.
  • fiducials are placed on the skin of a body part to be analyzed.
  • a navigation system tracks the fiducials as the body part is moved and the movement of the fiducials is analyzed to establish a biomechanical model of the body part.
  • An obvious downside of this technology is that the fiducials do not directly relate to the underlying bony structures and that shifts in skin or soft tissue occurs during motion. Such shifts can contribute to relatively large motion artifacts and inaccuracies in the results of the motion analysis and the established biomechanical model.
  • a technique that overcomes soft tissue shift is the direct implantation of fiducials, such as small tantalum beads, onto the bones of the subject, wherein the fiducials are tracked using stereo-radiography techniques during movement of the body part of the patient.
  • a motion analysis may not adequately capture functional information of the joints if the motion of the limb is passive. For example, when a surgeon moves the limbs of a patient, or if the patient is anesthetized and lying on an operating room table, no voluntary muscular forces are active to counter the effects of gravity on the body masses.
  • a computer-implemented method of registering information associated with a first data set to a second data set comprise the steps of collecting a first data set of an anatomical structure with an imaging device, developing additional information for the first data set, wherein the additional information has a unique identifiable spatial relationship to the structure of the first data set, and establishing a first arbitrary reference frame for the first data set.
  • the first reference frame is established without reference to any pre-selected landmark on the structure, and the first reference frame has a unique spatial relationship to the first data set.
  • the method also comprises the steps of collecting a second data set of an anatomical structure with an imaging device, establishing a second arbitrary reference frame for the second data set, transforming the first reference frame to the second reference by matching a unique spatial parameter of the first data set with the same unique spatial parameter of the second data set, and registering the additional information with the second data set.
  • a computer-implemented method of associating spatial information related to a first volume data set of an anatomical structure with a second volume data set of the anatomical structure includes the steps of obtaining a first volume data set of the anatomical structure with a computer surgical navigation system, assigning a first arbitrary reference frame to the first volume data set, calculating an inherent feature in the first volume data set, correlating the inherent feature to the first arbitrary reference frame, and associating additional spatial information with the first volume data set.
  • the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position, and the additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame.
  • the method further includes the steps of obtaining a second volume data set of the anatomical structure with a computer surgical navigation system, assigning a second arbitrary reference frame to the second volume data set, identifying the inherent feature in the second volume data set, and correlating the inherent feature to the second arbitrary reference frame.
  • the method also includes the steps of registering the first volume data set with the second volume data set based on the inherent feature, correlating the additional spatial information to the second volume data set in registration therewith, and displaying the additional spatial information in registration with the second volume data set on a display device.
  • the registering step is performed by a computer
  • a system for collecting and manipulating a volume data set of an anatomical structure includes means for obtaining a first volume data set of an anatomical structure of a patient and a second volume data set of the anatomical structure, and means for calculating an inherent feature of the first volume data set and the second volume data set.
  • the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position.
  • the system further includes means for assigning a first arbitrary reference frame to the first volume data set and a second arbitrary reference frame to the second volume data set, means for correlating the inherent feature to the first arbitrary reference frame, and means for associating additional spatial information with the first volume data set.
  • the additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame.
  • the system also includes means for registering the first volume data set with the second volume data set based on the inherent feature, and means for correlating the additional spatial information to the second volume data set in registration therewith.
  • a method of establishing a position a portion of a bone that has been altered from a normal shape includes the step of collecting a first volume data set for a first bone that is unaltered, wherein the first volume data set includes volume data for first and second portions of the first bone.
  • the method also includes the steps of identifying a first unique spatial characteristic of the volume data for the first portion of the first bone, establishing a first arbitrary reference frame for the first volume data set correlated with the first unique spatial characteristic, and identifying a unique spatial relation between the first arbitrary reference frame and the second portion of the first bone.
  • the method further includes the step of identifying a second bone that normally mirrors the first bone about a centerline, wherein the second bone includes a first portion and a second portion that correspond as substantially mirror structures to the first and second portions of the first bone, respectively, and wherein the second bone has been altered from a normal shape such that the first portion of the second bone is in an altered position with regard to the second portion of the second bone.
  • the method further includes the steps of collecting a second volume data set of the first the first portion of the second bone, identifying a second unique spatial characteristic of the second volume data set, wherein the second unique spatial characteristic substantially mirrors the first unique spatial characteristic, registering in mirrored correlation the first volume data set with the second volume data by correlating the first unique spatial characteristic with the second unique spatial characteristic, and re-establishing the normal position of the second portion of the second bone to coincide with the position of the second portion of the first bone as related to the registered position of the first portion of the first bone.
  • FIG. 1 is a schematic view of an embodiment of a surgical navigation system adapted to implement methods of the present disclosure
  • FIGS. 2 and 2A are flowcharts of registration procedures according to the present disclosure
  • FIGS. 3A , 3 B, and 3 C illustrate the development of additional spatial information for an anatomical structure represented by a first data set and the relation of such information to a second data set;
  • FIG. 4A is an example screen shot and FIGS. 4B and 4C are visual representations that depict data set collection and registration;
  • FIG. 5 illustrates the development of additional spatial information for a first anatomical structure and the relation of such information to a second anatomical structure that mirrors the first anatomical structure;
  • FIGS. 6A and 6B show an example of an anatomical reference frame defined pre-operatively that is not accessible during a surgical procedure
  • FIGS. 7A and 7B show an example of determining functional motion parameters of a hip of a patient.
  • Systems and methods of the present disclosure may be used to register different data sets related to one or more structures of a patient and/or to relate additional information from one such data set to another such data set, wherein the additional information may not be available or practically obtainable for the other data set.
  • positional information of non-contiguous regions of a body is tied together without the need to identify or relate to local anatomical reference frames based on pre-defined anatomical landmarks.
  • functional information from one data set is related to another data set to facilitate the performance of a functional assessment of a structure.
  • the structure can be an anatomical structure, such as a bone or joint of a patient
  • the volume data set can be an image data set of the bone or bones obtained using an ultrasound probe or other known imaging techniques of modalities.
  • the functional information that is developed for the anatomical structure from a pre-operative image data set can be related to an intra-operative image data set to aid in the planning and execution of surgical procedures and/or to facilitate early identification and prevention of certain diseases or harmful conditions.
  • other information can be utilized, for example, to re-establish an anatomical reference frame that is accessible in one data set but not another.
  • a further aspect of the present disclosure is the ability to register different data sets for a structure without a need for a predefined landmark or fiducial on the structure. Instead, arbitrary reference frames are established for different data sets and used to register such data sets.
  • FIG. 1 is a schematic view of a surgical navigation system 20 that is adapted to implement the steps of the procedure(s) disclosed herein.
  • the surgical navigation system 20 includes a display unit 22 , a computer system 24 , and a camera array 26 .
  • the computer system 20 is housed in a moveable cart 28 .
  • the computer system 24 may be, for example, any type of personal computer having a memory unit, a CPU, and a storage unit (all not shown), as would be apparent to one of ordinary skill in the art.
  • the display unit 22 can be any conventional display usable with the computer system 24 , such as a standard computer monitor or television.
  • An exemplary surgical navigation system is the Stryker Navigation system available from Stryker Corporation.
  • the surgical navigation system 20 is adapted to receive image data of a patient 30 .
  • image data is obtained by an ultrasound probe 32 manipulated by a user 34 , such as a surgeon or a nurse, and transmitted wirelessly to the computer system 24 .
  • a system that uses wires to transmit data between the ultrasound probe 32 and the computer system 24 can be used.
  • the ultrasound probe 32 provides a non-invasive, non-ionizing, and portable imaging modality to obtain image data of the patient 30 .
  • the ultrasound probe 32 provides image data for underlying bones to overcome skin shift related motion artifacts.
  • image data can be collected using any other acceptable imaging technique or modality, such as magnetic resonance imaging (“MRI”), computed tomography (“CT”), single photon emission computed tomography, positron emission tomography, and the like.
  • MRI magnetic resonance imaging
  • CT computed tomography
  • single photon emission computed tomography single photon emission computed tomography
  • positron emission tomography and the like.
  • the camera array 26 is adapted to detect the position of a sensor 36 coupled to the ultrasound probe 32 to track the position and orientation of such ultrasound probe 32 .
  • the sensor 36 can be one or more light emitting diodes (“LEDs”)
  • the camera array 26 can include a first camera 38 , a second camera 40 , and a third camera 42
  • the first, second, and third cameras 38 , 40 , 42 respectively, can be three CCD cameras that are adapted to detect infrared (“IR”) signals generated by the sensor 36 .
  • IR infrared
  • the user 34 can use other surgical tools and instruments that are capable of being tracked by the camera array 26 in the same manner as the ultrasound probe 32 .
  • These additional surgical tools and instruments may have sensors 36 that comprise, for example LEDs, either built into the tool or instrument or physically associated therewith in a known or determinable position and orientation sufficient for tracking the position of the instruments.
  • the camera array 26 is mounted on a rotatable arm 44 attached to the movable cart 28 so that the camera array 26 has a sufficient line of sight to a relevant field where a procedure is to take place.
  • the camera array 26 may be mounted onto an operating room wall (not shown) or onto another convenient surface or location.
  • the surgical navigation system 20 can be an active optical system that includes at least one infrared transceiver that is used to communicate data to and from the sensor 36 .
  • the camera array includes a first transceiver 46 and a second transceiver 48 located apart from each other. While the present disclosure is described using an active optical surgical navigation system, the systems and methods of the present disclosure can also be used with other surgical navigation technologies and systems, such as passive optical systems, magnetic based systems, inertial navigation based systems, and the like. Other computer-assisted systems also can be used including RFID based systems, video imaging based systems, and the like.
  • the camera array 26 is connected via a cable 50 to a localizer (not shown) or in some instances directly to the computer system 24 .
  • the localizer cooperates with the camera array 26 to identify the location and orientation of the sensor 36 on the ultrasound probe 32 within the line of sight of the camera array 26 .
  • the localizer converts raw position data of the LEDs into the orientation of individual LEDs of a plurality of LEDs that make up the sensor 36 and transmits this information to the computer system 24 .
  • the localizer converts raw position data of the LEDs into the position and orientation of the ultrasound probe 32 and transmits this information to the computer system 24 .
  • a software program executed by the computer system 24 can convert the raw data into the orientation of the ultrasound probe 32 .
  • the conversion of the raw position data is well known to one skilled in the art.
  • the computer system 24 may optionally be controlled remotely by control buttons (not visible) located on the ultrasound probe 32 or otherwise easily accessible to the user 34 .
  • the computer system 24 also includes one or more input devices, such as a keyboard 52 , a mouse 54 , or any other input devices for operating the computer system 24 .
  • methods of registering a first volume data set of an anatomical structure to a second volume data set of the anatomical structure is disclosed.
  • the methods are preferably performed using a computer surgical navigation system as disclosed herein, wherein the navigation system can track the position of one or more gathering devices for gathering the volume data sets, storing position data and extracting information therefrom, and correlating the position data with the volume data sets.
  • FIG. 2 describes a broad view of a method of registration, in which control initiates at a block 80 that collects a first data set for a structure, such as an anatomical structure of a patient. Following the block 80 , control passes to a block 82 that develops additional positional information related to the first data set, and a block 84 that determines or establishes a first reference frame for the first data set.
  • the first reference frame is preferably an arbitrary reference frame.
  • a block 86 collects a second data set, a block 88 determines a second reference frame for the volume second data set. Thereafter, control passes to a block 90 that transforms the first reference frame into the second reference frame, and a block 92 relates the additional spatial information from the first reference frame, such as functional information for the anatomical structure, to the second reference frame.
  • FIG. 2A describes a more detailed view of a method embodying the method of FIG. 2 , wherein control initiates a block 100 that collects a first data set for a structure, such as an anatomical structure of a patient.
  • the first data set can include a first volume data set for the anatomical structure.
  • anatomical structure can include an entire anatomical structural unit, such as a complete bone, and the term anatomical structure can include a smaller portion of the entire anatomical structural unit less than the entirety, such as just a small portion of the bone.
  • the volume data set includes information about the position and orientation of the anatomical structure, such as a bone or a joint.
  • the first volume data set is collected using a subcutaneous imaging device, such as the tracked ultrasound probe 32 of FIG. 1 , wherein the volume data set includes an image of a bone or other subcutaneous structure on a patient.
  • a subcutaneous imaging device such as the tracked ultrasound probe 32 of FIG. 1
  • Other image capturing devices and modalities for capturing the first image data set may be used also, such as CT scan, MRI, X-rays, etc.
  • the first volume data set may be obtained pre-operatively, for example.
  • the first volume data set preferably includes image data regarding the anatomical structure, and may include two-dimensional (2D) image data and/or three-dimensional (3D) image data.
  • One exemplary capturing device and modality for capturing the first image data set includes a 2D or 3D ultrasound imaging device, if a 2D ultrasound probe is used, such probe can be used to collect volume data by collecting multiple slices of a region that includes the anatomical structure.
  • the position of the capturing device is tracked by the surgical navigation system while capturing the volume data set.
  • the anatomical structure is in a fixed position while the volume data set is captured, which dispenses with a need to track the position of the anatomical structure separately during the capturing.
  • the anatomical structure optionally may be tracked during the capturing, in which case the anatomical structure may move during the capturing and/or additional robustness may be incorporated into position data for the acquired volume data set encompassing the anatomical structure.
  • the first reference frame is preferably an arbitrary reference frame established without reference to any pre-defined landmark, such as a fiducial or particular anatomical landmark, on the anatomical structure.
  • the first reference frame can be of a camera assembly, an ultrasound probe, or the first volume data set itself, such as, a center of the first volume data set.
  • Other ways to establish the arbitrary reference frame to distinguish from other types of reference frames can also be used.
  • the reference frames may be established by any known or commonly used image processing algorithms.
  • the arbitrary reference frame preferably has a unique spatial relationship to the volume of the subject anatomical structure, such as a bone, and the arbitrary reference frame remains fixed in the same position relative thereto.
  • Control passes to a block 104 that identifies an inherent feature of the first volume data set, such as a spatially unique physical spatial aspect of the volume data set.
  • the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position.
  • the computer system 24 is adapted with appropriate command routines to calculate an image moment of inertia of the volume data set, which is constant with regard to the volume regardless of what point of view the volume is acquired or viewed from.
  • Using the moment of inertia can be advantageous because any given volume has a constant moment of inertia that has a unique fixed spatial relation to a data set representing a particular volume and regardless of the point of view from which the volume is viewed.
  • the moment of inertia of a volume data set of a particular portion of a bone will be in the same relative position to that portion of the bone regardless of from what position or point of view the volume data set is obtained.
  • Other methods of determining a spatially unique physical spatial aspect of the volume data set may be used and may obtain alternative or additional uniquely defined spatial information about the anatomical structure, such as surface contour information, point landmarks, etc., that could be used to define the arbitrary reference frame.
  • the block 88 correlates the first arbitrary reference frame and the image moment of inertia by a known unique spatial relationship therebetween, such as the xyz Cartesian coordinates of the image moment of inertia within the first arbitrary reference frame.
  • the first arbitrary reference frame is assigned such that the moment of inertia defines an axis of the arbitrary reference frame.
  • Other alternative and/or equivalent methods or systems for correlating the arbitrary reference frame to the spatially unique physical aspect may be used.
  • Control passes to a block 108 that develops and/or associates additional positional information with the first volume data set, wherein the additional spatial information has a unique spatial orientation relative to the first arbitrary reference frame.
  • the additional positional information can include functional information for the anatomical structure, such as a gravity vector that acts on the anatomical structure and/or orientations of parts of the anatomical structure with respect to each other.
  • the first data set is a pre-operative volume data set and the functional information is obtained for the anatomical structure when a patient is in a generally standing or upright position.
  • the functional information can be obtained by other methods, as would be apparent to those skilled in the art.
  • the additional spatial information may be contiguous with the first volume data set.
  • the additional spatial information may include gravity vector information that defines a gravity vector through the first volume data set at the time the volume data set was obtained.
  • the gravity vector may be obtained by any known method.
  • One such method includes having a gravity sensing device, such as an accelerometer, installed on the camera of the surgical navigation system, wherein the gravity sensing device identifies the local gravity vector while the first volume data set is being gathered.
  • the gravity vector information is then associated with the volume data set such that the gravity vector can be uniquely located with respect to the first arbitrary reference frame.
  • the additional spatial information set may be non-contiguous to the first volume data set.
  • the additional spatial information may include a vector that identifies the location and orientation of another reference frame.
  • the other reference frame may be another arbitrary reference frame for a non-contiguous volume data set that does not overlap with the first volume data set, wherein the non-contiguous volume data set relates to another anatomical structure or another portion of the same anatomical structure.
  • the first volume data set may be of a first portion of a bone and the non-contiguous volume data set may be of a second portion of the same bone.
  • the other reference frame may also include a global reference frame that is common to several volume data sets, such as a camera reference frame of a camera of the surgical navigation system.
  • the specific locations and/or orientations of reference frames of each of one or more non-contiguous volume data sets may be interlinked with the first volume data set such that the location and orientation of any one or more of the volume data sets may be used to identify the location and orientation of one or more of the other non-contiguous volume data sets even though the volume data sets are not specifically overlapping and not presently viewed.
  • additional spatial information that may have unique spatial characteristics in relation to the anatomical structure represented by the volume data set may also be identified with the volume data set.
  • additional spatial information could include location and orientation vector(s) of the arbitrary reference frame of the volume data set with respect to other local anatomical reference frames of the patient, such as a pelvic plane, femoral mechanical axis, femoral anatomical axis, and other relevant local reference points and/or frame such as commonly used in the art.
  • the disclosure contemplates that any type of spatial information that has a unique identifiable spatial characteristic in relation to the anatomical structure represented by the volume data set may be associated with the volume data set as considered necessary or expedient for various and different specific applications.
  • Control passes to a block 110 that collects a second data set.
  • the second volume data set may be of the same anatomical structure as for the first volume data set or at least have significant overlap therewith. There is sufficient overlap between the first data set and the second data set so that the first and second data sets can be registered in a subsequent step described hereinafter. In a preferred embodiment, there is at least about a seventy percent overlap of the anatomical structure captured in the first volume data set and the anatomical structure captured in the second volume data set. More than seventy percent overlap may be even more preferable in some instances, and less than seventy percent overlap may be sufficient in other instances.
  • the second volume data set may be obtained from a same point of view as the first volume data set or it may be obtained from a different point of view.
  • the second volume data set may be of a different anatomical structure that has some known or determinable spatial relationship to the first anatomical structure.
  • the first volume data set may be of a bone or portion thereof on one side of a patient's body and the second volume data may be of a bone or portion thereof on the opposite side of the patient's body that corresponds as a substantially mirror image of the first bone.
  • the first volume data set may include image data of a left femoral head and the second volume data set may include image data of a right femoral head.
  • the left femoral head is assumed to be mathematically equivalent to a mirror image of the right femoral head in relation to a centerline of the body.
  • the left femoral head and the right femoral head have a known or determinable spatial relationship to each other about the centerline of the body.
  • a prominent standard anatomical feature of the femoral head such as the lesser trochanter, may be identified as a landmark that is assumed to be in the same location on each of the left and right femurs but in mirror image relationship to each other about a centerline of the body.
  • Other identifiable relationships between different anatomical structures can also be identified and used in a similar manner as described herein.
  • the second volume data set is obtained during a different portion of a procedure than the first volume data set, such as intra-operatively.
  • the second data set is an intra-operative volume data set of the same bone or portion thereof that is obtained while the patient is anesthetized and lying on an operating table.
  • the first and second volume data sets could be collected during the same portion of a surgical procedure, such as both being collected intra-operatively or both being collected pre-operatively.
  • one or more volume data sets may be collected post-operatively, such as to aid in post-operative diagnostics, for example.
  • the second data set can be collected using the same modality or a different modality than the first data set.
  • both the first and second volume data sets are obtained using a 3D ultrasound imaging system having a tracking device attached thereto for being tracked by the surgical navigation system.
  • the second data set is also collected using the ultrasound probe 32 tracked by the surgical navigation system 20 .
  • the second reference frame can be an arbitrary reference frame and can be assigned or determined in a similar manner as described herein.
  • the second arbitrary reference frame preferably is defined uniquely by the second volume data set.
  • the second arbitrary reference frame is defined by the anatomical structure in the same manner as described previously herein with respect to the first anatomical structure.
  • the second reference frame can be the same as or different from the first reference frame.
  • Control also passes to a block 114 that identifies an inherent feature of the second volume data set, such as a spatially unique physical spatial aspect of the volume data set.
  • an inherent feature of the second volume data set such as a spatially unique physical spatial aspect of the volume data set.
  • the inherent feature of both volume data sets is preferably the same because the same anatomical feature would have the same unique spatial aspect, such as the image moment of inertia, for both the first and second volume data sets.
  • the moment of inertia of the structure is constant and substantially unique relative to a given structure regardless of the point of view from which the structure is viewed.
  • the moment of inertia of the same anatomical feature is uniquely identifiable in different volume data sets of the anatomical feature taken from different points of view.
  • the second arbitrary reference frame may also be uniquely spatially associated with the image moment of inertia.
  • a block 116 correlates the second reference frame to the inherent feature of the second volume data set in the same manner as in block 106 or any sufficient manner.
  • Control passes to a block 118 that registers the first reference frame into the second reference frame.
  • the block 118 performs data set matching, such as by finding the unique moment of inertia of each image and correlating the first and second arbitrary reference frames to each other based on matching the moment of inertia of the first volume data set to the moment of inertia of the second volume data set and calculating an appropriate transformation matrix therefrom.
  • Another possible method of data set matching may include conducting a different volume data match of the first and second volume data sets, whereby first and second volume data sets are virtually overlaid and correlated to each other using any suitable or commonly known method.
  • the registration may also be performed using other methods, such as volume-volume matching, surface-surface matching, and/or point-to-point matching.
  • a mathematical transformation including rotation, translation, and scale is calculated preferably by the computer system 24 that will register a common or assumed common spatially unique feature in the two volume data sets.
  • the computer system 24 then transforms one or both of the arbitrary reference frames to bring the spatially unique feature, and thus the volume data sets, into registration with each other.
  • Other equally efficacious methods of calculating and performing a suitable transformation and registration may be used as would be known to a person skilled in the art.
  • Control passes to a block 120 that relates the additional spatial information from the first reference frame, such as functional information for the anatomical structure, to the second reference frame.
  • the additional spatial information from the first volume data set is related to the second arbitrary reference frame of the second volume data set after (or contemporaneously as) the first and second volume data sets have been brought into registration.
  • the additional spatial information is associated with the second arbitrary reference frame in correct registration therewith even though the additional spatial information is not directly available when the second volume data set is acquired.
  • the additional spatial information includes the gravity vector as described above, the gravity vector is associated with and brought into registration with the second volume data set in proper orientation to the anatomical structure even when the anatomical structure is in a different orientation.
  • the additional spatial information includes a vector that identifies the location and orientation of another reference frame as described above
  • the location and orientation of the non-contiguous volume data sets not part of the second volume data set may be identified based on the association and registration of the vector information comprising the additional spatial information.
  • control may then pass to a block 122 that displays in registration with the second volume data set on a display device. Additional manipulations and uses of the additional spatial information may also be performed as desired.
  • the blocks 100 - 122 described above can be rearranged, reordered, or modified by combining to include fewer or breaking down further additional steps, as would be apparent to one of skill in the art. As shown in FIG. 2A , the steps 100 - 122 in some instances correlate with the steps 80 - 92 shown in FIG. 2 , such as by being considered sub-steps thereof. Further, the logic represented by the flowcharts of FIGS. 2 and 2A can be implemented by any computer system that is adapted to implement the blocks 80 - 92 and/or 100 - 122 , such as the surgical navigation system 20 of FIG. 1 . In one embodiment, the computer system 24 includes appropriately programmed software and hardware to perform the blocks and processes of FIGS. 2 and 2A .
  • FIGS. 3A , 3 B, and 3 C illustrate an example where the additional spatial information includes functional information for an anatomical structure that is obtained from a pre-operative volume data set and related to an intra-operative volume data set so that the functional information can be used during a surgical procedure on a patient.
  • a gravity vector that acts on the various bones that make up the hip joint can be an important factor for the precise positioning of components of a prosthetic hip implant based on certain functional motion characteristics of the patient.
  • functional information may be obtained for the anatomical structure in a plurality of different positions or over a period of time.
  • functional information can be obtained for a knee joint in various positions to collect extension/flexion, varus/valgus, and other information for use during a surgical procedure.
  • a prosthetic component is placed accurately and effectively using the arbitrary reference frames discussed herein instead of relying on local biomechanical/anatomical references, such as a femur mechanical axis, pelvic frontal plane, or other standard local anatomical reference frames.
  • a “local” reference frame is a reference frame based on specific accepted or pre-defined defined anatomical features of a patient, such as specific skeletal landmarks.
  • an “arbitrary reference frame” refers to a reference frame that is identified uniquely based solely on the feature being looked at, such as the specific volume data set being viewed.
  • the arbitrary reference frame is not dependent on locations of one or more specific pre-defined anatomical landmarks with respect to other portions of the anatomy but is correlated to and identifiable from unique spatial characteristics of only the anatomy of interest.
  • Relating the functional information, such as the gravity vector G, to the intra-operative procedure facilitates placement of correct prosthetic components in an optimal position and alignment based on the natural motion and movement patterns of the patient.
  • the position and alignment of the prosthetic can further be optimized using a plurality of parameters that include, for example, joint specific anatomical and kinematic constraints, patient life-style specific activities, and prosthetic design specific geometrical and kinematic constrains. Still further optimization can be realized through incorporation of other relevant factors that arise or become visible intra-operatively, such as after the preparation of a joint surface to accept a prosthetic component.
  • FIG. 3C illustrates steps in one method and specific example that utilizes additional spatial information on a computer implemented surgical navigation system, such as the surgical navigation system 20 , including functional information relative to that shown in FIGS. 3A and 3B .
  • a block 170 acquires a pre-operative image volume data set 150 of a hip 152 or of parts of the hip of a patient 154 while the patient is standing.
  • the image volume data set 150 is acquired using the ultrasound probe 32 while being tracked by the camera array 26 to gather image data of the bones of interest in the hip joint, and the image data is stored in a suitable electronic memory available to the surgical navigation system.
  • a block 172 defines an arbitrary axis, such as an axis of the camera array, to the image volume data set 150 , identifies a unique spatial parameter of the image volume data set, and correlates the arbitrary axis to the spatial parameter.
  • the unique spatial parameter preferably includes the image moment of inertia of the image volume data set of the hip bones, calculated as discussed previously herein.
  • the arbitrary axis is optionally also correlated to a local anatomical parameter of the hip 152 , such as a frontal plane 156 of the hip 152 as shown in FIG. 3 .
  • a block 174 assigns additional spatial information including functional information, such as a gravity vector G, to the image volume data set 150 .
  • the gravity vector G is shown pointing downwards to the floor in relation to the hip 152 in FIG. 3A because the patient is standing upright while the image volume data set 150 is obtained.
  • the gravity vector is acquired using an inertial system with an accelerometer, but can be acquired by any sufficient system known in the arts, such as a liquid level measurement system or other systems.
  • the gravity vector G is spatially assigned by determining an orientation of the gravity vector G with respect to the anatomical parameter and/or arbitrary axis, such as a specific tilt angle ⁇ with respect to the frontal plane 156 .
  • anatomical parameters can be defined, for example, a functional plane 158 , an iliac crest, a pubic symphesis, and the like, and various planes and angles defined thereby, wherein the gravity vector G can be related to such anatomical parameters.
  • a block 176 collects an intra-operative image volume data set 160 of the same general area while the patient 154 is lying in a generally prone position in a similar manner as with the block 170 .
  • a block 178 performs a data match to register the pre-operative image volume data set 150 and the intra-operative image volume data set 160 using any suitable data matching technique, such as image inertia matching or the volume data matching techniques discussed previously. Due to such registration, the gravity vector G is simultaneously or subsequently transferred to the intra-operative volume data set 160 for use during the procedure.
  • any suitable data matching technique such as image inertia matching or the volume data matching techniques discussed previously. Due to such registration, the gravity vector G is simultaneously or subsequently transferred to the intra-operative volume data set 160 for use during the procedure.
  • a block 180 displays the image volume data set 160 on the display unit 22 with the gravity vector G shown in registration with the bones of the hip.
  • a replacement prosthesis can then be aligned with the bones of the hip using the surgical navigation system 20 so as to have a relationship to the bones that has been preselected based on the position of the bones with respect to the gravity vector G.
  • FIGS. 3A-3C The method shown and described in relation to FIGS. 3A-3C is preferably implemented on the surgical navigation system 20 , and blocks 172 , 174 , 178 , and 180 are preferably performed by appropriate computer software routines associated with and preferably controlling the computer system 24 in any available manner known to one skilled in the art.
  • FIGS. 4A-4C illustrate an example of a volume data match registration procedure, wherein three data set collection screen shots are shown.
  • a pre-operative volume data set 190 of a hip of a patient is collected by an ultrasound probe and shown on a display screen as shown in FIG. 4A , and a first arbitrary reference frame 192 is assigned thereto.
  • additional spatial information such as a gravity vector G, is associated in a unique spatial location with respect to the volume data set 190 .
  • An intra-operative volume data set 194 of the hip is then collected, and a second arbitrary reference 196 frame is assigned thereto.
  • the anatomical parameter of the hip includes a right iliac crest 198 , a left iliac crest 200 , and a pubic symphesis 202 in the pre-operative volume data set.
  • the same right and left iliac crests and pubic symphesis are also identified in the intra-operative volume data set 194 , wherein such structures in the pre-operative and intra-operative volume data sets can establish a unique arbitrary reference frame for the data sets.
  • the image moment of inertia is calculated for each of the pre-operative and intra-operative volume data sets 190 , 194 such that the image moment of inertia of the pre-operative volume data set 190 is identical or sufficiently close to the same as the image moment of inertia of the intra-operative volume data set 194 .
  • FIG. 4B shows the pre-operative volume data set 190 and the intra-operative volume data set 194 of the hip overlapping before registration, such as with the first and second arbitrary reference frames 192 , 196 aligned and without being registered.
  • the pre-operative data set 190 and intra-operative data set 194 are then registered by a reference frame transfer by overlaying and correlating the reference frames of the respective data sets, as illustrated, for example, in FIG. 4C , which shows the pre-operative and intra-operative volume data sets 190 , 194 of the hip overlapping after registration.
  • the transformation between the data sets is used to relate the additional spatial information, such as the gravity vector, from the pre-operative volume data set 190 to the intra-operative volume data set 194 .
  • the gravity vector G is determined with respect to an anatomical parameter of the hip in the pre-operative volume data set and assigned to the volume data set in unique spatial orientation thereto.
  • FIGS. 3A-3C , and 4 A- 4 C provide an improvement over prior surgical procedures, which typically used the functional plane 158 as an approximation for the gravity vector G.
  • the functional plane 158 can be determined using known methods, for example, by determining an anterior superior iliac spine and by integration of a longitudinal body axes. While the functional plane 158 has provided a rather good approximation of the gravity vector, there is generally a difference of about 5° to about 10° between the orientation of the functional plane 158 and the gravity vector G. Therefore, detecting and assigning the gravity vector G is a much more accurate and reliable method for considering functional aspects of the patient during normal activity and movement parameters.
  • an unaffected area is used to mirror an affected area to provide symmetrical biomechanical parameters, such as spatial relationships, angles, and biomechanical parameters, to repair the affected area.
  • biomechanical parameters such as spatial relationships, angles, and biomechanical parameters
  • FIG. 5 in the case of a patient with a broken left femur, image data and functional information can be obtained for a healthy right femur of the patient.
  • the functional information for the right femur can be related to the broken left femur using the procedure of FIG. 2 to provide symmetrical biomechanical parameters to repair the broken left femur.
  • data sets for both the right and left femurs normally are obtained intra-operatively due to the circumstances of typical trauma surgery, as would be apparent to one of ordinary skill in the art.
  • a diagrammatic image data set 210 of a left femur 212 and a right femur 214 shows that the left femur 212 has suffered some trauma, such as a severe fracture 216 across the femur neck and separating the left femur head 218 , while the right femur 214 is unaffected.
  • the image data set 210 can be obtained all at once by any suitable modality, such as with the ultrasound probe 32 and the surgical navigation system 20 as described earlier, and stored on the computer system 24 .
  • the computer system 24 then develops information for the left and right femurs 212 , 214 from the image data set 210 .
  • the information includes a first volume data set 220 including the left femur head 218 and a second volume data set 222 including a part of the left femur body 224 .
  • the volume data sets 220 and 222 are preferably not contiguous to each other and/or are mathematically isolated from each other on opposite sides of the fracture 216 .
  • the information also includes a third volume data set 226 including the unaffected right femur 214 .
  • the third volume data set 226 includes a volume data set 228 of the head of the right femur 214 and a noncontiguous volume data set 230 of a part of the right femur body.
  • the right femur 214 is preferably held immobile while the entirety of the volume data set 226 , including the volume data sets 228 and 230 , is obtained.
  • a tracking device (not shown) may be attached to the femur 214 while the volume data set 226 is obtained in order to provide additional robustness to the position information of the volume data sets 228 and 230 relative to each other and/or correct for movement of the right femur while the volume data set 226 is obtained.
  • Each volume data set 220 , 222 , 228 , and 230 is assigned an arbitrary reference frame, 232 , 234 , 236 , and 238 , respectively.
  • Each reference frame 232 , 234 , 236 , and 238 preferably has a location and orientation that is correlated to a uniquely identifiable aspect of the volume data set, such as the image moment of inertia described herein.
  • Additional spatial information comprising a vector 240 that uniquely defines the spatial relation, including position and orientation, of the volume data sets 222 and 230 to each other is established.
  • the reference frames 236 and 238 are spatially correlated with each other in a global reference frame 242 , such as of the camera array 26 , by the vector 240 even though the two volume data sets are not contiguous with each other.
  • a global reference frame 242 such as of the camera array 26
  • Calculation of the additional vector 240 may not be necessary or may be used to provide additional mathematical robustness by providing redundant measurements in an example where the volume data set 228 is contiguous with the volume data set 230 .
  • the first volume data set 220 and the second volume data set 222 of the affected left femur 212 are matched and registered with corresponding portions 228 and 230 , respectively, of the third volume data set 226 of the unaffected right femur 214 in order to determine a reconstructed position of the left femur head 218 and left femur body 224 that will match corresponding portions of the unaffected right femur 214 .
  • the registration processes includes performing a reference frame transfer from the reference frame of the unaffected right femur 214 to the reference frame of both volumes 220 and 222 of the affected left femur 212 .
  • the shape and position of the left femur 212 should be identical and a mirror image of the shape and position of the right femur 214 about a centerline therebetween. It is also assumed that the shapes of portions of the right femur 214 captured in the volume data sets 228 and 230 correspond substantially to the shapes of corresponding portions of the left femur 212 captured in the respective volume data sets 220 and 222 . With these assumptions, the reference frames 236 and 238 and corresponding volume data sets 228 and 230 and the vector 240 of the right femur 214 are mathematically mirrored about a centerline 244 to be in position to match the left femur 212 .
  • volume data sets 220 or 222 of the left femur 212 is then matched to with the corresponding mirrored volume data set 228 or 230 of the right femur 214 .
  • the volume data sets 222 and 238 may both include an easily identifiable three-dimensional feature, such as the lesser trochanter 246 , which can be used to register the volume data set 222 with the mirrored volume data set 238 .
  • an image moment of inertia is calculated for both volume data sets 222 and 238 , and the image moment of inertias are then matched after mirroring the right femur information.
  • Other methods of registering mirrored corresponding volume data sets may also be used.
  • the mirrored vector 240 ′ and volume data set 228 define the theoretically correct position of the left femur head 218 in relation to the left femur body 224 .
  • tracking devices 246 and 248 are attached to each of the left femur body 224 and the left femur head 218 and independently tracked by the surgical navigation system 20 during the entire procedure. A surgeon is then able to adjust the pieces 218 and 224 of the left femur to align with the theoretically derived locations based on the registration to the mirrored volume data sets 228 , 230 of the right femur 214 .
  • volume data sets 220 , 222 , 226 , 228 , and 230 can be chosen arbitrarily. Therefore, time and computing resources are saved because there is no need to establish a local reference frame.
  • the method is not limited to work done on a femur as describe here, but can be used with minor modifications for any anatomical structure that is substantially mirrored on opposite sides of a centerline of a body, such as arms, ribs, feet, hands, hip, etc.
  • the present disclosure also contemplates the relation of spatial information from one data set relating to an anatomical structure at a first time to another data set of the same anatomical structure at another time, such as an anatomical reference frame.
  • An exemplary situation shown in FIGS. 6A and 6B is the ability to define an anatomical reference frame pre-operatively that is not accessible during a surgical procedure due to the positioning and draping of the patient on a surgical table.
  • a pre-operative volume data set of a forearm 250 including image data of the underlying bone(s), is collected using the tracked ultrasound probe 32 and surgical navigation system 20 at a stage when the forearm is accessible.
  • a first volume data set 252 at one location of the forearm 250 is obtained and a second volume data set 254 at another location of the forearm 250 is obtained when both locations are accessible, as shown in FIG. 6A .
  • the pre-operative data set preferably includes regions of the forearm 250 that will be accessible after the area of interest has been prepared and draped or will otherwise be inaccessible.
  • An anatomical reference frame or other spatial information such as a global reference frame 256 of the camera assembly 26 , is defined at a stage when the anatomy is accessible, and arbitrary reference frames 258 and 260 are defined for volume data sets 252 and 254 , respectively.
  • Each reference frame 258 , 260 is uniquely identifiable from the associated volume data set, such as by having a known relation to the image moment of inertia the respective volume data set 252 , 254 as discussed previously.
  • the anatomical reference frame or other global reference frame 256 is geometrically associated with the arbitrary reference frames 258 , 260 for the pre-operative data set, and a vector 262 is determined that associates the arbitrary reference frames 258 and 260 and the respective volume data sets 252 and 254 in a unique spatial position relative to each other.
  • the portion of the forearm 250 corresponding to volume data set 254 may not be accessible to the ultra sound probe 32 , such as due to draping as shown in FIG. 6B , or any other reason.
  • a subsequent volume data set 252 ′ having substantial overlap with the volume data set 252 is collected using the ultrasound probe 32 and surgical navigation system 20 , and the arbitrary reference frame 258 is re-established to relate the anatomical or global reference frame 256 to the subsequent volume data set 252 ′.
  • the arbitrary reference frame 258 can be re-established in any sufficient manner, such as by matching the image moment of inertia of the volume data sets 252 and 252 ′ in a substantially similar manner as described previously herein, by other volume matching or surface matching methods, etc.
  • the computer system 24 re-establishes the location of the second volume data set 254 based on the vector 262 using appropriate program routines even though that area of the forearm 250 is not accessible.
  • the surgical navigation system 20 is able to re-establish the locations of portions of the bones of the forearm 250 based on being able to view just one volume portion of the bone that was previously identified without having to either view the other volume portions that were previously identified or define local anatomical landmarks as discussed above.
  • the example provided herein may be applied similarly to any anatomical feature that maintains a relatively stable structural geometry over time, such as any bone, and may be extended to apply to any number of spatially inter-connectable volume data sets.
  • FIGS. 7A and 7B functional information about an anatomical structure is developed by collecting a plurality of volume data sets of the same anatomical structure in a plurality of different positions at corresponding different times without the need to identify a local anatomical reference frame.
  • FIGS. 7A and 7B One example of this application is shown in FIGS. 7A and 7B , wherein functional motion parameters of a hip of a patient 270 are determined.
  • a first volume data set 272 is gathered of a portion 274 of the patient's femur
  • a first volume set 276 is gathered of a portion 278 of the pelvis, both with the patient's leg in an extended position.
  • a second volume data set 272 ′ is gathered of substantially the same portion of the patient's femur 274
  • a second volume data set 276 ′ is gathered of substantially the same portion 278 of the patient's pelvis, both with the patient's leg in a flexed position.
  • Each volume data set 272 , 276 , 272 ′, and 276 ′ is assigned an arbitrary reference frame 280 , 282 , 284 , and 286 , respectively, that is correlated to a known position in relation to a uniquely identifiable feature of the respective volume data set.
  • each arbitrary reference frame 280 , 282 , 284 , and 286 is correlated to an image moment of inertia of each volume data set 272 , 276 , 272 ′, and 276 ′, although other identifiable unique attributes of a particular volume data set could be used, as discussed herein.
  • Additional positional information including a vector 288 between the volume data sets 272 and 276 and a vector 288 ′ between the volume data sets 272 ′ and 276 ′ are calculated based on relation of the volume data sets to a global reference frame 290 , such as a reference frame of the camera array 26 .
  • a gravity vector G may be correlated to one or more of the arbitrary reference frames 280 , 282 , 284 , and 286 , as described earlier.
  • the reference frames of different volume data sets of the same volume, such as 272 and 272 ′, are registered with each other based on the uniquely identifiable feature of the volume in any suitable manner such as already discussed herein.
  • the process of obtaining volume data sets of the same portion of the femur and the same portion of the pelvis can be repeated in several additional different positions, such as to define a movement cone of the hip under regular use conditions.
  • the volume data sets 272 , 276 , 272 ′, and 276 ′ preferably are obtained using one or more ultrasound probes 32 that are tracked by the camera array 26 of the surgical navigation system 20 in a manner as described previously.
  • each of the volume data sets 272 , 276 , 272 ′, and 276 ′ is obtained using only a single tracked ultrasound probe 32 .
  • multiple tracked ultrasound probes 32 are used simultaneously to continuously obtain simultaneous volume data sets of each anatomical structure as the patient's leg, for example, is moved in different positions.
  • Functional motion parameters of the hip joint that are spatially related to the various volume data sets, such as a range of motion cone and the gravity vector G, may then be calculated based on the various volume data sets without the necessity of defining and/or determining local anatomical reference frames based on predefined anatomical features.
  • one or more computer systems 24 associated with the surgical navigation system 20 perform the required calculations and store all associated data in memory associated therewith in a manner known in the art.
  • the same or similar functional motion analyses may be performed on other portions of the body as well in a similar manner.
  • the methods and systems described herein can facilitate the relation of information from one data set to another data set, wherein the information would not otherwise be available or easily obtainable in the other data set.
  • the methods and systems disclosed herein in many aspects advantageously utilize arbitrarily defined unique reference frames in different data sets to easily register the data sets and relate the information from one to another without requiring identification and use of specific landmarks that can be compared and/or matched across two or more data sets.
  • Specific procedures that may benefit from the teachings disclosed herein include surgical procedures, such as joint arthroplasty to perform functional assessments during surgery and trauma surgery to mirror information from an unaffected anatomical structure to an affected anatomical structure.

Abstract

Methods and systems are disclosed for relating additional spatial information associated with one volume data set of an anatomical structure with another volume data set of the anatomical structure where the spatial information is not available. A unique spatial characteristic of the volume data set is identified, such as an image moment of inertia, and an arbitrary reference frame is assigned to the volume data set and correlated with the unique spatial characteristic. The additional spatial information is also correlated with the arbitrary reference frame. The additional spatial information is then correlated to a second volume data set of the anatomical structure by registering the first and second volume data sets based on the unique spatial characteristic. The methods and systems allow registration of different volume data sets of the same anatomical structure and transfer of the additional spatial information without establishing a local reference frame based on predefined landmarks.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • Not applicable
  • REFERENCE REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • Not applicable
  • SEQUENTIAL LISTING
  • Not applicable
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present disclosure relates to systems and methods for registering anatomical image data sets and relating anatomical information between anatomical image data sets.
  • 2. Description of the Background of the Invention
  • There exist various techniques in computer assisted surgical procedures to register patient related data across different modalities and/or different time frames. Such patient related data can include, for example, anatomical information or image data obtained using a variety of imaging techniques or modalities, such as ultrasound, magnetic resonance imaging (“MRI”), computed tomography (“CT”), single photon emission computed tomography, positron emission tomography, etc. One technique to register patient related data across different modalities is a “point to point” or “paired point” matching technique, wherein landmarks or fiducials that can be identified across different modalities are used to determine a transformation matrix and establish a spatial relationship between the different modalities. In one example, landmarks or fiducials are placed on a patient prior to an image scan, for example, an MRI or CT scan, and such landmarks or fiducials are identified in the image scan and on the patient during a surgical procedure to establish the registration between the patient and patient related data from the image scan.
  • In another technique, surface registration is used, wherein multiple surface points of a structure or region of interest are used to establish a registration surface. The surface points are identified independently in different modalities often using different techniques. In one embodiment, the surface registration technique is used in ear-nose-throat surgery where a face of a patient is used as a registration surface. For example, a CT or MRI scan of the face of the patient is obtained and the surface of the skin is identified in the scan and then matched to digitized points on the face of the patient during surgery. Such digitized points can either be collected directly with a digitization device, such as a pointer, or indirectly via a registration mask.
  • The above registration techniques generally serve only to register patient related data from one modality to a different modality. Most commonly, the registration techniques register pre-operative image data of a patient to the anatomy of the patient during surgery for localization purposes of surgical instruments used to perform the surgery.
  • In some types of procedures, such as procedures related to musculo-skeletal ailments, biomechanical and functional information of joints play an important role in determining the extent or cause of a disease. Such information is generally captured through a motion analysis. In one example of a motion analysis, fiducials are placed on the skin of a body part to be analyzed. A navigation system tracks the fiducials as the body part is moved and the movement of the fiducials is analyzed to establish a biomechanical model of the body part. An obvious downside of this technology is that the fiducials do not directly relate to the underlying bony structures and that shifts in skin or soft tissue occurs during motion. Such shifts can contribute to relatively large motion artifacts and inaccuracies in the results of the motion analysis and the established biomechanical model.
  • A technique that overcomes soft tissue shift is the direct implantation of fiducials, such as small tantalum beads, onto the bones of the subject, wherein the fiducials are tracked using stereo-radiography techniques during movement of the body part of the patient. Some of the obvious disadvantages of this technique are that a surgical procedure is required for bead implantation and that the motion analysis utilizes ionizing energy.
  • Further, during a surgical procedure, a motion analysis may not adequately capture functional information of the joints if the motion of the limb is passive. For example, when a surgeon moves the limbs of a patient, or if the patient is anesthetized and lying on an operating room table, no voluntary muscular forces are active to counter the effects of gravity on the body masses.
  • As surgical procedures around musculo-skeletal ailments start to shift away from pure static standing considerations to a more functional assessment of the joints and towards early intervention, the ability to capture joint related functional information and easily relate such information to the planning and execution of surgical procedures becomes increasingly important.
  • SUMMARY OF THE INVENTION
  • According to some aspects, a computer-implemented method of registering information associated with a first data set to a second data set is disclosed. The method comprise the steps of collecting a first data set of an anatomical structure with an imaging device, developing additional information for the first data set, wherein the additional information has a unique identifiable spatial relationship to the structure of the first data set, and establishing a first arbitrary reference frame for the first data set. The first reference frame is established without reference to any pre-selected landmark on the structure, and the first reference frame has a unique spatial relationship to the first data set. The method also comprises the steps of collecting a second data set of an anatomical structure with an imaging device, establishing a second arbitrary reference frame for the second data set, transforming the first reference frame to the second reference by matching a unique spatial parameter of the first data set with the same unique spatial parameter of the second data set, and registering the additional information with the second data set.
  • According to other aspects, a computer-implemented method of associating spatial information related to a first volume data set of an anatomical structure with a second volume data set of the anatomical structure is disclosed. The method includes the steps of obtaining a first volume data set of the anatomical structure with a computer surgical navigation system, assigning a first arbitrary reference frame to the first volume data set, calculating an inherent feature in the first volume data set, correlating the inherent feature to the first arbitrary reference frame, and associating additional spatial information with the first volume data set. The inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position, and the additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame. The method further includes the steps of obtaining a second volume data set of the anatomical structure with a computer surgical navigation system, assigning a second arbitrary reference frame to the second volume data set, identifying the inherent feature in the second volume data set, and correlating the inherent feature to the second arbitrary reference frame. The method also includes the steps of registering the first volume data set with the second volume data set based on the inherent feature, correlating the additional spatial information to the second volume data set in registration therewith, and displaying the additional spatial information in registration with the second volume data set on a display device. The registering step is performed by a computer
  • According to additional aspects, a system for collecting and manipulating a volume data set of an anatomical structure includes means for obtaining a first volume data set of an anatomical structure of a patient and a second volume data set of the anatomical structure, and means for calculating an inherent feature of the first volume data set and the second volume data set. The inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position. The system further includes means for assigning a first arbitrary reference frame to the first volume data set and a second arbitrary reference frame to the second volume data set, means for correlating the inherent feature to the first arbitrary reference frame, and means for associating additional spatial information with the first volume data set. The additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame. The system also includes means for registering the first volume data set with the second volume data set based on the inherent feature, and means for correlating the additional spatial information to the second volume data set in registration therewith.
  • According to further aspects, a method of establishing a position a portion of a bone that has been altered from a normal shape includes the step of collecting a first volume data set for a first bone that is unaltered, wherein the first volume data set includes volume data for first and second portions of the first bone. The method also includes the steps of identifying a first unique spatial characteristic of the volume data for the first portion of the first bone, establishing a first arbitrary reference frame for the first volume data set correlated with the first unique spatial characteristic, and identifying a unique spatial relation between the first arbitrary reference frame and the second portion of the first bone. The method further includes the step of identifying a second bone that normally mirrors the first bone about a centerline, wherein the second bone includes a first portion and a second portion that correspond as substantially mirror structures to the first and second portions of the first bone, respectively, and wherein the second bone has been altered from a normal shape such that the first portion of the second bone is in an altered position with regard to the second portion of the second bone. The method further includes the steps of collecting a second volume data set of the first the first portion of the second bone, identifying a second unique spatial characteristic of the second volume data set, wherein the second unique spatial characteristic substantially mirrors the first unique spatial characteristic, registering in mirrored correlation the first volume data set with the second volume data by correlating the first unique spatial characteristic with the second unique spatial characteristic, and re-establishing the normal position of the second portion of the second bone to coincide with the position of the second portion of the first bone as related to the registered position of the first portion of the first bone.
  • Other aspects and advantages of the present invention will become apparent upon consideration of the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view of an embodiment of a surgical navigation system adapted to implement methods of the present disclosure;
  • FIGS. 2 and 2A are flowcharts of registration procedures according to the present disclosure;
  • FIGS. 3A, 3B, and 3C illustrate the development of additional spatial information for an anatomical structure represented by a first data set and the relation of such information to a second data set;
  • FIG. 4A, is an example screen shot and FIGS. 4B and 4C are visual representations that depict data set collection and registration;
  • FIG. 5 illustrates the development of additional spatial information for a first anatomical structure and the relation of such information to a second anatomical structure that mirrors the first anatomical structure;
  • FIGS. 6A and 6B show an example of an anatomical reference frame defined pre-operatively that is not accessible during a surgical procedure; and
  • FIGS. 7A and 7B show an example of determining functional motion parameters of a hip of a patient.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • Systems and methods of the present disclosure may be used to register different data sets related to one or more structures of a patient and/or to relate additional information from one such data set to another such data set, wherein the additional information may not be available or practically obtainable for the other data set. In many instances, positional information of non-contiguous regions of a body is tied together without the need to identify or relate to local anatomical reference frames based on pre-defined anatomical landmarks. In one application, functional information from one data set is related to another data set to facilitate the performance of a functional assessment of a structure. For example, the structure can be an anatomical structure, such as a bone or joint of a patient, and the volume data set can be an image data set of the bone or bones obtained using an ultrasound probe or other known imaging techniques of modalities. The functional information that is developed for the anatomical structure from a pre-operative image data set can be related to an intra-operative image data set to aid in the planning and execution of surgical procedures and/or to facilitate early identification and prevention of certain diseases or harmful conditions. In another embodiment, other information can be utilized, for example, to re-establish an anatomical reference frame that is accessible in one data set but not another. A further aspect of the present disclosure is the ability to register different data sets for a structure without a need for a predefined landmark or fiducial on the structure. Instead, arbitrary reference frames are established for different data sets and used to register such data sets.
  • Turning now the drawings, FIG. 1 is a schematic view of a surgical navigation system 20 that is adapted to implement the steps of the procedure(s) disclosed herein. The surgical navigation system 20 includes a display unit 22, a computer system 24, and a camera array 26. In the present embodiment, the computer system 20 is housed in a moveable cart 28. The computer system 24 may be, for example, any type of personal computer having a memory unit, a CPU, and a storage unit (all not shown), as would be apparent to one of ordinary skill in the art. The display unit 22 can be any conventional display usable with the computer system 24, such as a standard computer monitor or television. An exemplary surgical navigation system is the Stryker Navigation system available from Stryker Corporation.
  • The surgical navigation system 20 is adapted to receive image data of a patient 30. In one embodiment, image data is obtained by an ultrasound probe 32 manipulated by a user 34, such as a surgeon or a nurse, and transmitted wirelessly to the computer system 24. Alternatively or additionally, a system that uses wires to transmit data between the ultrasound probe 32 and the computer system 24 can be used. In the present embodiment, the ultrasound probe 32 provides a non-invasive, non-ionizing, and portable imaging modality to obtain image data of the patient 30. Further, the ultrasound probe 32 provides image data for underlying bones to overcome skin shift related motion artifacts. However, in other embodiments, image data can be collected using any other acceptable imaging technique or modality, such as magnetic resonance imaging (“MRI”), computed tomography (“CT”), single photon emission computed tomography, positron emission tomography, and the like.
  • The camera array 26 is adapted to detect the position of a sensor 36 coupled to the ultrasound probe 32 to track the position and orientation of such ultrasound probe 32. By way of non-limiting examples, the sensor 36 can be one or more light emitting diodes (“LEDs”), the camera array 26 can include a first camera 38, a second camera 40, and a third camera 42, and the first, second, and third cameras 38, 40, 42, respectively, can be three CCD cameras that are adapted to detect infrared (“IR”) signals generated by the sensor 36. Although not shown, the user 34 can use other surgical tools and instruments that are capable of being tracked by the camera array 26 in the same manner as the ultrasound probe 32. These additional surgical tools and instruments may have sensors 36 that comprise, for example LEDs, either built into the tool or instrument or physically associated therewith in a known or determinable position and orientation sufficient for tracking the position of the instruments.
  • The camera array 26 is mounted on a rotatable arm 44 attached to the movable cart 28 so that the camera array 26 has a sufficient line of sight to a relevant field where a procedure is to take place. In other embodiments, the camera array 26 may be mounted onto an operating room wall (not shown) or onto another convenient surface or location.
  • The surgical navigation system 20 can be an active optical system that includes at least one infrared transceiver that is used to communicate data to and from the sensor 36. For example, in the present embodiment, the camera array includes a first transceiver 46 and a second transceiver 48 located apart from each other. While the present disclosure is described using an active optical surgical navigation system, the systems and methods of the present disclosure can also be used with other surgical navigation technologies and systems, such as passive optical systems, magnetic based systems, inertial navigation based systems, and the like. Other computer-assisted systems also can be used including RFID based systems, video imaging based systems, and the like.
  • The camera array 26 is connected via a cable 50 to a localizer (not shown) or in some instances directly to the computer system 24. The localizer cooperates with the camera array 26 to identify the location and orientation of the sensor 36 on the ultrasound probe 32 within the line of sight of the camera array 26. In one embodiment, the localizer converts raw position data of the LEDs into the orientation of individual LEDs of a plurality of LEDs that make up the sensor 36 and transmits this information to the computer system 24. In another embodiment, the localizer converts raw position data of the LEDs into the position and orientation of the ultrasound probe 32 and transmits this information to the computer system 24. In a further embodiment, a software program executed by the computer system 24 can convert the raw data into the orientation of the ultrasound probe 32. The conversion of the raw position data is well known to one skilled in the art. The computer system 24 may optionally be controlled remotely by control buttons (not visible) located on the ultrasound probe 32 or otherwise easily accessible to the user 34. The computer system 24 also includes one or more input devices, such as a keyboard 52, a mouse 54, or any other input devices for operating the computer system 24.
  • Referring next to FIGS. 2 and 2A, methods of registering a first volume data set of an anatomical structure to a second volume data set of the anatomical structure is disclosed. The methods are preferably performed using a computer surgical navigation system as disclosed herein, wherein the navigation system can track the position of one or more gathering devices for gathering the volume data sets, storing position data and extracting information therefrom, and correlating the position data with the volume data sets.
  • FIG. 2 describes a broad view of a method of registration, in which control initiates at a block 80 that collects a first data set for a structure, such as an anatomical structure of a patient. Following the block 80, control passes to a block 82 that develops additional positional information related to the first data set, and a block 84 that determines or establishes a first reference frame for the first data set. The first reference frame is preferably an arbitrary reference frame. A block 86 collects a second data set, a block 88 determines a second reference frame for the volume second data set. Thereafter, control passes to a block 90 that transforms the first reference frame into the second reference frame, and a block 92 relates the additional spatial information from the first reference frame, such as functional information for the anatomical structure, to the second reference frame.
  • FIG. 2A describes a more detailed view of a method embodying the method of FIG. 2, wherein control initiates a block 100 that collects a first data set for a structure, such as an anatomical structure of a patient. The first data set can include a first volume data set for the anatomical structure. As used herein, the term anatomical structure can include an entire anatomical structural unit, such as a complete bone, and the term anatomical structure can include a smaller portion of the entire anatomical structural unit less than the entirety, such as just a small portion of the bone. Preferably, the volume data set includes information about the position and orientation of the anatomical structure, such as a bone or a joint. The first volume data set is collected using a subcutaneous imaging device, such as the tracked ultrasound probe 32 of FIG. 1, wherein the volume data set includes an image of a bone or other subcutaneous structure on a patient. Other image capturing devices and modalities for capturing the first image data set may be used also, such as CT scan, MRI, X-rays, etc. The first volume data set may be obtained pre-operatively, for example. The first volume data set preferably includes image data regarding the anatomical structure, and may include two-dimensional (2D) image data and/or three-dimensional (3D) image data. One exemplary capturing device and modality for capturing the first image data set includes a 2D or 3D ultrasound imaging device, if a 2D ultrasound probe is used, such probe can be used to collect volume data by collecting multiple slices of a region that includes the anatomical structure. The position of the capturing device is tracked by the surgical navigation system while capturing the volume data set. Preferably, the anatomical structure is in a fixed position while the volume data set is captured, which dispenses with a need to track the position of the anatomical structure separately during the capturing. The anatomical structure optionally may be tracked during the capturing, in which case the anatomical structure may move during the capturing and/or additional robustness may be incorporated into position data for the acquired volume data set encompassing the anatomical structure.
  • After block 100, control passes to a block 102 that assigns or determines a first reference frame for the first data set. The first reference frame is preferably an arbitrary reference frame established without reference to any pre-defined landmark, such as a fiducial or particular anatomical landmark, on the anatomical structure. For example, the first reference frame can be of a camera assembly, an ultrasound probe, or the first volume data set itself, such as, a center of the first volume data set. Other ways to establish the arbitrary reference frame to distinguish from other types of reference frames can also be used. The reference frames may be established by any known or commonly used image processing algorithms. Once established, the arbitrary reference frame preferably has a unique spatial relationship to the volume of the subject anatomical structure, such as a bone, and the arbitrary reference frame remains fixed in the same position relative thereto.
  • Control passes to a block 104 that identifies an inherent feature of the first volume data set, such as a spatially unique physical spatial aspect of the volume data set. Preferably, the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position. In a preferred method, the computer system 24 is adapted with appropriate command routines to calculate an image moment of inertia of the volume data set, which is constant with regard to the volume regardless of what point of view the volume is acquired or viewed from. Using the moment of inertia can be advantageous because any given volume has a constant moment of inertia that has a unique fixed spatial relation to a data set representing a particular volume and regardless of the point of view from which the volume is viewed. Therefore, for example, the moment of inertia of a volume data set of a particular portion of a bone will be in the same relative position to that portion of the bone regardless of from what position or point of view the volume data set is obtained. Other methods of determining a spatially unique physical spatial aspect of the volume data set may be used and may obtain alternative or additional uniquely defined spatial information about the anatomical structure, such as surface contour information, point landmarks, etc., that could be used to define the arbitrary reference frame.
  • Thereafter, control passes to a block 106 that correlates the first arbitrary reference frame to the spatially unique physical aspect of the volume data set. In one example, the block 88 correlates the first arbitrary reference frame and the image moment of inertia by a known unique spatial relationship therebetween, such as the xyz Cartesian coordinates of the image moment of inertia within the first arbitrary reference frame. In one exemplary method, the first arbitrary reference frame is assigned such that the moment of inertia defines an axis of the arbitrary reference frame. Other alternative and/or equivalent methods or systems for correlating the arbitrary reference frame to the spatially unique physical aspect may be used.
  • Control passes to a block 108 that develops and/or associates additional positional information with the first volume data set, wherein the additional spatial information has a unique spatial orientation relative to the first arbitrary reference frame. The additional positional information can include functional information for the anatomical structure, such as a gravity vector that acts on the anatomical structure and/or orientations of parts of the anatomical structure with respect to each other. In one embodiment, the first data set is a pre-operative volume data set and the functional information is obtained for the anatomical structure when a patient is in a generally standing or upright position. In other embodiments, the functional information can be obtained by other methods, as would be apparent to those skilled in the art.
  • The additional spatial information may be contiguous with the first volume data set. For example, the additional spatial information may include gravity vector information that defines a gravity vector through the first volume data set at the time the volume data set was obtained. The gravity vector may be obtained by any known method. One such method includes having a gravity sensing device, such as an accelerometer, installed on the camera of the surgical navigation system, wherein the gravity sensing device identifies the local gravity vector while the first volume data set is being gathered. The gravity vector information is then associated with the volume data set such that the gravity vector can be uniquely located with respect to the first arbitrary reference frame.
  • In another example, the additional spatial information set may be non-contiguous to the first volume data set. For example, the additional spatial information may include a vector that identifies the location and orientation of another reference frame. The other reference frame may be another arbitrary reference frame for a non-contiguous volume data set that does not overlap with the first volume data set, wherein the non-contiguous volume data set relates to another anatomical structure or another portion of the same anatomical structure. In one instance, the first volume data set may be of a first portion of a bone and the non-contiguous volume data set may be of a second portion of the same bone. The other reference frame may also include a global reference frame that is common to several volume data sets, such as a camera reference frame of a camera of the surgical navigation system. In this manner, the specific locations and/or orientations of reference frames of each of one or more non-contiguous volume data sets may be interlinked with the first volume data set such that the location and orientation of any one or more of the volume data sets may be used to identify the location and orientation of one or more of the other non-contiguous volume data sets even though the volume data sets are not specifically overlapping and not presently viewed.
  • Other types of additional spatial information that may have unique spatial characteristics in relation to the anatomical structure represented by the volume data set may also be identified with the volume data set. By way of non-limiting examples, other types of additional spatial information could include location and orientation vector(s) of the arbitrary reference frame of the volume data set with respect to other local anatomical reference frames of the patient, such as a pelvic plane, femoral mechanical axis, femoral anatomical axis, and other relevant local reference points and/or frame such as commonly used in the art. However, the disclosure contemplates that any type of spatial information that has a unique identifiable spatial characteristic in relation to the anatomical structure represented by the volume data set may be associated with the volume data set as considered necessary or expedient for various and different specific applications.
  • Control passes to a block 110 that collects a second data set. In one example, the second volume data set may be of the same anatomical structure as for the first volume data set or at least have significant overlap therewith. There is sufficient overlap between the first data set and the second data set so that the first and second data sets can be registered in a subsequent step described hereinafter. In a preferred embodiment, there is at least about a seventy percent overlap of the anatomical structure captured in the first volume data set and the anatomical structure captured in the second volume data set. More than seventy percent overlap may be even more preferable in some instances, and less than seventy percent overlap may be sufficient in other instances. The second volume data set may be obtained from a same point of view as the first volume data set or it may be obtained from a different point of view.
  • In another example, the second volume data set may be of a different anatomical structure that has some known or determinable spatial relationship to the first anatomical structure. In one aspect, the first volume data set may be of a bone or portion thereof on one side of a patient's body and the second volume data may be of a bone or portion thereof on the opposite side of the patient's body that corresponds as a substantially mirror image of the first bone. By way non-limiting example, the first volume data set may include image data of a left femoral head and the second volume data set may include image data of a right femoral head. The left femoral head is assumed to be mathematically equivalent to a mirror image of the right femoral head in relation to a centerline of the body. In this manner, the left femoral head and the right femoral head have a known or determinable spatial relationship to each other about the centerline of the body. For example, a prominent standard anatomical feature of the femoral head, such as the lesser trochanter, may be identified as a landmark that is assumed to be in the same location on each of the left and right femurs but in mirror image relationship to each other about a centerline of the body. Other identifiable relationships between different anatomical structures can also be identified and used in a similar manner as described herein.
  • In some applications, the second volume data set is obtained during a different portion of a procedure than the first volume data set, such as intra-operatively. In one example, the second data set is an intra-operative volume data set of the same bone or portion thereof that is obtained while the patient is anesthetized and lying on an operating table. In other applications, the first and second volume data sets could be collected during the same portion of a surgical procedure, such as both being collected intra-operatively or both being collected pre-operatively. Additionally, one or more volume data sets may be collected post-operatively, such as to aid in post-operative diagnostics, for example.
  • The second data set can be collected using the same modality or a different modality than the first data set. In one example, both the first and second volume data sets are obtained using a 3D ultrasound imaging system having a tracking device attached thereto for being tracked by the surgical navigation system. In the present example, the second data set is also collected using the ultrasound probe 32 tracked by the surgical navigation system 20.
  • Thereafter, control passes to a block 112 that assigns a second reference frame to the second volume second data set. Like the first reference frame, the second reference frame can be an arbitrary reference frame and can be assigned or determined in a similar manner as described herein. The second arbitrary reference frame preferably is defined uniquely by the second volume data set. In one example, the second arbitrary reference frame is defined by the anatomical structure in the same manner as described previously herein with respect to the first anatomical structure. The second reference frame can be the same as or different from the first reference frame.
  • Control also passes to a block 114 that identifies an inherent feature of the second volume data set, such as a spatially unique physical spatial aspect of the volume data set. Where the first and second volume data sets are of substantially the same anatomical feature, the inherent feature of both volume data sets is preferably the same because the same anatomical feature would have the same unique spatial aspect, such as the image moment of inertia, for both the first and second volume data sets. For example, in a system that calculates an image moment of inertia of an ultrasound image, the moment of inertia of the structure is constant and substantially unique relative to a given structure regardless of the point of view from which the structure is viewed. Therefore, the moment of inertia of the same anatomical feature is uniquely identifiable in different volume data sets of the anatomical feature taken from different points of view. In such an example, the second arbitrary reference frame may also be uniquely spatially associated with the image moment of inertia.
  • A block 116 correlates the second reference frame to the inherent feature of the second volume data set in the same manner as in block 106 or any sufficient manner.
  • Control then passes to a block 118 that registers the first reference frame into the second reference frame. In one embodiment, the block 118 performs data set matching, such as by finding the unique moment of inertia of each image and correlating the first and second arbitrary reference frames to each other based on matching the moment of inertia of the first volume data set to the moment of inertia of the second volume data set and calculating an appropriate transformation matrix therefrom. Another possible method of data set matching may include conducting a different volume data match of the first and second volume data sets, whereby first and second volume data sets are virtually overlaid and correlated to each other using any suitable or commonly known method. The registration may also be performed using other methods, such as volume-volume matching, surface-surface matching, and/or point-to-point matching. Under any registration technique, a mathematical transformation including rotation, translation, and scale, is calculated preferably by the computer system 24 that will register a common or assumed common spatially unique feature in the two volume data sets. The computer system 24 then transforms one or both of the arbitrary reference frames to bring the spatially unique feature, and thus the volume data sets, into registration with each other. Other equally efficacious methods of calculating and performing a suitable transformation and registration may be used as would be known to a person skilled in the art.
  • Control then passes to a block 120 that relates the additional spatial information from the first reference frame, such as functional information for the anatomical structure, to the second reference frame. The additional spatial information from the first volume data set is related to the second arbitrary reference frame of the second volume data set after (or contemporaneously as) the first and second volume data sets have been brought into registration. In this manner, the additional spatial information is associated with the second arbitrary reference frame in correct registration therewith even though the additional spatial information is not directly available when the second volume data set is acquired. For example, when the additional spatial information includes the gravity vector as described above, the gravity vector is associated with and brought into registration with the second volume data set in proper orientation to the anatomical structure even when the anatomical structure is in a different orientation. When the additional spatial information includes a vector that identifies the location and orientation of another reference frame as described above, the location and orientation of the non-contiguous volume data sets not part of the second volume data set may be identified based on the association and registration of the vector information comprising the additional spatial information.
  • Optionally, control may then pass to a block 122 that displays in registration with the second volume data set on a display device. Additional manipulations and uses of the additional spatial information may also be performed as desired.
  • The blocks 100-122 described above can be rearranged, reordered, or modified by combining to include fewer or breaking down further additional steps, as would be apparent to one of skill in the art. As shown in FIG. 2A, the steps 100-122 in some instances correlate with the steps 80-92 shown in FIG. 2, such as by being considered sub-steps thereof. Further, the logic represented by the flowcharts of FIGS. 2 and 2A can be implemented by any computer system that is adapted to implement the blocks 80-92 and/or 100-122, such as the surgical navigation system 20 of FIG. 1. In one embodiment, the computer system 24 includes appropriately programmed software and hardware to perform the blocks and processes of FIGS. 2 and 2A.
  • FIGS. 3A, 3B, and 3C illustrate an example where the additional spatial information includes functional information for an anatomical structure that is obtained from a pre-operative volume data set and related to an intra-operative volume data set so that the functional information can be used during a surgical procedure on a patient. In one such surgical procedure, for example a hip joint arthroplasty procedure, a gravity vector that acts on the various bones that make up the hip joint can be an important factor for the precise positioning of components of a prosthetic hip implant based on certain functional motion characteristics of the patient. In other embodiments, functional information may be obtained for the anatomical structure in a plurality of different positions or over a period of time. For example, functional information can be obtained for a knee joint in various positions to collect extension/flexion, varus/valgus, and other information for use during a surgical procedure.
  • In a joint arthroplasty procedure, a prosthetic component is placed accurately and effectively using the arbitrary reference frames discussed herein instead of relying on local biomechanical/anatomical references, such as a femur mechanical axis, pelvic frontal plane, or other standard local anatomical reference frames. As used herein, a “local” reference frame is a reference frame based on specific accepted or pre-defined defined anatomical features of a patient, such as specific skeletal landmarks. In contrast, an “arbitrary reference frame” refers to a reference frame that is identified uniquely based solely on the feature being looked at, such as the specific volume data set being viewed. Thus, the arbitrary reference frame is not dependent on locations of one or more specific pre-defined anatomical landmarks with respect to other portions of the anatomy but is correlated to and identifiable from unique spatial characteristics of only the anatomy of interest.
  • Relating the functional information, such as the gravity vector G, to the intra-operative procedure facilitates placement of correct prosthetic components in an optimal position and alignment based on the natural motion and movement patterns of the patient. The position and alignment of the prosthetic can further be optimized using a plurality of parameters that include, for example, joint specific anatomical and kinematic constraints, patient life-style specific activities, and prosthetic design specific geometrical and kinematic constrains. Still further optimization can be realized through incorporation of other relevant factors that arise or become visible intra-operatively, such as after the preparation of a joint surface to accept a prosthetic component.
  • FIG. 3C illustrates steps in one method and specific example that utilizes additional spatial information on a computer implemented surgical navigation system, such as the surgical navigation system 20, including functional information relative to that shown in FIGS. 3A and 3B. A block 170 acquires a pre-operative image volume data set 150 of a hip 152 or of parts of the hip of a patient 154 while the patient is standing. Preferably, the image volume data set 150 is acquired using the ultrasound probe 32 while being tracked by the camera array 26 to gather image data of the bones of interest in the hip joint, and the image data is stored in a suitable electronic memory available to the surgical navigation system.
  • A block 172 defines an arbitrary axis, such as an axis of the camera array, to the image volume data set 150, identifies a unique spatial parameter of the image volume data set, and correlates the arbitrary axis to the spatial parameter. In this particular example, the unique spatial parameter preferably includes the image moment of inertia of the image volume data set of the hip bones, calculated as discussed previously herein. The arbitrary axis is optionally also correlated to a local anatomical parameter of the hip 152, such as a frontal plane 156 of the hip 152 as shown in FIG. 3.
  • A block 174 assigns additional spatial information including functional information, such as a gravity vector G, to the image volume data set 150. In this example, the gravity vector G is shown pointing downwards to the floor in relation to the hip 152 in FIG. 3A because the patient is standing upright while the image volume data set 150 is obtained. The gravity vector is acquired using an inertial system with an accelerometer, but can be acquired by any sufficient system known in the arts, such as a liquid level measurement system or other systems. The gravity vector G is spatially assigned by determining an orientation of the gravity vector G with respect to the anatomical parameter and/or arbitrary axis, such as a specific tilt angle β with respect to the frontal plane 156. In further examples, other anatomical parameters can be defined, for example, a functional plane 158, an iliac crest, a pubic symphesis, and the like, and various planes and angles defined thereby, wherein the gravity vector G can be related to such anatomical parameters.
  • A block 176 collects an intra-operative image volume data set 160 of the same general area while the patient 154 is lying in a generally prone position in a similar manner as with the block 170.
  • A block 178 performs a data match to register the pre-operative image volume data set 150 and the intra-operative image volume data set 160 using any suitable data matching technique, such as image inertia matching or the volume data matching techniques discussed previously. Due to such registration, the gravity vector G is simultaneously or subsequently transferred to the intra-operative volume data set 160 for use during the procedure.
  • A block 180 displays the image volume data set 160 on the display unit 22 with the gravity vector G shown in registration with the bones of the hip. A replacement prosthesis can then be aligned with the bones of the hip using the surgical navigation system 20 so as to have a relationship to the bones that has been preselected based on the position of the bones with respect to the gravity vector G.
  • The method shown and described in relation to FIGS. 3A-3C is preferably implemented on the surgical navigation system 20, and blocks 172, 174, 178, and 180 are preferably performed by appropriate computer software routines associated with and preferably controlling the computer system 24 in any available manner known to one skilled in the art.
  • FIGS. 4A-4C illustrate an example of a volume data match registration procedure, wherein three data set collection screen shots are shown. In a first step, a pre-operative volume data set 190 of a hip of a patient is collected by an ultrasound probe and shown on a display screen as shown in FIG. 4A, and a first arbitrary reference frame 192 is assigned thereto. Optionally, additional spatial information, such as a gravity vector G, is associated in a unique spatial location with respect to the volume data set 190. An intra-operative volume data set 194 of the hip is then collected, and a second arbitrary reference 196 frame is assigned thereto. In the present example, the anatomical parameter of the hip includes a right iliac crest 198, a left iliac crest 200, and a pubic symphesis 202 in the pre-operative volume data set. The same right and left iliac crests and pubic symphesis are also identified in the intra-operative volume data set 194, wherein such structures in the pre-operative and intra-operative volume data sets can establish a unique arbitrary reference frame for the data sets. Additionally or alternatively, the image moment of inertia is calculated for each of the pre-operative and intra-operative volume data sets 190, 194 such that the image moment of inertia of the pre-operative volume data set 190 is identical or sufficiently close to the same as the image moment of inertia of the intra-operative volume data set 194. Preferably, there is about seventy percent or more overlap between the two volume data sets 190, 194. FIG. 4B shows the pre-operative volume data set 190 and the intra-operative volume data set 194 of the hip overlapping before registration, such as with the first and second arbitrary reference frames 192, 196 aligned and without being registered. The pre-operative data set 190 and intra-operative data set 194 are then registered by a reference frame transfer by overlaying and correlating the reference frames of the respective data sets, as illustrated, for example, in FIG. 4C, which shows the pre-operative and intra-operative volume data sets 190, 194 of the hip overlapping after registration. Once the pre-operative and intra-operative volume data sets are registered, the transformation between the data sets is used to relate the additional spatial information, such as the gravity vector, from the pre-operative volume data set 190 to the intra-operative volume data set 194. As discussed above, the gravity vector G is determined with respect to an anatomical parameter of the hip in the pre-operative volume data set and assigned to the volume data set in unique spatial orientation thereto.
  • The embodiments of FIGS. 3A-3C, and 4A-4C provide an improvement over prior surgical procedures, which typically used the functional plane 158 as an approximation for the gravity vector G. The functional plane 158 can be determined using known methods, for example, by determining an anterior superior iliac spine and by integration of a longitudinal body axes. While the functional plane 158 has provided a rather good approximation of the gravity vector, there is generally a difference of about 5° to about 10° between the orientation of the functional plane 158 and the gravity vector G. Therefore, detecting and assigning the gravity vector G is a much more accurate and reliable method for considering functional aspects of the patient during normal activity and movement parameters.
  • The concepts disclosed herein can also be utilized during orthopedic, reconstructive, or trauma surgery procedures, during which procedures it is important to re-establish function and anatomy of an affected area. In some cases, an unaffected area is used to mirror an affected area to provide symmetrical biomechanical parameters, such as spatial relationships, angles, and biomechanical parameters, to repair the affected area. By way of illustration only with reference to FIG. 5, in the case of a patient with a broken left femur, image data and functional information can be obtained for a healthy right femur of the patient. The functional information for the right femur can be related to the broken left femur using the procedure of FIG. 2 to provide symmetrical biomechanical parameters to repair the broken left femur. In this example, data sets for both the right and left femurs normally are obtained intra-operatively due to the circumstances of typical trauma surgery, as would be apparent to one of ordinary skill in the art.
  • Referring to the example of FIG. 5, a diagrammatic image data set 210 of a left femur 212 and a right femur 214 shows that the left femur 212 has suffered some trauma, such as a severe fracture 216 across the femur neck and separating the left femur head 218, while the right femur 214 is unaffected. The image data set 210 can be obtained all at once by any suitable modality, such as with the ultrasound probe 32 and the surgical navigation system 20 as described earlier, and stored on the computer system 24. The computer system 24 then develops information for the left and right femurs 212, 214 from the image data set 210. The information includes a first volume data set 220 including the left femur head 218 and a second volume data set 222 including a part of the left femur body 224. The volume data sets 220 and 222 are preferably not contiguous to each other and/or are mathematically isolated from each other on opposite sides of the fracture 216. Further, the information also includes a third volume data set 226 including the unaffected right femur 214. The third volume data set 226 includes a volume data set 228 of the head of the right femur 214 and a noncontiguous volume data set 230 of a part of the right femur body. The right femur 214 is preferably held immobile while the entirety of the volume data set 226, including the volume data sets 228 and 230, is obtained. Alternatively, a tracking device (not shown) may be attached to the femur 214 while the volume data set 226 is obtained in order to provide additional robustness to the position information of the volume data sets 228 and 230 relative to each other and/or correct for movement of the right femur while the volume data set 226 is obtained.
  • Each volume data set 220, 222, 228, and 230 is assigned an arbitrary reference frame, 232, 234, 236, and 238, respectively. Each reference frame 232, 234, 236, and 238 preferably has a location and orientation that is correlated to a uniquely identifiable aspect of the volume data set, such as the image moment of inertia described herein.
  • Additional spatial information comprising a vector 240 that uniquely defines the spatial relation, including position and orientation, of the volume data sets 222 and 230 to each other is established. Thus, the reference frames 236 and 238 are spatially correlated with each other in a global reference frame 242, such as of the camera array 26, by the vector 240 even though the two volume data sets are not contiguous with each other. Of course, other global reference frames may also be used. Calculation of the additional vector 240 may not be necessary or may be used to provide additional mathematical robustness by providing redundant measurements in an example where the volume data set 228 is contiguous with the volume data set 230.
  • Next, the first volume data set 220 and the second volume data set 222 of the affected left femur 212 are matched and registered with corresponding portions 228 and 230, respectively, of the third volume data set 226 of the unaffected right femur 214 in order to determine a reconstructed position of the left femur head 218 and left femur body 224 that will match corresponding portions of the unaffected right femur 214. In one exemplary method, the registration processes includes performing a reference frame transfer from the reference frame of the unaffected right femur 214 to the reference frame of both volumes 220 and 222 of the affected left femur 212. To do this, it is assumed that the shape and position of the left femur 212 should be identical and a mirror image of the shape and position of the right femur 214 about a centerline therebetween. It is also assumed that the shapes of portions of the right femur 214 captured in the volume data sets 228 and 230 correspond substantially to the shapes of corresponding portions of the left femur 212 captured in the respective volume data sets 220 and 222. With these assumptions, the reference frames 236 and 238 and corresponding volume data sets 228 and 230 and the vector 240 of the right femur 214 are mathematically mirrored about a centerline 244 to be in position to match the left femur 212. One of the volume data sets 220 or 222 of the left femur 212 is then matched to with the corresponding mirrored volume data set 228 or 230 of the right femur 214. For example, the volume data sets 222 and 238 may both include an easily identifiable three-dimensional feature, such as the lesser trochanter 246, which can be used to register the volume data set 222 with the mirrored volume data set 238. In another example, an image moment of inertia is calculated for both volume data sets 222 and 238, and the image moment of inertias are then matched after mirroring the right femur information. Other methods of registering mirrored corresponding volume data sets may also be used.
  • After the volume data set 222 is registered with the mirrored volume data set 238, information about the orientation of the other parts of the unaffected right femur 214 can be related to the affected left femur 212 and used by a surgeon to position properly the broken parts of the affected left femur 212. For example, the mirrored vector 240′ and volume data set 228 define the theoretically correct position of the left femur head 218 in relation to the left femur body 224. Preferably, tracking devices 246 and 248 are attached to each of the left femur body 224 and the left femur head 218 and independently tracked by the surgical navigation system 20 during the entire procedure. A surgeon is then able to adjust the pieces 218 and 224 of the left femur to align with the theoretically derived locations based on the registration to the mirrored volume data sets 228, 230 of the right femur 214.
  • Because only the relative position of portions of an unaffected bone with respect to the corresponding portions of an affected bone are of interest, there is no need to calculate the absolute positions of the volume data sets 220, 222, 226, 228, and 230 relative to the rest of the body of the patient by using local reference frames. Rather, the volume data sets and attendant reference frames can be chosen arbitrarily. Therefore, time and computing resources are saved because there is no need to establish a local reference frame. Of course, the method is not limited to work done on a femur as describe here, but can be used with minor modifications for any anatomical structure that is substantially mirrored on opposite sides of a centerline of a body, such as arms, ribs, feet, hands, hip, etc.
  • The present disclosure also contemplates the relation of spatial information from one data set relating to an anatomical structure at a first time to another data set of the same anatomical structure at another time, such as an anatomical reference frame. An exemplary situation shown in FIGS. 6A and 6B is the ability to define an anatomical reference frame pre-operatively that is not accessible during a surgical procedure due to the positioning and draping of the patient on a surgical table. In this example, a pre-operative volume data set of a forearm 250, including image data of the underlying bone(s), is collected using the tracked ultrasound probe 32 and surgical navigation system 20 at a stage when the forearm is accessible. For example, a first volume data set 252 at one location of the forearm 250 is obtained and a second volume data set 254 at another location of the forearm 250 is obtained when both locations are accessible, as shown in FIG. 6A. The pre-operative data set preferably includes regions of the forearm 250 that will be accessible after the area of interest has been prepared and draped or will otherwise be inaccessible. An anatomical reference frame or other spatial information, such as a global reference frame 256 of the camera assembly 26, is defined at a stage when the anatomy is accessible, and arbitrary reference frames 258 and 260 are defined for volume data sets 252 and 254, respectively. Each reference frame 258, 260 is uniquely identifiable from the associated volume data set, such as by having a known relation to the image moment of inertia the respective volume data set 252, 254 as discussed previously. Once defined, the anatomical reference frame or other global reference frame 256 is geometrically associated with the arbitrary reference frames 258, 260 for the pre-operative data set, and a vector 262 is determined that associates the arbitrary reference frames 258 and 260 and the respective volume data sets 252 and 254 in a unique spatial position relative to each other. During a subsequent surgery, the portion of the forearm 250 corresponding to volume data set 254 may not be accessible to the ultra sound probe 32, such as due to draping as shown in FIG. 6B, or any other reason. In such case, a subsequent volume data set 252′ having substantial overlap with the volume data set 252 is collected using the ultrasound probe 32 and surgical navigation system 20, and the arbitrary reference frame 258 is re-established to relate the anatomical or global reference frame 256 to the subsequent volume data set 252′. The arbitrary reference frame 258 can be re-established in any sufficient manner, such as by matching the image moment of inertia of the volume data sets 252 and 252′ in a substantially similar manner as described previously herein, by other volume matching or surface matching methods, etc. The computer system 24 re-establishes the location of the second volume data set 254 based on the vector 262 using appropriate program routines even though that area of the forearm 250 is not accessible. In this manner, the surgical navigation system 20 is able to re-establish the locations of portions of the bones of the forearm 250 based on being able to view just one volume portion of the bone that was previously identified without having to either view the other volume portions that were previously identified or define local anatomical landmarks as discussed above. The example provided herein may be applied similarly to any anatomical feature that maintains a relatively stable structural geometry over time, such as any bone, and may be extended to apply to any number of spatially inter-connectable volume data sets.
  • In another application, functional information about an anatomical structure is developed by collecting a plurality of volume data sets of the same anatomical structure in a plurality of different positions at corresponding different times without the need to identify a local anatomical reference frame. One example of this application is shown in FIGS. 7A and 7B, wherein functional motion parameters of a hip of a patient 270 are determined. In this example, a first volume data set 272 is gathered of a portion 274 of the patient's femur, and a first volume set 276 is gathered of a portion 278 of the pelvis, both with the patient's leg in an extended position. A second volume data set 272′ is gathered of substantially the same portion of the patient's femur 274, and a second volume data set 276′ is gathered of substantially the same portion 278 of the patient's pelvis, both with the patient's leg in a flexed position. Each volume data set 272, 276, 272′, and 276′ is assigned an arbitrary reference frame 280, 282, 284, and 286, respectively, that is correlated to a known position in relation to a uniquely identifiable feature of the respective volume data set. Preferably, each arbitrary reference frame 280, 282, 284, and 286 is correlated to an image moment of inertia of each volume data set 272, 276, 272′, and 276′, although other identifiable unique attributes of a particular volume data set could be used, as discussed herein. Additional positional information including a vector 288 between the volume data sets 272 and 276 and a vector 288′ between the volume data sets 272′ and 276′ are calculated based on relation of the volume data sets to a global reference frame 290, such as a reference frame of the camera array 26. In some applications, a gravity vector G may be correlated to one or more of the arbitrary reference frames 280, 282, 284, and 286, as described earlier. The reference frames of different volume data sets of the same volume, such as 272 and 272′, are registered with each other based on the uniquely identifiable feature of the volume in any suitable manner such as already discussed herein. The process of obtaining volume data sets of the same portion of the femur and the same portion of the pelvis can be repeated in several additional different positions, such as to define a movement cone of the hip under regular use conditions. The volume data sets 272, 276, 272′, and 276′ preferably are obtained using one or more ultrasound probes 32 that are tracked by the camera array 26 of the surgical navigation system 20 in a manner as described previously. In one method and system, each of the volume data sets 272, 276, 272′, and 276′ is obtained using only a single tracked ultrasound probe 32. In such a system, preferably there is no movement of the femur and pelvis while the volume data sets are obtained at each position. In an alternative method and system, multiple tracked ultrasound probes 32 are used simultaneously to continuously obtain simultaneous volume data sets of each anatomical structure as the patient's leg, for example, is moved in different positions. Functional motion parameters of the hip joint that are spatially related to the various volume data sets, such as a range of motion cone and the gravity vector G, may then be calculated based on the various volume data sets without the necessity of defining and/or determining local anatomical reference frames based on predefined anatomical features. Preferably, one or more computer systems 24 associated with the surgical navigation system 20 perform the required calculations and store all associated data in memory associated therewith in a manner known in the art. The same or similar functional motion analyses may be performed on other portions of the body as well in a similar manner.
  • Other embodiments comprising various combinations of the individual features of each of the foregoing described embodiments are specifically included herein.
  • INDUSTRIAL APPLICABILITY
  • The methods and systems described herein can facilitate the relation of information from one data set to another data set, wherein the information would not otherwise be available or easily obtainable in the other data set. The methods and systems disclosed herein in many aspects advantageously utilize arbitrarily defined unique reference frames in different data sets to easily register the data sets and relate the information from one to another without requiring identification and use of specific landmarks that can be compared and/or matched across two or more data sets. Specific procedures that may benefit from the teachings disclosed herein include surgical procedures, such as joint arthroplasty to perform functional assessments during surgery and trauma surgery to mirror information from an unaffected anatomical structure to an affected anatomical structure.
  • Numerous modifications to the present invention will be apparent to those skilled in the art in view of the foregoing description. Accordingly, this description is to be construed as illustrative only and is presented for the purpose of enabling those skilled in the art to make and use the invention and to teach the best mode of carrying out same. The exclusive rights to all modifications that come within the scope of the appended claims are reserved. All patents and patent applications referred to herein are incorporated herein in the entireties thereof.

Claims (38)

1. A computer-implemented method of registering information associated with a first data set to a second data set, the method comprising the steps:
collecting a first data set of an anatomical structure with an imaging device;
developing additional information for the first data set, wherein the additional information has a unique identifiable spatial relationship to the structure of the first data set;
establishing a first arbitrary reference frame for the first data set, wherein the first reference frame is established without reference to any pre-selected landmark on the structure and the first reference frame has a unique spatial relationship to the first data set;
collecting a second data set of the anatomical structure with an imaging device;
establishing a second arbitrary reference frame for the second data set;
transforming the first reference frame to the second reference by matching a unique spatial parameter of the first data set with the same unique spatial parameter of the second data set; and
registering the additional information with the second data set.
2. The method of claim 1, wherein the step of establishing a first reference frame includes the steps:
calculating an inherent feature in the first data set, wherein the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position; and
correlating the inherent feature to the first arbitrary reference frame.
3. The method of claim 2, wherein the step of establishing a second arbitrary reference frame includes the steps:
identifying the inherent feature in the second volume data set; and
correlating the inherent feature to the second arbitrary reference frame.
4. The method of claim 3, wherein the step of transforming includes the step:
registering the first data set with the second volume data set based on the inherent feature, wherein the registering step is performed by the computer surgical navigation system.
5. The method of claim 4, further comprising the step of displaying the additional spatial information in registration with the second data set on a display device; and
wherein the collecting steps include collecting the data sets with a computer surgical navigation system.
6. A computer-implemented method of associating spatial information related to a first volume data set of an anatomical structure with a second volume data set of the anatomical structure, comprising the steps of:
obtaining a first volume data set of the anatomical structure with a computer surgical navigation system;
assigning a first arbitrary reference frame to the first volume data set;
calculating an inherent feature in the first volume data set, wherein the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position;
correlating the inherent feature to the first arbitrary reference frame;
associating additional spatial information with the first volume data set, wherein the additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame;
obtaining a second volume data set of the anatomical structure with a computer surgical navigation system;
assigning a second arbitrary reference frame to the second volume data set;
identifying the inherent feature in the second volume data set;
correlating the inherent feature to the second arbitrary reference frame;
registering the first volume data set with the second volume data set based on the inherent feature, wherein the registering step is performed by a computer;
correlating the additional spatial information to the second volume data set in registration therewith; and
displaying the additional spatial information in registration with the second volume data set on a display device.
7. The method of claim 6, wherein the steps of calculating the inherent feature include the step of calculating the image moment of inertia of each of the first volume data set and the second volume data set.
8. The method of claim 7, wherein the first volume data set and the second volume data set are obtained with the same modality.
9. The method of claim 8, wherein the modality comprises an ultrasound device that is tracked by the computer surgical navigation system.
10. The method of claim 6, wherein the step of associating additional spatial information includes the steps of identifying a gravity vector and correlating the gravity vector with the first arbitrary reference frame.
11. The method of claim 10, further comprising the step of displaying the gravity vector in registration with the second volume data set on the display.
12. The method of claim 6, wherein the step of associating additional spatial information includes the steps of:
obtaining a third volume data set of an other anatomical structure;
assigning a third arbitrary reference frame to the third volume data set;
calculating an inherent feature in the third volume data set, wherein the inherent feature has a unique position and orientation in relation to the other anatomical structure that can be identified from any reference position;
correlating the inherent feature of the third volume data set to the third arbitrary reference frame; and
calculating a unique position and orientation of the third arbitrary reference frame with respect to the first arbitrary reference frame.
13. The method of claim 12, wherein the third volume data set is not contiguous with the first volume data set.
14. The method of claim 13, wherein the other anatomical structure has a constant position relative to the first said anatomical structure.
15. The method of claim 14, wherein the first said anatomical structure and the other anatomical structure are part of a single bone.
16. The method of claim 14, wherein the first said anatomical structure is part of a first bone and the other anatomical structure is part of a second bone.
17. The method of claim 16, wherein the first bone is a femur and the second bone is pelvis.
18. The method of claim 16, further comprising the step of calculating functional information about a joint between the first bone and the second bone based on the additional spatial information.
19. The method of claim 12, wherein the first volume data set is obtained at a first time and the second volume data set is obtained at a second time.
20. The method of claim 19, wherein the first volume data set and the third volume data set are obtained while the first said anatomical structure and the other anatomical structure are in a first fixed global position.
21. The method of claim 20, wherein the second volume data set is obtained while the anatomical structures are in second global position.
22. The method of claim 21, wherein the first and third volume data sets are obtained pre-operatively and the second volume data set is obtained intra-operatively.
23. A system for collecting and manipulating a volume data set of an anatomical structure, comprising:
means for obtaining a first volume data set of an anatomical structure of a patient and a second volume data set of the anatomical structure;
means for calculating an inherent feature of the first volume data set and the second volume data set, wherein the inherent feature has a unique position and orientation in relation to the anatomical structure that can be identified from any reference position;
means for assigning a first arbitrary reference frame to the first volume data set and a second arbitrary reference frame to the second volume data set;
means for correlating the inherent feature to the first arbitrary reference frame;
means for associating additional spatial information with the first volume data set, wherein the additional spatial information has a unique spatial relationship correlated with the first arbitrary reference frame;
means for registering the first volume data set with the second volume data set based on the inherent feature; and
means for correlating the additional spatial information to the second volume data set in registration therewith.
24. The system of claim 23, further comprising means for determining the orientation of the anatomical structure or parts thereof with respect to the gravity vector in the first volume data set.
25. The system of claim 23, wherein the means for obtaining comprises a computer surgical navigation system and an ultrasound probe.
26. The system of claim 23, wherein the means for calculating comprises a computer implemented routine for calculating an image moment of inertia of the first and second volume data sets.
27. The system of claim 23, wherein the means for associating comprises means for identifying functional information in relation to the first volume data set.
28. The system of claim 27, wherein the means for identifying functional information comprises means for identifying a gravity vector.
29. The system of claim 27, wherein the means for identifying comprises means for identifying movement parameters of a joint.
30. The system of claim 23, wherein the means for associating comprises means for calculating a unique position and orientation of a third arbitrary reference frame with respect to the first arbitrary reference frame.
31. The system of claim 23, wherein the means for assigning comprises assigns the first and second reference frames without reference to any pre-defined landmark on the anatomical structure.
32. A method of establishing a position a portion of a bone that has been altered from a normal shape, comprising the steps of:
collecting a first volume data set for a first bone that is unaltered, wherein the first volume data set includes volume data for first and second portions of the first bone;
identifying a first unique spatial characteristic of the volume data for the first portion of the first bone;
establishing a first arbitrary reference frame for the first volume data set correlated with the first unique spatial characteristic;
identifying a unique spatial relation between the first arbitrary reference frame and the second portion of the first bone;
identifying a second bone that normally mirrors the first bone about a centerline, wherein the second bone includes a first portion and a second portion that correspond as substantially mirror structures to the first and second portions of the first bone, respectively, and wherein the second bone has been altered from a normal shape such that the first portion of the second bone is in an altered position with regard to the second portion of the second bone;
collecting a second volume data set of the first the first portion of the second bone;
identifying a second unique spatial characteristic of the second volume data set, wherein the second unique spatial characteristic substantially mirrors the first unique spatial characteristic;
registering in mirrored correlation the first volume data set with the second volume data by correlating the first unique spatial characteristic with the second unique spatial characteristic; and
re-establishing the normal position of the second portion of the second bone to coincide with the position of the second portion of the first bone as related to the registered position of the first portion of the first bone.
33. The method of claim 32, wherein the unique spatial relation comprises a positional vector between the volume data for the first portion of the first bone and the volume data for the second portion of the first bone.
34. The method of claim 32, wherein the first portion of the second bone is separated from the second portion of the second bone by a trauma.
35. The method of claim 32, wherein the first and second volume data sets are collected using an ultrasound probe tracked by a computer surgical navigation system.
36. The method of claim 35, further comprising tracking positions of each of the first and second portions of the second bone with a tracking device that is tracked by the computer surgical navigation system.
37. The method of claim 32, wherein the first arbitrary reference frames is established without reference to any pre-defined landmark on the anatomical structure.
38. The method of claim 32, wherein the steps of identifying the first and second unique spatial characteristics comprise the step of calculating an image moment of inertia of the volume data sets.
US12/835,384 2010-07-13 2010-07-13 Registration of anatomical data sets Active 2032-09-23 US8675939B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/835,384 US8675939B2 (en) 2010-07-13 2010-07-13 Registration of anatomical data sets
DE201110106812 DE102011106812A1 (en) 2010-07-13 2011-07-06 Registration of anatomical datasets
JP2011154736A JP2012020133A (en) 2010-07-13 2011-07-13 Device and method for registering anatomical structure data set
US14/100,055 US9572548B2 (en) 2010-07-13 2013-12-09 Registration of anatomical data sets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/835,384 US8675939B2 (en) 2010-07-13 2010-07-13 Registration of anatomical data sets

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/100,055 Continuation US9572548B2 (en) 2010-07-13 2013-12-09 Registration of anatomical data sets

Publications (2)

Publication Number Publication Date
US20120016269A1 true US20120016269A1 (en) 2012-01-19
US8675939B2 US8675939B2 (en) 2014-03-18

Family

ID=45403143

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/835,384 Active 2032-09-23 US8675939B2 (en) 2010-07-13 2010-07-13 Registration of anatomical data sets
US14/100,055 Active 2031-06-28 US9572548B2 (en) 2010-07-13 2013-12-09 Registration of anatomical data sets

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/100,055 Active 2031-06-28 US9572548B2 (en) 2010-07-13 2013-12-09 Registration of anatomical data sets

Country Status (3)

Country Link
US (2) US8675939B2 (en)
JP (1) JP2012020133A (en)
DE (1) DE102011106812A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120203140A1 (en) * 2011-02-08 2012-08-09 Henrik Malchau Patient Positioning Systems and Methods
US20120288173A1 (en) * 2011-05-13 2012-11-15 Broncus Technologies, Inc. Surgical assistance planning method using lung motion analysis
US20130135312A1 (en) * 2011-11-10 2013-05-30 Victor Yang Method of rendering and manipulating anatomical images on mobile computing device
WO2014127354A1 (en) 2013-02-18 2014-08-21 Orthogrid Systems, Llc Alignment plate apparatus and system and method of use
WO2015018804A1 (en) * 2013-08-05 2015-02-12 Fiagon Gmbh System for the reconstruction of symmetrical body parts
US20150238271A1 (en) * 2014-02-25 2015-08-27 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20160100909A1 (en) * 2014-02-25 2016-04-14 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20160328726A1 (en) * 2011-07-25 2016-11-10 Prevedere, Inc Systems and methods for forecasting based upon time series data
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
WO2017085532A1 (en) * 2015-11-19 2017-05-26 Synaptive Medical (Barbados) Inc. Neurosurgical mri-guided ultrasound via multi-modal image registration and multi-sensor fusion
US9675272B2 (en) 2013-03-13 2017-06-13 DePuy Synthes Products, Inc. Methods, systems, and devices for guiding surgical instruments using radio frequency technology
US9875544B2 (en) 2013-08-09 2018-01-23 Broncus Medical Inc. Registration of fluoroscopic images of the chest and corresponding 3D image data based on the ribs and spine
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US20190046155A1 (en) * 2017-08-10 2019-02-14 Biosense Webster (Israel) Ltd. Method and apparatus for performing facial registration
CN109381192A (en) * 2017-08-10 2019-02-26 韦伯斯特生物官能(以色列)有限公司 Method and apparatus for executing face registration
US10219811B2 (en) 2011-06-27 2019-03-05 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
AU2017276281B2 (en) * 2012-12-31 2019-12-12 Mako Surgical Corp. Systems and methods of registration using an ultrasound probe
US10610305B2 (en) 2016-05-22 2020-04-07 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
US10896388B2 (en) 2011-07-25 2021-01-19 Prevedere, Inc. Systems and methods for business analytics management and modeling
AU2016371212B2 (en) * 2015-12-18 2021-05-20 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US20220354590A1 (en) * 2015-05-29 2022-11-10 Smith & Nephew, Inc. Method for registering articulated anatomical structures
US20230138451A1 (en) * 2021-10-28 2023-05-04 Canon Medical Systems Corporation Image processing method and apparatus
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment

Families Citing this family (146)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US7794407B2 (en) 2006-10-23 2010-09-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US8388546B2 (en) 2006-10-23 2013-03-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US10524691B2 (en) 2007-11-26 2020-01-07 C. R. Bard, Inc. Needle assembly including an aligned magnetic element
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US10449330B2 (en) 2007-11-26 2019-10-22 C. R. Bard, Inc. Magnetic element-equipped needle assemblies
US10751509B2 (en) 2007-11-26 2020-08-25 C. R. Bard, Inc. Iconic representations for guidance of an indwelling medical device
CN103750858B (en) 2007-11-26 2017-04-12 C·R·巴德股份有限公司 Integrated system for intravascular placement of a catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
WO2010022370A1 (en) 2008-08-22 2010-02-25 C.R. Bard, Inc. Catheter assembly including ecg sensor and magnetic assemblies
US8437833B2 (en) 2008-10-07 2013-05-07 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US8588892B2 (en) * 2008-12-02 2013-11-19 Avenir Medical Inc. Method and system for aligning a prosthesis during surgery using active sensors
US9125578B2 (en) 2009-06-12 2015-09-08 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
RU2691318C2 (en) 2009-06-12 2019-06-11 Бард Аксесс Системс, Инк. Method for positioning catheter end
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
EP2464407A4 (en) 2009-08-10 2014-04-02 Bard Access Systems Inc Devices and methods for endovascular electrography
AU2010300677B2 (en) 2009-09-29 2014-09-04 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
WO2011150358A1 (en) 2010-05-28 2011-12-01 C.R. Bard, Inc. Insertion guidance system for needles and medical components
WO2011150376A1 (en) 2010-05-28 2011-12-01 C.R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9241657B2 (en) * 2010-06-30 2016-01-26 Brainlab Ag Medical image registration using a rigid inner body surface
KR101856267B1 (en) 2010-08-20 2018-05-09 씨. 알. 바드, 인크. Reconfirmation of ecg-assisted catheter tip placement
WO2012058461A1 (en) 2010-10-29 2012-05-03 C.R.Bard, Inc. Bioimpedance-assisted placement of a medical device
CN103402450A (en) * 2010-12-17 2013-11-20 阿韦尼尔医药公司 Method and system for aligning a prosthesis during surgery
WO2012131660A1 (en) 2011-04-01 2012-10-04 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system for spinal and other surgeries
WO2013006817A1 (en) 2011-07-06 2013-01-10 C.R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9483678B2 (en) 2011-09-16 2016-11-01 Gearbox, Llc Listing instances of a body-insertable device being proximate to target regions of interest
GB201117811D0 (en) * 2011-10-14 2011-11-30 Siemens Medical Solutions Registration of cardiac CTA to PET/SPECT
EP2771712B1 (en) 2011-10-28 2023-03-22 Decision Sciences International Corporation Spread spectrum coded waveforms in ultrasound imaging
KR101423923B1 (en) * 2012-02-06 2014-08-13 삼성메디슨 주식회사 Apparatus and method for obtainning a symmetry information of objects
JP5908790B2 (en) * 2012-05-28 2016-04-26 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic image display device and control program thereof
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
JP2015528713A (en) 2012-06-21 2015-10-01 グローバス メディカル インコーポレイティッド Surgical robot platform
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
JP6081301B2 (en) * 2012-06-27 2017-02-15 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and image data correction method
WO2014008613A1 (en) * 2012-07-12 2014-01-16 Ao Technology Ag Method for generating a graphical 3d computer model of at least one anatomical structure in a selectable pre-, intra-, or postoperative status
US9402637B2 (en) 2012-10-11 2016-08-02 Howmedica Osteonics Corporation Customized arthroplasty cutting guides and surgical methods using the same
KR101993384B1 (en) * 2012-10-24 2019-06-26 삼성전자주식회사 Method, Apparatus and system for correcting medical image by patient's pose variation
US9247998B2 (en) 2013-03-15 2016-02-02 Intellijoint Surgical Inc. System and method for intra-operative leg position measurement
US9844359B2 (en) 2013-09-13 2017-12-19 Decision Sciences Medical Company, LLC Coherent spread-spectrum coded waveforms in synthetic aperture image formation
US9283048B2 (en) 2013-10-04 2016-03-15 KB Medical SA Apparatus and systems for precise guidance of surgical tools
KR101547098B1 (en) * 2014-01-08 2015-09-04 삼성전자 주식회사 Apparatus and method for generating image
WO2015107099A1 (en) 2014-01-15 2015-07-23 KB Medical SA Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
EP3073910B1 (en) 2014-02-06 2020-07-15 C.R. Bard, Inc. Systems for guidance and placement of an intravascular device
US10039605B2 (en) 2014-02-11 2018-08-07 Globus Medical, Inc. Sterile handle for controlling a robotic surgical system from a sterile field
WO2015162256A1 (en) 2014-04-24 2015-10-29 KB Medical SA Surgical instrument holder for use with a robotic surgical system
US10357257B2 (en) 2014-07-14 2019-07-23 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
EP3212110B1 (en) * 2014-10-29 2024-02-28 Intellijoint Surgical Inc. Systems for anatomical registration and surgical localization
US10504252B2 (en) * 2014-12-15 2019-12-10 Canon Medical Systems Corporation Method of, and apparatus for, registration and segmentation of medical imaging data
US10973584B2 (en) 2015-01-19 2021-04-13 Bard Access Systems, Inc. Device and method for vascular access
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
WO2016131903A1 (en) 2015-02-18 2016-08-25 KB Medical SA Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
JP6835744B2 (en) 2015-02-25 2021-02-24 ディスィジョン サイエンシズ メディカル カンパニー,エルエルシー Kaplant device
WO2016210325A1 (en) 2015-06-26 2016-12-29 C.R. Bard, Inc. Connector interface for ecg-based catheter positioning system
US10058394B2 (en) 2015-07-31 2018-08-28 Globus Medical, Inc. Robot arm and methods of use
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
JP6894431B2 (en) 2015-08-31 2021-06-30 ケービー メディカル エスアー Robotic surgical system and method
US10034716B2 (en) 2015-09-14 2018-07-31 Globus Medical, Inc. Surgical robotic systems and methods thereof
CN108366775B (en) 2015-10-08 2022-06-14 决策科学医疗有限责任公司 Acoustic surgical tracking system and method
US9771092B2 (en) 2015-10-13 2017-09-26 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US11000207B2 (en) 2016-01-29 2021-05-11 C. R. Bard, Inc. Multiple coil system for tracking a medical device
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
JP6951116B2 (en) * 2016-05-11 2021-10-20 キヤノンメディカルシステムズ株式会社 Medical image processing equipment, medical diagnostic imaging equipment, and image processing methods
WO2018045086A1 (en) * 2016-08-30 2018-03-08 Mako Surgical Corp. Systems and methods for intra-operative pelvic registration
JP7233841B2 (en) 2017-01-18 2023-03-07 ケービー メディカル エスアー Robotic Navigation for Robotic Surgical Systems
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
WO2018189725A1 (en) 2017-04-14 2018-10-18 Stryker Corporation Surgical systems and methods for facilitating ad-hoc intraoperative planning of surgical procedures
AU2018279732B2 (en) 2017-06-09 2024-03-07 Mako Surgical Corp. Systems and tools for positioning workpieces with surgical robots
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
EP3445048A1 (en) 2017-08-15 2019-02-20 Holo Surgical Inc. A graphical user interface for a surgical navigation system for providing an augmented reality image during operation
US11357548B2 (en) 2017-11-09 2022-06-14 Globus Medical, Inc. Robotic rod benders and related mechanical and motor housings
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
EP3492032B1 (en) 2017-11-09 2023-01-04 Globus Medical, Inc. Surgical robotic systems for bending surgical rods
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US11272985B2 (en) 2017-11-14 2022-03-15 Stryker Corporation Patient-specific preoperative planning simulation techniques
US11234775B2 (en) 2018-01-26 2022-02-01 Mako Surgical Corp. End effectors, systems, and methods for impacting prosthetics guided by surgical robots
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
WO2020081373A1 (en) 2018-10-16 2020-04-23 Bard Access Systems, Inc. Safety-equipped connection systems and methods thereof for establishing electrical connections
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
EP3692939B1 (en) 2019-02-07 2021-07-14 Stryker European Operations Limited Surgical systems for facilitating tissue treatment
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
WO2020219705A1 (en) 2019-04-23 2020-10-29 Allan Wegner Semi-rigid acoustic coupling articles for ultrasound diagnostic and treatment applications
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
AU2020357877A1 (en) 2019-10-01 2022-05-19 Mako Surgical Corp. Surgical systems for guiding robotic manipulators
EP4037595A1 (en) 2019-10-06 2022-08-10 Universität Bern System and method for computation of coordinate system transformations
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11871998B2 (en) 2019-12-06 2024-01-16 Stryker European Operations Limited Gravity based patient image orientation detection
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11793574B2 (en) 2020-03-16 2023-10-24 Stryker Australia Pty Ltd Automated cut planning for removal of diseased regions
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
CN116685847A (en) 2020-11-13 2023-09-01 决策科学医疗有限责任公司 System and method for synthetic aperture ultrasound imaging of objects
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5572999A (en) * 1992-05-27 1996-11-12 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
US20040111024A1 (en) * 2001-02-07 2004-06-10 Guoyan Zheng Method for establishing a three-dimensional representation of a bone from image data
US20060120583A1 (en) * 2004-11-10 2006-06-08 Agfa-Gevaert Method of performing measurements on digital images
US20060133694A1 (en) * 2004-11-10 2006-06-22 Agfa-Gevaert Display device for displaying a blended image
US20080267483A1 (en) * 2007-04-30 2008-10-30 Siemens Medical Solutions Usa, Inc. Registration of Medical Images Using Learned-Based Matching Functions
US20080287781A1 (en) * 2004-03-05 2008-11-20 Depuy International Limited Registration Methods and Apparatus
US20090076371A1 (en) * 1998-09-14 2009-03-19 The Board Of Trustees Of The Leland Stanford Junior University Joint and Cartilage Diagnosis, Assessment and Modeling
US20090124890A1 (en) * 2005-02-18 2009-05-14 Raymond Derycke Method and a System for Assisting Guidance of a Tool for Medical Use
US7570791B2 (en) * 2003-04-25 2009-08-04 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US20090232369A1 (en) * 2004-12-20 2009-09-17 Koninklijke Philips Electronics N.V. Method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body
US20100080415A1 (en) * 2008-09-29 2010-04-01 Restoration Robotics, Inc. Object-tracking systems and methods
US7929745B2 (en) * 2005-03-24 2011-04-19 Optasia Medical Limited Method and system for characterization of knee joint morphology
US7949179B2 (en) * 2004-02-25 2011-05-24 The University Of Tokyo Shape measurement device and method thereof
US7968851B2 (en) * 2004-01-13 2011-06-28 Spectrum Dynamics Llc Dynamic spect camera
US20110249882A1 (en) * 2007-08-01 2011-10-13 Depuy Orthop??Die Gmbh Image processing
US20110305405A1 (en) * 2010-06-11 2011-12-15 Fujifilm Corporation Method, apparatus, and program for aligning images
US8280482B2 (en) * 2004-04-19 2012-10-02 New York University Method and apparatus for evaluating regional changes in three-dimensional tomographic images

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5690106A (en) 1995-06-30 1997-11-25 Siemens Corporate Research, Inc. Flexible image registration for rotational angiography
AU2928097A (en) 1996-04-29 1997-11-19 Government Of The United States Of America, As Represented By The Secretary Of The Department Of Health And Human Services, The Iterative image registration process using closest corresponding voxels
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
NZ513919A (en) 1999-03-17 2001-09-28 Synthes Ag Imaging and planning device for ligament graft placement
US6625607B1 (en) * 1999-07-22 2003-09-23 Parametric Technology Corporation Method of comparing parts
AU2003230845A1 (en) 2002-04-10 2003-10-27 Stereotaxis, Inc. Systems and methods for interventional medicine
US20050256398A1 (en) 2004-05-12 2005-11-17 Hastings Roger N Systems and methods for interventional medicine
ES2246029T3 (en) 2003-04-11 2006-02-01 Medcom Gesellschaft Fur Medizinische Bildverarbeitung Mbh COMBINATION OF THE DATA OF THE PRIMARY AND SECONDARY IMAGES OF AN OBJECT.
DE10333543A1 (en) 2003-07-23 2005-02-24 Siemens Ag A method for the coupled presentation of intraoperative as well as interactive and iteratively re-registered preoperative images in medical imaging
US7315644B2 (en) * 2003-07-31 2008-01-01 The Boeing Company Investigation of destroyed assemblies and identification of components thereof
US7724943B2 (en) 2004-04-21 2010-05-25 Siemens Medical Solutions Usa, Inc. Rapid and robust 3D/3D registration technique
US8090429B2 (en) 2004-06-30 2012-01-03 Siemens Medical Solutions Usa, Inc. Systems and methods for localized image registration and fusion
US8989349B2 (en) 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
DE102005023167B4 (en) 2005-05-19 2008-01-03 Siemens Ag Method and device for registering 2D projection images relative to a 3D image data set
DE102005036322A1 (en) 2005-07-29 2007-02-15 Siemens Ag Intraoperative registration method for intraoperative image data sets, involves spatial calibration of optical three-dimensional sensor system with intraoperative imaging modality
CN103251455B (en) 2005-10-20 2016-04-27 直观外科手术操作公司 Assistant images display on computer display in medical robotic system and manipulation
US8184909B2 (en) * 2008-06-25 2012-05-22 United Technologies Corporation Method for comparing sectioned geometric data representations for selected objects

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201984B1 (en) * 1991-06-13 2001-03-13 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5572999A (en) * 1992-05-27 1996-11-12 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
US20090076371A1 (en) * 1998-09-14 2009-03-19 The Board Of Trustees Of The Leland Stanford Junior University Joint and Cartilage Diagnosis, Assessment and Modeling
US7117027B2 (en) * 2001-02-07 2006-10-03 Synthes (Usa) Method for establishing a three-dimensional representation of a bone from image data
US20040111024A1 (en) * 2001-02-07 2004-06-10 Guoyan Zheng Method for establishing a three-dimensional representation of a bone from image data
US7570791B2 (en) * 2003-04-25 2009-08-04 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US20090290771A1 (en) * 2003-04-25 2009-11-26 Surgical Navigation Technologies, Inc. Method and Apparatus for Performing 2D to 3D Registration
US8036441B2 (en) * 2003-04-25 2011-10-11 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US7968851B2 (en) * 2004-01-13 2011-06-28 Spectrum Dynamics Llc Dynamic spect camera
US7949179B2 (en) * 2004-02-25 2011-05-24 The University Of Tokyo Shape measurement device and method thereof
US20080287781A1 (en) * 2004-03-05 2008-11-20 Depuy International Limited Registration Methods and Apparatus
US8280482B2 (en) * 2004-04-19 2012-10-02 New York University Method and apparatus for evaluating regional changes in three-dimensional tomographic images
US20060120583A1 (en) * 2004-11-10 2006-06-08 Agfa-Gevaert Method of performing measurements on digital images
US8014625B2 (en) * 2004-11-10 2011-09-06 Agfa Healthcare Method of performing measurements on digital images
US20060133694A1 (en) * 2004-11-10 2006-06-22 Agfa-Gevaert Display device for displaying a blended image
US20090232369A1 (en) * 2004-12-20 2009-09-17 Koninklijke Philips Electronics N.V. Method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body
US20090124890A1 (en) * 2005-02-18 2009-05-14 Raymond Derycke Method and a System for Assisting Guidance of a Tool for Medical Use
US7929745B2 (en) * 2005-03-24 2011-04-19 Optasia Medical Limited Method and system for characterization of knee joint morphology
US20080267483A1 (en) * 2007-04-30 2008-10-30 Siemens Medical Solutions Usa, Inc. Registration of Medical Images Using Learned-Based Matching Functions
US8121362B2 (en) * 2007-04-30 2012-02-21 Siemens Medical Solutions Usa, Inc. Registration of medical images using learned-based matching functions
US20110249882A1 (en) * 2007-08-01 2011-10-13 Depuy Orthop??Die Gmbh Image processing
US20100080415A1 (en) * 2008-09-29 2010-04-01 Restoration Robotics, Inc. Object-tracking systems and methods
US20110305405A1 (en) * 2010-06-11 2011-12-15 Fujifilm Corporation Method, apparatus, and program for aligning images

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120203140A1 (en) * 2011-02-08 2012-08-09 Henrik Malchau Patient Positioning Systems and Methods
US20170135610A1 (en) * 2011-02-08 2017-05-18 The General Hospital Corporation Patient positioning systems and methods
US9554731B2 (en) * 2011-02-08 2017-01-31 The General Hospital Corporation Patient positioning systems and methods
US20120288173A1 (en) * 2011-05-13 2012-11-15 Broncus Technologies, Inc. Surgical assistance planning method using lung motion analysis
US9020229B2 (en) * 2011-05-13 2015-04-28 Broncus Medical, Inc. Surgical assistance planning method using lung motion analysis
US20150228074A1 (en) * 2011-05-13 2015-08-13 Broncus Technologies Surgical assistance planning method using lung motion analysis
US9652845B2 (en) * 2011-05-13 2017-05-16 Broncus Medical Inc. Surgical assistance planning method using lung motion analysis
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US10219811B2 (en) 2011-06-27 2019-03-05 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US10080617B2 (en) 2011-06-27 2018-09-25 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US10896388B2 (en) 2011-07-25 2021-01-19 Prevedere, Inc. Systems and methods for business analytics management and modeling
US10740772B2 (en) * 2011-07-25 2020-08-11 Prevedere, Inc. Systems and methods for forecasting based upon time series data
US20160328726A1 (en) * 2011-07-25 2016-11-10 Prevedere, Inc Systems and methods for forecasting based upon time series data
US20130135312A1 (en) * 2011-11-10 2013-05-30 Victor Yang Method of rendering and manipulating anatomical images on mobile computing device
US8933935B2 (en) * 2011-11-10 2015-01-13 7D Surgical Inc. Method of rendering and manipulating anatomical images on mobile computing device
AU2017276281B2 (en) * 2012-12-31 2019-12-12 Mako Surgical Corp. Systems and methods of registration using an ultrasound probe
EP2956046A4 (en) * 2013-02-18 2016-11-16 Orthogrid Systems S A R L Alignment plate apparatus and system and method of use
WO2014127354A1 (en) 2013-02-18 2014-08-21 Orthogrid Systems, Llc Alignment plate apparatus and system and method of use
US11399734B2 (en) 2013-03-13 2022-08-02 DePuy Synthes Products, Inc. Methods, systems, and devices for guiding surgical instruments using radio frequency technology
US9675272B2 (en) 2013-03-13 2017-06-13 DePuy Synthes Products, Inc. Methods, systems, and devices for guiding surgical instruments using radio frequency technology
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
WO2015018804A1 (en) * 2013-08-05 2015-02-12 Fiagon Gmbh System for the reconstruction of symmetrical body parts
US11413096B2 (en) 2013-08-05 2022-08-16 Intersect Ent Gmbh System for the reconstruction of symmetrical body parts
US20160175056A1 (en) * 2013-08-05 2016-06-23 Fiagon Gmbh System for the reconstruction of symmetrical body parts
CN105764441A (en) * 2013-08-05 2016-07-13 菲亚戈股份有限公司 System for the reconstruction of symmetrical body parts
US9875544B2 (en) 2013-08-09 2018-01-23 Broncus Medical Inc. Registration of fluoroscopic images of the chest and corresponding 3D image data based on the ribs and spine
US10433914B2 (en) * 2014-02-25 2019-10-08 JointPoint, Inc. Systems and methods for intra-operative image analysis
US20150238271A1 (en) * 2014-02-25 2015-08-27 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US11642174B2 (en) 2014-02-25 2023-05-09 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11534127B2 (en) 2014-02-25 2022-12-27 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US20160100909A1 (en) * 2014-02-25 2016-04-14 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US10758198B2 (en) * 2014-02-25 2020-09-01 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US10765384B2 (en) 2014-02-25 2020-09-08 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US20220354590A1 (en) * 2015-05-29 2022-11-10 Smith & Nephew, Inc. Method for registering articulated anatomical structures
US11826112B2 (en) * 2015-05-29 2023-11-28 Smith & Nephew, Inc. Method for registering articulated anatomical structures
GB2559717B (en) * 2015-11-19 2021-12-29 Synaptive Medical Inc Neurosurgical MRI-guided ultrasound via multi-modal image registration and multi-sensor fusion
WO2017085532A1 (en) * 2015-11-19 2017-05-26 Synaptive Medical (Barbados) Inc. Neurosurgical mri-guided ultrasound via multi-modal image registration and multi-sensor fusion
GB2559717A (en) * 2015-11-19 2018-08-15 Synaptive Medical Barbados Inc Neurosurgical MRI-guided ultrasound via multi-modal image registration and multi-sensor fusion
AU2016371212B2 (en) * 2015-12-18 2021-05-20 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US10959782B2 (en) 2016-05-22 2021-03-30 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
US10610305B2 (en) 2016-05-22 2020-04-07 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
CN109381192A (en) * 2017-08-10 2019-02-26 韦伯斯特生物官能(以色列)有限公司 Method and apparatus for executing face registration
US20190046155A1 (en) * 2017-08-10 2019-02-14 Biosense Webster (Israel) Ltd. Method and apparatus for performing facial registration
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment
US20230138451A1 (en) * 2021-10-28 2023-05-04 Canon Medical Systems Corporation Image processing method and apparatus

Also Published As

Publication number Publication date
US8675939B2 (en) 2014-03-18
US9572548B2 (en) 2017-02-21
JP2012020133A (en) 2012-02-02
US20140094694A1 (en) 2014-04-03
DE102011106812A1 (en) 2012-01-19

Similar Documents

Publication Publication Date Title
US9572548B2 (en) Registration of anatomical data sets
US11839436B2 (en) Methods and kit for a navigated procedure
US10973580B2 (en) Method and system for planning and performing arthroplasty procedures using motion-capture data
JP6040151B2 (en) Method for determining access region from 3D patient image
US20190090955A1 (en) Systems and methods for position and orientation tracking of anatomy and surgical instruments
US20100030231A1 (en) Surgical system and method
JP2004527286A (en) System and method for total knee arthroplasty
US20230141368A1 (en) Assessment of Soft Tissue Tension In Hip Procedures

Legal Events

Date Code Title Description
AS Assignment

Owner name: STRYKER LEIBINGER GMBH & CO., KG., GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MOCTEZUMA DE LA BARRERA, JOSE LUIS;REEL/FRAME:024698/0885

Effective date: 20100713

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: STRYKER EUROPEAN HOLDINGS VI, LLC, MICHIGAN

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER LEIBINGER GMBH & CO. KG;REEL/FRAME:037152/0910

Effective date: 20151008

Owner name: STRYKER EUROPEAN HOLDINGS I, LLC, MICHIGAN

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER EUROPEAN HOLDINGS VI, LLC;REEL/FRAME:037153/0391

Effective date: 20151008

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

AS Assignment

Owner name: STRYKER EUROPEAN OPERATIONS HOLDINGS LLC, MICHIGAN

Free format text: CHANGE OF NAME;ASSIGNOR:STRYKER EUROPEAN HOLDINGS III, LLC;REEL/FRAME:052860/0716

Effective date: 20190226

Owner name: STRYKER EUROPEAN HOLDINGS III, LLC, DELAWARE

Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:STRYKER EUROPEAN HOLDINGS I, LLC;REEL/FRAME:052861/0001

Effective date: 20200519

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8