WO2012001550A1 - Method and system for creating physician-centric coordinate system - Google Patents

Method and system for creating physician-centric coordinate system Download PDF

Info

Publication number
WO2012001550A1
WO2012001550A1 PCT/IB2011/052339 IB2011052339W WO2012001550A1 WO 2012001550 A1 WO2012001550 A1 WO 2012001550A1 IB 2011052339 W IB2011052339 W IB 2011052339W WO 2012001550 A1 WO2012001550 A1 WO 2012001550A1
Authority
WO
WIPO (PCT)
Prior art keywords
physician
centric
data
image
tracking
Prior art date
Application number
PCT/IB2011/052339
Other languages
French (fr)
Inventor
Neil David Glossop
Thomas Shu Yin Tang
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2012001550A1 publication Critical patent/WO2012001550A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/368Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position

Definitions

  • the present invention generally relates to an image guided intervention involving a graphic icon of an instrument on a pre-operative scan or an intra-operative scan.
  • the present invention specifically relates to an image guided intervention involving a display of the graphic icon of the instrument within the scan from the viewpoint of a physician.
  • a physician During a minimally invasive intervention where it is not possible for a physician to directly view a tip of an instrument tip within a body of a patient (e.g., a needle biopsy, a radiofrequency ablation, etc.), the physician normally makes use of frequent imaging of the patient's body to help properly position the instrument. Particularly, the instrument is inserted in the patient's body, and a location and an orientation of the instrument is adjusted within the images while the physician is viewing the images.
  • IGI computer assisted image guided intervention
  • a tracked instrument e.g., a needle, a guidewire, a catheter, etc.
  • a pre-operative scan and/or an intra-operative scan of the patient e.g., an ultrasound scan, a CT scan, MRI scan, etc.
  • the tracked instrument may also be displayed as a graphic icon indicating its position and orientation relative to another tracked instrument or device.
  • an image of the instrument's position and orientation is overlayed on pre-operative scan and/or intra-operative scan.
  • the display on the computer monitor may display the physician's movements in real-time, highlighting the instrument's location, trajectory or other information relative to the target.
  • IGI systems typically use a device known as a "position sensor" to determine the position and orientation of instruments.
  • Position indicating elements attached to instruments and the position sensor electronics enable the position sensor to track the position and orientation of the position-indicating element and therefore the instrument.
  • Common position sensor technology includes electromagnetic and optical technologies.
  • motions of the instrument may be sometimes displayed in counterintuitive form. For example, moving a probe to the left might be displayed on the monitor as a motion to the right. If the physician moves position, or moves the display device, the perception may reverse. This is counterintuitive, disorienting and distracting to the physician. Although the physician may eventually "learn" the motions required to perform the correct motions on the screen, it can take some time and reduce the usability of the system. This learning time varies and can also lead to incorrect instrument placement. In addition to counterintuitive positioning of the instrument, the same problem can affect rotations. For example, a yaw action may result in a pitch action on the screen.
  • the present invention provides systems, devices and methods for assisting the hand-eye coordination of the physician so that manipulations of the instruments concur with the expected action of the graphic icon of the instrument on the display.
  • One form of the present invention is an image-guided system employing an imaging device, an instrument tracking device, a physician-centric display module and a registration device.
  • the imaging device generates imaging data indicative of an image of an anatomical region of a patient within an imaging coordinate system
  • the instrument tracking device generates tracking data indicative of a location (i.e., a position and/or an orientation) of an instrument within a tracking coordinate system
  • the physician-centric display module generates physician-centric data indicative of a physician centric viewpoint within a physician coordinate system.
  • the registration device In response to the data, the registration device generates a display illustrating a physician-centric viewpoint of a graphic icon of the instrument relative to the image of the anatomical region as a function of a transformation application of the physician-centric data to a registration of the imaging data and the tracking data.
  • An illustration of the physician-centric viewpoint of the graphic icon of the instrument relative to the image of the anatomical region includes the graphic icon replicating motion of the instrument within the tracking coordinate system from a viewpoint of a physician.
  • FIG. 1 illustrates an exemplary embodiment of an image-guided system as known in the art.
  • FIG. 2 illustrates an exemplary embodiment of a data registration executed by the image-guided system shown in FIG. 1 as known in the art.
  • FIG. 3 illustrates an exemplary embodiment of an image-guided system in accordance with the present invention.
  • FIG. 4 illustrates an exemplary data registration executed by the image-guided surgical system shown in FIG. 3 in accordance with the present invention.
  • FIG. 5 illustrates an exemplary embodiment of a pre-operative physician-centric viewpoint computation in accordance with the present invention.
  • FIG. 6 illustrates an exemplary embodiment of an intra-operative physician-centric viewpoint computation in accordance with the present invention.
  • FIG. 7 illustrates an exemplary embodiment of tracking indicators in accordance with the present invention.
  • FIG. 8 illustrates an exemplary embodiment of tracking patches in accordance with the present invention.
  • FIG. 9 illustrates an exemplary embodiment of a volume orientation display in accordance with the present invention.
  • FIG. 1 illustrates an image-guided system 10 employing an imaging device 20, an instrument tracking device 30, a registration device 40 and a display device 50.
  • imaging device 20 is broadly defined herein as any device structurally configured for generating imaging data indicative of an image of an anatomical region of a patient (e.g., brains, heart, lungs, abdomen, etc.) within an imaging coordinate system, such as, for example, a generation of imaging data ("ID") indicative of image 23 of an anatomical region of a patient within an imaging coordinate system 22 as shown in FIG. 2.
  • imaging device 20 include, but are not limited to, any known type of magnetic resonating imaging device, any known type of X-ray imaging device, any known type of ultrasound imaging device and any known type of computed tomography imaging device.
  • instrument tracking device 30 is broadly defined herein as any device structurally configured for generating tracking data indicative of tracking an instrument of any type within a tracking coordinate system, such as, for example, a generation of tracking data ("TD") 31 indicative of a tracking of an instrument 33 within a tracking coordinate system 32 as shown in FIG. 2.
  • TD tracking data
  • instrument tracking device 30 include, but are not limited to, any known type of electromagnetic tracking device and any known type of optical tracking device.
  • instrument 33 include, but are not limited to, surgical instruments/tools, imaging instruments/tools and therapeutic
  • registration device 40 is broadly defined herein as any device structurally configured for registering image data 21 and tracking data 31 to thereby generate a transformation matrix that facilitates a registered display (“RGD") 41 via display device 50 of a graphic icon of the instrument relative to the image of the anatomical region, such as, for example, a registration of imaging data 21 and tracking data 31 via a registration algorithm 42 as shown in FIG. 2 to thereby generate a transformation matrix 43 that facilitates a registered display 44 of a graphic icon 34 of instrument 33 relative to image 23 of the anatomical region as shown in FIG. 2.
  • registration algorithm 42 includes, but is not limited to, an iterative closet point (“ICP") algorithm and a singular value decomposition ("SVD”) algorithm as known in the art.
  • system 10 was not structurally configured to ascertain the physician's viewpoint of image guided intervention and consequently, significantly more often than not, motions of graphic icon 34 of instrument 33 are not displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician.
  • the present invention provides a physician-centric display module for ascertaining the physician's viewpoint of image-guided intervention whereby graphic icon 34 of instrument 33 may consistently be displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician.
  • FIG. 3 illustrates an image-guided intervention system 11 employing imaging device 20, instrument tracking device 30, a physician-centric display module 70, a registration device 80 and display device 50.
  • physician-centric display module 70 is broadly defined herein as any software, firmware and/or hardware structurally configured for generating physician-centric data ("PCD") 71 indicative of a physician-centric viewpoint within a physician coordinate system (i.e., the viewpoint of the physician is explicitly or implicitly centered within the physician coordinate system), such as, for example, a generation of physician-centric data 71 indicative of a physician-centric viewpoint 73 within a physician coordinate system 72 as shown in FIG. 4.
  • physician-centric data 71 may include an exact computation, an estimation or an approximation of physician-centric viewpoint 73 within the physician coordinate system 72.
  • module 70 Various embodiments of module 70 are subsequently provided herein in connection with the description of FIGS. 5-9.
  • registration device 80 is broadly defined herein as any device structurally configured for processing image data 21, tracking data 31 and physician-centric data 71 via a mathematical execution 82 of a registration algorithm and a transformation application that facilitates a display of a physician-centric viewpoint of the graphic icon of the instrument.
  • the "physician-centric viewpoint of the graphic icon of the instrument” is broadly defined herein as a display of the graphic icon of the instrument that replicates motion of the instrument from the viewpoint of the physician as the physician navigates the instrument within the tracking coordinate system. More particularly, the displayed motion of the graphic icon of the instrument will mimic the actual motion of the instrument in terms of direction, orientation and proportional degree of movement from the viewpoint of the physician.
  • the physician moves the instrument to his/her left from a first location on an anatomical object to a second location on the anatomical object
  • the displayed motion of the graphic icon of the instrument from the first location on a displayed image of the anatomical object to the second location on the displayed image of the anatomical object will mimic this physician navigated motion of the instrument from the first location on the anatomical object to the second location on the the anatomical object.
  • registration device 80 registers image data 21 and tracking data 31 as previously described herein to generate a base transformation matrix 83.
  • Physician-centric data 71 includes a physician-centric
  • transformation matrix 84 corresponding to a physician-centric viewpoint 73 within physician coordinate system 74
  • registration device 80 applies the transformation matrix 84 to transformation matrix 83 as know in the art to generate a display 85 of graphic icon 34 relative to image 23 of the anatomical region that replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32. This is highlighted by the arrows respectively pointing from graphic icon 34 to image 23 in display 85 and instrument 33 to anatomical region 24 shown in coordinate system 32.
  • FIG. 6 herein provides more detail of this embodiment.
  • registration device 80 applies physician-centric transformation matrix 84 to imaging data 21 as known in the art, and thereafter registers the transformed imaging data 21 to tracking data 31 as known to the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32.
  • physician-centric transformation matrix 84 to imaging data 21 as known in the art, and thereafter registers the transformed imaging data 21 to tracking data 31 as known to the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32.
  • registration device 80 applies physician-centric transformation matrix 84 to tracking data 31 as known in the art, and thereafter registers the transformed tracking data 31 to imaging data 31 as known in the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by physician 74 within tracking coordinate system 32.
  • physician-centric transformation matrix 84 to tracking data 31 as known in the art, and thereafter registers the transformed tracking data 31 to imaging data 31 as known in the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by physician 74 within tracking coordinate system 32.
  • registration device 80 In practice, for all three embodiments, those having ordinary skill in the art will appreciate the specific transformation application(s) utilized by registration device 80 is(are) dependent upon many variables including, but not limited to, the actual construction of coordinate systems 22 and 32, the order of multiplication, and the required application of the resultant transformation to display the graphical data. Thus, there are numerous operational modes of module 70 and device 80 in practice, and any operational mode of module 70 and registration device 80 in practice is dependent on the specific application of an image-guiding system of the present invention.
  • FIGS. 5-9 will now be described with an emphasis of the physician-centric transformation matrix aspect of the present invention.
  • FIG. 5 illustrates a virtual plan of a surgical room 90 that is annotated to determine a patient location 92 (e.g., prone, decubitus right, supine, etc.) relative to a physician-centric position 91 and a display location 93 relative to a physician-centric position 91. From this information, it is possible to "correct" the physician's motions so that they accurately reflect the expected movements. Using this information, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination. In practice, surgical room 90 may be in advance whereby the physician is required to place the patient in a given location and orientation and the physician is further required to stand in a particular location relative to the display device.
  • FIG. 1 illustrates a virtual plan of a surgical room 90 that is annotated to determine a patient location 92 (e.g., prone, decubitus right, supine, etc.) relative to a physician-centric position 91 and a display location
  • module 70 being "taught" physician-centric transformation matrix 84 in a manner that enables efficient hand eye coordination by leading the physician through a series of movements to record his reaction.
  • the physician may be instructed to move a coordinating instrument 101 that has been equipped with a position indicating element in a direction indicated on a display device 100.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
  • FIG. 7 illustrates the relative location of patient and physician may be determined automatically by a set 110 of special indicators 111-113 utilized via imaging of the anatomical region or tracking of the instrument. More particularly, during a indicator 111 may be placed on the physician on the same side of the patient as the physician, indicator 112 may be placed on the patient's head and indicator 113 may be placed on the patient on a side of the patient opposite the physician. Using this information during a pre-operative scan/intra-operative scan of the patient or a tracking of the instrument, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
  • indicators 111-113 are distinguishable from one another through geometry, selection of materials or by some other physical property that may manifest itself as "brightness" in an MR or CT scan. Additionally, indicators 111-113 may contain position indicating elements detectable by a position sensor arranged so as to also render an orientation of indicators 111-113 and purpose (i.e. to identify the patient's right hand side, physician side, etc.) known to the position sensor.
  • FIG. 8 illustrates a set 120 of a physician patch 121 to be applied to the physician and a patient patch 122 to be applied the patient to assist with proper display and reaction of the graphic display can be done prior to, during or after a diagnostic scan (e.g., CT scan) to help indicate where the physician is located relative to the patient and the orientation of the patient.
  • patches 121 and 122 may be individually distinguishable by the position sensing device alone, which is able to tell one from another along with its orientation.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye
  • FIG. 9 illustrates a graphical view of three (3) base displays 130-132 of an image of the anatomical region that facilitates the physician in rotating a 3D display 133 of the image to an orientation that mirrors the physician's view of the patient on the procedure table. More particularly, the physician uses an interface (e.g. a mouse or a trackball) to rotate the 3D display 133 so what is displayed on the screen duplicates his view of the image from his current location.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.

Abstract

An image-guided system (11) employing an imaging device (20), an instrument tracking device (80), a physician - centric display module (70) and a registration device. Imaging device generates imaging data (21) indicative of an image (23) of an anatomical region (24) of a patient within an imaging coordinate system (22). Instrument tracking device generates tracking data (31) indicative of a location of an instrument (33) within a tracking coordinate system (32). Physician - centric display module generates physician- centric data (71) indicative of a physician centric viewpoint (73) within a physician coordinate system (72). Registration device generates a physician - centric display (81) illustrating a physician - centric viewpoint of a graphic icon (34) of the instrument relative to the image of the anatomical region as a function of a transformation application of the physician - centric data to a registration of the imaging data and the tracking data. An illustration of the physician - centric viewpoint of the graphic icon of the instrument relative to the image of the anatomical region includes the graphic icon replicating motion of the instrument within the tracking coordinate system from a viewpoint of a physician.

Description

METHOD AND SYSTEM
FOR CREATING PHYSICIAN-CENTRIC COORDINATE SYSTEM
The present invention generally relates to an image guided intervention involving a graphic icon of an instrument on a pre-operative scan or an intra-operative scan. The present invention specifically relates to an image guided intervention involving a display of the graphic icon of the instrument within the scan from the viewpoint of a physician.
During a minimally invasive intervention where it is not possible for a physician to directly view a tip of an instrument tip within a body of a patient (e.g., a needle biopsy, a radiofrequency ablation, etc.), the physician normally makes use of frequent imaging of the patient's body to help properly position the instrument. Particularly, the instrument is inserted in the patient's body, and a location and an orientation of the instrument is adjusted within the images while the physician is viewing the images. A recent advance in this procedure is known as computer assisted image guided intervention ("IGI"), which makes use of instruments that contain embedded position indicating elements enabling their location and orientation to be tracked using a position sensor. A tracked instrument (e.g., a needle, a guidewire, a catheter, etc.) may be registered to a pre-operative scan and/or an intra-operative scan of the patient (e.g., an ultrasound scan, a CT scan, MRI scan, etc.) enabling the instrument's position and orientation to be superimposed on top of these images as a graphic icon. The tracked instrument may also be displayed as a graphic icon indicating its position and orientation relative to another tracked instrument or device. As the tracked instrument is manipulated in the patient, an image of the instrument's position and orientation is overlayed on pre-operative scan and/or intra-operative scan. The display on the computer monitor may display the physician's movements in real-time, highlighting the instrument's location, trajectory or other information relative to the target.
IGI systems typically use a device known as a "position sensor" to determine the position and orientation of instruments. Position indicating elements attached to instruments and the position sensor electronics enable the position sensor to track the position and orientation of the position-indicating element and therefore the instrument. Common position sensor technology includes electromagnetic and optical technologies.
Because the IGI system and position sensors normally have no way of knowing where the physician is standing relative to the patient or display monitor, motions of the instrument may be sometimes displayed in counterintuitive form. For example, moving a probe to the left might be displayed on the monitor as a motion to the right. If the physician moves position, or moves the display device, the perception may reverse. This is counterintuitive, disorienting and distracting to the physician. Although the physician may eventually "learn" the motions required to perform the correct motions on the screen, it can take some time and reduce the usability of the system. This learning time varies and can also lead to incorrect instrument placement. In addition to counterintuitive positioning of the instrument, the same problem can affect rotations. For example, a yaw action may result in a pitch action on the screen.
For these and other reasons, the present invention provides systems, devices and methods for assisting the hand-eye coordination of the physician so that manipulations of the instruments concur with the expected action of the graphic icon of the instrument on the display.
One form of the present invention is an image-guided system employing an imaging device, an instrument tracking device, a physician-centric display module and a registration device. In operation, the imaging device generates imaging data indicative of an image of an anatomical region of a patient within an imaging coordinate system, the instrument tracking device generates tracking data indicative of a location (i.e., a position and/or an orientation) of an instrument within a tracking coordinate system, and the physician-centric display module generates physician-centric data indicative of a physician centric viewpoint within a physician coordinate system. In response to the data, the registration device generates a display illustrating a physician-centric viewpoint of a graphic icon of the instrument relative to the image of the anatomical region as a function of a transformation application of the physician-centric data to a registration of the imaging data and the tracking data. An illustration of the physician-centric viewpoint of the graphic icon of the instrument relative to the image of the anatomical region includes the graphic icon replicating motion of the instrument within the tracking coordinate system from a viewpoint of a physician.
The foregoing form and other forms of the present invention as well as various features and advantages of the present invention will become further apparent from the following detailed description of various exemplary embodiments of the present invention read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present invention rather than limiting, the scope of the present invention being defined by the appended claims and equivalents thereof. FIG. 1 illustrates an exemplary embodiment of an image-guided system as known in the art.
FIG. 2 illustrates an exemplary embodiment of a data registration executed by the image-guided system shown in FIG. 1 as known in the art.
FIG. 3 illustrates an exemplary embodiment of an image-guided system in accordance with the present invention.
FIG. 4 illustrates an exemplary data registration executed by the image-guided surgical system shown in FIG. 3 in accordance with the present invention.
FIG. 5 illustrates an exemplary embodiment of a pre-operative physician-centric viewpoint computation in accordance with the present invention.
FIG. 6 illustrates an exemplary embodiment of an intra-operative physician-centric viewpoint computation in accordance with the present invention.
FIG. 7 illustrates an exemplary embodiment of tracking indicators in accordance with the present invention.
FIG. 8 illustrates an exemplary embodiment of tracking patches in accordance with the present invention.
FIG. 9 illustrates an exemplary embodiment of a volume orientation display in accordance with the present invention.
FIG. 1 illustrates an image-guided system 10 employing an imaging device 20, an instrument tracking device 30, a registration device 40 and a display device 50.
For purposes of the present invention, imaging device 20 is broadly defined herein as any device structurally configured for generating imaging data indicative of an image of an anatomical region of a patient (e.g., brains, heart, lungs, abdomen, etc.) within an imaging coordinate system, such as, for example, a generation of imaging data ("ID") indicative of image 23 of an anatomical region of a patient within an imaging coordinate system 22 as shown in FIG. 2. Examples of imaging device 20 include, but are not limited to, any known type of magnetic resonating imaging device, any known type of X-ray imaging device, any known type of ultrasound imaging device and any known type of computed tomography imaging device.
For purposes of the present invention, instrument tracking device 30 is broadly defined herein as any device structurally configured for generating tracking data indicative of tracking an instrument of any type within a tracking coordinate system, such as, for example, a generation of tracking data ("TD") 31 indicative of a tracking of an instrument 33 within a tracking coordinate system 32 as shown in FIG. 2. Examples of instrument tracking device 30 include, but are not limited to, any known type of electromagnetic tracking device and any known type of optical tracking device. Examples of instrument 33 include, but are not limited to, surgical instruments/tools, imaging instruments/tools and therapeutic
instruments/tools.
For purposes of the present invention, registration device 40 is broadly defined herein as any device structurally configured for registering image data 21 and tracking data 31 to thereby generate a transformation matrix that facilitates a registered display ("RGD") 41 via display device 50 of a graphic icon of the instrument relative to the image of the anatomical region, such as, for example, a registration of imaging data 21 and tracking data 31 via a registration algorithm 42 as shown in FIG. 2 to thereby generate a transformation matrix 43 that facilitates a registered display 44 of a graphic icon 34 of instrument 33 relative to image 23 of the anatomical region as shown in FIG. 2. Examples of registration algorithm 42 includes, but is not limited to, an iterative closet point ("ICP") algorithm and a singular value decomposition ("SVD") algorithm as known in the art.
Prior to the present invention, system 10 was not structurally configured to ascertain the physician's viewpoint of image guided intervention and consequently, significantly more often than not, motions of graphic icon 34 of instrument 33 are not displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician. To overcome this problem, the present invention provides a physician-centric display module for ascertaining the physician's viewpoint of image-guided intervention whereby graphic icon 34 of instrument 33 may consistently be displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician.
Specifically, FIG. 3 illustrates an image-guided intervention system 11 employing imaging device 20, instrument tracking device 30, a physician-centric display module 70, a registration device 80 and display device 50.
For purposes of the present invention, physician-centric display module 70 is broadly defined herein as any software, firmware and/or hardware structurally configured for generating physician-centric data ("PCD") 71 indicative of a physician-centric viewpoint within a physician coordinate system (i.e., the viewpoint of the physician is explicitly or implicitly centered within the physician coordinate system), such as, for example, a generation of physician-centric data 71 indicative of a physician-centric viewpoint 73 within a physician coordinate system 72 as shown in FIG. 4. In practice, physician-centric data 71 may include an exact computation, an estimation or an approximation of physician-centric viewpoint 73 within the physician coordinate system 72.
Various embodiments of module 70 are subsequently provided herein in connection with the description of FIGS. 5-9.
For purposes of the present invention, registration device 80 is broadly defined herein as any device structurally configured for processing image data 21, tracking data 31 and physician-centric data 71 via a mathematical execution 82 of a registration algorithm and a transformation application that facilitates a display of a physician-centric viewpoint of the graphic icon of the instrument. Furthermore, the "physician-centric viewpoint of the graphic icon of the instrument" is broadly defined herein as a display of the graphic icon of the instrument that replicates motion of the instrument from the viewpoint of the physician as the physician navigates the instrument within the tracking coordinate system. More particularly, the displayed motion of the graphic icon of the instrument will mimic the actual motion of the instrument in terms of direction, orientation and proportional degree of movement from the viewpoint of the physician. For example, if the physician moves the instrument to his/her left from a first location on an anatomical object to a second location on the anatomical object, then the displayed motion of the graphic icon of the instrument from the first location on a displayed image of the anatomical object to the second location on the displayed image of the anatomical object will mimic this physician navigated motion of the instrument from the first location on the anatomical object to the second location on the the anatomical object.
In one exemplary embodiment as shown in FIG. 4, registration device 80 registers image data 21 and tracking data 31 as previously described herein to generate a base transformation matrix 83. Physician-centric data 71 includes a physician-centric
transformation matrix 84 corresponding to a physician-centric viewpoint 73 within physician coordinate system 74, and registration device 80 applies the transformation matrix 84 to transformation matrix 83 as know in the art to generate a display 85 of graphic icon 34 relative to image 23 of the anatomical region that replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32. This is highlighted by the arrows respectively pointing from graphic icon 34 to image 23 in display 85 and instrument 33 to anatomical region 24 shown in coordinate system 32. The subsequent description of FIG. 6 herein provides more detail of this embodiment. In a second exemplary embodiment of shown in FIG. 4, registration device 80 applies physician-centric transformation matrix 84 to imaging data 21 as known in the art, and thereafter registers the transformed imaging data 21 to tracking data 31 as known to the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32. The subsequent description of FIGS. 5 and 7-9 herein provide more detail of this embodiment.
In a third exemplary embodiment, registration device 80 applies physician-centric transformation matrix 84 to tracking data 31 as known in the art, and thereafter registers the transformed tracking data 31 to imaging data 31 as known in the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by physician 74 within tracking coordinate system 32. The subsequent description of FIGS. 7 and 8 herein provide more detail of this embodiment.
In practice, for all three embodiments, those having ordinary skill in the art will appreciate the specific transformation application(s) utilized by registration device 80 is(are) dependent upon many variables including, but not limited to, the actual construction of coordinate systems 22 and 32, the order of multiplication, and the required application of the resultant transformation to display the graphical data. Thus, there are numerous operational modes of module 70 and device 80 in practice, and any operational mode of module 70 and registration device 80 in practice is dependent on the specific application of an image-guiding system of the present invention.
FIGS. 5-9 will now be described with an emphasis of the physician-centric transformation matrix aspect of the present invention.
FIG. 5 illustrates a virtual plan of a surgical room 90 that is annotated to determine a patient location 92 (e.g., prone, decubitus right, supine, etc.) relative to a physician-centric position 91 and a display location 93 relative to a physician-centric position 91. From this information, it is possible to "correct" the physician's motions so that they accurately reflect the expected movements. Using this information, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination. In practice, surgical room 90 may be in advance whereby the physician is required to place the patient in a given location and orientation and the physician is further required to stand in a particular location relative to the display device. FIG. 6 illustrates module 70 being "taught" physician-centric transformation matrix 84 in a manner that enables efficient hand eye coordination by leading the physician through a series of movements to record his reaction. Specifically, the physician may be instructed to move a coordinating instrument 101 that has been equipped with a position indicating element in a direction indicated on a display device 100. By recording the physician's response to moving the instrument according to the supplied instructions, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
FIG. 7 illustrates the relative location of patient and physician may be determined automatically by a set 110 of special indicators 111-113 utilized via imaging of the anatomical region or tracking of the instrument. More particularly, during a indicator 111 may be placed on the physician on the same side of the patient as the physician, indicator 112 may be placed on the patient's head and indicator 113 may be placed on the patient on a side of the patient opposite the physician. Using this information during a pre-operative scan/intra-operative scan of the patient or a tracking of the instrument, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
In practice, indicators 111-113 are distinguishable from one another through geometry, selection of materials or by some other physical property that may manifest itself as "brightness" in an MR or CT scan. Additionally, indicators 111-113 may contain position indicating elements detectable by a position sensor arranged so as to also render an orientation of indicators 111-113 and purpose (i.e. to identify the patient's right hand side, physician side, etc.) known to the position sensor.
FIG. 8 illustrates a set 120 of a physician patch 121 to be applied to the physician and a patient patch 122 to be applied the patient to assist with proper display and reaction of the graphic display can be done prior to, during or after a diagnostic scan (e.g., CT scan) to help indicate where the physician is located relative to the patient and the orientation of the patient. In practice, patches 121 and 122 may be individually distinguishable by the position sensing device alone, which is able to tell one from another along with its orientation. From this information, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye
coordination. FIG. 9 illustrates a graphical view of three (3) base displays 130-132 of an image of the anatomical region that facilitates the physician in rotating a 3D display 133 of the image to an orientation that mirrors the physician's view of the patient on the procedure table. More particularly, the physician uses an interface (e.g. a mouse or a trackball) to rotate the 3D display 133 so what is displayed on the screen duplicates his view of the image from his current location. Once this has been done, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
From the description of FIGS. 1-9, those having skill in the art will have a further appreciation on how to implement an image guided intervention for various applications in accordance with the present invention.
While various exemplary embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that the exemplary embodiments of the present invention as described herein are illustrative, and various changes and modifications may be made and equivalents may be substituted for elements thereof without departing from the true scope of the present invention. In addition, many modifications may be made to adapt the teachings of the present invention without departing from its central scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed as the best mode contemplated for carrying out the present invention, but that the present invention includes all embodiments falling within the scope of the appended claims.

Claims

Claims
1. An image-guided system (11), comprising :
an imaging device (20) operable for generating imaging data (21) indicative of an image (23) of an anatomical region (24) of a patient within an imaging coordinate system (22);
an instrument tracking device (80) operable for generating tracking data (31) indicative of a location of an instrument (33) within a tracking coordinate system (32);
a physician-centric display module (70) operable for generating physician-centric data (71) indicative of a physician centric viewpoint (73) within a physician coordinate system (72); and
a registration device (80) operable for generating a physician-centric display (81) illustrating a physician-centric viewpoint (73) of a graphic icon (34) of the instrument (33) relative to the image (23) of the anatomical region as a function of a transformation application of the physician-centric data (71) to a registration of the imaging data (21) and the tracking data (31),
wherein an illustration of the physician-centric viewpoint (73) of the graphic icon (34) of the instrument (33) relative to the image (23) of the anatomical region includes the graphic icon (34) replicating motion of the instrument (33) within the tracking coordinate system (32) from a viewpoint of a physician (74).
2. The image-guided system (11) of claim 1,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83); and
applying the physician-centric transformation matrix (84) to the base transformation matrix (83).
3. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the imaging data
(21); and
subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
4. The image-guided system (11) of claim 1,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the tracking data
(31); and
subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
5. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes at least one of a location (92) of the patient relative to a centric position (91) of the physician within the physician coordinate system (83) and a location (93) of a display device relative to a centric position (91) of the physician within the physician coordinate system (83).
6. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes at least one displayed location of a coordinating instrument (101) relative to physician centric viewpoint (73) within the physician coordinate system (83).
7. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a location of a physician indicator (111) within the physician coordinate system (83) as tracked by the tracking device (80), the physician indicator (111) representing the physician centric-viewpoint (73).
8. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a location of a patient indicator (112) within the physician coordinate system (83) as tracked by the tracking device (80), the patient indicator (112) representing a location of a particular feature of the patient within the physician coordinate system (83).
9. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a location of a supplemental indicator (113) within the physician coordinate system (83) as tracked by the tracking device (80), the patient indicator (113) representing a location within the physician coordinate system (83) remote from the physician centric-viewpoint (73).
10. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a location of a physician patch (121) within the physician coordinate system (83) as tracked by the tracking device (80), the physician patch (121) being attached to the physician and representing the physician centric-viewpoint (73).
11. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a location of a patient patch (122) within the physician coordinate system (83) as tracked by the tracking device (80), the patient patch (122) being attached to the patient and representing a location of a particular feature of the patient within the physician coordinate system (83).
12. The image-guided system (11) of claim 1, wherein the physician-centric data (71) includes a displayed orientation (133) of the image (23) of the anatomical region (24) representing the physician centric-viewpoint (73).
13. An image-guided system (11), comprising :
a physician-centric display module (70) operable for generating physician-centric data (71) indicative of a physician centric viewpoint (73) within a physician coordinate system (72); and a registration device (80) operable for generating a physician-centric display (81) illustrating a physician-centric viewpoint (73) of a graphic icon (34) of an instrument (33) relative to a image (23) of anatomical region of a patient as a function of a transformation application of the physician-centric data (71) to a registration of the imaging data (21) and the tracking data (31),
wherein the imaging data (21) is indicative of the image (23) of the anatomical region (24) of the patient within the imaging coordinate system (22),
wherein the tracking data (31) is indicative of the location of the instrument (33) within the tracking coordinate system (32), and
wherein an illustration of the physician-centric viewpoint (73) of the graphic icon (34) of the instrument (33) relative to the image (23) of the anatomical region includes the graphic icon (34) replicating motion of the instrument (33) within the tracking coordinate system (32) from a viewpoint of a physician (74).
14. The image-guided system (11) of claim 13,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83); and
applying the physician-centric transformation matrix (84) to the base transformation matrix (83).
15. The image-guided system (11) of claim 13,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the imaging data
(21); and subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
16. The image-guided system (11) of claim 13,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the tracking data
(31); and
subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
17. An image-guided method, comprising
generating imaging data (21) indicative of an image (23) of an anatomical region (24) of a patient within an imaging coordinate system (22);
generating tracking data (31) indicative of a location of an instrument (33) within a tracking coordinate system (32);
generating physician-centric data (71) indicative of a physician centric viewpoint (73) within a physician coordinate system (72); and
generating a physician-centric display (81) illustrating a physician-centric viewpoint (73) of a graphic icon (34) of the instrument (33) relative to the image (23) of the anatomical region as a function of a transformation application of the physician-centric data (71) to a registration of the imaging data (21) and the tracking data (31),
wherein an illustration of the physician-centric viewpoint (73) of the graphic icon (34) of the instrument (33) relative to the image (23) of the anatomical region includes the graphic icon (34) replicating motion of the instrument (33) within the tracking coordinate system (32) from a viewpoint of a physician (74).
18. The image-guided method of claim 17, wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83); and
applying the physician-centric transformation matrix (84) to the base transformation matrix (83).
19. The image-guided method of claim 17,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the imaging data
(21); and
subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
20. The image-guided method of claim 17,
wherein the physician-centric data (71) includes a physician-centric transformation matrix (84) derived from the physician centric viewpoint (73) within the physician coordinate system (72); and
wherein the transformation application of the physician-centric data (71) to the registration of the imaging data (21) and the tracking data (31) includes:
applying the physician-centric transformation matrix (84) to the tracking data
(31); and
subsequently registering the imaging data (21) and the tracking data (31) including a generation of a base transformation matrix (83).
PCT/IB2011/052339 2010-06-30 2011-05-27 Method and system for creating physician-centric coordinate system WO2012001550A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36020510P 2010-06-30 2010-06-30
US61/360,205 2010-06-30

Publications (1)

Publication Number Publication Date
WO2012001550A1 true WO2012001550A1 (en) 2012-01-05

Family

ID=44544053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/052339 WO2012001550A1 (en) 2010-06-30 2011-05-27 Method and system for creating physician-centric coordinate system

Country Status (1)

Country Link
WO (1) WO2012001550A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015075720A1 (en) * 2013-11-21 2015-05-28 Elbit Systems Ltd. A medical optical tracking system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5776050A (en) * 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20040138556A1 (en) * 1991-01-28 2004-07-15 Cosman Eric R. Optical object tracking system
WO2009094646A2 (en) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20040138556A1 (en) * 1991-01-28 2004-07-15 Cosman Eric R. Optical object tracking system
US5776050A (en) * 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
WO2009094646A2 (en) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015075720A1 (en) * 2013-11-21 2015-05-28 Elbit Systems Ltd. A medical optical tracking system

Similar Documents

Publication Publication Date Title
US10575755B2 (en) Computer-implemented technique for calculating a position of a surgical device
US10342575B2 (en) Apparatus for use with needle insertion guidance system
US20210401456A1 (en) Apparatus for Use with Needle Insertion Guidance System
US7466303B2 (en) Device and process for manipulating real and virtual objects in three-dimensional space
Zhang et al. Electromagnetic tracking for abdominal interventions in computer aided surgery
EP2096523B1 (en) Location system with virtual touch screen
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
JP2023053108A (en) Image registration and guidance using concurrent x-plane imaging
US20190209241A1 (en) Systems and methods for laparoscopic planning and navigation
US8977342B2 (en) Medical intervention device
US20110282188A1 (en) Insertion guidance system for needles and medical components
US20060173269A1 (en) Integrated skin-mounted multifunction device for use in image-guided surgery
US20240008846A1 (en) System for tracking and imaging a treatment probe
US20120059220A1 (en) Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
Andrews et al. Registration techniques for clinical applications of three-dimensional augmented reality devices
WO2015188393A1 (en) Human organ motion monitoring method, surgical navigation system, and computer-readable media
US20140206994A1 (en) Accurate visualization of soft tissue motion on x-ray
WO2012172474A1 (en) System and method for guided injection during endoscopic surgery
CN110192917B (en) System and method for performing percutaneous navigation procedures
JP2008126075A (en) System and method for visual verification of ct registration and feedback
EP3544538B1 (en) System for navigating interventional instrumentation
WO2008035271A2 (en) Device for registering a 3d model
Traub et al. Advanced display and visualization concepts for image guided surgery
WO2016108110A1 (en) Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11738287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11738287

Country of ref document: EP

Kind code of ref document: A1