US20040138556A1 - Optical object tracking system - Google Patents
Optical object tracking system Download PDFInfo
- Publication number
- US20040138556A1 US20040138556A1 US10/752,118 US75211804A US2004138556A1 US 20040138556 A1 US20040138556 A1 US 20040138556A1 US 75211804 A US75211804 A US 75211804A US 2004138556 A1 US2004138556 A1 US 2004138556A1
- Authority
- US
- United States
- Prior art keywords
- instrument
- camera
- optically detectable
- data
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 18
- 210000003484 anatomy Anatomy 0.000 claims description 22
- 239000000463 material Substances 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 claims description 6
- 230000001131 transforming effect Effects 0.000 claims description 5
- 238000001228 spectrum Methods 0.000 claims description 3
- 238000013500 data storage Methods 0.000 claims 1
- 238000002329 infrared spectrum Methods 0.000 claims 1
- 238000000034 method Methods 0.000 abstract description 20
- 238000011282 treatment Methods 0.000 abstract description 12
- 238000001514 detection method Methods 0.000 abstract description 9
- 230000006870 function Effects 0.000 description 17
- 239000000523 sample Substances 0.000 description 15
- 239000003550 marker Substances 0.000 description 12
- 230000009466 transformation Effects 0.000 description 9
- 238000001356 surgical procedure Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 5
- 239000003973 paint Substances 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000000844 transformation Methods 0.000 description 5
- 239000013598 vector Substances 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 238000003708 edge detection Methods 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 239000010432 diamond Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000000275 quality assurance Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 241001085205 Prenanthella exigua Species 0.000 description 1
- 208000002847 Surgical Wound Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000004141 dimensional analysis Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000012567 pattern recognition method Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000001225 therapeutic effect Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/368—Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3945—Active visible markers, e.g. light emitting diodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/397—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
- A61B2090/3975—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
- A61B2090/3979—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/105—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using a laser alignment system
Definitions
- the invention relates generally to medical equipment used in the surgical treatment of disease, and more particularly to a system and method for medical instrument navigation by optically tracking the positions of instruments used during surgery or other treatments in relation to a patient's anatomy.
- Image guided stereotaxy is widely used in the field of neurosurgery. It involves the quantitative determination of anatomical positions based on scan data taken from a CT, MRI or other scanning procedures to obtain three-dimensional scan data. Typically, the image scan data is placed in a computer to provide a three-dimensional database that may be variously used to provide graphic information. Essentially, such information is useful in surgical procedures and enables viewing a patient's anatomy in a graphics display.
- a mechanically linked space pointer (analogous to a pencil) attached to the end of an encoded mechanical linkage might be directed at a patient's anatomy and its position quantified relative to the stereotactic scan data.
- the space pointer might be oriented to point at an anatomical target and so displayed using computer graphics techniques.
- Such apparatus has been proposed, using an articulated space pointer with a mechanical linkage.
- the head frame and the articulated space pointer are mechanically connected to an apparatus used to measure and calculate the position of the probe or pointer. Consequently, although a relatively high number of degrees of freedom can be provided to the pointer (or other tool coupled to the pointer), the mechanical linkage may still restrict the possible ranges of motion available to the clinician. Furthermore, the linkages may be large and obtrusive, and can be difficult to sterilize.
- the apparatus tracks the position of the space pointer in relation to the patient's anatomy, the clinician is still free to move about the patient and operate from any desired position. This is not reflected by the data produced by the device. Accordingly, although a “pointer's eye” view of the surgical field can be provided, if the clinician is operating from any of various other angles, then any graphical representation of the surgical field may be disorienting, confusing, or not representative of the “surgeon's eye” view. Although the system's point-of-view might be selected and altered manually, this is not an optimum solution, as it requires additional steps to be taken by the clinician or an assistant.
- the need for relating external treatment apparatus or surgical viewing directions to a specific target arises in several aspects.
- the need arises in relation to the treatment of internal anatomical targets, specifically to position and maintain such targets with respect to a surgical instrument such as a probe, a microscope with a specific direction and orientation of view, or an X-ray treatment beam associated with a large external apparatus.
- a need exists for methods for aligning a surgical instrument, probe, or beam not attached by any mechanical linkage, to impact specific anatomical targets via a path selected to avoid injury to other critical anatomical structures.
- an optical camera apparatus functions in cooperation with a computer system and a specially configured surgical instrument.
- the camera system is positioned to detect a clinical field of view and to detect index markers on a surgical instrument, a patient, and/or a surgeon.
- the markers are tracked by the camera apparatus.
- the image scan data (such as from a CT or MR scan of the patient's anatomy) and data specifying the position of the instrument and the surgeon are transformed relative to the patient's anatomy and the camera coordinate system, thereby aligning the scan data, patient position and orientation data, instrument position and orientation data, and surgeon position and orientation data for selectable simultaneous viewing on a computer display.
- Various exemplary embodiments are given of the use of lines, arrays of points, geometric patterns and figures, lines of light, and other optically detectable marker configurations to identify the position and orientation of a surgical instrument, a patient, and a surgeon.
- the disclosed embodiments have the advantage of being wireless and optically coupled to the camera tracking system. Moreover, they can be relatively economical and lightweight in comparison to the mechanically coupled tracking devices described in the background section above.
- FIG. 1 schematically illustrates a system for optically tracking instruments and other objects in a surgical field in accordance with the present invention
- FIG. 2 which includes FIGS. 2A, 2B, 2 C, 2 D, 2 E, 2 F, 2 G, and 2 H, illustrates various configurations of optically detectable geometric objects and patterns associated with objects to be tracked in accordance with the system of FIG. 1;
- FIG. 3 which includes FIGS. 3A, 3B, 3 C, 3 D, 3 E, and 3 F, illustrates various optically detectable objects attached to instruments in accordance with the present invention
- FIG. 4 which includes FIGS. 4A, 4B, and 4 C, illustrates additional alternative embodiments of optically detectable objects in accordance with the present invention
- FIG. 5 schematically shows several combinations of graphics, video, and reconstructed representations derived from optically tracking of a surgical field
- FIG. 6 schematically shows a battery-powered optically tracked instrument for use in accordance with the present invention
- FIG. 7 illustrates the functions performed in the combined processing of tracking, videos, and/or image data in a display in accordance with the present invention
- FIG. 8 is a flowchart showing the sequence of steps performed in tracking an optically detectable object.
- FIG. 9 is a flowchart illustrating the sequence of steps performed in generating a display when a surgical instrument, a patient, and a surgeon are all tracked by a system in accordance with the invention.
- an embodiment of a system according to the invention is shown schematically as including a camera system 10 that has a field of view that includes multiple elements.
- the elements can include a surgical field for surgical application or a treatment field for therapy applications.
- Part of the patient's body 22 may or may not be in the camera field.
- Mounted to the patient within the camera field are several optically detectable objects such as markers 24 , 26 , and 28 , which are mounted directly on the patient, or alternatively, identifiers 30 , 32 , 34 , and 36 connected to a structure 38 that is rigidly connected to the patient's body 22 .
- the markers 24 , 26 , and 28 or the identifiers 30 , 32 , 34 , and 36 may be light-emitting, light-reflecting, or otherwise optically differentially detectable geometric structures, patterns, or elements. They may comprise, for example, light-emitting diodes (“LEDs”) capable of emitting infrared, visible, or other wavelengths of light; reflectors, such as mirrors, reflective paint, reflective sheeting or tape, reflective dispersions, and so on.
- the markers or identifiers may be fabricated in any of various shapes including discs, annular plates or rings, domes, hemispheres, spheres, triangles, squares, cubes, diamonds, or combinations thereof. It has been found that circular stick-down circles, domes or spheres are usable in this application.
- the identifier 36 may include a reflective surface of triangular shape, for example, that is detectable in spatial position and orientation by the camera system 10 . In this way, the patient's position and orientation can be detected with respect to the coordinate system of the camera system 10 ; this procedure will be discussed in further detail below.
- the camera system 10 comprises one or more cameras, each of which can be selected from optical cameras of various known types. In FIG. 1, three cameras are shown as part of the camera system 10 . In the disclosed embodiment, a right-mounted camera 12 and a left-mounted camera 16 are capable of resolving two-dimensional images. The dashed lines 40 illustrate the field of view of the right-mounted camera 12 ; the left-mounted camera 16 has a similar (but displaced) field of view. The cameras provide optical camera data to processor 20 related to optically detectable objects in the common field-of-view of the cameras included in the camera system 10 .
- stereoscopic or three-dimensional position data on the optically detectable object positions in the coordinate camera system can be derived by the processor 42 .
- the positions and orientations of objects within the camera system field of view can be determined rapidly by the processor 42 and sent to a computer 44 .
- the computer 44 has software to represent the positions and orientations of those objects in camera coordinates and display the objects in various representations on a display means 46 as desired by the clinician.
- a lateral support 18 for the cameras 12 and 16 is fixed by a coupler 20 to a rigid reference R, such as the ceiling, wall, or floor of a room.
- a rigid reference R such as the ceiling, wall, or floor of a room.
- light sources 50 and 52 which in the disclosed embodiment are mounted in proximity to cameras 12 and 16 , respectively. These light sources can send light outward as for example along a path represented by a dashed line 54 to be reflected off of a reflective optically detectable object such the marker 24 on the patient's body 22 . Reflected light then returns along a path such as that represented by a dashed line 56 , and is detected by the camera 12 .
- the marker 24 and other markers and identifiers in the field include reflective surfaces, points, lines, or regions
- these structures can be represented as camera data in a three-dimensional coordinate system fixed with respect to the camera system 10 .
- the light sources 50 and 52 are be pulsed clusters of LEDs in the infrared (IR) frequency range, and cameras 12 and 16 have selective IR filters matched to the IR source wave length.
- IR infrared
- ambient lighting conditions can be used to enable cameras 12 and 16 to detect the markers and identifiers.
- the marker 24 for example, is a brightly colored (white, green, red, etc.) disc, sphere, or other shape that stands out in contrast to whatever is visible in the background, then the marker's position can be detected by the cameras.
- the identifier 30 is bright white, and the surface of head clamp structure 38 is dark or black, then the identifier 30 can be discriminated by the camera system 10 .
- one or more cameras may be used in the camera system 10 .
- two or more cameras will yield stereoscopic data on objects in the clinical field of view in relation to the camera frame of reference or in camera coordinates.
- some or all of the optically detectable identifiers may comprise light sources themselves.
- the identifiers may be LEDs or other powered light sources such as lamps, possibly enclosed in diffusing globes.
- the light elements of identifiers 30 , 32 , and 34 can be triggered by and synchronized with cameras 12 and 16 .
- electronic shutters in the cameras can be used to enable the camera detectors at just the time when elements 30 , 32 , and 34 illuminate, thereby increasing the signal-to-noise ratio.
- FIG. 1 Also shown in FIG. 1 is a surgical instrument 8 .
- the instrument can be of any known surgical type, including but not limited to probes, cutting devices, suction tubes, endoscopes, electronic probes, and other tools.
- Attached to the instrument 8 is at least one optically detectable element 62 , which can comprise various geometric structures that are detectable and recognizable by cameras 12 and 16 .
- a rod indicator 64 is shown in a fixed relationship with a spherical indicator 66 .
- these indicators 64 and 66 can comprise reflective material, bright or colored surfaces, or light-emitting elements which are detected by cameras 12 and 16 .
- the three-dimensional position and orientation of the element 62 can then be calculated using the camera data processor 42 and the computer 44 .
- the orientation and position of the instrument 60 can thereby be determined.
- a calibration or pre-fixed position of the element 62 with respect to the instrument 60 may be performed before surgery or intraoperatively (see, for example, several of the products of Radionics, Burlington, Mass.).
- indicators 62 and 66 are light emitting, they can be connected the processor 42 (dashed line), and synchronized to strobing of the camera system 10 .
- light-detectable indicators 70 , 72 , and 74 are shown on a surgeon 76 .
- the indicators 70 , 72 , and 74 are attached to a headband 78 worn by the surgeon 76 .
- This optical detectable array can then be tracked by the camera system 10 along with the patient's body 22 and the instrument 60 .
- the camera data processed in the processor 42 and assimilated in the computer 21 can thereby track in three-dimensional space relative to the camera system 10 the positions of all elements and their relative orientations.
- the processor 42 can be connected to the surgeon's headband 78 (dashed line) to synchronize the indicators' signals.
- image data can be provided to the surgeon 76 via an optical headset 80 worn by the surgeon.
- the optical headset 30 is a binocular magnifier with built-in image-splitting elements.
- Graphic data from the processor 42 originating from image scan data 48 pre-scanned from the patient, can be sent into the viewing elements of the headset 80 to update the surgeon 76 with location data correlated to the surgeon's viewing position.
- a reconstructed image of CT or MRI data taken previously and provided to the computer 44 can be displayed via the headset 80 , thereby permitting the surgeon to see a “reconstructed” view from the direction of his physical perspective.
- the computer 44 can assimilate historic image data 48 and convert it to reconstructed planar images and send that information to a display element 46 , which thereafter can be “piped” or transmitted to the headset 80 for the surgeon's use.
- the headset 80 can comprise at least one video camera 82 capable of viewing the surgical field from the surgeon's direction.
- Information from the video camera 82 can be sent (via the dashed line) to the processor 42 and the computer 44 and onto the display 46 .
- that information can then be reconstructed and displayed via a split screen prism in the surgeon's field-of-view via his headset 80 .
- the surgeon's view information can be oriented in a suitable direction by the tracking of the indicators 70 , 72 , and 74 with the camera system 10 , as discussed above.
- the video information displayed in the headset 80 can be rendered from stereotactic camera coordinates.
- the processor 42 in one embodiment of the invention, is a dedicated processor for electronic data from the camera system 10 .
- the processor 42 is also capable of synchronously controlling the light emitters 50 and 52 , if needed to illuminate the optically detectable markers or indicators on the patient 22 , the head holder structure 38 , the instrument 60 , or the surgeon 76 .
- Data from the processor 42 is sent to the computer 44 , where it is then analyzed in three-dimensional camera-based coordinates.
- Image data 48 can be in memory of the computer 44 or otherwise transferred to computer 21 , as for example optical disk, magnetic tape, etc.
- the visualization of camera data and image scan data is accomplished via the display 46 , which in various embodiments can be a CRT, liquid crystal display, heads-up display, or other display device.
- the visual image presented by the display 46 represents the position of the instrument 60 in terms of orientation, tip position, and other characteristics with respect to the image scan data 48 in a variety of ways. For examples, see documentation for the OTS product of Radionics, Burlington, Mass. Specifically, cataloging slices, probe view, in-probe reconstructions, three-dimensional wedge views, and other views of the instrument 60 relative to the patient 22 can be represented on the display 46 . Also, the surgeon's view, via registration of the visual headset 80 (by identifying the indicators 70 , 72 , and 74 as described above) can also be shown on the display 46 . Although the instrument 60 is schematically shown as a pointed instrument in FIG.
- an instrument 60 for use with the present invention can be nearly any surgical instrument or device, such as a microscope, an endoscope, a cutting instrument, an ultrasonic imaging probe, or a treatment device such as an X-ray collimation device for a linear accelerator (LINAC).
- LINAC linear accelerator
- the objects in this field of view of the camera system 10 can be tracked in the three-dimensional coordinate space of the camera system 10 .
- the instrument 60 can be calibrated relative to the patient 22 in a variety of ways (see the OTS Tracking System of Radionics, Burlington, Mass. for examples).
- the instrument 60 is touched to a plurality of fiducial markers placed on the patient 22 (for example, the markers 24 , 26 , and 28 ), natural landmarks on the patient's skin, surface swabbing of the patient's anatomy, a reference to real-time imaging data (for example ultrasound, MRI, CT, etc.) in the situation where the structure 38 is connected or associated with such an imaging apparatus, and so on.
- a plurality of fiducial markers placed on the patient 22 (for example, the markers 24 , 26 , and 28 ), natural landmarks on the patient's skin, surface swabbing of the patient's anatomy, a reference to real-time imaging data (for example ultrasound, MRI, CT, etc.) in the
- the processor 42 uses such data in a calibration step so that the position of the instrument 60 is in a known position and orientation relative to the patient 22 or the structure 38 affixed to the patient 22 , or even with respect to apparatus elsewhere in the room such as a linear accelerator, an image scanner, or an apparatus on a surgeon (the headband 78 , for example).
- FIG. 2 various embodiments of patterns, shapes, and objects for the optically detectable elements that can be used on, for example, the instrument 60 (FIG. 1) or the patient 22 , the surgeon 76 , a microscope, or other surgical device not shown.
- the surgical instrument 60 is rendered schematically.
- the instrument 60 is depicted in the embodiment set forth in FIG. 2, it should be noted that similar or identical configurations can be used on the patient 22 , the structure 38 , the surgeon 76 , or any other implement to be tracked.
- the instrument 60 has a surgical axis (dashed line 84 ) and a focal point, end point, isocenter, or other characteristic point 86 .
- a geometric object 88 is attached to the instrument 60 by a connector 90 .
- the connector 90 is a rigid coupling and is in a predetermined relationship with the instrument 60 ; alternatively, it could be in an arbitrary relationship with the instrument 60 and subject to calibration.
- the geometric 88 bears a bright portion 92 (the hatched area) on its surface.
- the bright portion 92 of the surface of the geometric object 88 may comprise reflective paint, reflective film, a brightly colored surface in a particular color spectrum, or an illuminated field.
- the camera system 10 is represented here only schematically, but could comprise the elements described in FIG. 1, including cameras, light sources, a processor, a computers, image data, and a display, among other items. Further, it should be noted that although the geometric object 88 and its bright portion 92 are specifically described and shown as triangular in configuration, many other shapes are possible and equally operative in the context of the invention, which is not so limited.
- the position and orientation of the instrument 60 can be determined by tracking the position and orientation of the geometric object 88 .
- the instrument 60 may be a rigid body of complex shape. Its position, for example, may be characterized by axes such as 84 , 85 , and 87 , and its orientation around an axis 84 may be characterized by a rotation angle indicated by an arrow 83 .
- this rotation angle 83 and the position and orientation of the axes 84 , 85 , and 87 may be tracked relative to the coordinate system of the camera system 10 . This can be done by rigid body transformations which are well known to those skilled in matrix mathematics.
- the instrument 60 is an endoscope or a microscope for which the axis 84 represents a viewing direction
- the characteristic point 86 is a point desired to be viewed in the surgical field
- angle 94 and if the axes 85 and 87 represent the orientation of the viewing field relative to the patient's coordinate system or the coordinate system of image scan data, then tracking the geometric object 88 will provide position and orientation tracking of the endoscopic or microscopic field of view.
- Detecting the edges of the bright portion 92 in the three-dimensional coordinate system relative to the camera system 10 enables the direction and orientation of the geometric object 88 to be determined.
- the camera system 10 , the processor 42 , and the computer 42 (FIG.
- edges such as a line 94 between the bright portion 92 and the remainder of the geometric object 88 , as well as the other respective edges of the triangle or geometric shape. This may be accomplished by differential detection of the shaded area of the triangle versus the perimeter band, which may not be of reflective, brightly colored, or illuminating optically detectable material. Edge detection of geometric shapes can be done by well-known segmentation or detection algorithms in the processor 42 or the computer 44 . Three non-collinear points define a plane; additional data can be used to define position and orientation within the plane.
- the index structure comprises a four-sided geometric shape 96 having a shaded band 98 which may be of reflective or bright material. Inside is a relatively dark area 100 which may be of non-reflective material. Alternatively, the roles of the shaded band 98 and dark area 100 could be reversed.
- the camera system 10 detects this object and the linear edges of the band 98 or the dark area 100 . This establishes the position and orientation of the shape 96 .
- the shape 96 is attached by a connector 102 to the instrument 60 .
- Such a shape 96 could be easily made.
- the differentially reflective areas i.e., the shaded band 98 and the dark area 100
- the differentially reflective areas can be sprayed on, etched, or deposited on by a masking process; any of these procedures would be inexpensive and lead to very sharp linear borders between the two regions. These borders can then be detected by the camera system 10 via linear discrimination algorithms in the processor 42 and the computer 44 (FIG. 1).
- the shape 96 is a parallelogram or a square, the orientation of the plane of the shape 96 can easily be determined by vector cross-product calculations of the linear positions of the borders in three-dimensional space with the edges of the object.
- the connector 102 is optional; if the shape 96 is integrally part of the tool or instrument 60 , viz. part of its handle, then an explicit connector 102 would not be needed.
- the instrument 60 has attached to it an optically detectable shape 104 in the form of a solid or a plate.
- various geometric patterns 106 , 108 , and 110 which may be, for example, reflective patches or painted areas on a black background. These structures by their respective shapes and orientation encode the position and orientation of the shape 104 .
- the patterns can be circles, domes, spheres, or ellipsoids which are detectable by the camera system 10 .
- the shape 104 may be flat or curved, according to needs. In an embodiment of the invention, one of the patterns, e.g.
- pattern 110 has a more linear structure which is distinguishable from curvilinear shapes such as shapes 106 and 108 also identifiable by the camera system 10 .
- the pattern 108 has an annular shape with a hole 112 in the middle to distinguish it from a dot-shaped pattern 106 .
- the combination can uniquely identify and locate the shape 104 , and therefore the instrument 60 , in its orientation and position.
- the various patterns 106 , 108 , and 110 can be distinguished from each other, from the background, and from other types of surgical instruments by their reflectivity, color, position, and geometry to give a unique signature or knapping to the instrument 60 .
- the tool could be a special forceps, and the shape 104 with its distinguishing optical characteristics, could be known to the camera system 10 and its associated computer system 44 to be a particular type of forceps.
- other specific tools can have different optically detectable signature structures.
- a flat detectable shape 114 is shown.
- the shape 114 has orthogonal bar patterns 116 and 118 , which could be again reflective tape on a black background of the shape 114 . These patterns are recognizable and distinguishable by detecting the borders, such as a line 120 between the patterns 116 and 118 and the background. Linear structures are easily detectable by camera systems and pattern recognition software.
- the camera system 10 could easily scan such a geometric linear pattern in distinguishing the linear bar patterns, thereby determining the orientation of the patterns 116 and 118 as orthogonal and in a given spatial three-dimensional position.
- the orientation of the shape 114 and its position in space can be determined in the coordinates of the camera system 10 .
- a fixed relationship between the instrument 60 and the shape 114 via a connector 122 can then be used to identify the position and orientation of the instrument 60 in all of its movements within the field of view of the camera system 10 .
- FIG. 2E shows yet another embodiment of the present invention with shows a linear rod 124 and a spherical object 126 coupled together.
- a reflective surface 128 on the rod 124 could be taped or painted onto the rod 124 .
- the spherical object 126 bearing reflective tape or paint is, in the disclosed embodiment, coaxial with the painted surface 128 of the rod 124 .
- the camera system 10 is capable of recognizing the linear form of the rod 124 and the center of the spherical object 126 . Accordingly, a detection algorithm in the computer 44 (FIG.
- another example of the present invention comprises a longitudinal rod 130 with a reflective linear surface 132 (shaded) and an orthogonal rod 134 with two reflective segments 136 and 138 (shaded).
- These linear structures again are detectable by the camera system 10 , thereby determining the orientation of the plane defined by the longitudinal rod 130 and the orthogonal rod 134 . As described above, this is information is then used to determine the orientation and movement of the instrument 60 , which is coupled to the rods 132 and 134 via a connector 139 , in three-dimensional space.
- FIG. 2G shows yet another example of rod-like structures in a triangle 140 .
- the shaded linear segments 142 , 144 , and 146 lie at the edges of the triangle 140 and define the plane and orientation of the triangle 140 .
- the triangle 140 is attached to the instrument 60 by a connector 148 , and the instrument is tracked as described above.
- a similar V-shaped structure 150 comprising identifiable leg segments 152 and 154 (shaded) provides a similar position and orientation vector analogous to the previous examples.
- FIG. 3 presents several further embodiments of the present invention that are useful in certain applications.
- a plate 160 or similar structure has detectable areas 162 , 164 , and 166 (shaded).
- a connector 168 couples the plate 160 to the instrument 60 .
- the plate 160 with its identifiable multiple areas, is a disposable sterile-packed device which can be detachably coupled to the connector 168 .
- the detectable areas 162 , 164 , and 166 can be, for example, reflective disks that are adhesively affixed to the plate 160 in particular positions that are recognizable and indexed by the camera system 10 in conjunction with the processor 42 and the computer 44 (FIG. 1).
- the concept of a disposable, single use, sterile-packed, optically detected index marker such as that shown in FIG. 3A has several advantages over non-disposable, more expensive devices.
- the plate 160 can be coupled to the connector 168 in a pre-calibrated or a non-precalibrated orientation. If calibrated, it will have a known relationship to the instrument 60 and any focal points, features, or directions thereof. If non-precalibrated, the plate 160 could simply be “stuck” onto the connector 168 and used in an intraoperative calibration procedure to determine translations, rotations, and other transformations of the plate 160 and instrument 60 prior to defining the movement and relative orientation of the instrument 60 . The process of intraoperatively calibrating positions, directions, and orientations of the instrument 60 is facilitated by an intraoperative calibration holder (not shown; see the products of Radionics, Burlington, Mass.).
- FIG. 3B another plate-like index structure is shown.
- a plate 170 is attached to the instrument 60 by a connector 172 .
- the dome-shaped structures 174 and 176 comprise embedded illumination devices (e.g., LEDs).
- the dome-shaped structures can include surface-mounted illumination devices, or can simply be made from reflective material. The dome-shaped structures 174 and 176 are then detectable by the camera system 10 , as described above.
- the camera system 10 can detect their surfaces and average the three-dimensional positions of the surface points to identify a centroid which may, for example, be the center of a sphere or a hemisphere. Accordingly, there can be several of these spherical or dome-shaped structures on the plate 170 in a pattern or array.
- the structures can be in a linear array, on the corners of a triangle, on the corners of a square, or in a multiple indexed array to provide position, orientation, and transformation information to a system according to the invention.
- FIG. 3C yet another plate-like index structure in accordance with the present invention is shown.
- a plate 180 is attached to the instrument 60 in a similar fashion to that described above.
- reflective patterns 182 and 184 here in the form of diamonds or other multi-sided objects.
- Such patterns are identifiable by the camera system 10 and its analysis system to discriminate them from other objects in the field, just as is done in all the previous examples.
- the patterns 182 and 184 are square or diamond-shaped patches of reflective paint or tape; alternatively, they could be brightly colored surfaces with different colors to be detected by the camera system 10 .
- a background surface 186 on the plate 180 may be of opaque, black character so that the linear edges between the patterns 182 and 184 and that surface 186 , for example, have a sharp optical delineation. This makes it simpler for the camera system 10 and its processor 42 , and computer 44 to detect such an edge. If the edge is straight, then detection along the lined contour can readily be performed by well-known analysis methods. This can give precise linear directions which in turn can define the vector and positional orientation of the entire plate 180 , and thus the orientation of the instrument 60 , with high accuracy.
- a plate 190 is shown in a somewhat triangular or trapezoidal shape. It has on it linear structures 191 and 192 , which may be reflective edges or other patterns laid down or fastened to the surface plate 190 .
- the linear structures 191 and 192 provide contrast for optical discrimination by being highly reflective or very brightly colored surfaces that are detectable by and analyzable by the camera system 10 , as described above.
- the linear borders on both sides of the structures 191 and 192 make possible linear discrimination analysis of these surfaces and also, by mutual information theory, an easily recognizable pattern.
- the pattern is a non-parallel linear or V-shaped pattern of the elements 191 and 192 .
- Such a V-shaped pattern corresponds to and defines two vectors, which in turn can define the plane and orientation of the plate 190 , and thus the instrument 60 .
- the instrument 60 is provided with three spherical elements 193 , 194 , and 195 in a linear configuration, each of which is made to be reflective or light-emitting. Three centroids corresponding to the spherical elements 193 , 194 , and 195 can then be determined, and the position and orientation of the instrument 60 follows.
- the instrument 60 bears three spherical elements 196 , 197 , and 198 in a triangular configuration, each of which is reflective, light-emitting, or otherwise optically detectable.
- the centroids of the three spherical elements 196 , 197 , and 198 are determinable by the system; the centroids define a plane that specifies the orientation of the instrument 60 .
- FIG. 4A a solid three-dimensional optically detectable structure is attached to the instrument 60 or comprises part of the instrument 60 itself.
- the structure includes a rod 200 which is attached by coupler 202 to a sphere 204 .
- the rod 200 and the sphere 204 comprise reflective or distinctly colored material detectable by the camera system 10 .
- the reflective rod 200 has the advantage that from all directions it has a similar linear shape, the edges of which are discriminated by the camera system 10 and detected by linear edge detection.
- a centroid axis 206 can therefore be calculated for the rod 200 by the processor 42 and the computer 44 (FIG. 1).
- the reflective sphere 204 defines a centroid 208 which can be detected by spherical edge detection of the sphere 204 and appropriate centroid calculation in the processor 42 and the computer 44 .
- the combination of the axis 206 and the centroid 208 determines the plane defined by the sphere 204 and the rod 200 , and thus the orientation and position of the instrument 60 .
- a solid prism-shaped object 210 is coupled by a connector 212 to the instrument 60 .
- On the sides of the object 210 namely a right side 214 and a left side 216 , there are respective reflective areas 218 and 220 (shaded), which can be polished, painted, reflective paint, or reflective tape surfaces. Their position and direction determine the orientation of the object 210 , and therefore by transformation the orientation and position of the instrument 60 .
- a solid prismoidal structure 222 has distinguishing optically detectable markings which perform as a signature of the instrument 60 to which it is attached.
- shaded area 224 On one face of the structure 222 , there is shaded area 224 having a distinct shape.
- shaded areas 226 and 228 On another face, there are two separate shaded areas 226 and 228 having distinguishable size and shape characteristics.
- the camera system 10 can determine by the size and shape characteristics of the shaded areas 224 , 226 , and 228 the orientation and position of the structure 222 , and thus the orientation and position of the instrument 60 .
- a large number of different and identifiable objects such as the structure 222 can be used to distinguish one tool from another.
- the detectable faces on different sides of the structure 222 will ensure that the structure 222 is identifiable from nearly any direction of view by the camera system 10 .
- Patterns such as bar codes or distinguishable line or object orientations can be used to encode the structure 222 (and thereby the instrument 60 ), allowing each different type of instrument to be recognizable via pattern recognition algorithms implemented in the processor 42 and the computer 44 .
- FIGS. 2, 3, and 4 While most of the embodiments described above (in FIGS. 2, 3, and 4 ) include a connector to couple an optically detectable structure to the surgical instrument 60 , it should be noted that the objects, shapes, and patterns in the above examples can generally be built integrally into the instrument 60 itself. The very shape of the instrument may be optically detectable and classified and tracked by the camera system 10 and other processing elements, as described above.
- FIGS. 1, 2, 3 , and 4 have the advantage of providing optically coupled, non-mechanically coupled, wireless tracking.
- the marker objects of FIGS. 2, 3, and 4 can be made simply, economically, lightweight, and sterilizable or sterilely packaged.
- Each embodiment has practical advantages relative to the frame-based or mechanically-linked space pointers given as examples in the background section above.
- FIG. 5 illustrates the operative functionality of a system according to FIG. 1.
- the surgical instrument 60 has an optically detectable index structure 230 .
- a dynamic referencing head clamp 232 with index marks 234 , 236 , and 238 is present; the clamp 232 further includes an additional index marker 240 .
- a processor 242 and a computer 244 convert camera data from the camera system 10 for an image display 246 , which shows a representation of the position of the instrument 60 as a dashed line 248 relative to an anatomical structure 250 .
- a predetermined point on the instrument 60 such as a tip or a focal point, is indicated relative to the anatomical structure 250 as a point 252 . Examples of such coordinated display of probe orientation and image data is given in the product of OTS by Radionics, Burlington, Mass.
- the processor 242 and the computer 244 are also capable of generating a separate representation 254 of the position of the instrument 60 .
- the separate representation 254 displays in a two- or three-dimensional form 256 the position of the instrument 60 in comparison to an anatomical rendering 258 , along with other optional representations of probe, anatomy, or target points such as a target point 260 .
- the separate representation 254 is reconstructed from two-dimensional or three-dimensional image data such as CT or MR scans taken of the patient previously or contemporaneously in a real-time image scanner during surgery or treatment.
- three-dimensional analysis of the position of the instrument 60 can be accomplished by determined by the stereoscopic cameras 12 and 16 , together with the processor 42 and the computer 44 . This can be done based on LED or reflective infrared light processing, or alternatively based on direct visible-light video processing of information from the two cameras 12 and 16 . It can be advantageous to provide the cameras 12 and 16 with infrared optical filters. If the optically detectable objects used in the system are infrared LEDs or if the cameras have pulsed infrared light sources near them, then filtering will increase the signal-to-noise ratio of the tracking signal and reduce the effect of any ambient light background.
- a third camera 14 is provided (see also FIG. 1).
- the third camera 14 is preferably a standard video camera which views the surgical field.
- the processor 42 and the computer 44 further display the view from the third video camera 14 in an additional display 262 .
- a direct video view of the patient 264 is available.
- a view of the instrument 60 (seen as an instrument image 266 with an index marker image 268 ) is seen from actual video.
- a virtual extrapolation of the probe shown as a dashed line 270 with a tip or target point 272 , can be determined from the analysis shown on the alternative representation 254 .
- this virtual extrapolation is overlaid directly onto the additional display 262 so that direct comparison of the reconstructed three-dimensional navigation image of the alternative representation 254 can be compared to an actual video image on the additional display 262 .
- Correspondence and registration between a reconstructed image and an actual image in this way confirms the correctness of the probe orientation, and consequently the virtual position of unseen elements such as probe tip and probe position, for example in the depths of the surgical wound.
- a hybrid of reconstructed stereoscopic tracking by one set of cameras e.g., the cameras 12 and 16
- can be displayed and coordinated with respect to video imaging from another set of cameras e.g., the video camera 14 ).
- All cameras may be of the visible video type, or some may be filtered infrared (or other spectral filtered types) used with others of the visible video type.
- the cameras 12 and 16 used for tracking are infrared filtered cameras; while the additional video camera 14 observes the visual spectrum. Accordingly, offering a comparison between the views provided by the separate cameras is a useful quality assurance check of the integrity of the entire tracking system.
- FIG. 6 another embodiment of the present invention involves a battery-powered optically detectable index structure 280 associated with an instrument 282 .
- a camera system 284 comprises three cameras 286 , 288 , and 290 , which in the disclosed embodiment are linear infrared CCD cameras (see for example the IGT product, Boulder, Colo.). Data signals are processed by a processor 292 , and these can be sent to a computer system, as described above (see FIG. 1).
- the instrument 282 is shown generically; the optical index structure 280 comprises LED emitters 294 , 296 , and 298 which in a preferred embodiment are of an infrared-emitting type.
- the emitters 294 , 296 , and 298 define a plane of light which can be transformed to specify the position of the instrument 282 to which they are attached.
- the emitters 294 , 296 , and 298 are coupled to a circuit 300 which distributes energy to the LEDs for their illumination.
- the circuit 300 controls the sequence and synchronization of LED lighting.
- a battery 302 is provided to supply power to the circuit 300 and to the emitters 294 , 296 , and 298 .
- the LED emitters 294 , 296 , and 298 are flashed in a coded sequence controlled by the circuit 300 that is detectable by the processor 292 so as to recognize the instrument 282 and the index structure 280 .
- the pattern of positions of the emitters 294 , 296 , and 298 can be used to allow the processor 292 to discriminate what specific instrument 282 is being used.
- a coding scheme can be sent from a transmitter 304 to a receiver 306 coupled to the instrument 282 .
- the receiver 306 accepts light or radio wave signals from the transmitter 304 , which is connected to the processor 292 .
- a synchronization signal representative of the shutter operation from the cameras 286 , 288 , and 290 is sent via the transmitter 304 (as shown by a dashed line 308 ) to the receiver 306 .
- the receiver 306 and the circuit 300 then cause the sequential flashing of the emitters 294 , 296 , and 298 detected by the cameras.
- An optional return signal (represented by a dashed line 310 ) from the receiver 306 to the transmitter 304 can be used to confirm the synchronization of the emitters to the cameras.
- a patient 312 may be in the surgical field with attached optically detectable index elements 314 , 316 , and 318 , plus others as described above.
- These light emitters may also be battery powered or wire powered from either batteries or another source.
- the LED emitters 294 , 296 , and 298 do not consume much power if they are flashed intermittently, and thus the battery 302 comprises a standard type of battery, such as one that might be used to operate a flashlight, camera, or other small appliance. Such batteries can easily be replaced or sterilized at the time of surgery.
- the use of batteries in a surgical instrument is advantageous in that the system is wireless and mechanically de-coupled from the camera system and its processor.
- light sources may be used near to the cameras to produce reflected light from reflecting optically-detectable objects.
- the optically detectable objects can alternatively have bright, colored, or shiny surfaces or have contrasting patterns of light and dark or alternately colored shapes and patterns to be detectable by cameras in ambient light.
- the ambient light By arranging the ambient light to shine appropriately on a surgical, diagnostic, or therapeutic setting, objects can be recognized directly by the camera system 10 as shown in FIG. 1.
- the use of additional lights near the cameras can enhance the reflection from optically detectable objects in certain clinical settings where ambient light may not be sufficient, or where high degrees of light contrast, such as from surgical head holders, microscope lights, or operating theatre lights may cause difficulty in discriminating light levels from the detectable objects.
- various illumination possibilities can easily be devised in accordance with the present invention to facilitate detection and data processing of the camera and video information to suit the clinical context.
- a camera and light reflection processing function specifies that the camera system 10 (FIG. 1) detects an instrument with an optically detectable object attached to it. This is done with a camera system as described above, wherein camera data from infrared filtered cameras of various kinds and/or video cameras is provided to a pattern data processing function (block 322 ).
- the pattern data processing function 322 receives data from the camera and light reflection processing function 320 , allowing the instrument is recognized by pattern recognition algorithms operating on stereoscopic data received from the camera system 10 .
- the nature of the instrument can also be recognized by way of time or geometric sequencing or arrangements of light-emitting or light reflecting objects or patterns on the instrument, as described above.
- a visible video camera may also be used in combination with filtered or unfiltered cameras as a confirmational step.
- a video processing function (block 324 ) is provided to carry out the reception and processing of such visible video data.
- the output data from such video processing is sent to a computer processing function (block 326 ), along with the instrument tracking data from the pattern data processing function 322 .
- the computer processing function 326 may also accept image scan data which is either taken prior to operation or contemporaneously during the operation, as illustrated by an image data processing function (block 328 ).
- the computer processing function 326 can then combine or merge a combination of pattern recognition and tracking data (from the pattern data processing function 322 ), the visible video data (from the video processing function 324 ), and image scan data (from the image data processing function 328 ), so as to display it in various forms via a display function (block 330 ).
- a useful quality assurance check would be, for example, to overlay visible video data onto the combined representations of the image scan data and of the surgical instrument as it moves relative to the anatomy.
- the video data shows in real time the position of an instrument relative to the anatomy, or the relative position of instruments relative to each other, within the field of surgical view.
- a rendering of the reconstructed position of a surgical instrument relative to the overlaid anatomy, or compared side-by-side to the actual visible video view of the instrument relative to the anatomy is a strong confirmational step to show that the tracking is being done properly.
- FIG. 7 can apply to camera and video detection in the surgical setting, a diagnostic suite, or in connection with treatment planning process and instrumentation.
- a real time diagnostic or intraoperative imaging machine such as a CT, MR, PET, X-ray, or other scanner would be another context for the process in FIG. 7.
- a patient registration data processing function (block 332 ), which represents the step of registering or calibrating instrumentation or apparatus relative to a patient, prior to performing a procedure with the tracked instrument.
- the registration step may be predetermined or determined during the clinical setting in a variety of ways, as described above.
- a set of multiple camera images (stereoscopic images for the case of two or more two-dimensional cameras) is acquired (step 340 ) from the camera system 10 (FIG. 1). Any markers present in the stereoscopic images are then detected (step 342 ) as described above. For example, when two two-dimensional CCD cameras are used, there are two frames in a set of stereoscopic images, namely a left frame (from the left camera 16 ) and a right frame (from the right camera 12 ).
- the detected markers will appear in slightly different positions in the two frames, so the positions are then correlated (step 344 ).
- the difference in a marker's position between the two frames is used to determine depth (i.e., distance from the camera system 10 ) in three dimensions.
- depth i.e., distance from the camera system 10
- more than two cameras may be used in the present invention; the additional cameras can be used to verify the stereoscopic images or to provide further accuracy or definition.
- the images are further processed to determine the positions of the markers in three-dimensional space by transforming the markers (step 346 ) into a coordinate system defined by the camera system 10 .
- this step is performed in varying ways depending on the nature of the markers in the field of view. For example, a spherical marker will define a centroid, while a rod-shaped or flat marker will define an axis. Accordingly, the unique set of centroids, axes, and other characteristics in the coordinate system of the cameras can be used to identify the position of the object being tracked (step 348 ). This information is used in the operation of the system as described below.
- FIG. 9 illustrates, in one exemplary embodiment, how the various objects are tracked by the system to generate one or more displays, as described above.
- the location of the surgical instrument 60 (FIG. 1) is identified with respect to the camera system 10 , as described in conjunction with FIG. 8. A set of coordinates is generated thereby. Those coordinates specify the position of the instrument 60 , and further specify a transformation between the coordinate system of the camera system 10 and a coordinate system associated with the instrument. This may involve, for example, index point registrations from the patient's physical anatomy to image scan data, as described previously.
- the location of the patient 22 is identified (step 352 ) with respect to the camera system 10 . Again, the coordinates specify the position of the patient 22 and a coordinate transformation between the camera system and the patient.
- the location of the surgeon 76 is identified (step 354 ), as above.
- a desired view is selected (step 356 ) by the surgeon or other operator.
- a “surgeon's eye” view is possible by transforming the instrument position and the patient position into the surgeon's coordinate system.
- An “instrument's eye” view is possible by transforming the patient position into the instrument's coordinate system.
- a patient-centered system is possible by transforming the instrument position into the patient's coordinate system.
- the desired transformations of the instrument position (step 358 ) and the patient position (step 360 ) are then performed.
- a display is generated (step 362 ) based on the transformed positions (see FIG. 5).
- the display can comprise only a reproduction of the instrument in relation to a reproduction of the patient's anatomical structures (for example, based on reconstructions from image scan data from CT, MR, or other types of scans), or can include an overlaid video view from a video camera 14 on the camera system 10 or a video camera 82 on the surgeon 76 .
- the patient's anatomical data can be manipulated in various ways well known in the art to provide slice, cutaway, or contour views, among others.
- further coordinate transformations can optionally be provided to allow operator control over the views on the display, for example to slightly displace a view from a true “instrument's eye” view.
- Steps 350 - 362 are repeated as necessary to update the display with the various object positions in real time or close to real time.
Abstract
Camera system in combination with data processors, image scan data, and computers and associated graphic display provide tracking of instruments, objects, patients, and apparatus in a surgical, diagnostic, or treatment setting. Optically detectable objects are connected to instrumentation, a patient, or a clinician to track their position in space by optical detection systems and methods. The recognition of instruments by patterns of optically detectable structures provides data on three-dimensional position, orientation, and instrument type. Passive or active optical detection is possible via various light sources, reflectors, and pattern structures applicable in various clinical contexts.
Description
- This is a continuation of application Ser. No. 09/014,840, filed on Jan. 28, 1998, which is a continuation-in-part of application Ser. No. 08/475,681, filed on Jun. 7, 1995, U.S. Pat. No. 6,006,126, which is a continuation-in-part of application Ser. No. 08/441,788, filed on May 16, 1995, U.S. Pat. No. 5,662,111, which is a continuation of application Ser. No. 08/299,987, filed Sep. 1, 1994, now abandoned, which is a continuation of application Ser. No. 08/047,879, filed Apr. 15, 1993, now abandoned, which is a continuation of application Ser. No. 07/941,863 filed on Sep. 8, 1992, now abandoned, which is a continuation of application Ser. No. 07/647,463 filed on January 28, 1991, now abandoned.
- The invention relates generally to medical equipment used in the surgical treatment of disease, and more particularly to a system and method for medical instrument navigation by optically tracking the positions of instruments used during surgery or other treatments in relation to a patient's anatomy.
- Image guided stereotaxy is widely used in the field of neurosurgery. It involves the quantitative determination of anatomical positions based on scan data taken from a CT, MRI or other scanning procedures to obtain three-dimensional scan data. Typically, the image scan data is placed in a computer to provide a three-dimensional database that may be variously used to provide graphic information. Essentially, such information is useful in surgical procedures and enables viewing a patient's anatomy in a graphics display.
- The use of image guided stereotactic head frames is commonplace. For example, see U.S. Pat. No. 4,608,977 issued Sep. 2, 1986 and entitled, System Using Computed Tomography as for Selective Body Treatment. Such structures employ a head fixation device typically with some form of indexing to acquire referenced data representative of scan slices through the head. The scan data so acquired is quantified relative to the head frame to identify individual slices. A probe or surgical instrument may then be directed to an anatomical feature in the head by mechanical connection to the head frame based on scan data representations. Three-dimensional scan data has been employed to relate positions in a patient's anatomy to other structures so as to provide a composite graphics display. For example, a mechanically linked space pointer (analogous to a pencil) attached to the end of an encoded mechanical linkage might be directed at a patient's anatomy and its position quantified relative to the stereotactic scan data. The space pointer might be oriented to point at an anatomical target and so displayed using computer graphics techniques. Such apparatus has been proposed, using an articulated space pointer with a mechanical linkage. In that regard, see an article entitled “An Articulated Neurosurgical Navigational System Using MRI and CT Images,” IEEE Transactions on Biomedical Engineering, Volume 35, No. 2, February 1988 (Kosugi, et al.) incorporated by reference herein.
- The above-described systems have at least two disadvantages of note. First, the head frame and the articulated space pointer are mechanically connected to an apparatus used to measure and calculate the position of the probe or pointer. Consequently, although a relatively high number of degrees of freedom can be provided to the pointer (or other tool coupled to the pointer), the mechanical linkage may still restrict the possible ranges of motion available to the clinician. Furthermore, the linkages may be large and obtrusive, and can be difficult to sterilize.
- Second, although the apparatus tracks the position of the space pointer in relation to the patient's anatomy, the clinician is still free to move about the patient and operate from any desired position. This is not reflected by the data produced by the device. Accordingly, although a “pointer's eye” view of the surgical field can be provided, if the clinician is operating from any of various other angles, then any graphical representation of the surgical field may be disorienting, confusing, or not representative of the “surgeon's eye” view. Although the system's point-of-view might be selected and altered manually, this is not an optimum solution, as it requires additional steps to be taken by the clinician or an assistant.
- In light of the above considerations, the need for relating external treatment apparatus or surgical viewing directions to a specific target arises in several aspects. For example, the need arises in relation to the treatment of internal anatomical targets, specifically to position and maintain such targets with respect to a surgical instrument such as a probe, a microscope with a specific direction and orientation of view, or an X-ray treatment beam associated with a large external apparatus. Thus, a need exists for methods for aligning a surgical instrument, probe, or beam not attached by any mechanical linkage, to impact specific anatomical targets via a path selected to avoid injury to other critical anatomical structures. A further need exists for the capability to show the operating clinician a view of the patient's anatomy and the surgical tool from a perspective that is natural to the clinician, and not disorienting or confusing. Further, there is a need for an economic, compact, and wireless system and method to track instruments in clinical applications.
- Generally, in accordance herewith, an optical camera apparatus functions in cooperation with a computer system and a specially configured surgical instrument. In an embodiment of the invention, the camera system is positioned to detect a clinical field of view and to detect index markers on a surgical instrument, a patient, and/or a surgeon. The markers are tracked by the camera apparatus. The image scan data (such as from a CT or MR scan of the patient's anatomy) and data specifying the position of the instrument and the surgeon are transformed relative to the patient's anatomy and the camera coordinate system, thereby aligning the scan data, patient position and orientation data, instrument position and orientation data, and surgeon position and orientation data for selectable simultaneous viewing on a computer display.
- Various exemplary embodiments are given of the use of lines, arrays of points, geometric patterns and figures, lines of light, and other optically detectable marker configurations to identify the position and orientation of a surgical instrument, a patient, and a surgeon. The disclosed embodiments have the advantage of being wireless and optically coupled to the camera tracking system. Moreover, they can be relatively economical and lightweight in comparison to the mechanically coupled tracking devices described in the background section above. Once the positions of the instrument, patient, and surgeon have been determined with respect to a common coordinate system, a simulated view of the instrument and the patient can be provided on a display device in a manner that is comfortable and convenient to the surgeon. In an embodiment of the invention, the simulated view is overlaid with an actual live video display to further orient the surgeon.
- In the drawings, which constitute a part of this specification, embodiments are exhibited in various forms, and are set forth specifically:
- FIG. 1 schematically illustrates a system for optically tracking instruments and other objects in a surgical field in accordance with the present invention;
- FIG. 2, which includes FIGS. 2A, 2B,2C, 2D, 2E, 2F, 2G, and 2H, illustrates various configurations of optically detectable geometric objects and patterns associated with objects to be tracked in accordance with the system of FIG. 1;
- FIG. 3, which includes FIGS. 3A, 3B,3C, 3D, 3E, and 3F, illustrates various optically detectable objects attached to instruments in accordance with the present invention;
- FIG. 4, which includes FIGS. 4A, 4B, and4C, illustrates additional alternative embodiments of optically detectable objects in accordance with the present invention;
- FIG. 5 schematically shows several combinations of graphics, video, and reconstructed representations derived from optically tracking of a surgical field;
- FIG. 6 schematically shows a battery-powered optically tracked instrument for use in accordance with the present invention;
- FIG. 7 illustrates the functions performed in the combined processing of tracking, videos, and/or image data in a display in accordance with the present invention;
- FIG. 8 is a flowchart showing the sequence of steps performed in tracking an optically detectable object; and
- FIG. 9 is a flowchart illustrating the sequence of steps performed in generating a display when a surgical instrument, a patient, and a surgeon are all tracked by a system in accordance with the invention.
- Referring initially to FIG. 1, an embodiment of a system according to the invention is shown schematically as including a
camera system 10 that has a field of view that includes multiple elements. The elements can include a surgical field for surgical application or a treatment field for therapy applications. Part of the patient'sbody 22 may or may not be in the camera field. Mounted to the patient within the camera field are several optically detectable objects such asmarkers identifiers structure 38 that is rigidly connected to the patient'sbody 22. - The
markers identifiers - The
identifier 36 may include a reflective surface of triangular shape, for example, that is detectable in spatial position and orientation by thecamera system 10. In this way, the patient's position and orientation can be detected with respect to the coordinate system of thecamera system 10; this procedure will be discussed in further detail below. - The
camera system 10 comprises one or more cameras, each of which can be selected from optical cameras of various known types. In FIG. 1, three cameras are shown as part of thecamera system 10. In the disclosed embodiment, a right-mountedcamera 12 and a left-mountedcamera 16 are capable of resolving two-dimensional images. The dashedlines 40 illustrate the field of view of the right-mountedcamera 12; the left-mountedcamera 16 has a similar (but displaced) field of view. The cameras provide optical camera data toprocessor 20 related to optically detectable objects in the common field-of-view of the cameras included in thecamera system 10. For example, for the multiple-camera system 10 includingcameras processor 42. Thus, in accordance with the invention, the positions and orientations of objects within the camera system field of view can be determined rapidly by theprocessor 42 and sent to acomputer 44. As will be discussed in further detail below, thecomputer 44 has software to represent the positions and orientations of those objects in camera coordinates and display the objects in various representations on a display means 46 as desired by the clinician. - Considering now the structure of the
camera system 10, alateral support 18 for thecameras coupler 20 to a rigid reference R, such as the ceiling, wall, or floor of a room. Also shown in FIG. 1 arelight sources cameras line 54 to be reflected off of a reflective optically detectable object such themarker 24 on the patient'sbody 22. Reflected light then returns along a path such as that represented by a dashedline 56, and is detected by thecamera 12. - If the
marker 24 and other markers and identifiers in the field include reflective surfaces, points, lines, or regions, then these structures can be represented as camera data in a three-dimensional coordinate system fixed with respect to thecamera system 10. For example, in one embodiment of the invention, thelight sources cameras markers identifiers - Alternatively, ambient lighting conditions can be used to enable
cameras marker 24, for example, is a brightly colored (white, green, red, etc.) disc, sphere, or other shape that stands out in contrast to whatever is visible in the background, then the marker's position can be detected by the cameras. For example, if theidentifier 30 is bright white, and the surface ofhead clamp structure 38 is dark or black, then theidentifier 30 can be discriminated by thecamera system 10. - As stated above, one or more cameras may be used in the
camera system 10. As is well known in the art, two or more cameras will yield stereoscopic data on objects in the clinical field of view in relation to the camera frame of reference or in camera coordinates. - In an alternative embodiment of the invention, some or all of the optically detectable identifiers (such as
identifiers identifiers cameras elements - Also shown in FIG. 1 is a surgical instrument8. The instrument can be of any known surgical type, including but not limited to probes, cutting devices, suction tubes, endoscopes, electronic probes, and other tools. Attached to the instrument 8 is at least one optically
detectable element 62, which can comprise various geometric structures that are detectable and recognizable bycameras rod indicator 64 is shown in a fixed relationship with aspherical indicator 66. - As discussed above, these
indicators cameras element 62 can then be calculated using thecamera data processor 42 and thecomputer 44. The orientation and position of theinstrument 60 can thereby be determined. A calibration or pre-fixed position of theelement 62 with respect to theinstrument 60 may be performed before surgery or intraoperatively (see, for example, several of the products of Radionics, Burlington, Mass.). As with the other markers and indicators, ifindicators camera system 10. - In addition, light-
detectable indicators surgeon 76. In the disclosed embodiment, theindicators headband 78 worn by thesurgeon 76. This optical detectable array can then be tracked by thecamera system 10 along with the patient'sbody 22 and theinstrument 60. The camera data processed in theprocessor 42 and assimilated in the computer 21 can thereby track in three-dimensional space relative to thecamera system 10 the positions of all elements and their relative orientations. Thus, for example, when theindicators processor 42 can be connected to the surgeon's headband 78 (dashed line) to synchronize the indicators' signals. - By tracking the surgeon via the
headband 78, image data can be provided to thesurgeon 76 via anoptical headset 80 worn by the surgeon. For example, in the disclosed embodiment, theoptical headset 30 is a binocular magnifier with built-in image-splitting elements. Graphic data from theprocessor 42, originating fromimage scan data 48 pre-scanned from the patient, can be sent into the viewing elements of theheadset 80 to update thesurgeon 76 with location data correlated to the surgeon's viewing position. For example, from the surgeon's eye view, as represented by the position defined byindicators computer 44 can be displayed via theheadset 80, thereby permitting the surgeon to see a “reconstructed” view from the direction of his physical perspective. Thecomputer 44 can assimilatehistoric image data 48 and convert it to reconstructed planar images and send that information to adisplay element 46, which thereafter can be “piped” or transmitted to theheadset 80 for the surgeon's use. - Alternatively, the
headset 80 can comprise at least onevideo camera 82 capable of viewing the surgical field from the surgeon's direction. Information from thevideo camera 82 can be sent (via the dashed line) to theprocessor 42 and thecomputer 44 and onto thedisplay 46. Once again, that information can then be reconstructed and displayed via a split screen prism in the surgeon's field-of-view via hisheadset 80. The surgeon's view information can be oriented in a suitable direction by the tracking of theindicators camera system 10, as discussed above. Thus, the video information displayed in theheadset 80 can be rendered from stereotactic camera coordinates. - The
processor 42, in one embodiment of the invention, is a dedicated processor for electronic data from thecamera system 10. Theprocessor 42 is also capable of synchronously controlling thelight emitters patient 22, thehead holder structure 38, theinstrument 60, or thesurgeon 76. Data from theprocessor 42 is sent to thecomputer 44, where it is then analyzed in three-dimensional camera-based coordinates.Image data 48 can be in memory of thecomputer 44 or otherwise transferred to computer 21, as for example optical disk, magnetic tape, etc. The visualization of camera data and image scan data (CT, MR, PET, ultrasound, etc.) is accomplished via thedisplay 46, which in various embodiments can be a CRT, liquid crystal display, heads-up display, or other display device. - The visual image presented by the
display 46 represents the position of theinstrument 60 in terms of orientation, tip position, and other characteristics with respect to theimage scan data 48 in a variety of ways. For examples, see documentation for the OTS product of Radionics, Burlington, Mass. Specifically, cataloging slices, probe view, in-probe reconstructions, three-dimensional wedge views, and other views of theinstrument 60 relative to the patient 22 can be represented on thedisplay 46. Also, the surgeon's view, via registration of the visual headset 80 (by identifying theindicators display 46. Although theinstrument 60 is schematically shown as a pointed instrument in FIG. 1, it should be noted that aninstrument 60 for use with the present invention can be nearly any surgical instrument or device, such as a microscope, an endoscope, a cutting instrument, an ultrasonic imaging probe, or a treatment device such as an X-ray collimation device for a linear accelerator (LINAC). There are many other possibilities, as well. - The objects in this field of view of the
camera system 10 can be tracked in the three-dimensional coordinate space of thecamera system 10. Theinstrument 60 can be calibrated relative to the patient 22 in a variety of ways (see the OTS Tracking System of Radionics, Burlington, Mass. for examples). In one embodiment of the invention, during a calibration procedure, theinstrument 60 is touched to a plurality of fiducial markers placed on the patient 22 (for example, themarkers structure 38 is connected or associated with such an imaging apparatus, and so on. As stated, the processor 42 (or the computer 44) uses such data in a calibration step so that the position of theinstrument 60 is in a known position and orientation relative to the patient 22 or thestructure 38 affixed to thepatient 22, or even with respect to apparatus elsewhere in the room such as a linear accelerator, an image scanner, or an apparatus on a surgeon (theheadband 78, for example). - Referring now to FIG. 2, various embodiments of patterns, shapes, and objects for the optically detectable elements that can be used on, for example, the instrument60 (FIG. 1) or the
patient 22, thesurgeon 76, a microscope, or other surgical device not shown. In FIG. 2A, thesurgical instrument 60 is rendered schematically. Although theinstrument 60 is depicted in the embodiment set forth in FIG. 2, it should be noted that similar or identical configurations can be used on thepatient 22, thestructure 38, thesurgeon 76, or any other implement to be tracked. In the disclosed embodiment, theinstrument 60 has a surgical axis (dashed line 84) and a focal point, end point, isocenter, or othercharacteristic point 86. It can have other independent axes such as those illustrated by dashedlines geometric object 88, specifically a triangle, is attached to theinstrument 60 by a connector 90. In the illustrated embodiment, the connector 90 is a rigid coupling and is in a predetermined relationship with theinstrument 60; alternatively, it could be in an arbitrary relationship with theinstrument 60 and subject to calibration. The geometric 88 bears a bright portion 92 (the hatched area) on its surface. Thebright portion 92 of the surface of thegeometric object 88 may comprise reflective paint, reflective film, a brightly colored surface in a particular color spectrum, or an illuminated field. Thecamera system 10 is represented here only schematically, but could comprise the elements described in FIG. 1, including cameras, light sources, a processor, a computers, image data, and a display, among other items. Further, it should be noted that although thegeometric object 88 and itsbright portion 92 are specifically described and shown as triangular in configuration, many other shapes are possible and equally operative in the context of the invention, which is not so limited. - The position and orientation of the
instrument 60 can be determined by tracking the position and orientation of thegeometric object 88. In various forms, theinstrument 60 may be a rigid body of complex shape. Its position, for example, may be characterized by axes such as 84, 85, and 87, and its orientation around anaxis 84 may be characterized by a rotation angle indicated by anarrow 83. By calibrating thegeometric object 88 to theinstrument 60, thisrotation angle 83 and the position and orientation of theaxes camera system 10. This can be done by rigid body transformations which are well known to those skilled in matrix mathematics. Thus, for example, if theinstrument 60 is an endoscope or a microscope for which theaxis 84 represents a viewing direction, thecharacteristic point 86 is a point desired to be viewed in the surgical field, andangle 94, and if theaxes geometric object 88 will provide position and orientation tracking of the endoscopic or microscopic field of view. - Detecting the edges of the
bright portion 92 in the three-dimensional coordinate system relative to thecamera system 10 enables the direction and orientation of thegeometric object 88 to be determined. By calibrating or precalibrating the orientation of thegeometric object 88 relative to theinstrument 60, specifically itsaxis 84 and characteristic point 86 (including other axes such asaxes instrument 60 can be accomplished (see for example the OTS Optical Tracking System of Radionics, Burlington, Mass.). Thecamera system 10, theprocessor 42, and the computer 42 (FIG. 1) are adapted to detect edges such as aline 94 between thebright portion 92 and the remainder of thegeometric object 88, as well as the other respective edges of the triangle or geometric shape. This may be accomplished by differential detection of the shaded area of the triangle versus the perimeter band, which may not be of reflective, brightly colored, or illuminating optically detectable material. Edge detection of geometric shapes can be done by well-known segmentation or detection algorithms in theprocessor 42 or thecomputer 44. Three non-collinear points define a plane; additional data can be used to define position and orientation within the plane. - Referring now to FIG. 2B, another type of index structure is shown. The index structure comprises a four-sided
geometric shape 96 having a shadedband 98 which may be of reflective or bright material. Inside is a relativelydark area 100 which may be of non-reflective material. Alternatively, the roles of the shadedband 98 anddark area 100 could be reversed. Thecamera system 10 detects this object and the linear edges of theband 98 or thedark area 100. This establishes the position and orientation of theshape 96. As with the other index structures disclosed herein, theshape 96 is attached by aconnector 102 to theinstrument 60. - Such a
shape 96 could be easily made. The differentially reflective areas (i.e., the shadedband 98 and the dark area 100) can be sprayed on, etched, or deposited on by a masking process; any of these procedures would be inexpensive and lead to very sharp linear borders between the two regions. These borders can then be detected by thecamera system 10 via linear discrimination algorithms in theprocessor 42 and the computer 44 (FIG. 1). If theshape 96 is a parallelogram or a square, the orientation of the plane of theshape 96 can easily be determined by vector cross-product calculations of the linear positions of the borders in three-dimensional space with the edges of the object. As with all the examples in FIG. 2, theconnector 102 is optional; if theshape 96 is integrally part of the tool orinstrument 60, viz. part of its handle, then anexplicit connector 102 would not be needed. - Referring to FIG. 2C, the
instrument 60 has attached to it an opticallydetectable shape 104 in the form of a solid or a plate. On it are variousgeometric patterns shape 104. The patterns can be circles, domes, spheres, or ellipsoids which are detectable by thecamera system 10. Theshape 104 may be flat or curved, according to needs. In an embodiment of the invention, one of the patterns,e.g. pattern 110, has a more linear structure which is distinguishable from curvilinear shapes such asshapes camera system 10. In this embodiment, thepattern 108 has an annular shape with ahole 112 in the middle to distinguish it from a dot-shapedpattern 106. The combination can uniquely identify and locate theshape 104, and therefore theinstrument 60, in its orientation and position. Thevarious patterns instrument 60. For example, the tool could be a special forceps, and theshape 104 with its distinguishing optical characteristics, could be known to thecamera system 10 and its associatedcomputer system 44 to be a particular type of forceps. Similarly, other specific tools can have different optically detectable signature structures. - Referring to FIG. 2D, a flat
detectable shape 114 is shown. Theshape 114 hasorthogonal bar patterns shape 114. These patterns are recognizable and distinguishable by detecting the borders, such as aline 120 between thepatterns camera system 10 could easily scan such a geometric linear pattern in distinguishing the linear bar patterns, thereby determining the orientation of thepatterns shape 114 and its position in space can be determined in the coordinates of thecamera system 10. A fixed relationship between theinstrument 60 and theshape 114 via aconnector 122 can then be used to identify the position and orientation of theinstrument 60 in all of its movements within the field of view of thecamera system 10. - FIG. 2E shows yet another embodiment of the present invention with shows a
linear rod 124 and aspherical object 126 coupled together. For instance, areflective surface 128 on the rod 124 (shaded in the drawing) could be taped or painted onto therod 124. On the end of the rod, thespherical object 126 bearing reflective tape or paint is, in the disclosed embodiment, coaxial with the paintedsurface 128 of therod 124. Thecamera system 10 is capable of recognizing the linear form of therod 124 and the center of thespherical object 126. Accordingly, a detection algorithm in the computer 44 (FIG. 1) could determine the linear configuration and central axis of therod 124, and the centroid point of thespherical object 126, thereby determining a vector direction along the axis of therod 124 and a uniquely identified endpoint at thespherical object 126. Therod 124 and thespherical object 126 are joined by aconnector 128 to theinstrument 60, thereby specifying the position and orientation of theinstrument 60 with respect to thecamera system 10. - Referring to FIG. 2F, another example of the present invention comprises a
longitudinal rod 130 with a reflective linear surface 132 (shaded) and anorthogonal rod 134 with tworeflective segments 136 and 138 (shaded). These linear structures again are detectable by thecamera system 10, thereby determining the orientation of the plane defined by thelongitudinal rod 130 and theorthogonal rod 134. As described above, this is information is then used to determine the orientation and movement of theinstrument 60, which is coupled to therods connector 139, in three-dimensional space. - FIG. 2G shows yet another example of rod-like structures in a
triangle 140. The shadedlinear segments triangle 140 and define the plane and orientation of thetriangle 140. Thetriangle 140 is attached to theinstrument 60 by aconnector 148, and the instrument is tracked as described above. - Referring to FIG. 2H, a similar V-shaped
structure 150 comprisingidentifiable leg segments 152 and 154 (shaded) provides a similar position and orientation vector analogous to the previous examples. - FIG. 3 presents several further embodiments of the present invention that are useful in certain applications. In FIG. 3A, a
plate 160 or similar structure hasdetectable areas 162, 164, and 166 (shaded). Aconnector 168 couples theplate 160 to theinstrument 60. In one embodiment of the invention, theplate 160, with its identifiable multiple areas, is a disposable sterile-packed device which can be detachably coupled to theconnector 168. Thedetectable areas plate 160 in particular positions that are recognizable and indexed by thecamera system 10 in conjunction with theprocessor 42 and the computer 44 (FIG. 1). The concept of a disposable, single use, sterile-packed, optically detected index marker such as that shown in FIG. 3A has several advantages over non-disposable, more expensive devices. Theplate 160 can be coupled to theconnector 168 in a pre-calibrated or a non-precalibrated orientation. If calibrated, it will have a known relationship to theinstrument 60 and any focal points, features, or directions thereof. If non-precalibrated, theplate 160 could simply be “stuck” onto theconnector 168 and used in an intraoperative calibration procedure to determine translations, rotations, and other transformations of theplate 160 andinstrument 60 prior to defining the movement and relative orientation of theinstrument 60. The process of intraoperatively calibrating positions, directions, and orientations of theinstrument 60 is facilitated by an intraoperative calibration holder (not shown; see the products of Radionics, Burlington, Mass.). - Referring to FIG. 3B, another plate-like index structure is shown. A
plate 170 is attached to theinstrument 60 by aconnector 172. On the surface of theplate 170, there are dome-shapedstructures structures structures camera system 10, as described above. If the dome-shaped structures have spherical or convex surfaces, then thecamera system 10 can detect their surfaces and average the three-dimensional positions of the surface points to identify a centroid which may, for example, be the center of a sphere or a hemisphere. Accordingly, there can be several of these spherical or dome-shaped structures on theplate 170 in a pattern or array. The structures can be in a linear array, on the corners of a triangle, on the corners of a square, or in a multiple indexed array to provide position, orientation, and transformation information to a system according to the invention. - Referring to FIG. 3C, yet another plate-like index structure in accordance with the present invention is shown. A
plate 180 is attached to theinstrument 60 in a similar fashion to that described above. On the surface of theplate 180 arereflective patterns camera system 10 and its analysis system to discriminate them from other objects in the field, just as is done in all the previous examples. For example, in the disclosed embodiment, thepatterns camera system 10. Multiple arrays or groups of such diamond-shaped patterns with differential reflective and non-reflective areas are possible to facilitate discrimination by thecamera system 10. For example, abackground surface 186 on theplate 180 may be of opaque, black character so that the linear edges between thepatterns surface 186, for example, have a sharp optical delineation. This makes it simpler for thecamera system 10 and itsprocessor 42, andcomputer 44 to detect such an edge. If the edge is straight, then detection along the lined contour can readily be performed by well-known analysis methods. This can give precise linear directions which in turn can define the vector and positional orientation of theentire plate 180, and thus the orientation of theinstrument 60, with high accuracy. - Referring now to FIG. 3D, yet another plate-like index structure is shown. A
plate 190 is shown in a somewhat triangular or trapezoidal shape. It has on itlinear structures surface plate 190. Thelinear structures camera system 10, as described above. The linear borders on both sides of thestructures elements plate 190, and thus theinstrument 60. - In FIG. 3E, the
instrument 60 is provided with threespherical elements spherical elements instrument 60 follows. - In the embodiment of FIG. 3F, the
instrument 60 bears threespherical elements spherical elements instrument 60. - Turning now to FIG. 4, in FIG. 4A a solid three-dimensional optically detectable structure is attached to the
instrument 60 or comprises part of theinstrument 60 itself. The structure includes arod 200 which is attached bycoupler 202 to asphere 204. Therod 200 and thesphere 204 comprise reflective or distinctly colored material detectable by thecamera system 10. Thereflective rod 200 has the advantage that from all directions it has a similar linear shape, the edges of which are discriminated by thecamera system 10 and detected by linear edge detection. Acentroid axis 206 can therefore be calculated for therod 200 by theprocessor 42 and the computer 44 (FIG. 1). Thereflective sphere 204 defines acentroid 208 which can be detected by spherical edge detection of thesphere 204 and appropriate centroid calculation in theprocessor 42 and thecomputer 44. The combination of theaxis 206 and thecentroid 208 determines the plane defined by thesphere 204 and therod 200, and thus the orientation and position of theinstrument 60. - In FIG. 4B, a solid prism-shaped
object 210 is coupled by aconnector 212 to theinstrument 60. On the sides of theobject 210, namely aright side 214 and aleft side 216, there are respectivereflective areas 218 and 220 (shaded), which can be polished, painted, reflective paint, or reflective tape surfaces. Their position and direction determine the orientation of theobject 210, and therefore by transformation the orientation and position of theinstrument 60. - Referring to FIG. 4C, a solid
prismoidal structure 222 has distinguishing optically detectable markings which perform as a signature of theinstrument 60 to which it is attached. On one face of thestructure 222, there is shadedarea 224 having a distinct shape. On another face, there are two separateshaded areas structure 222, thecamera system 10 can determine by the size and shape characteristics of the shadedareas structure 222, and thus the orientation and position of theinstrument 60. As described above, a large number of different and identifiable objects such as thestructure 222 can be used to distinguish one tool from another. The detectable faces on different sides of thestructure 222 will ensure that thestructure 222 is identifiable from nearly any direction of view by thecamera system 10. Patterns such as bar codes or distinguishable line or object orientations can be used to encode the structure 222 (and thereby the instrument 60), allowing each different type of instrument to be recognizable via pattern recognition algorithms implemented in theprocessor 42 and thecomputer 44. - While most of the embodiments described above (in FIGS. 2, 3, and4) include a connector to couple an optically detectable structure to the
surgical instrument 60, it should be noted that the objects, shapes, and patterns in the above examples can generally be built integrally into theinstrument 60 itself. The very shape of the instrument may be optically detectable and classified and tracked by thecamera system 10 and other processing elements, as described above. - The embodiments of FIGS. 1, 2,3, and 4 have the advantage of providing optically coupled, non-mechanically coupled, wireless tracking. The marker objects of FIGS. 2, 3, and 4 can be made simply, economically, lightweight, and sterilizable or sterilely packaged. Each embodiment has practical advantages relative to the frame-based or mechanically-linked space pointers given as examples in the background section above.
- FIG. 5 illustrates the operative functionality of a system according to FIG. 1. The
surgical instrument 60 has an opticallydetectable index structure 230. A dynamic referencinghead clamp 232 with index marks 234, 236, and 238 is present; theclamp 232 further includes anadditional index marker 240. Aprocessor 242 and acomputer 244 convert camera data from thecamera system 10 for animage display 246, which shows a representation of the position of theinstrument 60 as a dashedline 248 relative to ananatomical structure 250. A predetermined point on theinstrument 60, such as a tip or a focal point, is indicated relative to theanatomical structure 250 as apoint 252. Examples of such coordinated display of probe orientation and image data is given in the product of OTS by Radionics, Burlington, Mass. - The
processor 242 and thecomputer 244 are also capable of generating aseparate representation 254 of the position of theinstrument 60. Theseparate representation 254 displays in a two- or three-dimensional form 256 the position of theinstrument 60 in comparison to ananatomical rendering 258, along with other optional representations of probe, anatomy, or target points such as a target point 260. In the disclosed embodiment, theseparate representation 254 is reconstructed from two-dimensional or three-dimensional image data such as CT or MR scans taken of the patient previously or contemporaneously in a real-time image scanner during surgery or treatment. - As with the system set forth in FIG. 1, three-dimensional analysis of the position of the
instrument 60 can be accomplished by determined by thestereoscopic cameras processor 42 and thecomputer 44. This can be done based on LED or reflective infrared light processing, or alternatively based on direct visible-light video processing of information from the twocameras cameras - In an alternative embodiment of the invention, a
third camera 14 is provided (see also FIG. 1). Thethird camera 14 is preferably a standard video camera which views the surgical field. Theprocessor 42 and thecomputer 44 further display the view from thethird video camera 14 in anadditional display 262. In this way, a direct video view of thepatient 264 is available. In addition, a view of the instrument 60 (seen as aninstrument image 266 with an index marker image 268) is seen from actual video. - A virtual extrapolation of the probe, shown as a dashed
line 270 with a tip ortarget point 272, can be determined from the analysis shown on thealternative representation 254. In an embodiment of the invention, this virtual extrapolation is overlaid directly onto theadditional display 262 so that direct comparison of the reconstructed three-dimensional navigation image of thealternative representation 254 can be compared to an actual video image on theadditional display 262. Correspondence and registration between a reconstructed image and an actual image in this way confirms the correctness of the probe orientation, and consequently the virtual position of unseen elements such as probe tip and probe position, for example in the depths of the surgical wound. Thus, a hybrid of reconstructed stereoscopic tracking by one set of cameras (e.g., thecameras 12 and 16) can be displayed and coordinated with respect to video imaging from another set of cameras (e.g., the video camera 14). - All cameras may be of the visible video type, or some may be filtered infrared (or other spectral filtered types) used with others of the visible video type. For example, in the embodiment of FIG. 5, the
cameras additional video camera 14 observes the visual spectrum. Accordingly, offering a comparison between the views provided by the separate cameras is a useful quality assurance check of the integrity of the entire tracking system. - Referring now to FIG. 6, another embodiment of the present invention involves a battery-powered optically detectable index structure280 associated with an
instrument 282. A camera system 284 comprises threecameras processor 292, and these can be sent to a computer system, as described above (see FIG. 1). Theinstrument 282 is shown generically; the optical index structure 280 comprises LEDemitters emitters instrument 282 to which they are attached. Theemitters circuit 300 which distributes energy to the LEDs for their illumination. Thecircuit 300 controls the sequence and synchronization of LED lighting. Abattery 302 is provided to supply power to thecircuit 300 and to theemitters - In an embodiment of the invention, the
LED emitters circuit 300 that is detectable by theprocessor 292 so as to recognize theinstrument 282 and the index structure 280. Alternatively, the pattern of positions of theemitters processor 292 to discriminate whatspecific instrument 282 is being used. - As an alternative, a coding scheme can be sent from a
transmitter 304 to areceiver 306 coupled to theinstrument 282. Thereceiver 306 accepts light or radio wave signals from thetransmitter 304, which is connected to theprocessor 292. A synchronization signal representative of the shutter operation from thecameras receiver 306. Thereceiver 306 and thecircuit 300 then cause the sequential flashing of theemitters receiver 306 to thetransmitter 304 can be used to confirm the synchronization of the emitters to the cameras. - Again a
patient 312 may be in the surgical field with attached opticallydetectable index elements - The
LED emitters battery 302 comprises a standard type of battery, such as one that might be used to operate a flashlight, camera, or other small appliance. Such batteries can easily be replaced or sterilized at the time of surgery. The use of batteries in a surgical instrument is advantageous in that the system is wireless and mechanically de-coupled from the camera system and its processor. - Referring again to FIG. 1, light sources may be used near to the cameras to produce reflected light from reflecting optically-detectable objects. In various embodiments of the invention, the optically detectable objects can alternatively have bright, colored, or shiny surfaces or have contrasting patterns of light and dark or alternately colored shapes and patterns to be detectable by cameras in ambient light. By arranging the ambient light to shine appropriately on a surgical, diagnostic, or therapeutic setting, objects can be recognized directly by the
camera system 10 as shown in FIG. 1. However, the use of additional lights near the cameras can enhance the reflection from optically detectable objects in certain clinical settings where ambient light may not be sufficient, or where high degrees of light contrast, such as from surgical head holders, microscope lights, or operating theatre lights may cause difficulty in discriminating light levels from the detectable objects. Thus, various illumination possibilities can easily be devised in accordance with the present invention to facilitate detection and data processing of the camera and video information to suit the clinical context. - Referring now to FIG. 7, a block diagram is provided to illustrate the relationship among the various functional steps performed by a system according to the invention. A camera and light reflection processing function (block200) specifies that the camera system 10 (FIG. 1) detects an instrument with an optically detectable object attached to it. This is done with a camera system as described above, wherein camera data from infrared filtered cameras of various kinds and/or video cameras is provided to a pattern data processing function (block 322). The pattern
data processing function 322 receives data from the camera and lightreflection processing function 320, allowing the instrument is recognized by pattern recognition algorithms operating on stereoscopic data received from thecamera system 10. The nature of the instrument can also be recognized by way of time or geometric sequencing or arrangements of light-emitting or light reflecting objects or patterns on the instrument, as described above. - As part of cameras, as shown for example in FIG. 1 and FIG. 5, a visible video camera may also be used in combination with filtered or unfiltered cameras as a confirmational step. A video processing function (block324) is provided to carry out the reception and processing of such visible video data. The output data from such video processing is sent to a computer processing function (block 326), along with the instrument tracking data from the pattern
data processing function 322. Thecomputer processing function 326 may also accept image scan data which is either taken prior to operation or contemporaneously during the operation, as illustrated by an image data processing function (block 328). Thecomputer processing function 326 can then combine or merge a combination of pattern recognition and tracking data (from the pattern data processing function 322), the visible video data (from the video processing function 324), and image scan data (from the image data processing function 328), so as to display it in various forms via a display function (block 330). - Various examples of combination displays have been described in connection with FIG. 5. A useful quality assurance check would be, for example, to overlay visible video data onto the combined representations of the image scan data and of the surgical instrument as it moves relative to the anatomy. The video data shows in real time the position of an instrument relative to the anatomy, or the relative position of instruments relative to each other, within the field of surgical view. Seen on a display, a rendering of the reconstructed position of a surgical instrument relative to the overlaid anatomy, or compared side-by-side to the actual visible video view of the instrument relative to the anatomy, is a strong confirmational step to show that the tracking is being done properly. In certain clinical situations such as surgery, X-ray treatment on a treatment planning machine such as a linear accelerator, or patient positioning on a diagnostic machine, such a confirmational step could be very important. Thus, the process of FIG. 7 can apply to camera and video detection in the surgical setting, a diagnostic suite, or in connection with treatment planning process and instrumentation. Use, for example, together with a real time diagnostic or intraoperative imaging machine such as a CT, MR, PET, X-ray, or other scanner would be another context for the process in FIG. 7.
- Also shown in FIG. 7 is a patient registration data processing function (block332), which represents the step of registering or calibrating instrumentation or apparatus relative to a patient, prior to performing a procedure with the tracked instrument. The registration step may be predetermined or determined during the clinical setting in a variety of ways, as described above.
- The steps performed in tracking an object (for example, the
instrument 60, thepatient 22, or the surgeon 76) according to the invention are set forth in FIG. 8. First, a set of multiple camera images (stereoscopic images for the case of two or more two-dimensional cameras) is acquired (step 340) from the camera system 10 (FIG. 1). Any markers present in the stereoscopic images are then detected (step 342) as described above. For example, when two two-dimensional CCD cameras are used, there are two frames in a set of stereoscopic images, namely a left frame (from the left camera 16) and a right frame (from the right camera 12). The detected markers will appear in slightly different positions in the two frames, so the positions are then correlated (step 344). The difference in a marker's position between the two frames is used to determine depth (i.e., distance from the camera system 10) in three dimensions. It should be noted that more than two cameras may be used in the present invention; the additional cameras can be used to verify the stereoscopic images or to provide further accuracy or definition. - After the markers have been correlated between the stereoscopic frames, the images are further processed to determine the positions of the markers in three-dimensional space by transforming the markers (step346) into a coordinate system defined by the
camera system 10. As described above, this step is performed in varying ways depending on the nature of the markers in the field of view. For example, a spherical marker will define a centroid, while a rod-shaped or flat marker will define an axis. Accordingly, the unique set of centroids, axes, and other characteristics in the coordinate system of the cameras can be used to identify the position of the object being tracked (step 348). This information is used in the operation of the system as described below. - FIG. 9 illustrates, in one exemplary embodiment, how the various objects are tracked by the system to generate one or more displays, as described above. First, the location of the surgical instrument60 (FIG. 1) is identified with respect to the
camera system 10, as described in conjunction with FIG. 8. A set of coordinates is generated thereby. Those coordinates specify the position of theinstrument 60, and further specify a transformation between the coordinate system of thecamera system 10 and a coordinate system associated with the instrument. This may involve, for example, index point registrations from the patient's physical anatomy to image scan data, as described previously. Next, or concurrently, the location of thepatient 22 is identified (step 352) with respect to thecamera system 10. Again, the coordinates specify the position of thepatient 22 and a coordinate transformation between the camera system and the patient. Finally, or concurrently, the location of thesurgeon 76 is identified (step 354), as above. - With all of the positional data having been generated, a desired view is selected (step356) by the surgeon or other operator. Several possible views have been described above, but there are alternatives. For example, a “surgeon's eye” view is possible by transforming the instrument position and the patient position into the surgeon's coordinate system. An “instrument's eye” view is possible by transforming the patient position into the instrument's coordinate system. A patient-centered system is possible by transforming the instrument position into the patient's coordinate system. These transformations involve simple matrix manipulation and trigonometric calculations; they would be well-known to a person of ordinary skill in the mathematical arts.
- The desired transformations of the instrument position (step358) and the patient position (step 360) are then performed. A display is generated (step 362) based on the transformed positions (see FIG. 5). As described above, the display can comprise only a reproduction of the instrument in relation to a reproduction of the patient's anatomical structures (for example, based on reconstructions from image scan data from CT, MR, or other types of scans), or can include an overlaid video view from a
video camera 14 on thecamera system 10 or avideo camera 82 on thesurgeon 76. Moreover, the patient's anatomical data can be manipulated in various ways well known in the art to provide slice, cutaway, or contour views, among others. Moreover, further coordinate transformations can optionally be provided to allow operator control over the views on the display, for example to slightly displace a view from a true “instrument's eye” view. - Steps350-362 are repeated as necessary to update the display with the various object positions in real time or close to real time.
- Forms and embodiments of optical object tracking systems and methods are provided involving various geometries, detection methods, pattern recognition methods, display methods, systems components, and process steps. However, it should be recognized that other forms varying from the embodiments specifically set forth herein may be used as variations of the above examples in accordance with the present invention. In particular, it should be noted that although various functional components have been set forth and described herein, many of these functional components can be integrated (into a single general-purpose digital computer, for example), or performed by separate processing devices; any such embodiment is intended to be within the scope of the invention. Moreover, although sequences of process steps are set forth herein as though performed in a certain order, it is recognized that the invention will be equally operative if the steps are rearranged or otherwise performed in a different order. In addition, it has been noted that certain steps are optional, such as identifying the surgeon's position (step354) if it is not desired to track the surgeon.
- In view of these considerations, as would be apparent by persons skilled in the art, the implementation of a system in accordance with the invention should be considered broadly and with respect to the claims set forth below.
Claims (19)
1. A system for optically tracking an instrument relative to the anatomy of a patient in a clinical field of view, comprising:
a camera system including at least two spatially separated cameras, capable of viewing the clinical field of view to provide camera data in a first coordinate system defined by the camera system;
an instrument comprising an optically detectable object that is detectable by the camera system to provide instrument data representative of the position of the instrument in the first coordinate system;
data storage comprising image data representative of the anatomy of the patient received from an imaging machine;
a computer to accept the camera data, the instrument data, and the image data;
a software program running on the computer and capable of transforming the image data, the camera data, and the instrument data into a second coordinate system, thereby generating tracking data representative of the position of the instrument in relation to the anatomy of the patient.
2. The system of claim 1 , further comprising a display to display the tracking data.
3. The system of claim 1 , wherein the first coordinate system is identical to the second coordinate system.
4. The system of claim 1 wherein the camera system comprises at least two two-dimensional CCD cameras.
5. The system of claim 1 , wherein the camera system-comprises at least three linear CCD cameras.
6. The system of claim 3 , wherein:
each camera in the camera system has a filter passing the infrared optical spectrum; and
the optically detectable object is visible in the infrared spectrum.
7. The system of claim 6 , wherein said optically detectable object comprises an emitter of infrared light.
8. The system of claim 6 , further comprising at least one infrared light source, and wherein the optically detectable object comprises a reflective object; whereby infrared light emitted from the infrared light source is reflected from the optically detectable object toward the camera system.
9. The system of claim 1 , wherein the optically detectable object comprises an arrangement of geometric objects identifiable by said camera system to yield position data representative of the position of the optically detectable object.
10. The system of claim 9 , wherein the arrangement of geometric objects comprises a pattern of light-emitting diodes (LEDs).
11. The system of claim 9 , wherein the arrangement of geometric objects comprises at least one optically detectable rod.
12. The system of claim 9 , wherein the arrangement of geometric objects comprises at least one optically detectable rod and at least one optically detectable sphere comprises a pattern of optically detectable geometric forms disposed on a surface.
14. The system of claim 13, wherein the surface comprises a substantially planar plate and the geometric forms comprise a plurality of linear shapes defining an orientation of the optically detectable object.
15. The system of claim 13, wherein the geometric forms comprise at least one circular shape.
16. The system of claim 9 , wherein the arrangement of geometric objects comprises at least one sphere.
17. The system of claim 16 , wherein the arrangement of geometric objects comprises three spheres.
18. The system of claim 9 , wherein the arrangement of geometric objects comprises a plurality of surfaces bearing reflective material.
19. The system of claim 9 , wherein the arrangement of geometric objects comprises a plurality of surfaces bearing brightly colored material.
20. The system of claim 9 , wherein the arrangement of geometric objects comprises a plurality of illuminated surfaces.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/752,118 US20040138556A1 (en) | 1991-01-28 | 2004-01-05 | Optical object tracking system |
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US64746391A | 1991-01-28 | 1991-01-28 | |
US94186392A | 1992-09-08 | 1992-09-08 | |
US4787993A | 1993-04-15 | 1993-04-15 | |
US29998794A | 1994-09-01 | 1994-09-01 | |
US08/441,788 US5662111A (en) | 1991-01-28 | 1995-05-16 | Process of stereotactic optical navigation |
US08/475,681 US6006126A (en) | 1991-01-28 | 1995-06-07 | System and method for stereotactic registration of image scan data |
US1484098A | 1998-01-28 | 1998-01-28 | |
US09/491,502 US6675040B1 (en) | 1991-01-28 | 2000-01-26 | Optical object tracking system |
US10/752,118 US20040138556A1 (en) | 1991-01-28 | 2004-01-05 | Optical object tracking system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/491,502 Continuation US6675040B1 (en) | 1991-01-28 | 2000-01-26 | Optical object tracking system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040138556A1 true US20040138556A1 (en) | 2004-07-15 |
Family
ID=32719787
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/491,502 Expired - Fee Related US6675040B1 (en) | 1991-01-28 | 2000-01-26 | Optical object tracking system |
US10/752,118 Abandoned US20040138556A1 (en) | 1991-01-28 | 2004-01-05 | Optical object tracking system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/491,502 Expired - Fee Related US6675040B1 (en) | 1991-01-28 | 2000-01-26 | Optical object tracking system |
Country Status (1)
Country | Link |
---|---|
US (2) | US6675040B1 (en) |
Cited By (86)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030133602A1 (en) * | 2002-01-15 | 2003-07-17 | Ali Bani-Hashemi | Patient positioning by video imaging |
US20030187360A1 (en) * | 2001-02-12 | 2003-10-02 | Milton Waner | Infrared assisted monitoring of a catheter |
US20050020909A1 (en) * | 2003-07-10 | 2005-01-27 | Moctezuma De La Barrera Jose Luis | Display device for surgery and method for using the same |
US20050033108A1 (en) * | 2003-08-05 | 2005-02-10 | Sawyer Timothy E. | Tumor treatment identification system |
WO2006008300A1 (en) * | 2004-07-20 | 2006-01-26 | Politecnico Di Milano | Apparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers |
WO2006016290A1 (en) | 2004-08-09 | 2006-02-16 | Koninklijke Philips Electronics N.V. | Processing of images of interventional instruments with markers |
US20060072124A1 (en) * | 2004-10-01 | 2006-04-06 | Smetak Edward C | System and tracker for tracking an object, and related methods |
US20060122502A1 (en) * | 2004-12-06 | 2006-06-08 | Scherch John D | System for analyzing the geometry of a radiation treatment apparatus, software and related methods |
US20060215813A1 (en) * | 2005-03-23 | 2006-09-28 | Scherch John D | System for monitoring the geometry of a radiation treatment apparatus, trackable assembly, program product, and related methods |
US20060285641A1 (en) * | 2005-06-16 | 2006-12-21 | Nomos Corporation | System, tracker, and program product to facilitate and verify proper target alignment for radiation delivery, and related methods |
US20070066899A1 (en) * | 2005-09-22 | 2007-03-22 | Siemens Aktiengesellschaft | Medical treatment device and associated method of operation |
US20070066880A1 (en) * | 2005-09-09 | 2007-03-22 | Warren Lee | Image-based probe guidance system |
JP2007209531A (en) * | 2006-02-09 | 2007-08-23 | Hamamatsu Univ School Of Medicine | Surgery supporting system, method and program |
US7266175B1 (en) | 2003-07-11 | 2007-09-04 | Nomos Corporation | Planning method for radiation therapy |
US20070270690A1 (en) * | 2006-05-18 | 2007-11-22 | Swen Woerlein | Non-contact medical registration with distance measuring |
US20080161682A1 (en) * | 2007-01-02 | 2008-07-03 | Medtronic Navigation, Inc. | System and method for tracking positions of uniform marker geometries |
US20090109240A1 (en) * | 2007-10-24 | 2009-04-30 | Roman Englert | Method and System for Providing and Reconstructing a Photorealistic Three-Dimensional Environment |
US20100061608A1 (en) * | 2008-09-10 | 2010-03-11 | Galant Adam K | Medical Image Data Processing and Interventional Instrument Identification System |
US20100094085A1 (en) * | 2007-01-31 | 2010-04-15 | National University Corporation Hamamatsu Universi Ty School Of Medicine | Device for Displaying Assistance Information for Surgical Operation, Method for Displaying Assistance Information for Surgical Operation, and Program for Displaying Assistance Information for Surgical Operation |
US20100149213A1 (en) * | 2006-04-12 | 2010-06-17 | Nassir Navab | Virtual Penetrating Mirror Device for Visualizing of Virtual Objects within an Augmented Reality Environment |
US20100168763A1 (en) * | 2008-12-31 | 2010-07-01 | Intuitive Surgical, Inc. | Configuration marker design and detection for instrument tracking |
US20100168562A1 (en) * | 2008-12-31 | 2010-07-01 | Intuitive Surgical, Inc. | Fiducial marker design and detection for locating surgical instrument in images |
US20100185100A1 (en) * | 2009-01-19 | 2010-07-22 | Alexander Urban | Identifying and localizing tissue using light analysis |
US20100268071A1 (en) * | 2007-12-17 | 2010-10-21 | Imagnosis Inc. | Medical imaging marker and program for utilizing same |
WO2010124672A1 (en) * | 2009-04-27 | 2010-11-04 | Phacon Gmbh | Video-based mono-camera navigation system |
US7831289B2 (en) | 2003-10-07 | 2010-11-09 | Best Medical International, Inc. | Planning system, method and apparatus for conformal radiation therapy |
US20110190637A1 (en) * | 2008-08-18 | 2011-08-04 | Naviswiss Ag | Medical measuring system, method for surgical intervention as well as use of a medical measuring system |
WO2012001550A1 (en) * | 2010-06-30 | 2012-01-05 | Koninklijke Philips Electronics N.V. | Method and system for creating physician-centric coordinate system |
WO2011113441A3 (en) * | 2010-03-18 | 2012-02-02 | Rigshospitalet | Optical motion tracking of an object |
US20120078236A1 (en) * | 2010-09-29 | 2012-03-29 | Hans Schoepp | Surgical Navigation System |
US20120082342A1 (en) * | 2010-10-04 | 2012-04-05 | Korea Institute Of Science And Technology | 3 dimension tracking system for surgery simulation and localization sensing method using the same |
US20120280910A1 (en) * | 2009-11-18 | 2012-11-08 | Elmiro Business Development B.V. | Control system and method for controlling a plurality of computer devices |
US20130108979A1 (en) * | 2011-10-28 | 2013-05-02 | Navident Technologies, Inc. | Surgical location monitoring system and method |
US8435033B2 (en) | 2010-07-19 | 2013-05-07 | Rainbow Medical Ltd. | Dental navigation techniques |
EP2641561A1 (en) * | 2012-03-21 | 2013-09-25 | Covidien LP | System and method for determining camera angles by using virtual planes derived from actual images |
US20130261433A1 (en) * | 2012-03-28 | 2013-10-03 | Navident Technologies, Inc. | Haptic simulation and surgical location monitoring system and method |
US8666476B2 (en) | 2009-03-01 | 2014-03-04 | National University Corporation Hamamatsu University School Of Medicine | Surgery assistance system |
US20140078517A1 (en) * | 2007-09-26 | 2014-03-20 | Elbit Systems Ltd. | Medical wide field of view optical tracking system |
WO2014048448A1 (en) * | 2012-09-25 | 2014-04-03 | Brainlab Ag | Modular navigation reference |
US20140128727A1 (en) * | 2012-11-08 | 2014-05-08 | Navident Technologies, Inc. | Surgical location monitoring system and method using natural markers |
US8792614B2 (en) | 2009-03-31 | 2014-07-29 | Matthew R. Witten | System and method for radiation therapy treatment planning using a memetic optimization algorithm |
WO2014120909A1 (en) * | 2013-02-01 | 2014-08-07 | Sarment David | Apparatus, system and method for surgical navigation |
US20140228675A1 (en) * | 2011-10-28 | 2014-08-14 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method |
WO2014139022A1 (en) | 2013-03-15 | 2014-09-18 | Synaptive Medical (Barbados) Inc. | Systems and methods for navigation and simulation of minimally invasive therapy |
US20140276943A1 (en) * | 2013-03-13 | 2014-09-18 | Stryker Corporation | Systems and Methods for Establishing Virtual Constraint Boundaries |
US20140320600A1 (en) * | 2013-04-26 | 2014-10-30 | Navigate Surgical Technologies, Inc. | System and method for tracking non-visible structure of a body |
US8908918B2 (en) | 2012-11-08 | 2014-12-09 | Navigate Surgical Technologies, Inc. | System and method for determining the three-dimensional location and orientation of identification markers |
US9008757B2 (en) | 2012-09-26 | 2015-04-14 | Stryker Corporation | Navigation system including optical and non-optical sensors |
WO2015075720A1 (en) * | 2013-11-21 | 2015-05-28 | Elbit Systems Ltd. | A medical optical tracking system |
US20150209118A1 (en) * | 2014-01-27 | 2015-07-30 | Align Technology, Inc. | Adhesive objects for improving image registration of intraoral images |
US9198737B2 (en) | 2012-11-08 | 2015-12-01 | Navigate Surgical Technologies, Inc. | System and method for determining the three-dimensional location and orientation of identification markers |
US20150367961A1 (en) * | 2014-06-18 | 2015-12-24 | Airbus Operations (S.A.S.) | Computer-assisted methods of quality control and corresponding quality control systems |
US9456122B2 (en) | 2013-08-13 | 2016-09-27 | Navigate Surgical Technologies, Inc. | System and method for focusing imaging devices |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
JP2016536076A (en) * | 2013-08-13 | 2016-11-24 | ナビゲート サージカル テクノロジーズ インク | Method for determining the position and orientation of a reference reference |
US9511243B2 (en) | 2012-04-12 | 2016-12-06 | University Of Florida Research Foundation, Inc. | Prevention of setup errors in radiotherapy |
US9545188B2 (en) | 2010-12-02 | 2017-01-17 | Ultradent Products, Inc. | System and method of viewing and tracking stereoscopic video images |
US9554763B2 (en) | 2011-10-28 | 2017-01-31 | Navigate Surgical Technologies, Inc. | Soft body automatic registration and surgical monitoring system |
US9566123B2 (en) | 2011-10-28 | 2017-02-14 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method |
US9585721B2 (en) | 2011-10-28 | 2017-03-07 | Navigate Surgical Technologies, Inc. | System and method for real time tracking and modeling of surgical site |
WO2017075085A1 (en) * | 2015-10-28 | 2017-05-04 | Endochoice, Inc. | Device and method for tracking the position of an endoscope within a patient's body |
US9659367B2 (en) | 2014-04-04 | 2017-05-23 | International Business Machines Corporation | Head mounted video and touch detection for healthcare facility hygiene |
US20170245945A1 (en) * | 2014-11-21 | 2017-08-31 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
WO2018055950A1 (en) * | 2016-09-23 | 2018-03-29 | Sony Corporation | Control device, control method, and medical system |
CN107970060A (en) * | 2018-01-11 | 2018-05-01 | 上海联影医疗科技有限公司 | Surgical robot system and its control method |
US10021351B2 (en) | 2012-06-01 | 2018-07-10 | Ultradent Products, Inc. | Stereoscopic video imaging |
US10034713B2 (en) | 2012-07-03 | 2018-07-31 | 7D Surgical Inc. | Attachments for tracking handheld implements |
US10105149B2 (en) | 2013-03-15 | 2018-10-23 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US10219811B2 (en) | 2011-06-27 | 2019-03-05 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
CN109765992A (en) * | 2017-11-09 | 2019-05-17 | 波音公司 | By the system of virtual content and physical environment spatial registration, Method and kit for |
CN110464462A (en) * | 2019-08-29 | 2019-11-19 | 中国科学技术大学 | The image-guidance registration arrangement and relevant apparatus of abdominal surgery intervention operation |
EP3610771A1 (en) * | 2018-08-16 | 2020-02-19 | Fujifilm Corporation | Control device for an endoscope with means for detecting a pattern and for identifying whether the endoscope is in a non-use state or in a use state and an associated method and a program |
WO2020109903A1 (en) * | 2018-11-26 | 2020-06-04 | Augmedics Ltd. | Tracking system for image-guided surgery |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
JP2021519186A (en) * | 2018-03-30 | 2021-08-10 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Surveillance of moving objects in the operating room |
US11103315B2 (en) | 2015-12-31 | 2021-08-31 | Stryker Corporation | Systems and methods of merging localization and vision data for object avoidance |
US11103314B2 (en) * | 2017-11-24 | 2021-08-31 | Synaptive Medical Inc. | Methods and devices for tracking objects by surgical navigation systems |
US11116574B2 (en) | 2006-06-16 | 2021-09-14 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
EP3922203A1 (en) * | 2020-06-09 | 2021-12-15 | Globus Medical, Inc. | Surgical object tracking in visible light via fiducial seeding and synthetic image registration |
US11304777B2 (en) | 2011-10-28 | 2022-04-19 | Navigate Surgical Technologies, Inc | System and method for determining the three-dimensional location and orientation of identification markers |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US20220241013A1 (en) * | 2014-03-28 | 2022-08-04 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional visualization of instruments in a field of view |
US11750794B2 (en) | 2015-03-24 | 2023-09-05 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
US11911117B2 (en) | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
Families Citing this family (122)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6675040B1 (en) * | 1991-01-28 | 2004-01-06 | Sherwood Services Ag | Optical object tracking system |
US6381485B1 (en) * | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies, Inc. | Registration of human anatomy integrated for electromagnetic localization |
US20010034530A1 (en) * | 2000-01-27 | 2001-10-25 | Malackowski Donald W. | Surgery system |
US6725080B2 (en) * | 2000-03-01 | 2004-04-20 | Surgical Navigation Technologies, Inc. | Multiple cannula image guided tool for image guided procedures |
WO2001067979A1 (en) * | 2000-03-15 | 2001-09-20 | Orthosoft Inc. | Automatic calibration system for computer-aided surgical instruments |
US20020063225A1 (en) * | 2000-09-27 | 2002-05-30 | Payton David W. | Distributed sensing apparatus and method of use therefor |
WO2002036018A1 (en) * | 2000-11-03 | 2002-05-10 | Synthes Ag Chur | Determination of deformation of surgical tools |
US6757416B2 (en) * | 2000-12-04 | 2004-06-29 | Ge Medical Systems Global Technology Company, Llc | Display of patient image data |
JP4337266B2 (en) * | 2001-01-31 | 2009-09-30 | コニカミノルタセンシング株式会社 | Three-dimensional measurement method and three-dimensional measurement system |
US20050113846A1 (en) * | 2001-02-27 | 2005-05-26 | Carson Christopher P. | Surgical navigation systems and processes for unicompartmental knee arthroplasty |
US7547307B2 (en) * | 2001-02-27 | 2009-06-16 | Smith & Nephew, Inc. | Computer assisted knee arthroplasty instrumentation, systems, and processes |
WO2002067784A2 (en) * | 2001-02-27 | 2002-09-06 | Smith & Nephew, Inc. | Surgical navigation systems and processes for unicompartmental knee |
US7251352B2 (en) * | 2001-08-16 | 2007-07-31 | Siemens Corporate Research, Inc. | Marking 3D locations from ultrasound images |
DE10202125A1 (en) * | 2002-01-22 | 2003-07-31 | Leica Microsystems | Transmission device for an operating microscope |
AU2003217389B2 (en) * | 2002-02-11 | 2008-10-30 | Smith & Nephew, Inc. | Image-guided fracture reduction |
JP2003294416A (en) * | 2002-04-03 | 2003-10-15 | Eng Kk | Stereoscopic image processor |
DE50201006D1 (en) * | 2002-04-16 | 2004-10-21 | Brainlab Ag | Marker for an instrument and method for locating a marker |
US20040215057A1 (en) * | 2002-09-27 | 2004-10-28 | Wellman Parris S. | Portable, reusable visualization system |
US7869861B2 (en) * | 2002-10-25 | 2011-01-11 | Howmedica Leibinger Inc. | Flexible tracking article and method of using the same |
US7616801B2 (en) * | 2002-11-27 | 2009-11-10 | Hologic, Inc. | Image handling and display in x-ray mammography and tomosynthesis |
US6713092B1 (en) * | 2002-12-03 | 2004-03-30 | Natreon Inc. | Withania Somnifera composition, method for obtaining same and pharmaceutical, nutritional and personal care formulations thereof |
US7862570B2 (en) | 2003-10-03 | 2011-01-04 | Smith & Nephew, Inc. | Surgical positioners |
US7764985B2 (en) * | 2003-10-20 | 2010-07-27 | Smith & Nephew, Inc. | Surgical navigation system component fault interfaces and related processes |
US20050085822A1 (en) * | 2003-10-20 | 2005-04-21 | Thornberry Robert C. | Surgical navigation system component fault interfaces and related processes |
US20050085718A1 (en) * | 2003-10-21 | 2005-04-21 | Ramin Shahidi | Systems and methods for intraoperative targetting |
US20060030985A1 (en) * | 2003-10-24 | 2006-02-09 | Active Recognition Technologies Inc., | Vehicle recognition using multiple metrics |
US7794467B2 (en) * | 2003-11-14 | 2010-09-14 | Smith & Nephew, Inc. | Adjustable surgical cutting systems |
US20050109855A1 (en) * | 2003-11-25 | 2005-05-26 | Mccombs Daniel | Methods and apparatuses for providing a navigational array |
US20050113659A1 (en) * | 2003-11-26 | 2005-05-26 | Albert Pothier | Device for data input for surgical navigation system |
DE50304977D1 (en) * | 2003-12-05 | 2006-10-19 | Moeller Wedel Gmbh | Method and device for observing objects with a microscope |
US7771436B2 (en) * | 2003-12-10 | 2010-08-10 | Stryker Leibinger Gmbh & Co. Kg. | Surgical navigation tracker, system and method |
JP2007523696A (en) * | 2004-01-16 | 2007-08-23 | スミス アンド ネフュー インコーポレーテッド | Computer-aided ligament balancing in total knee arthroplasty |
US20050159759A1 (en) * | 2004-01-20 | 2005-07-21 | Mark Harbaugh | Systems and methods for performing minimally invasive incisions |
JP2007518540A (en) * | 2004-01-22 | 2007-07-12 | スミス アンド ネフュー インコーポレーテッド | Method, system and apparatus for providing a surgical navigation sensor attached to a patient |
FR2867376B1 (en) * | 2004-03-12 | 2007-01-05 | Tornier Sa | DEVICE AND ASSEMBLY FOR DETERMINING THE POSITION OF A PORTION OF A HUMAN BODY |
US20050234466A1 (en) * | 2004-03-31 | 2005-10-20 | Jody Stallings | TLS adjustable block |
US20050234465A1 (en) * | 2004-03-31 | 2005-10-20 | Mccombs Daniel L | Guided saw with pins |
AU2005231404B9 (en) * | 2004-03-31 | 2012-04-26 | Smith & Nephew, Inc. | Methods and apparatuses for providing a reference array input device |
US20050228404A1 (en) * | 2004-04-12 | 2005-10-13 | Dirk Vandevelde | Surgical navigation system component automated imaging navigation and related processes |
US20070287910A1 (en) * | 2004-04-15 | 2007-12-13 | Jody Stallings | Quick Disconnect and Repositionable Reference Frame for Computer Assisted Surgery |
US7300432B2 (en) * | 2004-04-21 | 2007-11-27 | Depuy Products, Inc. | Apparatus for securing a sensor to a surgical instrument for use in computer guided orthopaedic surgery |
WO2005104978A1 (en) * | 2004-04-21 | 2005-11-10 | Smith & Nephew, Inc. | Computer-aided methods, systems, and apparatuses for shoulder arthroplasty |
US20050279368A1 (en) * | 2004-06-16 | 2005-12-22 | Mccombs Daniel L | Computer assisted surgery input/output systems and processes |
EP1615170A1 (en) * | 2004-07-10 | 2006-01-11 | Evotec Technologies GmbH | Image segmentation algorithms for applications in cellular biology |
US20060034535A1 (en) * | 2004-08-10 | 2006-02-16 | Koch Roger D | Method and apparatus for enhancing visibility to a machine operator |
EP1645241B1 (en) * | 2004-10-05 | 2011-12-28 | BrainLAB AG | Position marker system with point light sources |
US9216015B2 (en) | 2004-10-28 | 2015-12-22 | Vycor Medical, Inc. | Apparatus and methods for performing brain surgery |
EP1835967A1 (en) * | 2004-12-02 | 2007-09-26 | Smith and Nephew, Inc. | Systems for providing a reference plane for mounting an acetabular cup |
DE102004058122A1 (en) * | 2004-12-02 | 2006-07-13 | Siemens Ag | Medical image registration aid for landmarks by computerized and photon emission tomographies, comprises permeable radioactive substance is filled with the emission tomography as radiation permeable containers, a belt and patient body bowl |
US20060161051A1 (en) * | 2005-01-18 | 2006-07-20 | Lauralan Terrill-Grisoni | Method of computer-assisted ligament balancing and component placement in total knee arthroplasty |
JP2008531091A (en) | 2005-02-22 | 2008-08-14 | スミス アンド ネフュー インコーポレーテッド | In-line milling system |
US8031227B2 (en) * | 2005-03-07 | 2011-10-04 | The Regents Of The University Of Michigan | Position tracking system |
US8295909B2 (en) * | 2005-06-16 | 2012-10-23 | Brainlab Ag | Medical tracking system with infrared data transfer |
EP1733693B1 (en) * | 2005-06-16 | 2008-04-23 | BrainLAB AG | Tracking system for medical equipment with infrared transmission |
US20060287583A1 (en) | 2005-06-17 | 2006-12-21 | Pool Cover Corporation | Surgical access instruments for use with delicate tissues |
KR101251944B1 (en) * | 2005-08-04 | 2013-04-08 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Apparatus for monitoring a person having an interest to an object, and method thereof |
DE502005004417D1 (en) * | 2005-10-12 | 2008-07-24 | Brainlab Ag | Marker for a navigation system and method for detecting a marker |
AU2006308766B2 (en) * | 2005-11-03 | 2012-04-12 | Orthosoft Ulc | Multifaceted tracker device for computer-assisted surgery |
US20070118055A1 (en) * | 2005-11-04 | 2007-05-24 | Smith & Nephew, Inc. | Systems and methods for facilitating surgical procedures involving custom medical implants |
JP4772540B2 (en) * | 2006-03-10 | 2011-09-14 | 株式会社東芝 | Ultrasonic diagnostic equipment |
US7557710B2 (en) * | 2006-03-17 | 2009-07-07 | Med Wave, Llc | System for tracking surgical items in an operating room environment |
US20080123910A1 (en) * | 2006-09-19 | 2008-05-29 | Bracco Imaging Spa | Method and system for providing accuracy evaluation of image guided surgery |
US7256899B1 (en) * | 2006-10-04 | 2007-08-14 | Ivan Faul | Wireless methods and systems for three-dimensional non-contact shape sensing |
EP1952779B1 (en) * | 2007-02-01 | 2012-04-04 | BrainLAB AG | Method and system for Identification of medical instruments |
US20080287805A1 (en) * | 2007-05-16 | 2008-11-20 | General Electric Company | System and method to guide an instrument through an imaged subject |
EP2157929A4 (en) | 2007-06-15 | 2017-11-15 | Orthosoft, Inc. | Computer-assisted surgery system and method |
US20090287094A1 (en) * | 2008-05-15 | 2009-11-19 | Seacrete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US8636670B2 (en) | 2008-05-13 | 2014-01-28 | The Invention Science Fund I, Llc | Circulatory monitoring systems and methods |
US9717896B2 (en) | 2007-12-18 | 2017-08-01 | Gearbox, Llc | Treatment indications informed by a priori implant information |
US20090287101A1 (en) * | 2008-05-13 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287109A1 (en) * | 2008-05-14 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287120A1 (en) * | 2007-12-18 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090292212A1 (en) * | 2008-05-20 | 2009-11-26 | Searete Llc, A Limited Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20090287191A1 (en) * | 2007-12-18 | 2009-11-19 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Circulatory monitoring systems and methods |
US20110004224A1 (en) * | 2008-03-13 | 2011-01-06 | Daigneault Emmanuel | Tracking cas system |
US20090318773A1 (en) * | 2008-06-24 | 2009-12-24 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Involuntary-response-dependent consequences |
US20100056873A1 (en) * | 2008-08-27 | 2010-03-04 | Allen Paul G | Health-related signaling via wearable items |
US8284046B2 (en) | 2008-08-27 | 2012-10-09 | The Invention Science Fund I, Llc | Health-related signaling via wearable items |
US8125331B2 (en) * | 2008-08-27 | 2012-02-28 | The Invention Science Fund I, Llc | Health-related signaling via wearable items |
US8130095B2 (en) * | 2008-08-27 | 2012-03-06 | The Invention Science Fund I, Llc | Health-related signaling via wearable items |
US8094009B2 (en) * | 2008-08-27 | 2012-01-10 | The Invention Science Fund I, Llc | Health-related signaling via wearable items |
JP5553672B2 (en) * | 2010-04-26 | 2014-07-16 | キヤノン株式会社 | Acoustic wave measuring apparatus and acoustic wave measuring method |
WO2011134083A1 (en) * | 2010-04-28 | 2011-11-03 | Ryerson University | System and methods for intraoperative guidance feedback |
DE112011102348A5 (en) * | 2010-07-15 | 2013-04-18 | Naviswiss Ag | Method for determining spatial coordinates |
WO2012055071A1 (en) * | 2010-10-28 | 2012-05-03 | 医百科技股份有限公司 | Dental injection simulation system and method |
JP5959150B2 (en) * | 2011-01-12 | 2016-08-02 | オリンパス株式会社 | Endoscope system |
JP5752945B2 (en) * | 2011-01-24 | 2015-07-22 | オリンパス株式会社 | Endoscope system |
US8900126B2 (en) * | 2011-03-23 | 2014-12-02 | United Sciences, Llc | Optical scanning device |
US9572539B2 (en) * | 2011-04-08 | 2017-02-21 | Imactis | Device and method for determining the position of an instrument in relation to medical images |
US8687172B2 (en) | 2011-04-13 | 2014-04-01 | Ivan Faul | Optical digitizer with improved distance measurement capability |
US10540479B2 (en) * | 2011-07-15 | 2020-01-21 | Stephen B. Murphy | Surgical planning system and method |
US10722318B2 (en) * | 2011-08-24 | 2020-07-28 | Mako Surgical Corp. | Surgical tools for selectively illuminating a surgical volume |
TWI454246B (en) * | 2011-09-30 | 2014-10-01 | Mackay Memorial Hospital | Immediate monitoring of the target location of the radiotherapy system |
US8900125B2 (en) | 2012-03-12 | 2014-12-02 | United Sciences, Llc | Otoscanning with 3D modeling |
US9024462B2 (en) | 2012-09-19 | 2015-05-05 | Jeff Thramann | Generation of electrical energy in a ski or snowboard |
US9554880B2 (en) * | 2012-10-25 | 2017-01-31 | Zfx Gmbh | Reference member for determining a position of an implant analog |
US20150097937A1 (en) * | 2013-10-08 | 2015-04-09 | Ali Kord | Single-camera motion capture system |
WO2015130124A1 (en) * | 2014-02-28 | 2015-09-03 | 주식회사 엠에스피 | Helmet-type low-intensity focused ultrasound stimulation device and system |
US10912523B2 (en) | 2014-03-24 | 2021-02-09 | Intuitive Surgical Operations, Inc. | Systems and methods for anatomic motion compensation |
WO2015175635A1 (en) | 2014-05-13 | 2015-11-19 | Vycor Medical, Inc. | Guidance system mounts for surgical introducers |
EP2944283B1 (en) | 2014-05-14 | 2018-08-15 | Stryker European Holdings I, LLC | Navigation system for tracking the position of a work target |
CA2964512C (en) * | 2014-10-14 | 2018-04-24 | Synaptive Medical (Barbados) Inc. | Patient reference tool |
US10314523B2 (en) | 2014-11-14 | 2019-06-11 | Synaptive Medical (Barbados) Inc. | Method, system and apparatus for image capture and registration in image-guided surgery |
US11129691B2 (en) * | 2014-12-16 | 2021-09-28 | Koninklijke Philips N.V. | Pulsed-light emitting marker device |
US20160278864A1 (en) * | 2015-03-19 | 2016-09-29 | Medtronic Navigation, Inc. | Apparatus And Method For Instrument And Gesture Based Image Guided Surgery |
US20170086941A1 (en) | 2015-09-25 | 2017-03-30 | Atracsys | Marker for Optical Tracking System |
US10828125B2 (en) | 2015-11-03 | 2020-11-10 | Synaptive Medical (Barbados) Inc. | Dual zoom and dual field-of-view microscope |
KR102488295B1 (en) | 2016-05-23 | 2023-01-16 | 마코 서지컬 코포레이션 | Systems and methods for identifying and tracking physical objects during robotic surgical procedures |
US10706565B2 (en) * | 2016-07-27 | 2020-07-07 | Seikowave, Inc. | Method and apparatus for motion tracking of an object and acquisition of three-dimensional data over large areas |
GB2568425B (en) * | 2016-08-17 | 2021-08-18 | Synaptive Medical Inc | Wireless active tracking fiducials |
US10543016B2 (en) | 2016-11-07 | 2020-01-28 | Vycor Medical, Inc. | Surgical introducer with guidance system receptacle |
US10376258B2 (en) | 2016-11-07 | 2019-08-13 | Vycor Medical, Inc. | Surgical introducer with guidance system receptacle |
US11612307B2 (en) | 2016-11-24 | 2023-03-28 | University Of Washington | Light field capture and rendering for head-mounted displays |
EP3494903B1 (en) * | 2017-12-07 | 2023-11-01 | Augmedics Ltd. | Spinous process clamp |
WO2019209725A1 (en) | 2018-04-23 | 2019-10-31 | Mako Surgical Corp. | System, method and software program for aiding in positioning of a camera relative to objects in a surgical environment |
DE102018206406B3 (en) * | 2018-04-25 | 2019-09-12 | Carl Zeiss Meditec Ag | Microscopy system and method for operating a microscopy system |
US10825563B2 (en) * | 2018-05-14 | 2020-11-03 | Novarad Corporation | Aligning image data of a patient with actual views of the patient using an optical code affixed to the patient |
US11291507B2 (en) | 2018-07-16 | 2022-04-05 | Mako Surgical Corp. | System and method for image based registration and calibration |
US20220338886A1 (en) * | 2019-06-19 | 2022-10-27 | Think Surgical, Inc. | System and method to position a tracking system field-of-view |
WO2022055371A1 (en) * | 2020-09-08 | 2022-03-17 | Weta Digital Limited | Motion capture calibration using a wand |
US11295460B1 (en) | 2021-01-04 | 2022-04-05 | Proprio, Inc. | Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene |
CN112971986A (en) * | 2021-03-31 | 2021-06-18 | 南京逸动智能科技有限责任公司 | Tracer for navigation operation and positioning method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4797736A (en) * | 1987-09-02 | 1989-01-10 | Luxtec Corporation | Head mounted illumination and camera assembly |
US4859181A (en) * | 1986-09-11 | 1989-08-22 | Stefan Neumeyer | Method and apparatus for measuring relative jaw movement |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
US5622170A (en) * | 1990-10-19 | 1997-04-22 | Image Guided Technologies, Inc. | Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body |
US5848967A (en) * | 1991-01-28 | 1998-12-15 | Cosman; Eric R. | Optically coupled frameless stereotactic system and method |
US6006126A (en) * | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
US6675040B1 (en) * | 1991-01-28 | 2004-01-06 | Sherwood Services Ag | Optical object tracking system |
Family Cites Families (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3821469A (en) | 1972-05-15 | 1974-06-28 | Amperex Electronic Corp | Graphical data device |
DE2443558B2 (en) | 1974-09-11 | 1979-01-04 | Siemens Ag, 1000 Berlin Und 8000 Muenchen | Device for puncturing internal organs and vessels |
US3983474A (en) | 1975-02-21 | 1976-09-28 | Polhemus Navigation Sciences, Inc. | Tracking and determining orientation of object using coordinate transformation means, system and process |
US4068556A (en) | 1977-02-03 | 1978-01-17 | Bei Electronics, Inc. | Ammunition identification and firing system having electrical identification means |
US4068156A (en) | 1977-03-01 | 1978-01-10 | Martin Marietta Corporation | Rate control system for manipulator arms |
DE2718804C3 (en) | 1977-04-27 | 1979-10-31 | Karlheinz Prof. Dr. 3000 Hannover Renner | Device for positioning control of patients and / or radiation sources |
US4182312A (en) | 1977-05-20 | 1980-01-08 | Mushabac David R | Dental probe |
DE7805301U1 (en) | 1978-02-22 | 1978-07-06 | Howmedica International, Inc. Zweigniederlassung Kiel, 2300 Kiel | Distal aiming device for locking nailing |
US4341220A (en) | 1979-04-13 | 1982-07-27 | Pfizer Inc. | Stereotactic surgery apparatus and method |
US4608977A (en) | 1979-08-29 | 1986-09-02 | Brown Russell A | System using computed tomography as for selective body treatment |
DE2948986C2 (en) | 1979-12-05 | 1982-10-28 | Siemens AG, 1000 Berlin und 8000 München | Medical examination facility |
US4638798A (en) | 1980-09-10 | 1987-01-27 | Shelden C Hunter | Stereotactic method and apparatus for locating and treating or removing lesions |
US4358856A (en) | 1980-10-31 | 1982-11-09 | General Electric Company | Multiaxial x-ray apparatus |
AU7986682A (en) | 1981-02-12 | 1982-08-19 | New York University | Apparatus for stereotactic surgery |
NL8101722A (en) | 1981-04-08 | 1982-11-01 | Philips Nv | CONTOUR METER. |
US4465069A (en) | 1981-06-04 | 1984-08-14 | Barbier Jean Y | Cranial insertion of surgical needle utilizing computer-assisted tomography |
US4407298A (en) | 1981-07-16 | 1983-10-04 | Critikon Inc. | Connector for thermodilution catheter |
US4473074A (en) | 1981-09-28 | 1984-09-25 | Xanar, Inc. | Microsurgical laser device |
US4645343A (en) | 1981-11-11 | 1987-02-24 | U.S. Philips Corporation | Atomic resonance line source lamps and spectrophotometers for use with such lamps |
US4457311A (en) | 1982-09-03 | 1984-07-03 | Medtronic, Inc. | Ultrasound imaging system for scanning the human back |
US4506676A (en) | 1982-09-10 | 1985-03-26 | Duska Alois A | Radiographic localization technique |
US4701407A (en) | 1982-11-24 | 1987-10-20 | Baylor College Of Medicine | Diagnosis of Alzheimer disease |
US4961422A (en) | 1983-01-21 | 1990-10-09 | Marchosky J Alexander | Method and apparatus for volumetric interstitial conductive hyperthermia |
US4651732A (en) | 1983-03-17 | 1987-03-24 | Frederick Philip R | Three-dimensional light guidance system for invasive procedures |
JPS59218513A (en) | 1983-05-26 | 1984-12-08 | Fanuc Ltd | Arc control method of industrial robot |
NL8302228A (en) | 1983-06-22 | 1985-01-16 | Optische Ind De Oude Delft Nv | MEASURING SYSTEM FOR USING A TRIANGULAR PRINCIPLE, CONTACT-FREE MEASURING A DISTANCE GIVEN BY A SURFACE CONTOUR TO AN OBJECTIVE LEVEL. |
DE3342675A1 (en) | 1983-11-25 | 1985-06-05 | Fa. Carl Zeiss, 7920 Heidenheim | METHOD AND DEVICE FOR CONTACTLESS MEASUREMENT OF OBJECTS |
US4753528A (en) | 1983-12-13 | 1988-06-28 | Quantime, Inc. | Laser archery distance device |
US4841967A (en) | 1984-01-30 | 1989-06-27 | Chang Ming Z | Positioning device for percutaneous needle insertion |
US4674057A (en) | 1984-02-14 | 1987-06-16 | Lockheed Corporation | Ultrasonic ranging control system for industrial robots |
US4571834A (en) | 1984-02-17 | 1986-02-25 | Orthotronics Limited Partnership | Knee laxity evaluator and motion module/digitizer arrangement |
US4583538A (en) | 1984-05-04 | 1986-04-22 | Onik Gary M | Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization |
JPS6149205A (en) | 1984-08-16 | 1986-03-11 | Seiko Instr & Electronics Ltd | Robot control system |
US4705395A (en) | 1984-10-03 | 1987-11-10 | Diffracto Ltd. | Triangulation data integrity |
US4821206A (en) | 1984-11-27 | 1989-04-11 | Photo Acoustic Technology, Inc. | Ultrasonic apparatus for positioning a robot hand |
US4592352A (en) | 1984-11-30 | 1986-06-03 | Patil Arun A | Computer-assisted tomography stereotactic system |
US4706665A (en) | 1984-12-17 | 1987-11-17 | Gouda Kasim I | Frame for stereotactic surgery |
SE447848B (en) | 1985-06-14 | 1986-12-15 | Anders Bengtsson | INSTRUMENTS FOR SEATING SURFACE TOPOGRAPHY |
US4743771A (en) | 1985-06-17 | 1988-05-10 | View Engineering, Inc. | Z-axis height measurement system |
US4805615A (en) | 1985-07-02 | 1989-02-21 | Carol Mark P | Method and apparatus for performing stereotactic surgery |
US4686997A (en) | 1985-07-03 | 1987-08-18 | The United States Of America As Represented By The Secretary Of The Air Force | Skeletal bone remodeling studies using guided trephine sample |
US4737032A (en) | 1985-08-26 | 1988-04-12 | Cyberware Laboratory, Inc. | Surface mensuration sensor |
US4705401A (en) | 1985-08-12 | 1987-11-10 | Cyberware Laboratory Inc. | Rapid three-dimensional surface digitizer |
IL76517A (en) | 1985-09-27 | 1989-02-28 | Nessim Igal Levy | Distance measuring device |
US4709156A (en) | 1985-11-27 | 1987-11-24 | Ex-Cell-O Corporation | Method and apparatus for inspecting a surface |
US4794262A (en) | 1985-12-03 | 1988-12-27 | Yukio Sato | Method and apparatus for measuring profile of three-dimensional object |
US4742815A (en) | 1986-01-02 | 1988-05-10 | Ninan Champil A | Computer monitoring of endoscope |
US4722056A (en) | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
US4776749A (en) | 1986-03-25 | 1988-10-11 | Northrop Corporation | Robotic device |
EP0239409A1 (en) | 1986-03-28 | 1987-09-30 | Life Technology Research Foundation | Robot for surgical operation |
SE469321B (en) | 1986-04-14 | 1993-06-21 | Joenkoepings Laens Landsting | SET AND DEVICE TO MAKE A MODIFIED THREE-DIMENSIONAL IMAGE OF AN ELASTIC DEFORMABLE PURPOSE |
US5078140A (en) | 1986-05-08 | 1992-01-07 | Kwoh Yik S | Imaging device - aided robotic stereotaxis system |
US4822163A (en) | 1986-06-26 | 1989-04-18 | Robotic Vision Systems, Inc. | Tracking vision sensor |
US4723544A (en) | 1986-07-09 | 1988-02-09 | Moore Robert R | Hemispherical vectoring needle guide for discolysis |
US4791934A (en) | 1986-08-07 | 1988-12-20 | Picker International, Inc. | Computer tomography assisted stereotactic surgery system and method |
US4733969A (en) | 1986-09-08 | 1988-03-29 | Cyberoptics Corporation | Laser probe for determining distance |
US4743770A (en) | 1986-09-22 | 1988-05-10 | Mitutoyo Mfg. Co., Ltd. | Profile-measuring light probe using a change in reflection factor in the proximity of a critical angle of light |
US4761072A (en) | 1986-09-30 | 1988-08-02 | Diffracto Ltd. | Electro-optical sensors for manual control |
US4933843A (en) | 1986-11-06 | 1990-06-12 | Storz Instrument Company | Control system for ophthalmic surgical instruments |
US4750487A (en) | 1986-11-24 | 1988-06-14 | Zanetti Paul H | Stereotactic frame |
DE3703422A1 (en) | 1987-02-05 | 1988-08-18 | Zeiss Carl Fa | OPTOELECTRONIC DISTANCE SENSOR |
US4753128A (en) | 1987-03-09 | 1988-06-28 | Gmf Robotics Corporation | Robot with spring pivot balancing mechanism |
US4745290A (en) | 1987-03-19 | 1988-05-17 | David Frankel | Method and apparatus for use in making custom shoes |
US4762016A (en) | 1987-03-27 | 1988-08-09 | The Regents Of The University Of California | Robotic manipulator having three degrees of freedom |
US4875478A (en) | 1987-04-10 | 1989-10-24 | Chen Harry H | Portable compression grid & needle holder |
US4733661A (en) | 1987-04-27 | 1988-03-29 | Palestrant Aubrey M | Guidance device for C.T. guided drainage and biopsy procedures |
US4809694A (en) | 1987-05-19 | 1989-03-07 | Ferrara Vincent L | Biopsy guide |
DE3717871C3 (en) | 1987-05-27 | 1995-05-04 | Georg Prof Dr Schloendorff | Method and device for reproducible visual representation of a surgical intervention |
US4836778A (en) | 1987-05-26 | 1989-06-06 | Vexcel Corporation | Mandibular motion monitoring system |
US4835710A (en) | 1987-07-17 | 1989-05-30 | Cincinnati Milacron Inc. | Method of moving and orienting a tool along a curved path |
US4829373A (en) | 1987-08-03 | 1989-05-09 | Vexcel Corporation | Stereo mensuration apparatus |
US4931056A (en) | 1987-09-04 | 1990-06-05 | Neurodynamics, Inc. | Catheter guide apparatus for perpendicular insertion into a cranium orifice |
US5099836A (en) | 1987-10-05 | 1992-03-31 | Hudson Respiratory Care Inc. | Intermittent oxygen delivery system and cannula |
CA1288176C (en) * | 1987-10-29 | 1991-08-27 | David C. Hatcher | Method and apparatus for improving the alignment of radiographic images |
US4991579A (en) | 1987-11-10 | 1991-02-12 | Allen George S | Method and apparatus for providing related images over time of a portion of the anatomy using fiducial implants |
JP2538953B2 (en) | 1987-11-17 | 1996-10-02 | 三菱重工業株式会社 | Balance mechanism of industrial robot |
US5027818A (en) | 1987-12-03 | 1991-07-02 | University Of Florida | Dosimetric technique for stereotactic radiosurgery same |
US5251127A (en) | 1988-02-01 | 1993-10-05 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
EP0326768A3 (en) | 1988-02-01 | 1991-01-23 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US5050608A (en) | 1988-07-12 | 1991-09-24 | Medirand, Inc. | System for indicating a position to be operated in a patient's body |
US4896673A (en) | 1988-07-15 | 1990-01-30 | Medstone International, Inc. | Method and apparatus for stone localization using ultrasound imaging |
US5197476A (en) | 1989-03-16 | 1993-03-30 | Christopher Nowacki | Locating target in human body |
DE69026196T2 (en) | 1989-11-08 | 1996-09-05 | George S Allen | Mechanical arm for an interactive, image-controlled, surgical system |
US5047036A (en) | 1989-11-17 | 1991-09-10 | Koutrouvelis Panos G | Stereotactic device |
US5080662A (en) | 1989-11-27 | 1992-01-14 | Paul Kamaljit S | Spinal stereotaxic device and method |
US5224049A (en) | 1990-04-10 | 1993-06-29 | Mushabac David R | Method, system and mold assembly for use in preparing a dental prosthesis |
US5107839A (en) | 1990-05-04 | 1992-04-28 | Pavel V. Houdek | Computer controlled stereotaxic radiotherapy system and method |
US5086401A (en) | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
US5295483A (en) * | 1990-05-11 | 1994-03-22 | Christopher Nowacki | Locating target in human body |
US5017139A (en) | 1990-07-05 | 1991-05-21 | Mushabac David R | Mechanical support for hand-held dental/medical instrument |
US5193106A (en) | 1990-08-28 | 1993-03-09 | Desena Danforth | X-ray identification marker |
US5207223A (en) | 1990-10-19 | 1993-05-04 | Accuray, Inc. | Apparatus for and method of performing stereotaxic surgery |
DE69133603D1 (en) | 1990-10-19 | 2008-10-02 | Univ St Louis | System for localizing a surgical probe relative to the head |
US5389101A (en) | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
IL109939A (en) | 1993-06-21 | 1997-09-30 | Gen Electric | Display system for enhancing visualization of body structures during medical procedures |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5436542A (en) * | 1994-01-28 | 1995-07-25 | Surgix, Inc. | Telescopic camera mount with remotely controlled positioning |
GB9405299D0 (en) | 1994-03-17 | 1994-04-27 | Roke Manor Research | Improvements in or relating to video-based systems for computer assisted surgery and localisation |
DE29505318U1 (en) | 1995-03-29 | 1995-05-18 | Zeiss Carl Fa | Device for marking points to be measured optically |
US5617857A (en) | 1995-06-06 | 1997-04-08 | Image Guided Technologies, Inc. | Imaging system having interactive medical instruments and methods |
DE19709960A1 (en) | 1997-03-11 | 1998-09-24 | Aesculap Ag & Co Kg | Method and device for preoperatively determining the position data of endoprosthesis parts |
-
2000
- 2000-01-26 US US09/491,502 patent/US6675040B1/en not_active Expired - Fee Related
-
2004
- 2004-01-05 US US10/752,118 patent/US20040138556A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4859181A (en) * | 1986-09-11 | 1989-08-22 | Stefan Neumeyer | Method and apparatus for measuring relative jaw movement |
US4797736A (en) * | 1987-09-02 | 1989-01-10 | Luxtec Corporation | Head mounted illumination and camera assembly |
US5622170A (en) * | 1990-10-19 | 1997-04-22 | Image Guided Technologies, Inc. | Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body |
US5848967A (en) * | 1991-01-28 | 1998-12-15 | Cosman; Eric R. | Optically coupled frameless stereotactic system and method |
US6006126A (en) * | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
US6675040B1 (en) * | 1991-01-28 | 2004-01-06 | Sherwood Services Ag | Optical object tracking system |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
Cited By (158)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030187360A1 (en) * | 2001-02-12 | 2003-10-02 | Milton Waner | Infrared assisted monitoring of a catheter |
US20030133602A1 (en) * | 2002-01-15 | 2003-07-17 | Ali Bani-Hashemi | Patient positioning by video imaging |
US7016522B2 (en) * | 2002-01-15 | 2006-03-21 | Siemens Medical Solutions Usa, Inc. | Patient positioning by video imaging |
US20050020909A1 (en) * | 2003-07-10 | 2005-01-27 | Moctezuma De La Barrera Jose Luis | Display device for surgery and method for using the same |
US7266175B1 (en) | 2003-07-11 | 2007-09-04 | Nomos Corporation | Planning method for radiation therapy |
US20050033108A1 (en) * | 2003-08-05 | 2005-02-10 | Sawyer Timothy E. | Tumor treatment identification system |
US8055323B2 (en) | 2003-08-05 | 2011-11-08 | Imquant, Inc. | Stereotactic system and method for defining a tumor treatment region |
US7831289B2 (en) | 2003-10-07 | 2010-11-09 | Best Medical International, Inc. | Planning system, method and apparatus for conformal radiation therapy |
WO2006008300A1 (en) * | 2004-07-20 | 2006-01-26 | Politecnico Di Milano | Apparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers |
US20080033283A1 (en) * | 2004-07-20 | 2008-02-07 | Raffaele Dellaca | Apparatus for Navigation and for Fusion of Ecographic and Volumetric Images of a Patient Which Uses a Combination of Active and Passive Optical Markers |
WO2006017489A2 (en) | 2004-08-03 | 2006-02-16 | Imquant, Inc. | Tumor treatment identification system |
EP1786517A2 (en) * | 2004-08-03 | 2007-05-23 | Imquant, Inc. | Tumor treatment identification system |
EP1786517A4 (en) * | 2004-08-03 | 2008-04-09 | Imquant Inc | Tumor treatment identification system |
US20090216111A1 (en) * | 2004-08-09 | 2009-08-27 | Koninklijke Philips Electronics, N.V. | Processing of images of interventional instruments with markers |
WO2006016290A1 (en) | 2004-08-09 | 2006-02-16 | Koninklijke Philips Electronics N.V. | Processing of images of interventional instruments with markers |
US9393079B2 (en) | 2004-08-09 | 2016-07-19 | Koninklijke Philips N.V. | Processing of images of interventional instruments with markers |
JP2008511343A (en) * | 2004-08-09 | 2008-04-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Method and apparatus for processing an image of an interventional instrument having a marker |
US20060072124A1 (en) * | 2004-10-01 | 2006-04-06 | Smetak Edward C | System and tracker for tracking an object, and related methods |
US7289227B2 (en) | 2004-10-01 | 2007-10-30 | Nomos Corporation | System and tracker for tracking an object, and related methods |
US20060122502A1 (en) * | 2004-12-06 | 2006-06-08 | Scherch John D | System for analyzing the geometry of a radiation treatment apparatus, software and related methods |
US7729472B2 (en) | 2004-12-06 | 2010-06-01 | Best Medical International, Inc. | System for analyzing the geometry of a radiation treatment apparatus, software and related methods |
WO2006062872A1 (en) * | 2004-12-06 | 2006-06-15 | Nomos Corporation | System for analyzing the geometry of a radiation treatment apparatus, software and related methods |
US20060215813A1 (en) * | 2005-03-23 | 2006-09-28 | Scherch John D | System for monitoring the geometry of a radiation treatment apparatus, trackable assembly, program product, and related methods |
US7590218B2 (en) | 2005-03-23 | 2009-09-15 | Best Medical International, Inc. | System for monitoring the geometry of a radiation treatment apparatus, trackable assembly, program product, and related methods |
US7613501B2 (en) | 2005-06-16 | 2009-11-03 | Best Medical International, Inc. | System, tracker, and program product to facilitate and verify proper target alignment for radiation delivery, and related methods |
US20060285641A1 (en) * | 2005-06-16 | 2006-12-21 | Nomos Corporation | System, tracker, and program product to facilitate and verify proper target alignment for radiation delivery, and related methods |
US20070066880A1 (en) * | 2005-09-09 | 2007-03-22 | Warren Lee | Image-based probe guidance system |
US20070066899A1 (en) * | 2005-09-22 | 2007-03-22 | Siemens Aktiengesellschaft | Medical treatment device and associated method of operation |
US7892232B2 (en) * | 2005-09-22 | 2011-02-22 | Siemens Aktiengesellschaft | Medical treatment device and associated method of operation |
EP1982650A1 (en) * | 2006-02-09 | 2008-10-22 | Nat. University Corp. Hamamatsu University School of Medicine | Surgery support device, method, and program |
US8463360B2 (en) | 2006-02-09 | 2013-06-11 | National University Corporation Hamamatsu University School Of Medicine | Surgery support device, surgery support method, and computer readable recording medium storing surgery support program |
EP1982650A4 (en) * | 2006-02-09 | 2010-04-14 | Nat Univ Corp Hamamatsu | Surgery support device, method, and program |
JP2007209531A (en) * | 2006-02-09 | 2007-08-23 | Hamamatsu Univ School Of Medicine | Surgery supporting system, method and program |
US20110054300A1 (en) * | 2006-02-09 | 2011-03-03 | National University Corporation Hamamatsu University School Of Medicine | Surgery support device, surgery support method, and computer readable recording medium storing surgery support program |
US20100149213A1 (en) * | 2006-04-12 | 2010-06-17 | Nassir Navab | Virtual Penetrating Mirror Device for Visualizing of Virtual Objects within an Augmented Reality Environment |
US20070270690A1 (en) * | 2006-05-18 | 2007-11-22 | Swen Woerlein | Non-contact medical registration with distance measuring |
US11116574B2 (en) | 2006-06-16 | 2021-09-14 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
US11857265B2 (en) | 2006-06-16 | 2024-01-02 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
US9220573B2 (en) * | 2007-01-02 | 2015-12-29 | Medtronic Navigation, Inc. | System and method for tracking positions of uniform marker geometries |
US20080161682A1 (en) * | 2007-01-02 | 2008-07-03 | Medtronic Navigation, Inc. | System and method for tracking positions of uniform marker geometries |
US20100094085A1 (en) * | 2007-01-31 | 2010-04-15 | National University Corporation Hamamatsu Universi Ty School Of Medicine | Device for Displaying Assistance Information for Surgical Operation, Method for Displaying Assistance Information for Surgical Operation, and Program for Displaying Assistance Information for Surgical Operation |
US8251893B2 (en) * | 2007-01-31 | 2012-08-28 | National University Corporation Hamamatsu University School Of Medicine | Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation |
US8885177B2 (en) * | 2007-09-26 | 2014-11-11 | Elbit Systems Ltd. | Medical wide field of view optical tracking system |
US20140078517A1 (en) * | 2007-09-26 | 2014-03-20 | Elbit Systems Ltd. | Medical wide field of view optical tracking system |
US20090109240A1 (en) * | 2007-10-24 | 2009-04-30 | Roman Englert | Method and System for Providing and Reconstructing a Photorealistic Three-Dimensional Environment |
US20100268071A1 (en) * | 2007-12-17 | 2010-10-21 | Imagnosis Inc. | Medical imaging marker and program for utilizing same |
US9008755B2 (en) * | 2007-12-17 | 2015-04-14 | Imagnosis Inc. | Medical imaging marker and program for utilizing same |
US20110190637A1 (en) * | 2008-08-18 | 2011-08-04 | Naviswiss Ag | Medical measuring system, method for surgical intervention as well as use of a medical measuring system |
US20100061608A1 (en) * | 2008-09-10 | 2010-03-11 | Galant Adam K | Medical Image Data Processing and Interventional Instrument Identification System |
US8244013B2 (en) | 2008-09-10 | 2012-08-14 | Siemens Medical Solutions Usa, Inc. | Medical image data processing and interventional instrument identification system |
US10675098B2 (en) | 2008-12-31 | 2020-06-09 | Intuitive Surgical Operations, Inc. | Configuration marker design and detection for instrument tracking |
US11471221B2 (en) | 2008-12-31 | 2022-10-18 | Intuitive Surgical Operations, Inc. | Configuration marker design and detection for instrument tracking |
US9526587B2 (en) | 2008-12-31 | 2016-12-27 | Intuitive Surgical Operations, Inc. | Fiducial marker design and detection for locating surgical instrument in images |
US20100168562A1 (en) * | 2008-12-31 | 2010-07-01 | Intuitive Surgical, Inc. | Fiducial marker design and detection for locating surgical instrument in images |
US9867669B2 (en) * | 2008-12-31 | 2018-01-16 | Intuitive Surgical Operations, Inc. | Configuration marker design and detection for instrument tracking |
US20100168763A1 (en) * | 2008-12-31 | 2010-07-01 | Intuitive Surgical, Inc. | Configuration marker design and detection for instrument tracking |
US20100185100A1 (en) * | 2009-01-19 | 2010-07-22 | Alexander Urban | Identifying and localizing tissue using light analysis |
US8666476B2 (en) | 2009-03-01 | 2014-03-04 | National University Corporation Hamamatsu University School Of Medicine | Surgery assistance system |
US8792614B2 (en) | 2009-03-31 | 2014-07-29 | Matthew R. Witten | System and method for radiation therapy treatment planning using a memetic optimization algorithm |
WO2010124672A1 (en) * | 2009-04-27 | 2010-11-04 | Phacon Gmbh | Video-based mono-camera navigation system |
US20120280910A1 (en) * | 2009-11-18 | 2012-11-08 | Elmiro Business Development B.V. | Control system and method for controlling a plurality of computer devices |
WO2011113441A3 (en) * | 2010-03-18 | 2012-02-02 | Rigshospitalet | Optical motion tracking of an object |
WO2012001550A1 (en) * | 2010-06-30 | 2012-01-05 | Koninklijke Philips Electronics N.V. | Method and system for creating physician-centric coordinate system |
US8435033B2 (en) | 2010-07-19 | 2013-05-07 | Rainbow Medical Ltd. | Dental navigation techniques |
US10165981B2 (en) | 2010-09-29 | 2019-01-01 | Stryker European Holdings I, Llc | Surgical navigation method |
US8657809B2 (en) * | 2010-09-29 | 2014-02-25 | Stryker Leibinger Gmbh & Co., Kg | Surgical navigation system |
US20120078236A1 (en) * | 2010-09-29 | 2012-03-29 | Hans Schoepp | Surgical Navigation System |
US8682062B2 (en) * | 2010-10-04 | 2014-03-25 | Korea Institute Of Science And Technology | 3 dimension tracking system for surgery simulation and localization sensing method using the same |
US20120082342A1 (en) * | 2010-10-04 | 2012-04-05 | Korea Institute Of Science And Technology | 3 dimension tracking system for surgery simulation and localization sensing method using the same |
US10716460B2 (en) | 2010-12-02 | 2020-07-21 | Ultradent Products, Inc. | Stereoscopic video imaging and tracking system |
US10154775B2 (en) | 2010-12-02 | 2018-12-18 | Ultradent Products, Inc. | Stereoscopic video imaging and tracking system |
US9545188B2 (en) | 2010-12-02 | 2017-01-17 | Ultradent Products, Inc. | System and method of viewing and tracking stereoscopic video images |
US10219811B2 (en) | 2011-06-27 | 2019-03-05 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US11911117B2 (en) | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US10080617B2 (en) | 2011-06-27 | 2018-09-25 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US20130108979A1 (en) * | 2011-10-28 | 2013-05-02 | Navident Technologies, Inc. | Surgical location monitoring system and method |
US9566123B2 (en) | 2011-10-28 | 2017-02-14 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method |
US9554763B2 (en) | 2011-10-28 | 2017-01-31 | Navigate Surgical Technologies, Inc. | Soft body automatic registration and surgical monitoring system |
US11304777B2 (en) | 2011-10-28 | 2022-04-19 | Navigate Surgical Technologies, Inc | System and method for determining the three-dimensional location and orientation of identification markers |
US9585721B2 (en) | 2011-10-28 | 2017-03-07 | Navigate Surgical Technologies, Inc. | System and method for real time tracking and modeling of surgical site |
US20140228675A1 (en) * | 2011-10-28 | 2014-08-14 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method |
US8938282B2 (en) * | 2011-10-28 | 2015-01-20 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method with automatic registration |
US9452024B2 (en) | 2011-10-28 | 2016-09-27 | Navigate Surgical Technologies, Inc. | Surgical location monitoring system and method |
EP2641561A1 (en) * | 2012-03-21 | 2013-09-25 | Covidien LP | System and method for determining camera angles by using virtual planes derived from actual images |
US20130261433A1 (en) * | 2012-03-28 | 2013-10-03 | Navident Technologies, Inc. | Haptic simulation and surgical location monitoring system and method |
US9511243B2 (en) | 2012-04-12 | 2016-12-06 | University Of Florida Research Foundation, Inc. | Prevention of setup errors in radiotherapy |
US9561387B2 (en) | 2012-04-12 | 2017-02-07 | Unitversity of Florida Research Foundation, Inc. | Ambiguity-free optical tracking system |
US10021351B2 (en) | 2012-06-01 | 2018-07-10 | Ultradent Products, Inc. | Stereoscopic video imaging |
US11856178B2 (en) | 2012-06-01 | 2023-12-26 | Ultradent Products, Inc. | Stereoscopic video imaging |
US10034713B2 (en) | 2012-07-03 | 2018-07-31 | 7D Surgical Inc. | Attachments for tracking handheld implements |
US10117712B2 (en) | 2012-09-25 | 2018-11-06 | Brainlab Ag | Modular navigation reference |
WO2014048448A1 (en) * | 2012-09-25 | 2014-04-03 | Brainlab Ag | Modular navigation reference |
US11529198B2 (en) | 2012-09-26 | 2022-12-20 | Stryker Corporation | Optical and non-optical sensor tracking of objects for a robotic cutting system |
US9271804B2 (en) | 2012-09-26 | 2016-03-01 | Stryker Corporation | Method for tracking objects using optical and non-optical sensors |
US10575906B2 (en) | 2012-09-26 | 2020-03-03 | Stryker Corporation | Navigation system and method for tracking objects using optical and non-optical sensors |
US9008757B2 (en) | 2012-09-26 | 2015-04-14 | Stryker Corporation | Navigation system including optical and non-optical sensors |
US9687307B2 (en) | 2012-09-26 | 2017-06-27 | Stryker Corporation | Navigation system and method for tracking objects using optical and non-optical sensors |
US9918657B2 (en) | 2012-11-08 | 2018-03-20 | Navigate Surgical Technologies, Inc. | Method for determining the location and orientation of a fiducial reference |
US9198737B2 (en) | 2012-11-08 | 2015-12-01 | Navigate Surgical Technologies, Inc. | System and method for determining the three-dimensional location and orientation of identification markers |
US20140128727A1 (en) * | 2012-11-08 | 2014-05-08 | Navident Technologies, Inc. | Surgical location monitoring system and method using natural markers |
US8908918B2 (en) | 2012-11-08 | 2014-12-09 | Navigate Surgical Technologies, Inc. | System and method for determining the three-dimensional location and orientation of identification markers |
WO2014120909A1 (en) * | 2013-02-01 | 2014-08-07 | Sarment David | Apparatus, system and method for surgical navigation |
US11918305B2 (en) | 2013-03-13 | 2024-03-05 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US11464579B2 (en) | 2013-03-13 | 2022-10-11 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US20140276943A1 (en) * | 2013-03-13 | 2014-09-18 | Stryker Corporation | Systems and Methods for Establishing Virtual Constraint Boundaries |
US10512509B2 (en) | 2013-03-13 | 2019-12-24 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US9603665B2 (en) * | 2013-03-13 | 2017-03-28 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
AU2014231344B2 (en) * | 2013-03-15 | 2018-10-04 | Synaptive Medical Inc. | Systems and methods for navigation and simulation of minimally invasive therapy |
US10105149B2 (en) | 2013-03-15 | 2018-10-23 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
CN105208958B (en) * | 2013-03-15 | 2018-02-02 | 圣纳普医疗(巴巴多斯)公司 | System and method for navigation and the simulation of minimally-invasive treatment |
US10433763B2 (en) | 2013-03-15 | 2019-10-08 | Synaptive Medical (Barbados) Inc. | Systems and methods for navigation and simulation of minimally invasive therapy |
WO2014139022A1 (en) | 2013-03-15 | 2014-09-18 | Synaptive Medical (Barbados) Inc. | Systems and methods for navigation and simulation of minimally invasive therapy |
JP2016517287A (en) * | 2013-03-15 | 2016-06-16 | シナプティヴ メディカル (バルバドス) インコーポレイテッドSynaptive Medical (Barbados) Inc. | CROSS-REFERENCE TO SYSTEMS AND METHODS FOR INDUCTION AND SIMULATION OF LOW-INVASIVE TREATMENTS Claims priority to 800,155, the entire contents of which are incorporated herein by reference. This application also claims priority to US Provisional Patent Application No. 61 / 924,993, filed Jan. 8, 2014, entitled “PLANNING, NAVIGATION SIMULATION SYSTEMS AND METHFORMS FORMINIMALLY INVASIVETHERAPY,” the entire contents of which are hereby incorporated by reference. Embedded in the book. This application also claims priority to US Provisional Patent Application No. 61 / 801,746, filed Mar. 15, 2013, entitled “INSERTTIMEMAGING DEVICE”, the entire contents of which are hereby incorporated by reference. Incorporated. This application also claims priority to US Provisional Patent Application No. 61 / 818,255, filed May 1, 2013, entitled “INSERTTIMEMAGING DEVICE”, the entire contents of which are hereby incorporated by reference. Incorporated. This application also claims priority to US Provisional Patent Application No. 61 / 801,143, filed March 15, 2013, entitled “INSERTABLEMANETTICRESONANCEIMAGINGCOILPROBEFORMINMINARYLYVASIVECORRIDOR-BASEDPROCEDURES,” the entire contents of which are hereby incorporated by reference. Embedded in the book. This application also claims priority to US Provisional Patent Application No. 61 / 818,325, entitled “INSERTABLEMAGNETICRESONANCENEMAGINGCOILPROBEFORMINMINARYLYVASIVECORRRIDOR-BASEDPROCEDURES,” filed May 1, 2013, the entire contents of which are hereby incorporated by reference. Embedded in the book. |
EP2967292A4 (en) * | 2013-03-15 | 2017-03-01 | Synaptive Medical (Barbados) Inc. | Systems and methods for navigation and simulation of minimally invasive therapy |
CN105208958A (en) * | 2013-03-15 | 2015-12-30 | 圣纳普医疗(巴巴多斯)公司 | Systems and methods for navigation and simulation of minimally invasive therapy |
US9489738B2 (en) * | 2013-04-26 | 2016-11-08 | Navigate Surgical Technologies, Inc. | System and method for tracking non-visible structure of a body with multi-element fiducial |
US20140320600A1 (en) * | 2013-04-26 | 2014-10-30 | Navigate Surgical Technologies, Inc. | System and method for tracking non-visible structure of a body |
US9844413B2 (en) | 2013-04-26 | 2017-12-19 | Navigate Surgical Technologies, Inc. | System and method for tracking non-visible structure of a body with multi-element fiducial |
US9456122B2 (en) | 2013-08-13 | 2016-09-27 | Navigate Surgical Technologies, Inc. | System and method for focusing imaging devices |
JP2016536076A (en) * | 2013-08-13 | 2016-11-24 | ナビゲート サージカル テクノロジーズ インク | Method for determining the position and orientation of a reference reference |
WO2015075720A1 (en) * | 2013-11-21 | 2015-05-28 | Elbit Systems Ltd. | A medical optical tracking system |
CN105916462A (en) * | 2013-11-21 | 2016-08-31 | 埃尔比特系统公司 | A medical optical tracking system |
US10806549B2 (en) * | 2014-01-27 | 2020-10-20 | Align Technology, Inc. | Image registration of intraoral images using adhesive objects |
US11793610B2 (en) * | 2014-01-27 | 2023-10-24 | Align Technology, Inc. | Image registration of intraoral images using non-rigid indicia |
US20190008617A1 (en) * | 2014-01-27 | 2019-01-10 | Align Technology, Inc. | Image registration of intraoral images using adhesive objects |
US20150209118A1 (en) * | 2014-01-27 | 2015-07-30 | Align Technology, Inc. | Adhesive objects for improving image registration of intraoral images |
US20200405457A1 (en) * | 2014-01-27 | 2020-12-31 | Align Technology, Inc. | Image registration of intraoral images using non-rigid indicia |
US10111714B2 (en) * | 2014-01-27 | 2018-10-30 | Align Technology, Inc. | Adhesive objects for improving image registration of intraoral images |
US20220241013A1 (en) * | 2014-03-28 | 2022-08-04 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional visualization of instruments in a field of view |
US10847263B2 (en) | 2014-04-04 | 2020-11-24 | International Business Machines Corporation | Head mounted video and touch detection for healthcare facility hygiene |
US10734108B2 (en) | 2014-04-04 | 2020-08-04 | International Business Machines Corporation | Head mounted video and touch detection for healthcare facility hygiene |
US9659367B2 (en) | 2014-04-04 | 2017-05-23 | International Business Machines Corporation | Head mounted video and touch detection for healthcare facility hygiene |
US10705513B2 (en) * | 2014-06-18 | 2020-07-07 | Airbus Operations (S.A.S.) | Computer-assisted methods of quality control and corresponding quality control systems |
US20150367961A1 (en) * | 2014-06-18 | 2015-12-24 | Airbus Operations (S.A.S.) | Computer-assisted methods of quality control and corresponding quality control systems |
US10507063B2 (en) * | 2014-11-21 | 2019-12-17 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
US20170245945A1 (en) * | 2014-11-21 | 2017-08-31 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
US11750794B2 (en) | 2015-03-24 | 2023-09-05 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US20170119474A1 (en) * | 2015-10-28 | 2017-05-04 | Endochoice, Inc. | Device and Method for Tracking the Position of an Endoscope within a Patient's Body |
WO2017075085A1 (en) * | 2015-10-28 | 2017-05-04 | Endochoice, Inc. | Device and method for tracking the position of an endoscope within a patient's body |
US11529197B2 (en) * | 2015-10-28 | 2022-12-20 | Endochoice, Inc. | Device and method for tracking the position of an endoscope within a patient's body |
US11103315B2 (en) | 2015-12-31 | 2021-08-31 | Stryker Corporation | Systems and methods of merging localization and vision data for object avoidance |
US11806089B2 (en) | 2015-12-31 | 2023-11-07 | Stryker Corporation | Merging localization and vision data for robotic control |
WO2018055950A1 (en) * | 2016-09-23 | 2018-03-29 | Sony Corporation | Control device, control method, and medical system |
US11419696B2 (en) * | 2016-09-23 | 2022-08-23 | Sony Corporation | Control device, control method, and medical system |
CN109765992A (en) * | 2017-11-09 | 2019-05-17 | 波音公司 | By the system of virtual content and physical environment spatial registration, Method and kit for |
US11103314B2 (en) * | 2017-11-24 | 2021-08-31 | Synaptive Medical Inc. | Methods and devices for tracking objects by surgical navigation systems |
CN107970060A (en) * | 2018-01-11 | 2018-05-01 | 上海联影医疗科技有限公司 | Surgical robot system and its control method |
JP2021519186A (en) * | 2018-03-30 | 2021-08-10 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Surveillance of moving objects in the operating room |
EP3610771A1 (en) * | 2018-08-16 | 2020-02-19 | Fujifilm Corporation | Control device for an endoscope with means for detecting a pattern and for identifying whether the endoscope is in a non-use state or in a use state and an associated method and a program |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
WO2020109903A1 (en) * | 2018-11-26 | 2020-06-04 | Augmedics Ltd. | Tracking system for image-guided surgery |
CN110464462A (en) * | 2019-08-29 | 2019-11-19 | 中国科学技术大学 | The image-guidance registration arrangement and relevant apparatus of abdominal surgery intervention operation |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
EP3922203A1 (en) * | 2020-06-09 | 2021-12-15 | Globus Medical, Inc. | Surgical object tracking in visible light via fiducial seeding and synthetic image registration |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
Also Published As
Publication number | Publication date |
---|---|
US6675040B1 (en) | 2004-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6675040B1 (en) | Optical object tracking system | |
EP1415609A1 (en) | Optical object tracking system | |
US6405072B1 (en) | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus | |
CA2973606C (en) | Optical targeting and visualization of trajectories | |
EP1219259B1 (en) | System for locating relative positions of objects | |
US8988505B2 (en) | Imaging system using markers | |
US6146390A (en) | Apparatus and method for photogrammetric surgical localization | |
US6006126A (en) | System and method for stereotactic registration of image scan data | |
US6529758B2 (en) | Method and apparatus for volumetric image navigation | |
US6187018B1 (en) | Auto positioner | |
US20030210812A1 (en) | Apparatus and method for surgical navigation | |
EP2438880A1 (en) | Image projection system for projecting image on the surface of an object | |
CN111936074A (en) | Monitoring moving objects in an operating room | |
Nathoo et al. | SURGICAL NAVIGATION SYSTEM TECHNOLOGIES |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTEGRA RADIONICS, INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TYCO HEALTHCARE GROUP LP;SHERWOOD SERVICES AG;REEL/FRAME:018515/0942 Effective date: 20060217 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |