CA2161126C - System for locating relative positions of objects - Google Patents
System for locating relative positions of objects Download PDFInfo
- Publication number
- CA2161126C CA2161126C CA002161126A CA2161126A CA2161126C CA 2161126 C CA2161126 C CA 2161126C CA 002161126 A CA002161126 A CA 002161126A CA 2161126 A CA2161126 A CA 2161126A CA 2161126 C CA2161126 C CA 2161126C
- Authority
- CA
- Canada
- Prior art keywords
- radiation
- orientation
- coordinate system
- present time
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0073—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0064—Body surface scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1077—Measuring of profiles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/12—Devices for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/50—Clinical applications
- A61B6/501—Clinical applications involving diagnosis of head, e.g. neuroimaging, craniography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
- A61B17/58—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
- A61B17/68—Internal fixation devices, including fasteners and spinal fixators, even if a part thereof projects from the skin
- A61B17/70—Spinal positioners or stabilisers ; Bone stabilisers comprising fluid filler in an implant
- A61B17/7074—Tools specially adapted for spinal fixation operations other than for bone removal or filler handling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3925—Markers, e.g. radio-opaque or breast lesions markers ultrasonic
- A61B2090/3929—Active markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3945—Active visible markers, e.g. light emitting diodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3954—Markers, e.g. radio-opaque or breast lesions markers magnetic, e.g. NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/397—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
- A61B2090/3975—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
- A61B2090/3979—Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/14—Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
Abstract
This system locates and displays the relative positions of one moveable object (12) with respect to a second moveable object (11) in a three-dimensional space. The locations of at least two points reference (14, 16) on the first object (12) and at least three reference points (73-75) on the second object (11) are measurable by position sensors (20, 22, 24) in a predetermined fixed coordinate system (80). From the locations of the reference points (14, 16, 73-75), a computer (36) determines the positions and orientations of the two objects (11, 12) in the fixed coordinate system (80). From the positions and orientations of the two objects (11, 12), the computer (36) further determines the relative position and orientation of the first object (12) with respect to a predetermined local coordinate system in a fixed relationship with the second object (11). The computer (36) causes the display of a graphical representation of the position and orientation of the first object (12) on a graphical model of previously taken image data of the second object (11).
Description
SYSTEM FOR LOCATING RELATIVE POSITION OF OBJECTS
FIELD OF THE INVENTION
This invention relates to locating the position of one object relative to another object in a three-dimensional space. It more particularly refers to a system for locating and displaying the position and the orientation of a first moving object relative to a second moving object in three dimensional space.
Description of Prior Art Computed tomography (CT), magnetic resonance imaging (MRI), and other methods provide important detailed images of the internals of human medical patients which are useful for diagnostic purposes. Often, these diagnostic tools, which are used to determine and display images of the inside of patients' bodies, were used, and the images they create were taken at times other than during actual surgical work on the patients, that is before andjor sometimes after surgery. Specifically, it is usual for CT or MRI scans to be taken before the surgeon starts his work. They are diagnostic tools, not tools for following an operation in progress. These.prior scans are then commonly used to plan the surgery or at least to assist the surgeon in deciding what surgical course of action should be initiated and then followed.
Sometimes, they are also used after surgery to determine and evaluate the results of the surgical procedure.
If surgery is in areas of the body which are not WO 94/2364' 216112 6 pCT/US94/04298 readily visible to the surgeon, such as inside the cranium within the recesses of the brain, it is difficult, if not impossible to follow the course of the surgery while it is going on. It is presently impossible to conduct surgery while simultaneously taking an MRI. Therefore, the surgeon is always working with previously taken images of the internal structures of the patient.
While working with these previously taken images during surgery, there often is no obvious, clear-cut relationship between points of interest in the diagnostic images and the corresponding points on the actual patient.
While anomalous tissue may be obviously distinct from normal healthy tissue in the images, the difference may not be so visible in the patient on the operating table.
Furthermore, in intracranial surgery, the region of interest may not always be accessible to direct view.
Thus there exists a need for apparatus and systems to help a surgeon relate locations in the diagnostic images to the corresponding locations in the actual anatomy of the patient, and vice versa.
The related prior art includes references which disclose subject matter which describe means to accomplish objects which are similar to those of the present invention as a whole. These references include publications describing the correlation of previously taken internal medical images of a patient, which are usually three-dimensional, with the corresponding actual, present time physical locations on and in the patient in the operating room during surgery. U.S. Patent 4,791,934 describes a semi-automated system which makes this correlation, but note should be taken that the system described in this patent also requires additional radiographic imaging to be accomplished in the operating room at the time of surgery, and it then also requires that these present time images be correlated into the coordinate systems of the previously taken diagnostic images so that they can be related to the live patient.
Furthermore, the system of this 1934 patent uses WO 9412364' 2161126 PCT/US94/04298 a computer-driven robot arm to position a surgical tool in relation to these images. It does not measure and define the present time location and orientation of an input probe (a surgical tool) positioned interactively by the surgeon, and superimpose such present time data on a previously taken image.
There have been other attempts to solve the three-dimensional localization problem specifically for stereotactic surgery. One class of solutions has been the use of a variety of mechanical frames, holders, or protractors for surgery (usually intracranial surgery).
For examples see U.S. Patents 4,931,056, 4,875,478, 4,841,967, 4,809,694, 4,805,615, 4,723,544, 4,706,665, and 4,651,732, 4,638,798. Generally these patents disclose systems which are intended to reproduce angles derived from the analysis of internal images, and most require rigidly screwing a frame to the skull. In any case, these methods are all inconvenient, time-consuming, and prone to human error.
A more interactive method is disclosed in United States patent 4,750,487. This patent discloses the use of present time fluoroscopy in the operating room to help guide surgical tools. It will be abundantly clear that subjecting a patient to fluoroscopy is undesirable at any time. It may impose even greater risks during surgery.
More relevant prior art are the publications which describe systems which have been built specifically for stereotactic surgery. The following reference is illustrative and pertinent:
David W. Roberts, M.D., et al; "A
Frameless Stereotaxic Integration of Computerized Tomographic Imaging and the Operating Microscope", J. Neurosurgery 65, Oct.
1986.
This reference reports on the use of a three-dimensional digitizer to track the position and orientation of the field of view of a surgical microscope and means to superimpose, on the field of view in the microscope, the corresponding internal planar slice of a previously obtained computed tomographic (CT) image. Such a system was a great step forward in this art. It was, however, not without attendant disadvantages.
In the first place, the system described by Roberts et al. used the transmission of sound impulses to communicate the location and orientation of the field of the surgical microscope. The use of sound is necessarily based on the determination of transmission distances as a function of the speed of sound, and the differentiation of small distance differences. It is well known that the speed of sound varies to a substantial extent as a function of the temperature in the medium through which the sound travels. With modern air conditioning and heating, there are many different thermoclines and air currents present in an operating room, which are of little concern to a patient or to the doctors, but which can materially effect the accurate measurement of precise distances as a function of the speed of sound transmission. Therefore very accurate compensation factors must be applied in order to get accurate information on the exact location and orientation of the operating microscope. It will be appreciated that small errors in this field can have very serious consequences to the patient.
It must be remembered that the purpose of measuring distances from the sound emitters to the receivers is to determine precisely where the point of a surgical probe, or other device, is. The point may move only a very short distance, millimeters or even less, and so the position of the sound emitter on the probe my change by only a very small amount. This mensuration system must be able to detect such small changes in position and orientation of the sound emitters, and to translate these small changes into meaningful changes in the defined position and orientation of the point of the probe. Thus the accuracy of measuring and knowing the speed of sound in the medium of the operating room is critical to locating the point of the probe with any Wo 94/23647 216112 6 PCT/US94/04298 accuracy and cross correlating this location with a previously taken image which corresponds to the location of the probe tip. It should therefore be clear that the major disadvantage for this reported sonic system is the 5 inherent inaccuracy and instability of the sonic mensuration apparatus.
In addition to the difficulty of making accurate and precise distance measurements using sound transmission, there was another difficulty and uncertainty which was inherent in the Roberts et al. system regardless of its use of sound sensing means or not. The Roberts et al. system relied on the position of the patient, at least so much of the patient as corresponded to the area being operated on-that is for example the head, being absolutely fixed. That system was not capable of determining changes in the location of the patient housing the operating microscope, but tracked only the location of the microscope independent of the patient. The location and orientation of the patient (the head of the patient) was determined at the start of the operation and the previously taken CT scan related to that specific position and orientation. If this position and/or orientation of the patient was changed, intentionally or even inadvertently, during the entire operation, the system had to be re-correlated. That is, the operation had to be stopped and the previously taken CT scan had to be correlated with the new position and/or orientation of the patient's head. Clearly this was a major disadvantage of the Roberts et al. system. Not only does it require that the system be re-correlated at the time that the surgeon intentionally moved the patient's head, but it was not capable of taking into consideration inadvertent movements of the patient's head during the operation.
The present invention does not include the taking of suitable images of the internals of a patient before the operation. It starts from the point at which these images have already been taken and are found to be acceptable by the surgeon. This invention therefore does not comprise the imaging apparatus used to generate the internal three-dimensional image or model of the internals of the patient or other object. However, this invention does use these previous imaging data and inputs this information into the instant system. The system of this invention does not include the means of taking these images, but it does include the images themselves, preferably in electronic form.
It is contemplated that such known imaging devices might be ultra-sound, computed tomography (CT) or magnetic resonance imaging (MRI). It is also contemplated that such imaging device might be one which has as yet not been developed. The important limiting factor in the sense of the imager is that the data generated by that imager must be available in an electronic digital format, or is readily convertible into such format. The digital data may be derived directly from such an imager and transmitted to the instant system over a conventional communication network or through magnetic tape or disk media.
There is another area of prior art references which may be applicable to the patentability of the instant invention. This prior art is related specifically to localizing devices, which measure the relative positions of a manually maneuvered probe and another object, but not necessarily applied to "seeing" the present time position of the probe internal to a patient.
Previous methods and devices have been utilized to sense the position of an object in three-dimensional space.
These known techniques employ various methods of mensuration.
Numerous three-dimensional mensuration methods project a thin beam or a plane of light onto an object and optically sense where the light intersects the object.
Examples of several United States patents which disclose such simple distance range-finding devices using this general approach are: U.S. patents 4,660,970, 4,701,049, 4,705,395, 4,709,156, 4,733,969, 4,743,770, 4,753,528, 4,761,072, 4,764,016, 4,782,239, and 4,825,091. Examples of United States patents which disclose using the plane of light to sense an object's shape include: U.S. patents 4,821,200, 4,701,047, 4,705,401, 4,737,032, 4,745,290, 4,794,262, 4,821,200, 4,743,771, and 4,822,163. In the latter, the accuracy of determining the location and orientation of the surface sample points is limited by the typically low resolution of two-dimensional sensors which have usually been employed (currently the accuracy of these devices is about 1 part in 512 for a solid state video camera). Furthermore, these devices are not presently capable of detecting the location and orientation of the tip of a probe whereby the tip of the probe identified is coincident with specific points, and identifying the location of such probe tip is synonymous with identifying such specific point. Additionally, optical systems are traditionally limited by line-of-sight considerations. If you cannot see the point in question, light cannot be directly impinged on that point or projected from it. Because of these inherent limitations, these known devices have been generally useless for locating a point within a recess, which is necessary for intracranial surgery.
The internal imaging devices themselves (such as computed tomography, magnetic resonance imaging, or ultrasonic imaging) are unsuited for tracking the spatial location and orientation of a manually held probe during an operation, even though they are unencumbered by line-of-sight restrictions. Thus, these systems are not capable of being used to previously record an image, or a set of images, of the internals of a patient, and also to image these same internals in present time during an operation.
Other prior art methods and apparatus are known which track the position of one or more specific moveable points in three-dimensional space. In these known techniques, the moveable points are generally represented by small radiating emitters which move relative to fixed WO 94/2364' 2161126 PCTIUS94/04298 position sensors. Some methods interchange the roles of the emitters and sensors. The typical forms of radiation are light (U.S. Patent 4,836,778 for example), sound (U.S.
Patent 3,821,469), and magnetic fields (U.S. Patent 3,983,474). Other methods include mechanical arms or cables (U.S. Patent 4,779,212). Some electro-optical approaches use a pair of video cameras plus a computer to calculate the position of homologous points in a pair of stereographic video images (for example, U.S. Patents 4,836,778 or 4,829,373). The points of interest may be passive reflectors or flashing light emitters. The use of light emitters tend to simplify finding, distinguishing, and calculating the location and orientation of the points.
Probes with a pointing tip and sonic localizing emitters on them have been publicly marketed for several years. The instant invention is also concerned with determining the location and orientation of a stylus, but it is an improvement over the known devices in that it employs tiny light emitters, in place of the known sound emitters. Further, as will become apparent, the method used to sense the positions of these light emitters is different from what has been used in connection with sound.
Additional prior art related to the instant invention is found in these references:
Fuchs, H.; Duran, J.; Johnson, B.; "Acquisition and Modeling of Human Body Form Data", Proc. SPIE, v 166, (1978), p 94-102.
Mesqui, F.; Kaeser, F.; Fischer, P.; "Real-time, Non-invasive Recording and 3-d Display of the Functional Movements of an Arbitrary Mandible Point", SPIE Biostereometrics 602, 1985, p 77-84.
Yamashita, Y.; Suzuki, N.; Oshima, M.;
"Three-Dimensional Stereometric Measurement System Using Optical Scanners, Cylindrical Lenses, and Line Sensors, Proc. SPIE, v.
361, 1983, p. 67-73.
FIELD OF THE INVENTION
This invention relates to locating the position of one object relative to another object in a three-dimensional space. It more particularly refers to a system for locating and displaying the position and the orientation of a first moving object relative to a second moving object in three dimensional space.
Description of Prior Art Computed tomography (CT), magnetic resonance imaging (MRI), and other methods provide important detailed images of the internals of human medical patients which are useful for diagnostic purposes. Often, these diagnostic tools, which are used to determine and display images of the inside of patients' bodies, were used, and the images they create were taken at times other than during actual surgical work on the patients, that is before andjor sometimes after surgery. Specifically, it is usual for CT or MRI scans to be taken before the surgeon starts his work. They are diagnostic tools, not tools for following an operation in progress. These.prior scans are then commonly used to plan the surgery or at least to assist the surgeon in deciding what surgical course of action should be initiated and then followed.
Sometimes, they are also used after surgery to determine and evaluate the results of the surgical procedure.
If surgery is in areas of the body which are not WO 94/2364' 216112 6 pCT/US94/04298 readily visible to the surgeon, such as inside the cranium within the recesses of the brain, it is difficult, if not impossible to follow the course of the surgery while it is going on. It is presently impossible to conduct surgery while simultaneously taking an MRI. Therefore, the surgeon is always working with previously taken images of the internal structures of the patient.
While working with these previously taken images during surgery, there often is no obvious, clear-cut relationship between points of interest in the diagnostic images and the corresponding points on the actual patient.
While anomalous tissue may be obviously distinct from normal healthy tissue in the images, the difference may not be so visible in the patient on the operating table.
Furthermore, in intracranial surgery, the region of interest may not always be accessible to direct view.
Thus there exists a need for apparatus and systems to help a surgeon relate locations in the diagnostic images to the corresponding locations in the actual anatomy of the patient, and vice versa.
The related prior art includes references which disclose subject matter which describe means to accomplish objects which are similar to those of the present invention as a whole. These references include publications describing the correlation of previously taken internal medical images of a patient, which are usually three-dimensional, with the corresponding actual, present time physical locations on and in the patient in the operating room during surgery. U.S. Patent 4,791,934 describes a semi-automated system which makes this correlation, but note should be taken that the system described in this patent also requires additional radiographic imaging to be accomplished in the operating room at the time of surgery, and it then also requires that these present time images be correlated into the coordinate systems of the previously taken diagnostic images so that they can be related to the live patient.
Furthermore, the system of this 1934 patent uses WO 9412364' 2161126 PCT/US94/04298 a computer-driven robot arm to position a surgical tool in relation to these images. It does not measure and define the present time location and orientation of an input probe (a surgical tool) positioned interactively by the surgeon, and superimpose such present time data on a previously taken image.
There have been other attempts to solve the three-dimensional localization problem specifically for stereotactic surgery. One class of solutions has been the use of a variety of mechanical frames, holders, or protractors for surgery (usually intracranial surgery).
For examples see U.S. Patents 4,931,056, 4,875,478, 4,841,967, 4,809,694, 4,805,615, 4,723,544, 4,706,665, and 4,651,732, 4,638,798. Generally these patents disclose systems which are intended to reproduce angles derived from the analysis of internal images, and most require rigidly screwing a frame to the skull. In any case, these methods are all inconvenient, time-consuming, and prone to human error.
A more interactive method is disclosed in United States patent 4,750,487. This patent discloses the use of present time fluoroscopy in the operating room to help guide surgical tools. It will be abundantly clear that subjecting a patient to fluoroscopy is undesirable at any time. It may impose even greater risks during surgery.
More relevant prior art are the publications which describe systems which have been built specifically for stereotactic surgery. The following reference is illustrative and pertinent:
David W. Roberts, M.D., et al; "A
Frameless Stereotaxic Integration of Computerized Tomographic Imaging and the Operating Microscope", J. Neurosurgery 65, Oct.
1986.
This reference reports on the use of a three-dimensional digitizer to track the position and orientation of the field of view of a surgical microscope and means to superimpose, on the field of view in the microscope, the corresponding internal planar slice of a previously obtained computed tomographic (CT) image. Such a system was a great step forward in this art. It was, however, not without attendant disadvantages.
In the first place, the system described by Roberts et al. used the transmission of sound impulses to communicate the location and orientation of the field of the surgical microscope. The use of sound is necessarily based on the determination of transmission distances as a function of the speed of sound, and the differentiation of small distance differences. It is well known that the speed of sound varies to a substantial extent as a function of the temperature in the medium through which the sound travels. With modern air conditioning and heating, there are many different thermoclines and air currents present in an operating room, which are of little concern to a patient or to the doctors, but which can materially effect the accurate measurement of precise distances as a function of the speed of sound transmission. Therefore very accurate compensation factors must be applied in order to get accurate information on the exact location and orientation of the operating microscope. It will be appreciated that small errors in this field can have very serious consequences to the patient.
It must be remembered that the purpose of measuring distances from the sound emitters to the receivers is to determine precisely where the point of a surgical probe, or other device, is. The point may move only a very short distance, millimeters or even less, and so the position of the sound emitter on the probe my change by only a very small amount. This mensuration system must be able to detect such small changes in position and orientation of the sound emitters, and to translate these small changes into meaningful changes in the defined position and orientation of the point of the probe. Thus the accuracy of measuring and knowing the speed of sound in the medium of the operating room is critical to locating the point of the probe with any Wo 94/23647 216112 6 PCT/US94/04298 accuracy and cross correlating this location with a previously taken image which corresponds to the location of the probe tip. It should therefore be clear that the major disadvantage for this reported sonic system is the 5 inherent inaccuracy and instability of the sonic mensuration apparatus.
In addition to the difficulty of making accurate and precise distance measurements using sound transmission, there was another difficulty and uncertainty which was inherent in the Roberts et al. system regardless of its use of sound sensing means or not. The Roberts et al. system relied on the position of the patient, at least so much of the patient as corresponded to the area being operated on-that is for example the head, being absolutely fixed. That system was not capable of determining changes in the location of the patient housing the operating microscope, but tracked only the location of the microscope independent of the patient. The location and orientation of the patient (the head of the patient) was determined at the start of the operation and the previously taken CT scan related to that specific position and orientation. If this position and/or orientation of the patient was changed, intentionally or even inadvertently, during the entire operation, the system had to be re-correlated. That is, the operation had to be stopped and the previously taken CT scan had to be correlated with the new position and/or orientation of the patient's head. Clearly this was a major disadvantage of the Roberts et al. system. Not only does it require that the system be re-correlated at the time that the surgeon intentionally moved the patient's head, but it was not capable of taking into consideration inadvertent movements of the patient's head during the operation.
The present invention does not include the taking of suitable images of the internals of a patient before the operation. It starts from the point at which these images have already been taken and are found to be acceptable by the surgeon. This invention therefore does not comprise the imaging apparatus used to generate the internal three-dimensional image or model of the internals of the patient or other object. However, this invention does use these previous imaging data and inputs this information into the instant system. The system of this invention does not include the means of taking these images, but it does include the images themselves, preferably in electronic form.
It is contemplated that such known imaging devices might be ultra-sound, computed tomography (CT) or magnetic resonance imaging (MRI). It is also contemplated that such imaging device might be one which has as yet not been developed. The important limiting factor in the sense of the imager is that the data generated by that imager must be available in an electronic digital format, or is readily convertible into such format. The digital data may be derived directly from such an imager and transmitted to the instant system over a conventional communication network or through magnetic tape or disk media.
There is another area of prior art references which may be applicable to the patentability of the instant invention. This prior art is related specifically to localizing devices, which measure the relative positions of a manually maneuvered probe and another object, but not necessarily applied to "seeing" the present time position of the probe internal to a patient.
Previous methods and devices have been utilized to sense the position of an object in three-dimensional space.
These known techniques employ various methods of mensuration.
Numerous three-dimensional mensuration methods project a thin beam or a plane of light onto an object and optically sense where the light intersects the object.
Examples of several United States patents which disclose such simple distance range-finding devices using this general approach are: U.S. patents 4,660,970, 4,701,049, 4,705,395, 4,709,156, 4,733,969, 4,743,770, 4,753,528, 4,761,072, 4,764,016, 4,782,239, and 4,825,091. Examples of United States patents which disclose using the plane of light to sense an object's shape include: U.S. patents 4,821,200, 4,701,047, 4,705,401, 4,737,032, 4,745,290, 4,794,262, 4,821,200, 4,743,771, and 4,822,163. In the latter, the accuracy of determining the location and orientation of the surface sample points is limited by the typically low resolution of two-dimensional sensors which have usually been employed (currently the accuracy of these devices is about 1 part in 512 for a solid state video camera). Furthermore, these devices are not presently capable of detecting the location and orientation of the tip of a probe whereby the tip of the probe identified is coincident with specific points, and identifying the location of such probe tip is synonymous with identifying such specific point. Additionally, optical systems are traditionally limited by line-of-sight considerations. If you cannot see the point in question, light cannot be directly impinged on that point or projected from it. Because of these inherent limitations, these known devices have been generally useless for locating a point within a recess, which is necessary for intracranial surgery.
The internal imaging devices themselves (such as computed tomography, magnetic resonance imaging, or ultrasonic imaging) are unsuited for tracking the spatial location and orientation of a manually held probe during an operation, even though they are unencumbered by line-of-sight restrictions. Thus, these systems are not capable of being used to previously record an image, or a set of images, of the internals of a patient, and also to image these same internals in present time during an operation.
Other prior art methods and apparatus are known which track the position of one or more specific moveable points in three-dimensional space. In these known techniques, the moveable points are generally represented by small radiating emitters which move relative to fixed WO 94/2364' 2161126 PCTIUS94/04298 position sensors. Some methods interchange the roles of the emitters and sensors. The typical forms of radiation are light (U.S. Patent 4,836,778 for example), sound (U.S.
Patent 3,821,469), and magnetic fields (U.S. Patent 3,983,474). Other methods include mechanical arms or cables (U.S. Patent 4,779,212). Some electro-optical approaches use a pair of video cameras plus a computer to calculate the position of homologous points in a pair of stereographic video images (for example, U.S. Patents 4,836,778 or 4,829,373). The points of interest may be passive reflectors or flashing light emitters. The use of light emitters tend to simplify finding, distinguishing, and calculating the location and orientation of the points.
Probes with a pointing tip and sonic localizing emitters on them have been publicly marketed for several years. The instant invention is also concerned with determining the location and orientation of a stylus, but it is an improvement over the known devices in that it employs tiny light emitters, in place of the known sound emitters. Further, as will become apparent, the method used to sense the positions of these light emitters is different from what has been used in connection with sound.
Additional prior art related to the instant invention is found in these references:
Fuchs, H.; Duran, J.; Johnson, B.; "Acquisition and Modeling of Human Body Form Data", Proc. SPIE, v 166, (1978), p 94-102.
Mesqui, F.; Kaeser, F.; Fischer, P.; "Real-time, Non-invasive Recording and 3-d Display of the Functional Movements of an Arbitrary Mandible Point", SPIE Biostereometrics 602, 1985, p 77-84.
Yamashita, Y.; Suzuki, N.; Oshima, M.;
"Three-Dimensional Stereometric Measurement System Using Optical Scanners, Cylindrical Lenses, and Line Sensors, Proc. SPIE, v.
361, 1983, p. 67-73.
The paper by Fuchs, et al, (1978) best describes a method used to track a surgical probe in three-dimensional space. This method could possibly be used in conjunction with and to supplement the practice of the instant invention. It is based on using three or more one-dimensional sensors, each consisting of a cylindrical lens and a linear array of radiation detectors, such as a charge-coupled semiconductor device (CCD) or a differential-voltage position sensitive detector (PSD).
The sensors determine intersecting planes which all correspond to a single point radiating light emitter.
Calculation of the point of intersection of the planes gives the location of the emitter. The calculation is based on the locations, orientations, and other details concerning the one-dimensional sensors and is a straightforward application of analytic geometry. This photo-optical method, however, has not been previously used for the purpose of the present invention. In that sense, it is possible that the instant system could be considered to be a new and unobvious use of an existing system.
Thus, there still remains a need for a complete system (apparatus and method) which provides fast, accurate, safe, and convenient mensuration of the three-dimensional position and orientation of a manually operated probe relative to a moveable object of interest.
This system must also visually relate, in the same coordinate system, the relative position and orientation of the probe, even a portion of the probe, which is out of line of sight, to an image of a previously-generated three-dimensional model of the object.
Objects and Summary of the Invention One objective of the present invention is to provide means for accurate three-dimensional mensuration of the relative position and orientation of a moveable member with respect to a moveable object.
Another object of this invention is to provide accurate visual relationship between two objects which are each moveable with respect to each other as well as with respect to the coordinate system in which these movable 5 objects reside.
A further object of this invention is to provide accurate spacial relationships between a moving probe and a moving surgical patient during an operation in an operating room, wherein the probe and the patient are 10 moving relative to each other as well as relative to a fixed location and orientation of the mensuration apparatus.
A still further object of this invention is to provide an electro-optical mensuration system which is inexpensive, easy to use, reliable, and portable, and which employs a manually positioned probe, or other instrument, at least part of which is not within a line of sight of the surgeon, and which further employs a means of measuring the otherwise "invisible" position and orientation of the probe tip.
Another object of this invention is to provide a simple, non-invasive system for establishing a correspondence between a presently existing coordinate system containing a movable object and a previously obtained coordinate system containing a three-dimensional computer model of that object, where the previously obtained computer model is also of this same system.
Another object of this invention is to relate a measured location on the outside, or inside, of an object to its corresponding location in a previously generated computer model of that object by establishing correspondence between the coordinate systems of the object and the model.
Another object of this invention is to display a cut-away view or a cross-sectional slice of a previously generated computer model of a planar cross-section of a geometric model, where the slice approximately intersects the location in the model corresponding to a location measured in present time, and to superimpose a marker on the displayed slice to indicate the location on the slice corresponding to the measured location.
Another object of this invention is to assist an operating surgeon locate subcutaneous diseased tissue while avoiding healthy critical structures, especially in cranial neurosurgery.
Additional objects, advantages, and novel features of the invention shall be set forth, at least in part in the following description. These will also become apparent to those skilled in the art upon examination of the following discussion or may be learned by the practice of the invention. The objects and the advantages of the invention may be realized and attained by means of the instrumentalities and in the combinations particularly pointed out in the appended claims.
To achieve the foregoing and other objects, and in accordance with the invention, as embodied and broadly described herein, the relative position and orientation of two or more objects, moving independently of each other in space, is determined using the apparatus and the method of this invention. As used herein, a two moving-object set, which is a simple illustration of the system used in this invention, is defined as:
at least one first movable object, which may be a hand held probe having an invasive tip, for touching or for inserting into a second object;
at least one second movable object with respect to which the first object is movable, or is moving;
a present time coordinate system, that is a coordinate system which exists at the time of determining the spatial interrelationship of the several moving objects, which includes the second object, and in which said second object is moving, or can be moved;
a previously taken predetermined three-dimensional geometrical model, suitably a computer generated model, of the second object suitably provided in an electronically-accessible data base form;
WO 94/2364' 2161126 PCTIUS94/04298 a previous time coordinate system, which includes the previously taken computer model, that is a coordinate system which was determined at the time that the data from which the computer generated model were taken;
means to relate the previous time coordinate system of the previously taken model of the second object to the present time coordinate system of the second object itself;
at least two first radiation emitters, mounted in spaced relation to each other on a portion of the first object which, during movement of said first object relative to said second object, are not obscured from receiver(s) adapted to receive the radiation emitted thereby;
at least three second, additional radiation emitters, differentiatable from each other and from the first radiation emitters, which are mounted in fixed relationship to, and are movable with, the second object and which are not obscured from receiver(s) adapted to receive the radiation emitted therefrom at least three radiation sensors, the positions of which are known with respect to a present time fixed coordinate system, which are so positioned that they can detect the positions of at least two of the first radiation emitters and at least three of the second radiation emitters;
means to differentiate between the first and second radiation emitters sufficient to be able to distinguish between the several emitters;
means to control the number of contacts between said first radiation emitters and said sensors per unit of time suf f icient to track movement of said first object relative to said present time coordinate system in which said second object resides;
means to control the number of contacts between said second radiation emitters and said sensors per unit of time sufficient to track movement of said second object relative to the present time coordinate system in which it WO 94/2364' 2161126 pCT/US94104298 resides without interference with the radiation emitted from said first emitters;
computer means coupled to the first and second radiation emitters and to the radiation sensors adapted to receive data from the sensors and from the emitters;
means to calculate, based on the data received from said emitters and said sensors, the position and orientation of each radiation emitter with respect to said present time fixed coordinate system;
computer means adapted to independently determine the positions and orientations of the first object and the second object relative to the present time fixed coordinate system, whereby determining the position and orientation of said first object relative to said second object, from the computed positions of the radiation emitters;
computer means adapted to, at a frequency which is sufficient to substantially continuously accurately relate the present time position and orientation of the first object to the position and orientation of the model of the second object, relate the present time position and orientation of the second object with the previously obtained model of the second object and to electronically cross correlate the model of said second object and the present time position of said second object so as to form one composite image of said model which is consistent with the present time position and orientation of said second object, and to superimpose the present time position of the first object on said consistent image in the present time coordinate system; and, preferably, display means coupled to the computer means adapted to display the location of the first object correctly superimposed on the model of the second object by correctly displaying a representation of the previously taken image of the second object which corresponds to the location and orientation of the second object in present time relative to the display of the first object in present time.
The sensors determine intersecting planes which all correspond to a single point radiating light emitter.
Calculation of the point of intersection of the planes gives the location of the emitter. The calculation is based on the locations, orientations, and other details concerning the one-dimensional sensors and is a straightforward application of analytic geometry. This photo-optical method, however, has not been previously used for the purpose of the present invention. In that sense, it is possible that the instant system could be considered to be a new and unobvious use of an existing system.
Thus, there still remains a need for a complete system (apparatus and method) which provides fast, accurate, safe, and convenient mensuration of the three-dimensional position and orientation of a manually operated probe relative to a moveable object of interest.
This system must also visually relate, in the same coordinate system, the relative position and orientation of the probe, even a portion of the probe, which is out of line of sight, to an image of a previously-generated three-dimensional model of the object.
Objects and Summary of the Invention One objective of the present invention is to provide means for accurate three-dimensional mensuration of the relative position and orientation of a moveable member with respect to a moveable object.
Another object of this invention is to provide accurate visual relationship between two objects which are each moveable with respect to each other as well as with respect to the coordinate system in which these movable 5 objects reside.
A further object of this invention is to provide accurate spacial relationships between a moving probe and a moving surgical patient during an operation in an operating room, wherein the probe and the patient are 10 moving relative to each other as well as relative to a fixed location and orientation of the mensuration apparatus.
A still further object of this invention is to provide an electro-optical mensuration system which is inexpensive, easy to use, reliable, and portable, and which employs a manually positioned probe, or other instrument, at least part of which is not within a line of sight of the surgeon, and which further employs a means of measuring the otherwise "invisible" position and orientation of the probe tip.
Another object of this invention is to provide a simple, non-invasive system for establishing a correspondence between a presently existing coordinate system containing a movable object and a previously obtained coordinate system containing a three-dimensional computer model of that object, where the previously obtained computer model is also of this same system.
Another object of this invention is to relate a measured location on the outside, or inside, of an object to its corresponding location in a previously generated computer model of that object by establishing correspondence between the coordinate systems of the object and the model.
Another object of this invention is to display a cut-away view or a cross-sectional slice of a previously generated computer model of a planar cross-section of a geometric model, where the slice approximately intersects the location in the model corresponding to a location measured in present time, and to superimpose a marker on the displayed slice to indicate the location on the slice corresponding to the measured location.
Another object of this invention is to assist an operating surgeon locate subcutaneous diseased tissue while avoiding healthy critical structures, especially in cranial neurosurgery.
Additional objects, advantages, and novel features of the invention shall be set forth, at least in part in the following description. These will also become apparent to those skilled in the art upon examination of the following discussion or may be learned by the practice of the invention. The objects and the advantages of the invention may be realized and attained by means of the instrumentalities and in the combinations particularly pointed out in the appended claims.
To achieve the foregoing and other objects, and in accordance with the invention, as embodied and broadly described herein, the relative position and orientation of two or more objects, moving independently of each other in space, is determined using the apparatus and the method of this invention. As used herein, a two moving-object set, which is a simple illustration of the system used in this invention, is defined as:
at least one first movable object, which may be a hand held probe having an invasive tip, for touching or for inserting into a second object;
at least one second movable object with respect to which the first object is movable, or is moving;
a present time coordinate system, that is a coordinate system which exists at the time of determining the spatial interrelationship of the several moving objects, which includes the second object, and in which said second object is moving, or can be moved;
a previously taken predetermined three-dimensional geometrical model, suitably a computer generated model, of the second object suitably provided in an electronically-accessible data base form;
WO 94/2364' 2161126 PCTIUS94/04298 a previous time coordinate system, which includes the previously taken computer model, that is a coordinate system which was determined at the time that the data from which the computer generated model were taken;
means to relate the previous time coordinate system of the previously taken model of the second object to the present time coordinate system of the second object itself;
at least two first radiation emitters, mounted in spaced relation to each other on a portion of the first object which, during movement of said first object relative to said second object, are not obscured from receiver(s) adapted to receive the radiation emitted thereby;
at least three second, additional radiation emitters, differentiatable from each other and from the first radiation emitters, which are mounted in fixed relationship to, and are movable with, the second object and which are not obscured from receiver(s) adapted to receive the radiation emitted therefrom at least three radiation sensors, the positions of which are known with respect to a present time fixed coordinate system, which are so positioned that they can detect the positions of at least two of the first radiation emitters and at least three of the second radiation emitters;
means to differentiate between the first and second radiation emitters sufficient to be able to distinguish between the several emitters;
means to control the number of contacts between said first radiation emitters and said sensors per unit of time suf f icient to track movement of said first object relative to said present time coordinate system in which said second object resides;
means to control the number of contacts between said second radiation emitters and said sensors per unit of time sufficient to track movement of said second object relative to the present time coordinate system in which it WO 94/2364' 2161126 pCT/US94104298 resides without interference with the radiation emitted from said first emitters;
computer means coupled to the first and second radiation emitters and to the radiation sensors adapted to receive data from the sensors and from the emitters;
means to calculate, based on the data received from said emitters and said sensors, the position and orientation of each radiation emitter with respect to said present time fixed coordinate system;
computer means adapted to independently determine the positions and orientations of the first object and the second object relative to the present time fixed coordinate system, whereby determining the position and orientation of said first object relative to said second object, from the computed positions of the radiation emitters;
computer means adapted to, at a frequency which is sufficient to substantially continuously accurately relate the present time position and orientation of the first object to the position and orientation of the model of the second object, relate the present time position and orientation of the second object with the previously obtained model of the second object and to electronically cross correlate the model of said second object and the present time position of said second object so as to form one composite image of said model which is consistent with the present time position and orientation of said second object, and to superimpose the present time position of the first object on said consistent image in the present time coordinate system; and, preferably, display means coupled to the computer means adapted to display the location of the first object correctly superimposed on the model of the second object by correctly displaying a representation of the previously taken image of the second object which corresponds to the location and orientation of the second object in present time relative to the display of the first object in present time.
Therefore, according to a first aspect of the present invention, there is provided a system for determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensicnal fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and frcm said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local ccordinate system, which remains fixed with respect to said second object, but is movable, along with said seccnd object, within said fixed coordinate system;
at least three non-collinear radiation emitter means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish between radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independe.~.tly determine the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor mean in known relation to said fixed coordination system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
15a means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first and second objects, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coDrdinate system.
15b In accordance with another aspect of the invention, there is also provided a system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from said objects;
said first and second objects located within said fixed coordinate system;
at least three non-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers;
at least two spaced apart electromagnetic radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted for all of the other radiation ?mitter means;
15c means to independently determine the location of each of said radiation emitter means on said first object as a funcZicn of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientaticn of any point on said first object in relation to said fixed coordinate system by integrating the determine location of at lest two of said radiation emitter means on said first object;
means to crier.t said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previcusly taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time deter;nined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined pcsition and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time repeatedly determine the positions and orientations of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second obj ect ;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said 30positions and orientations of said second object with said previously taken three dimensional image data, an image 15d representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system.
In accordance with yet another aspect of the invention, there is further provided in a system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means, in known spacial relationship to said present time three-dimensional fixed coordinate system, which are spaced from each other and from said objects;
said first and second objects located within said fixed coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
at least three no-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers in said fixed coordinate system;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object;
15e means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independently determine the location of each of said fiducial markers on said second object;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said fiducial markers on said second object into said fixed coordinate system;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and 15f orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relaticn to said fixed coordinate system;
the improvement which comprises said radiation being electromagnetic radiation.
In accordance with yet another aspect of the invention, there is further provided in a system for determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which is spaced from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear fiducial marker means in fixed spacial relationship to said second object and at 15g known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means o said first object and said radiation sensor means in known relation to said fixed coordination system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object; and in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
the improvement which comprises:
radiation emitter means associated in known spacial relationship to said fiducial markers;
15h means to determine the location of said fiducial markers on said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known r=_lation to said fixed coordinate system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time dete= ined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data.
15i The method of this invention relates to the operation of the above described apparatus. This method will be described in relation to "seeing" the location of a point of a surgical probe inside the cranium of a patient, where the inside of the patient's cranium has been previously "seen" on an MRI or a CT scan. In carrying out this method, both the head of the patient, the second object according to this invention, and the surgical probe, the first object according to this invention, will be moved in illustration of the novel operation of this invention.
The method of this invention, described relative to the above described apparatus, includes the steps of:
at a previous time, taking an MRI or a CT or the like, which may hereinbelow be referred to sometimes as the previous scan, through the patient's head with a sufficient number and location of slices as to reveal the internal structures of the patient's head. Where there is an abnormality, the number and location of the slices should be sufficient to depict that abnormality in at least one slice;
establishing and storing an electronic file of the scan in the form of a model of the internals of the patient's head, including the abnormality,' if there is one;
relating that electronic file to a present time coordinate system;
at the present time, as opposed to the previous time when the scan was originally taken, at sufficiently frequent intervals to accurately follow present time movement, detecting the positions of at least three of the radiation emitters operatively associated with the patient's head (the patient's head is the second object in the generic description of this invention);
at the present time, at sufficiently frequent intervals to follow present time movement, computing, from the detected positions of these emitters, the moving 15j locations and orientation of the second object relative to the predetermined fixed coordinate system;
electronically adjusting the stored model to display a view of that model which corresponds to the computed present time location and orientation of the moving second object in the same present time coordination system;
at the present time, at sufficiently frequent intervals to follow present time movement, detecting the locations of at least two of the radiation emitters operatively associated with the probe (the probe is the first object in the generic description of this invention);
at the present time, computing, from the detected positions of these emitters, at sufficiently frequent intervals to follow present time movement, the position and orientation of the first object in present time relative to the predetermined fixed coordinate system;
at sufficiently frequent intervals to track the movement of the first object relative to the second object, determining the positions and orientations of the first object relative to the second object by correlating the positions and orientations of the first object relative to the predetermined fixed coordinate system with the positions and orientations of the object relative to the same predetermined fixed coordinate system;
determining the correlation between the relative position and orientation of the probe with respect to the model of the object; and indicating the location of the probe with respect to the object by displaying a representation of the position and orientation of the probe in present time on the presently displayed model of the previously taken scan of the internal structures of the cranium.
15k Therefore, according to one aspect of the invention there is provided a method of determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
defining a present time three-dimensional global fixed coordinate system;
providing at lest three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
disposing said moveable first and second objects located within said fixed coordinate system;
defining a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
providing at lest three ncn-collinear radiation emitter means in known spacial relationship to said second object ar_d at known coordinates in said local coordinate system;
taking, at a previcus time, three-dimensional image data which geometrically describe said second object;
providing at least two spaced apart radiation emitter means disposed on said first object;
providing first movement means to move said first object relative to said second object and to said fixed coordinate system;
providing second movement means to move said second object relative to said first object and to said fixed coordinate system;
transmitting radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordination system;
distinguishing radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
independently determining the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
independently determining the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
determining the position and orientation of said second I
object in said local coordinate system;
15m orienting said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
integrating the' present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional-image data;
in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and in present time, repeatedly determining the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previouslY taken image data;
wherein, as a consequence of said repeatedly determining and integrating the positions and orientations of said first and second objects, and correlating said position and orientation of said second object with said previously taken three dimensional image data, positioning an image representative first object correctly in present time on said previously taken image data regardless of the movement of at least one of first and second objects in relation to said fixed coordinate system.
15n The present invention also provides for the use of a system wherein said first object is a surgical probe and said second object is a surgical patient, said surgical probe being partially insertable within said patient such that said inserted portion is not visible to an inserter thereof.
The present invention further contemplates the use as above wherein at least two of said emitter means are disposable on so much of said first object as is not insertable into said patient, said at least two of said emifter means being disposed on said first object at a known distance from a tip of said probe which is adapted for insertion into said patient.
In the description of this invention, the radiation emitters have been located on the first and second objects, and the sensors for these radiations have been located in fixed positions within the present time coordination system. It is considered that this arrangement of these emitters and sensors could readily be reversed. That is, the emitters could occupy the fixed positions in the present time coordination system and the sensors could be located on the first and second objects.
The invention would operate in the same manner with either arrangement of sensors and emitters. For convenience, the description of this invention has placed the emitters on the moving objects and the sensors in fixed positions.
This arrangement should not be considered to be a limitation on either the apparatus or method of this invention.
Brief-Description of the Drawings The accompanying drawing figures illustrate a preferred embodiment of the present invention and, together with the description, serve to explain the principles of the invention.
Figure 1A is a block flow diagram of the optical mensuration and correlation system of the present invention showing the major components of this system, except for the means to automatically measure the position and orientation of the movable and second object.
Figure 1B is similar to Fig. lA but includes the additional radiation emitters which will permit automatically measuring the position and orientation of the second object even during the movement of this second obj ect .
Figure 2 is a perspective view illustrating the invention in use by a surgeon performing intracranial surgery on a patient, and showing a cursor on a display screen that marks the corresponding position of the tip of the probe (the first object) within the image of previously obtained model data corresponding to the cranium (the second object).
Figure 3 is a view of a sample display showing a position of the tip of the probe superimposed on previously obtained model data of an inside slice of a cranium and the showing reference points of the second object, as depicted in figure 1A, as triangles on the patient's skull.
Figure 4 is a schema.tic perspective view of a sample of one of the one-dimensional photodetectors which are useful in the practice of the present invention.
Figure 5 is a graph of the image intensity (manifested as a voltage or current) versus locations on the photodetector surface for a typical light detector which could be used by the radiation supported mensuration and correlation apparatus of the present invention.
Figures 6 and 7 are diagrams of the major steps performed by the computer to calculate the position of the probe (first object) with respect to the model of the inside of the cranium (the second object) and to display a cross-sectional image slice of the model of the inside of the cranium on a suitable display screen, such as a computer screen (CRT).
Figure 8 is a schematic view of radiation beams depicting an embodiment of this invention.
Detailed Description of The Preferred Embodiments of This Invention The radiation mensuration and correlation apparatus 10 of the present invention, as applied to a medical application, which is illustrative of one use of this invention, is shown schematically in Figure 1. it comprises a hand-held invasive probe 12 (first object) housing at least two radiation emitters 14 and 16 mounted collinear with one another and with the tip 18 of the probe 12. At 'least three remotely located, one-dimensional radiation sensors 20, 22, and 24 are mounted in fixed, spaced relationship to each other and are located at known positions with respect to a predetermined fixed coordinate system 80. The radiation sensors 20, 22, and 24 sense the radiation widely projected by the individual emitters 14 and 16 and generate electrical output signals from which are derived the location of the probe emitters 14 and 16 and, consequently the position and orientation of the probe tip 18 (which may not be visible to the surgeon because it is within the cranial cavity), with respect to the fixed coordinate system 80.
In addition, where it is desired to determine the moving position of the cranium during the operation, the three sensors 20, 22, and 24 can be programmed to sense and derive the locations of other reference emitters 70, 72, and 74 on the second object 11 (Figure 1B) in the same manner as for the probe emitters 14 and 16. The role of these reference emitters on the second object is to automate the calculation of the relationships between the present time coordinate system of the model's image 13 (Figure 2) of the second object, the coordinate system of the sensors, and the local coordinate system of the second object itself 11.
A control unit 30 connected to the moveable probe 12 via a data line 26 and coupled to the remotely located sensors 20, 22, and 24 via data lines 28, 32, and 34, respectively, synchronizes the sensing of the five (exemplary) emitters and the differentiation between them.
In that embodiment of this invention where the various emitters emit radiation in pulses, as by strobing them, the control unit is adapted to control the time multiplexing of the two emitters 14 and 16 on the probe and the three emitters 70, 72 and 74 on the cranium, controls the operation of the sensors 20, 22, and 24, and receives differentiatable data from these sensors as will be more completely described below. A coordinate computer 36, coupled to the control unit 30 by a data line 38, calculates the three-dimensional spatial location of the probe emitters 14 and 16 and consequently the position and orientation of the probe tip 18, and correlates those positions with data from correlation information 42 and from a model 13 of the second object 11 which has been previously stored electronically in an electronically accessible database 40. Finally, the computer 36 causes an associated cathode ray tube-monitor (CRT) to display the representation of the position and the orientation of the probe tip 18 with respect to the computer image 13 of the cranium 11 on display screen 44 (Figure 2) as will be more fully described below.
The probe 12 could be used without the cable 26, in that it could be coupled to the control unit 30 by employing distinctive modulation of the light emitters 14 and 16 instead of sequentially energizing (strobing) them, or by varying the wavelength or type of the radiation emitted therefrom. For example, the wave forms, color, or frequencies of each could be different. In the case of using sound radiation, the frequencies of the sound emitted by the different emitters could be varied so as to differentiate between them. The controller 30, by detecting the differences between different emitters, that is the wave form, color, frequency or other dissimilarity, of the emitted radiation, can determine to which emitter the sensors 20, 22, and 24 are reacting.
The fundamental mensuration and correlation apparatus 10 of the present invention has been illustrated in connection with aiding surgeons performing delicate intracranial surgery. This general use of this apparatus does not constitute this invention, nor is it a limitation thereon. This use will only serve to illustrate this invention. The remaining description continues to use such a surgical embodiment as illustrative, although many other surgical or other applications besides intra-cranial surgery are possible (for example, back or sinus surgery and breast biopsy). Moreover, the radiation mensuration and correlation apparatus 10 of this invention may be used for other purposes in many various medical or non-medical fields. In the described embodiment, the physical object 11 of interest, that is the second object in the generic application of this invention, is the head or cranium of a patient, and the model of the cranium is replicated using a series of parallel internal image slices (of known mutual spatial relationship) such as those obtained by means of computed tomography (CT) or nuclear magnetic resonance imaging (MRI). These image slices are then digitized, forming a three-dimensional computer model of wo 94/23647 2161126 PCTIUS94/04298 the patient's cranium which is then stored in the electronically accessible database 40.
As shown in Figures 1A, 1B, 2, and 3 a surgeon places the tip 18 of the probe 12, that is the first 5 object, at any point on or inside the cranium 11 of the patient. The position sensors 20, 22, and 24 detect the locations of the emitters 14 and 16 attached to the portion of the probe 12 that remains outside the patient's body. In order to accomplish this, the radiation produced 10 by the emitters 14 and 16 must be "visible" to the sensors 20, 22, and 24. For that reason, more than two emitters may be placed on the first object so that the radiation from at least two of them will always be visible to the sensors. These emitters 14 and 16 are effectively point 15 sources and radiate energy through a wide angle so that this radiation is visible at the sensors over a wide range of probe orientations and positions.
The sensors 20, 22, and 24, the control unit 30, and the computer 36 cooperate to determine the three-20 dimensional location of each emitter 14 and 16 within a coordinate system, and compute the coordinates of each emitter in the predetermined fixed coordinate system 80, in present time. The computer 36 can then calculate the position and orientation of the tip 18 of the probe 12 with respect to the predetermined fixed coordinate system 80, according to the locations of the emitters within the fixed coordinate system 80 and the dimensions of the probe, which dimensions had been placed into the memory (not shown) of the computer 36 beforehand. It should be noticed that the computer 36 can also easily compute position and orientation information about other specific locations on the probe (such as the vector from emitter 14 to the tip 18). Once the computer 36 has calculated the location of the probe tip 18 with respect to the fixed coordinate system 80, the computer 36 then uses the relationship between the model of the cranium, which had previously been obtained and stored in the database 40, and the fixed_coordinate system 80 to calculate the position and orientation of the probe tip 18 in relation to the model of the second object 11. Finally, the computer 36 displays a representation of the model-relative position and the orientation of the tip 18 on a display screen 44. In a simple form of the preferred embodiment of this invention, the computer 36 accomplishes this display by accessing a previously taken CT or MRI
image slice 13 stored in the database 40 that is closest to the present time position of the probe tip 18, and then superimposes a suitable representation 76 of the tip 18 on the image 13 as shown in Figures 2 and 3.. Thus, the surgeon knows the precise position and orientation of the tip 18 in the patient's cranium relative to the image data by merely observing the display screen 44. A most preferred form of the present invention can derive and display an arbitrary oblique cross-section through the multiple image slices of the MRI, etc, where the cross-section can be, for example, perpendicular to the probe orientation.
The details of the optical mensuration and correlation apparatus 10 of the present invention are best understood by reference to Figures 1 and 4 collectively.
Essentially, the probe 12 supports the two radiation emitters 14 and 16, which are rigidly attached to the probe 12 at fixed, known distances from each other as well as from the probe tip. Since only two emitters are used here as being representative of the practice of this invention, the emitters 14 and 16 should preferably be collinear with the tip 18 of the probe 12 so that the computer 36 can determine uniquely the position and orientation of the tip 18 in three dimensions. Moreover, for reasonable measurement accuracy, the emitters 14 and 16 should preferably be at least as far from each other as the nearest one is from the tip 18. In any case, the geometrical relationship of the emitters 14 and 16 to each other and to the probe tip 18 must be specified to the computer 36 beforehand so that the computer 36 can compute the exact location of the tip 18 based on the locations of WO 94/2364' 2161126 PCT/US94/04298 the individual radiation emitters 14 and 16. The use of three or more non-collinear emitters would not require that any two of them to be collinear with the probe tip.
Three or more non-collinear emitters would permit the computer to compute full position and orientation information (yaw, pitch, and roll) for the probe.
Although the invention is described as showing only a cursor locating the relative position of the probe tip 18, the invention can be modified to display a line or a shaped graphic or other icon to indicate the position and orientation of the probe 12. This would entail only the determination of additional points on the probe in the same way that the tip of the probe is located.
The two radiation emitters 14 and 16, as well as the additional radiation emitters 70, 72, and 74, can be, and preferably are, high intensity light emitting diodes (LEDs), which are preferably coordinated such that the emission for any one source is distinguishable from the emissions from the other sources. One such differentiation is to have the emitters time-multiplexed or strobed by the control unit 30 in a predetermined sequence such that only one light emitter is "on" or emitting light at any one time. The light emitted from any one of these emitters is detected by each of the three light sensors 20, 22, and 24, which then determines the location of each particular emitter in relation to the known positions of the sensors 20, 22, and 24 at the time it is strobed.
Each of the one-dimensional sensors 20, 22, and 24 used in the preferred embodiment 10 of the present invention can be identical to the others in every respect.
Therefore, for the purpose of giving a detailed description of this embodiment, only the sensor 20 is shown and described in detail in figure 4 since the remaining sensors 22 and 24 are identical.
In Figure 4, the representative one-dimensional sensor 20 comprises a cylindrical lens 46 having a longitudinal axis 48 which is orthogonal to the optical axis 50 of the sensor 20. A linear radiation detector 52, such as a charge coupled device (CCD) with several thousand elements (or a similar device capable of linear positional radiation detection of a suitable "image"), is positioned in such a manner that the "optical" axis 50 passes through the center of the aperture 54 of the radiation detector 52 and such that the longitudinal axis of the aperture 54 is orthogonal to the longitudinal axis 48 of the lens 46. Radiation, such as light beams, 56 from the emitters 14 (and in the same manner emitters 16, 70, 72, and/or 74) are focused by the cylindrical lens 46 into a real image line 58 on the surface 60 of linear detector 52.
The detector, illustrated by a photodetector 52, then generates an output 68 (Figure 5) that is related to the position of a real image line 58 on the surface 60 of photodetector 52, thus characterizing the location of the image itself. That is, those elements or points of the photodetector 52 illuminated by the real image line 58 will generate a strong signal, while those not illuminated will generate none or very weak signals. Thus, a graph of image intensity (or signal strength) versus locations on the surface of the photodetector will resemble a signal peak curve 68 (see for example Figure 5). The "all-emitters-off" (or background) signal level 66 is never quite zero due to the effects of environmental radiation, such as light in the operating room, electronic noise, and imperfections in the photodetector. In any event, since the image of the illuminated emitter is focused into line 58, only the angular displacement of emitter 14 from the optical axis 50 in the plane of the longitudinal sensor axis 54 is measured by the sensor 52, hence the designation "one-dimensional sensor".
Thus, a single one-dimensional sensor 20 can only locate the plane on which a radiating emitter 14 lies. The detector 20 cannot, by itself, determine the unique point in space on that plane at which radiating emitter 14 is located. To precisely determine the WO 94/2364' 216112 6 PCTIUS94/04298 location in space of the radiating emitter 14 requires at least three such sensors positioned in spaced relationship to each other, since the intersection of three planes defined by the three sensors, respectively, are required to define a single point in space.
To locate the position of one particular radiating emitter, such as 14, the sensors 20, 22, and 24 are mounted so that the optical axes of their lenses 48 are not all parallel and no two of such axes are collinear. In a preferred embodiment of this invention, two light sensors, such as sensors 20 and 24 in Figure 2, are situated so that their respective axes 48 (Figure 4) are in parallel, spaced relationship, and the third detector 22 is situated between and equidistant from the other two detectors, but with its axis 48 perpendicular to the axes of the other two. That is, the sensors 20, 22, and 24 should be arranged along a line or arc (Figure 2), such that each sensor 20, 22, and 24 is generally equidistant from the center of the volume in which the measurements are made, equally spaced from each other, and all aimed at the center of the measurement volume.
Suppose for example that the sensors 20, 22, and 24 are arranged along a horizontal arc and the optical axes of all sensors are oriented horizontall.y. Then the middle sensor should be oriented so as to measure the angular elevation of the radiation emitters as described above.
The two outer sensors measure the horizontal angle (azimuth) relative to the fixed coordinate system 50.
Data from the outer sensors are used to stereographically calculate both the horizontal position and distance from the sensors as will be more fully described below.
The accuracy of three-dimensional measurement depends on the angle formed between the optical axes of the outer two sensors 20 and 24, where the emitter to be measured is at the vertex of the angle. Accuracy will improve as that angle approaches a right angle. At least three of the several possible sensors 20, 22, and 24 must be spaced so that the desired measurement volume is completely within their field of view which can be accomplished by making the focal length of the lens 46 short enough to provide coverage of the entire desired field of view. In another embodiment of this invention, 5 additional sensors, which may be substantially identical to sensors 20, 22, and 24, could be used to provide more viewpoints, to broaden coverage of the field of view, or to enhance measurement accuracy.
While this process of detecting a given 10 radiating emitter, such as 14, can determine the exact location of the radiating emitter, it cannot by itself determine the particular orientation and position of the probe or its tip 18 in three-dimensional space. To do so with only two emitters requires that both emitters 14 and 15 16 be collinear with the probe tip 18, as described above.
Also, the distances between each emitter, 14 and 16, and the probe tip (as well as the distances between the emitters 14 and 16 themselves) must be known and loaded into the memory of the computer 36 before the computer 36 20 can determine the position and orientation of the probe tip 18 from the locations of the emitters 14 and 16 in the fixed coordinate system 80. Consequently, when each of the radiation emitters 14 and 16 is rapidly turned on in sequence, or strobed, the sensors 20, 22, and 24 can 25 detect the exact location of each emitter in turn. Thus computer 36 can determine the exact position and orientation of the probe, and therefore its tip 18. Since only one of the radiation emitters 14 or 16 is on at any one time, the detectors 20, 22, and 24 locate the location of that particular illuminated emitter individually. If the strobe rate, that is, the frequency at which the emitters 14 and 16 are turned on and off in sequence, is fast enough, the detectors 20, 22, and 24 can, for all practical purposes, determine the position and orientation of the probe 12 and its tip 18 at any instant in time, and therefore can follow the movement of the probe tip in present time, that is during the time that the probe tip is actually moving. In other words, this system can WO 94/23647 21611 /r 6 PCT/US94/04298 simulate the movement of the probe tip on the previously taken image in present time during the surgical procedure.
The sensors 20, 22, and 24 need only distinguish which of the radiation emitters 14, 16, 70, 72, or 74 is on at any one time. In the preferred embodiment 10 of the present invention, this function is accomplished by strobing each of the emitters in sequence, as described above. However, other methods can be used to allow the sensors 20, 22, and 24 to distinguish the respective radiation emitters 14, 16, 70, 72, and 74 from one another. For example, different wave lengths (colors) of light, or different frequencies of sound, could be used in conjunction with detectors capable of distinguishing those particular different radiations.
Alternatively, it is one aspect of this invention to modulate each of the respective radiation emitters 14, 16, 70, 72, and 74 with a unique wave form or pulse train. This means of differentiating between the different emitters is believed to be novel and unique to the instant invention. If such different wave forms or pulse trains are used to differentiate the different emitters, it is also within the spirit and scope of this invention to transmit additional information on these wave forms, such as for example the temperature of the particular structure being contacted by the probe tip 18.
It is within the scope of this invention to provide means readily accessible to the surgeon or other operator to engage or disengage the taking of such additional information and the transmission thereof by the unique wave forms or pulse trains radiated by the respective emitters. Under these circumstances, the control unit 30 or computer 36 will be designed to demodulate the wave form to determine to which particular emitter the sensed signal belongs, and to decode the additional information being transmitted.
Numerous other methods for distinguishing the radiation emitters are possible. Therefore, the present invention should not be regarded as limited to the wo 94123647 216112 6 PCT/US94/04298 particular strobing method shown and described herein, but is generic to the use of any means to differentiate between the different emitters.
Conventional or unique auto-focusing or multiple-lens radiation detection may be integrated into the sensors 20, 22, and 24 to improve the performance of the system. However, the simple, fixed-focus optics shown and described herein and shown in Figure 4 for one sensor provide a good level of performance if the working range of the probe is restricted. Even if the real image of an emitter, such as 14, is somewhat out of focus on the detector 52, the angular measurement of the image is still usable. A usable measurement for each of the sensors 20, 22, or 24 to generate will be any of the following: (1) the position of the detector element with peak intensity, (2) the intensity-weighted average (centroid) of all over-threshold elements, or simply (3) the average of the minimum and maximum elements where the intensity is over some threshold. The detector 52 should be placed at the focal distance for the farthest typical operating distance of the radiation emitters. Closer emitters will form slightly defocused images 58, but they require less precise angular measurement for a given distance accuracy.
Furthermore, their de-focused real images are brighter, which increases the brightness gradient at the edges of the image.
As described so far, the real image 58 of the currently activated emitter must be significantly different from (for example brighter than) the rest of the radiation falling on the sensor 52. Otherwise, other lights or reflective surfaces in the field of view of the sensors will hinder the detection of the emitter's real image. Therefore, it is desirable to include in the apparatus, circuitry to subtract the background radiation received by the sensors from other, ambient, sources.
This per se known circuitry enhances use of the invention where the sensors are required to detect the radiation emitters against relatively bright backgrounds.
While the radiation emitters are all momentarily extinguished, the one-dimensional data from each sensor are saved in a memory. This can be done in an analog delay line or by digitally sampling the output signal and storing it in a digital memory. Then, as each emitter is "viewed" sequentially, the saved data are subtracted from the current data generated by the currently radiating emitter. If the background data are stored digitally, the current data are also digitized, and the stored background data are digitally subtracted from the current data.
A graphical representation of the radiation intensity of the image or, equivalently, the generated output voltage amplitude for each element in a row of detecting elements, is shown in Figure 5. The graph depicts typical background image intensities 66 with all emitters off, the intensities 68 with one radiation emitter on, and the element-by-element difference 64 between the intensities with the emitter off and those with it on. The measurements will likely contain some random noise, electronic or otherwise, and two consecutive measurements for a given sensor element may differ slightly even where the background is unchanged.
Therefore, the differential intensities 64 between two consecutive measurements also contain some random electronic noise. However, the two measurements differ substantially only at the location of the radiation emitter image, and this difference exceeds the threshold level 62.
The details of the structure and operation of the control unit 30 are best seen in Figure 6.
Specifically, control unit 30 (see Figure lA and 1B) supplies power to the radiation emitters 14, 16, 70, 72, and 74 and the radiation sensors 20, 22, and 24. A
control and synchronization unit 84 and radiation source sequencer 88 (where a strobed radiation sequencing is used) time-multiplexes or strobes the radiation emitters individually, as described above, so that the position and orientation of the probe tip 18 (Figure 1) can be determined from the signals received from the sensors 20, 22, and 24. The angular data signals received from the sensors 20, 22, and 24 are converted by an analog-to-digital converter 92. Actually, three analog-to-digital converters are used, as shown in Figure 6, but only one is labeled and described herein for brevity, since the other two analog-to-digital converters are substantially identical and are used to convert the signals from the other sensors 22 and 24.
The control and synchronization unit 84 also controls three switches, of which switch 93 is typical, which store all digital data received from the sensors 20, 22, and 24 when the radiation emitters 14 and 16 are off and stores these data into a background memory 94. Then, when the radiation emitters 14, 16, 70, 72, and 74 are illuminated in sequence by radiation source sequencer 18, the synchronization and control unit 84 changes the state of switch 93 which then redirects the data from the three sensors 20, 22, and 24 to a subtraction unit 91. The subtraction unit 91 subtracts the background data from the emitter radiation data, thus resulting in a signal which has been relatively freed from the background signal 66 (Figure 5) since the fixed pattern noise has been subtracted from the signal.
As shown in Figure 6, which should be considered in conjunction with Figure 5, a 1-D (one-dimensional) position calculation unit 95 determines the location of the real image line 58 on the CCD sensor 52 (Figure 4) by measuring the locations of the edges 67 and 69 of the signal blip 68 (Figure 5) generated by the CCD sensor based on a predetermined threshold signal level 62. The 1-D position calculation unit 95 then averages the distance between the two edges to find the center of the signal peak 68 as shown in Figure 5. This method of determining the center of the signal peak is per se well known in the art and need not be described in further detail. Moreover, numerous other methods of determining the location of the signal peak or its centroid are known WO 94'2364' 216112 6 PCT/US94/04298 in the art and will be obvious to those of ordinary skill in the art. The method used depends on the signal characteristics of the radiation sensor used as well as the characteristics of the lens system used to focus the 5 radiation onto the surface of the detector, in addition to other parameters. Those practicing this invention with the various alternatives described herein would have no trouble selecting a signal detection algorithm best suited to the particular characteristics of the sensors and the 10 particular radiation being used.
Finally, the control unit 30 (Figure 1) transmits the radiation data to the computer 36. That is, when the computer 36 is ready to compute the current location of the currently radiating emitter, such as 14, 15 the latest angular data from all sensors 20, 22, and 24 are provided for analysis. If the sensors generate data faster than the control unit 30 can process them, the surplus angular data are simply discarded.
The operation of the computer 36 is most 20 advantageously set forth in Figure 7. The computer 36 calculates one-dimensional positions for each radiation emitter such as 14 or 16, based on the location of the signal peak from each respective sensor 20, 22, and 24.
These one-dimensional angular position measurements are 25 then used to determine the three-dimensional spatial coordinates of the emitters 14 and 16 and thus for the position and orientation of the probe 12 relative to the predetermined fixed coordinate system 80 by coordinate transformation methods which are per se well-known in the 30 art. The output signals from the computer 36 can be in any form desired by the operator or required by the application system, such as XYZ coordinate triples based upon the predetermined fixed coordinate system 80.
Figure 8 and the following paragraphs describe in detail how the location of a single radiation emitter, such as 14, is computed from the data derived from the sensors 20, 22, and 24. The following description applies to these three sensors 20, 22, and 24 only. If there are more than three such sensors, the calculation can be performed using any three or more of the sensors.
Furthermore, if more than three sensors are used, the average of the points calculated from all combinations of three sensors could be used to increase accuracy. Another option is to use the point calculated from the three sensors closest to the radiation emitter 14 or 16. The following parameters are considered to be known XYZ
constants:
D0[i], one endpoint of each linear photodetector i;
D1[i], the other endpoint of linear photodetector i;
L0[i], one endpoint of the axis of each lens i; and Ll[i], the other endpoint of the axis of lens i.
Each sensor generates T[i], a parametric value between 0 and 1 indicating where the peak or center of the line image of the emitter intersects the line segment between D0[i] and D1(i]. The XYZ coordinates of point S are to be calculated, where S is the location of the radiation emitter. For a CCD radiation detector array, T[i] is the index of the element on which the center or peak of the image falls divided by the number of elements on the detector array.
The three-dimensional coordinates of the above points are all referenced to a predetermined fixed coordinate system 80. The cylindrical lens and linear photodetector do not directly measure the angle A of the radiation emitter about its lens axis; rather, they measure a value T[i] linearly related to the tangent of that angle:
tan(A) = C * (2 * T[i] - 1), where C is a constant of proportionality that is related to, and determined empirically by, the dimensions of a particular system.
The three-dimensional location of the image line on the linear photodetector is:
D[i] = (1 - T[i]) * D0[i] + (T[i]) * Dl[i]
If the lens is ideal, then S also lies in plane P[i]. In reality, the point D[i] might have to be computed by a non-linear function F(t) that corrects for non-linear aberrations of the lens or the photodetector:
D[i] = (1 - F(T[i])) * D0[i] + (F(T[i])) * D1[i]
Function F(t) could be a polynomial in variable T, or it could be a value interpolated from an empirically determined table.
P[i] is the unique plane determined by the three points D[i], L0[i], and L1[i], which are never collinear.
S is the point of intersection of the planes P(1], P[2], and P[3] determined respectively by sensors 1, 2, and 3.
S is a unique point if at least two sensor lenses longitudinal axes 48 are not parallel and if no two lens axes 48 are collinear. The intersection point is found by finding the common solution S of the three equations defining the planes P(i]. Once the location S of each of the probe's radiation emitters is computed, the location of the probe's tip 18 can be calculated. The method of making such a determination is well known using the teaching of analytic geometry and matrix manipulations.
If M is a linear transformation describing the relationship between a point R in the image coordinate system and a point S in the fixed coordinate system, then :
R * M = S.
If M'1 is the inverse of M and if S is a point in the fixed coordinate system, then the point R in the image coordinate system corresponding to S is:
S * M"' = R.
Now, suppose that the second object is moved in the WO 94'2364' 216112 6 PCTIUS94/04298 mensuration coordinate system. This can be described by a linear transformation U where the coordinates S of a point are mapped to the coordinates S':
S * U = S' Then the old value of M above must be multiplied by U in order to correct the relationship between the point R in the image coordinate and the corresponding point in the mensuration coordinate system because of the relative movement of first object with respect to the second object:
R = S' * U-' * M'' The preliminary steps required before practicing the method of the invention are now described. Then, after fully describing these preliminary steps, the detailed steps of the method of the optical mensuration and correlation apparatus are described.
Use of the invention takes place in three phases: the imaging phase, the correlation phase, and the normal operation phase. The imaging phase precedes the normal operation of the present invention. During the imaging phase, a scan of the body of the second object of interest is used to build a three-dimensional geometrical model. In the preceding description, the second object was the head of a human intracranial surgical patient because the invention is advantageously used in stereotactic neurosurgery. Accordingly, the three-dimensional model comprises digital data from a series of internal cross-sectional images obtained from computed tomography (CT), magnetic resonance (MRI), ultrasound, or some other diagnostic medical scanner. In any case, the image data are stored in a suitable, electronic memory 40 which can be accessed later by the computer 36. The data are considered to be stored as a series of parallel two-dimensional rectangular arrays of picture elements (pixels), each pixel being an integer representing relative density. If the object is relatively rigid, like a human head, this three-dimensional model may be created at some time before the correlation and operational phases of the invention and possibly at another location.
Also, during the imaging phase, at least three non-collinear reference points 71, 73, and 75 (Figures 2 and 3) must be identified relative to the object 11.
These may be represented by ink spots, tattoos, radiopaque beads, well-defined rigid anatomical landmarks, locations on a stereotactic frame, sterile pins temporarily inserted into rigid tissue or bone of a surgical patient, or some other reference means. The coordinates of these reference points are measured and recorded relative to the coordinate system of the imaging device. One way to accomplish this is to capture the reference points as part of the previously made three dimensional model itself.
For example, radiopaque pins could be placed within the image planes of diagnostic CT slices; the pin locations, if not automatically detectable from their high density, can be identified interactively by the surgeon using a cursor on the computer display of the CT slices. See Figure 3.
The initializing of the position and the orientation of the second object, the patient's cranium, is well known in this art. The instant invention departs from this well known operation to add radiation emitters which have a known and exact spacial relation to these fiducial markings. These additional radiation emitters must then be programmed or otherwise activated for use in a particular manner in order to practice the instant invention. They must be programmed to radiate at some frequency during the surgical procedure which is in progress during present time so that the position and orientation of the second object will be available to the surgeon at all relevant times and so that this position and orientation can be repeatedly and automatically updated in order for the system to revise the specific selected scan slice to superimpose the position and the orientation of the first object on in a correct depictation of the actual relative positions and orientations of both the first and second object in 5 present time during the surgical procedure.
The initial correlation mode immediately precedes the normal operational phase of the present invention and must take place in the operating room.
During this initial correlation phase, the instant system 10 accesses the data of the three-dimensional geometrical model of the patient (or other object), including the reference point (fiducial marker) coordinates which were recorded earlier, that is previously. Next, the surgeon may place the tip of the probe 18 at each of the reference 15 points 71, 73, and 75 on the patient, in turn. This sequence of operations may be directed by the computer program. In the alternative, the system of this invention provides these data automatically by the radiation from the emitters 70, 72, and 74 being received by the sensors 20 directly and automatically without special intervention by the surgeon. Either of these procedures establish an initial relationship between the locations of these reference points in the model coordinate system and their current physical locations in the fixed coordinate system 25 80. However, the preferred determination of this initial position and orientation also carries on during the whole of the surgical procedure and therefore is capable of substantially continuously updating the position and orientation of the second object and relating it to the 30 current position and orientation of the probe. In turn, this establishes a linear mathematical relationship between all points in the model and points in the coordinate system 80. Thereafter, when the patient is moved relative to the sensors, the prior art must 35 establish a new relationship by again digitizing the reference points 71, 73, and 75 within the coordinate system 80. That is, the correlation phase must be repeated. Again, the system of this invention uses the Wo 94/23647 2161126 PCT/US94/04298 emitters 70, 72 and 74 to accomplish this automatically.
For this reason, the automatic tracking of the position of the head, or the second object whatever that is, which is described below and which overcomes this problem, is an essential, significant feature of the present invention.
Since the position and orientation of the head is initially and substantially continually thereafter correlated with the model, the surgeon can relate any locations of interest on the diagnostic images with the corresponding physical locations on this patient during the operation, and vice versa. These include locations accessible to the probe tip 18 but not necessarily directly visible to the surgeon.
Having described the function and purpose of the preliminary steps, the detailed method of the present invention is more easily understood. As shown in Figure 7, the position data 21 of the probe emitters generated by the sensors and control unit are converted into three-dimensional coordinates relative to the predetermined fixed coordinate system 80 of the sensors. Using dimensional parameters describing the relationship among the probe emitters and the probe tip, the computer determines the coordinates of the probe tip in a step 39.
During the initial correlation phase, the probe tip may be placed at each of the reference points 71, 73, and 75 in turn. Alternatively, in accord with a preferred aspect of this invention, the emitters 71, 73 and 75 are located by the sensors and the correct position and orientation of the second object is thereby determined. The coordinates of the second object in the fixed coordinate system along with their coordinates 46 in the image coordinate system determine a unique linear transformation relating the two coordinate systems in a step 45. This is a per se known calculation in analytic geometry and matrix mathematics.
As noted above, a more automated and direct method of determining the location of the second object is to directly read the locations of the fiducial points 71, 73 and 75 by the fixed sensors 20, 22 and 24. This can be WO 94,23647 2161126 PCTIUS94/04298 accomplished by placing radiation emitters 70, 72, and 74 (Figure 1B) at those reference points (or in a known fixed spacial relationship to those reference points 71, 73, and 75). The emissions of these emitters can then be read directly by the sensors 20, 22, and 24, and thus the computer can then automatically determine their locations relative to the predetermined fixed coordinate system 80 of the sensors. Thus the position and orientation of the second object, the cranium in the preferred embodiment of this invention, can be automatically and substantially continuously determined. With the position and orientation of the second object being at least frequently, if not substantially continuously, updated, the position of the first object, the probe, which is also determined at least frequently, if not substantially continuously, can then be updated in relation to the second object at the same frequency. Since the position and orientation of both the first and the second objects are each at least frequently determined and updated in relation to the fixed coordinate system 80, the position and orientation of each of these first and second objects can then be determined relative to each other, by indirect, but well known, calculations which are easily carried out in short order by a computer.
It has been stated herein that the position and orientation of the second object, the cranium, can, according to this invention, be determined continuously or at least frequently. The frequency at which the position and orientation of the second object is determined is a function of the desires of the operator of this system and the frequency at which the radiation emitters and the sensors can be operated. In the case where the emitters all emit the same wave length of radiation and the sensors all sense this same wave length of radiation, the differentiation of the emissions of the several emitters is followed in a sequential pattern. Thus, in this embodiment of this invention, the emitters will emit radiation in sequence, for example 14, then 16, then 70, Wo 94/23647 216112 6 PCT/US94/04298 then 72 and then 74. The sensors will have been programmed to identify a signal with an emitter as a function of when the signal is received.
In this embodiment of this invention, the position and orientation of the first object, the probe, is determined with the same frequency as is the location and the orientation of the second object, the cranium, because all of the emitters radiate in sequence. However, the system can be programmed so that the emitters 14 and 16 fire more or less frequently than the emitters 70, 72 and 74. Under these conditions, the position and orientation of the first object and of the second object will be determined at different individual frequencies, that is at the same frequency as the frequency of the radiation from their respective emitters. It will therefore be clear that the frequency of determination of the location of any given emitter, and therefore the determination of the position and orientation of these first and second objects, is, because of the instant invented system, for the first time entirely controllable by the programmer or the operator, within the capabilities of the system operating the emitters.
However, it should be understood that the position and orientation of the first and/or second objects can be determined in a substantially continuous manner. In this embodiment of this invention, each emitter will radiate a different wave length or wave form or frequency pulse of radiation. Therefore, the radiation emitted from each emitter is simultaneously distinct from the radiation emitted from the other emitters. Under these conditions, the location of each emitter can be determined continuously by a set of sensors which is tuned to the specific, different radiation of each emitter.
Therefore, the location of each emitter can be determined continuously, whereby the position and orientation of either or both of the objects can be calculated by the computer from these continuous locations of the different emitters.
WO 94'2364' 216112 5 pCT/US94104298 While it is a significant distinction of this invention from the prior art that:
in the prior art:
the f irst object is intended to be moved and the second object is intended to be stationary; and h position and orientation of the first object is frequently determined, but the position and orientation of the second object is only determined at the start of the operation and at any time that the second object, the cranium, which is intended not to be moved at all during the operation, is known by the surgeon to be moved;
whereas according to this invention:
the first object is intended to be moved, and the second object is not intended to be rigidly immobilized in place, or, put another way, the second object is permitted to move and is even expected to move;
and the position and the orientation of the first object is frequently determined, and the position and orientation of the second object is also frequently determined. The position and orientation of these two objects may be determined at the same frequency or at different frequencies (or even continuously) as desired by the operator.
In preferred embodiments of this invention, the position and orientation of the second object will be determined from one hundredth to ten times, most preferably from a quarter as often to four times, as often as the frequency at which the position and orientation of the first object is determined. As a general proposition, there is no limit on the relationship between these frequencies of measurement. The preferred relationships set forth herein are illustrative and not limiting. The frequency of each measurement is dependent on the amount of movement which is allowed and is intended to be shown on the CRT. The upper limit on this frequency is determined by the ability of the emitters to be distinguished. There is no lower limit.
In the instant specification, the emitters have been described as being on the first and second objects and being movable therewith, and the sensors have been 5 described as being in a fixed relation to the coordinate system. While this is the preferred system, it is by no means the only configuration of the system of this invention. It is also within the scope of this invention to provide the emitters in fixed relationship to the 10 coordinate system, and the sensors on the f irst and second objects, respectively. The wiring may be somewhat more cumbersome in this configuration, but that should not detract from the viability of such a reversal.
This invention has been described with reference 15 to a first and a second moving object, and the determination of each of their absolute and relative positions and orientations in a fixed coordinate system.
It will be clear that this same system applies to more than two objects. In fact, the position and orientation 20 of any number of objects can be determined, both absolutely with respect to the coordination system, and relatively with respect to each other, by the practice of this invention. Thus, when this specification and the claims appended hereto speak of a first and a second 25 object, these can be two out of any number of total objects. This number is merely illustrative of the practice of this invention and is in no way limiting thereon.
Thus, the preferred system of this invention 30 performs the three primary tasks of this invention, preferably, but not necessarily, simultaneously:
the absolute position and orientation of the second object, the cranium, in the fixed coordination system is determined at least very frequently;
35 the relationship between the absolute position and orientation of the second object with respect to the previously taken images of that object, particularly the inside structures of that object, is determined at least WO 94/23647 _ 2161126 pCT/US94/04298 very frequently; and the absolute position and orientation of the first object, the probe, is determined at least very frequently.
The accomplishment of these three tasks then permits the computer to accomplish the three essential secondary tasks of this invention:
to calculate the position and orientation of the first object, the probe, in relation to the second object, the cranium, even though the first object, or a portion of it, is out of the line of sight of either the surgeon or the sensors;
to select the appropriate slice of the previously taken model of the interior of the second object which corresponds to the present time position and orientation of the first object in relation to the present time position and orientation of the second object; and to display the appropriate slice of the previously taken image of the second object with the present time position and orientation of the first object correctly depicted thereon.
Both the initial and the continual correlation determinations can be automatically initiated and updated by the computer 36 in some predetermined timed sequence or continuously. In fact, according to the most preferred aspect of this embodiment of this invention, the correlation phase is frequently, briefly from time to time, or even continuously, repeated, interspersed in between measurements in the operational phase or conducted simultaneously with the operational phase of the practice of this invention for the purpose of recalculating the linear transformations M and M' when the second object (such as a surgical patient) moves relative to the sensors.
During normal operation, the tip coordinates are transformed in a step 44 using the transformation computed in step 45. The new transformed coordinates, relative to the image coordinate system, are used to determine the plane of some two-dimensional cross-section through the WO 94123647 216112 s PCT/EJS94104298 three-dimensional image model 41 accessible in the accessible memory 43. The simplest method is simply to choose the existing diagnostic image plane located closest to the probe tip's coordinates relative to the model coordinate system.
In any case, a step 47 transforms the two-dimensional cross-sectional slice to a screen image and places a cursor on it to mark the location of the probe tip superimposed in the image. Scaling and viewing parameters determine how the image is displayed. Because the surgeon may not be able to simultaneously view the patient (object) and the computer display screen, the step 47 should be controlled by the surgeon, such as for example by placing an activating button on the probe.
Pressing the button can be the signal for freezing the image and the depicted position and orientation of the probe tip marker at that instant on the display screen.
In a more complex embodiment of this invention, the computer system could generate and display on the screen a cut-away view at an arbitrary angle, for example, perpendicular to the direction the probe is pointing, using the data from multiple image slices. In simpler cases, the computer simply displays any one or more convenient image slices through the location of the probe tip. For example, the displayed slice might simply be the original CT slice which includes the location of the probe tip, or is closest to that location. In any case, the computer then causes the image a cursor at the current position of the probe tip to be displayed on this previously taken image of a slice through the second obj ect .
An alternative means, to record the location of the reference points in the coordinate space of the imaging apparatus during the imaging phase, employs an additional, separate instance of the three-dimensional position mensuration probe, sensors, control unit, and computer of the present invention. In order to implement this embodiment of this invention, the additional sensors are permanently attached directly on the imaging apparatus. The additional probe measures the location of the reference points at the time of imaging, and the additional control unit and computer determines and records their locations relative to the coordinate system of the imaging apparatus. The advantage of this approach is that the fiducial markers, that is the landmarks or reference pins, need not be within the limited cross-sectional slices visible to the imaging device.
As an alternative to true three-dimensional images, standard x-ray radiographs from several distinct directions can be used to construct a crude model in lieu of the imaging phase described above. Radiographs from two or more directions are digitally scanned, and four non-coplanar reference points on them are identified with a cursor or light pen. In a correlation phase similar to that described above, these four points on the patient are digitized just prior to surgery. Then, during surgery, the location of the probe tip is projected onto the digitized computer images of the two-dimensional radiographs where the projection is uniquely defined by mapping and transferring the reference point coordinates from the model coordinate system to the fixed sensor coordinate system.
In a further embodiment of this invention, a videotape recording of the computer screen (as well as the direct view of the surgeon and patient) is used to help document the performance of the instant procedure.
Radiation emitters may be present on more than one standard surgical tool such as the microscope, scalpel, forceps, and cauterizer, each of which thereby becomes, in effect, a probe. These emitters should be differentiated from each other in the same manner as aforesaid.
The method and apparatus of the optical mensuration and correlation apparatus 10 of the present invention has been completely described. While some of the numerous modifications and equivalents of the system of this invention have been described herein, still other WO 94,23647 216112 6 PCT/US94/04298 modifications and changes will readily occur to those of ordinary skill in the art. For instance, the preferred embodiment described herein uses visible light, since human operators can readily observe if the light sources are operative or whether they are causing troublesome reflections. Clearly, other wavelengths of electromagnetic radiation could be used without departing from the spirit and scope of the invention. Non-visible light, such as infrared or ultra-violet light, would have the advantage of not distracting the surgeon with flashing lights. Ultra-sound could be used conveniently. Other modifications to the detector "optics" and lenses are possible which would change, and possibly improve, the image characteristics on the detectors. For example, toroidal lenses could be used which are longitudinally curved along an arc with a radius equal to the focal length of the lens. Similarly, the surfaces of the photodetectors could also be curved, thus allowing the images of distant light sources to remain in sharp focus, regardless of their positions. Numerous enhancements of the digital data are possible by suitably programming the computer.
The most preferred aspects of this invention use electromagnetic radiation, and especially visible light, as the radiation from the emitters. This use of light for this function is a major improvement over the use in the prior art of audible sound emitters and detectors.
However, prior art systems which are based on the use of sound emitters can be reprogrammed to carry out the operations to substantially continuously recorrelate the position and orientation of the second object during the surgical procedure, as they have been described herein.
Thus, the movement of the second object can be at least frequently, if not continuously, tracked using sound emitters and detectors and suitable temperature compensation techniques. In this last regard, the aforementioned ability of the instant system to determine and transmit the temperature of the probe tip can be used wo 94/23647 216112 6 PCT/US94/04298 to good advantage when using sound as the radiation of choice. The fact that the use of electromagnetic radiation, particularly light, emitters and sensors is an improvement over the use of audible sound emitters and 5 sensors is not intended to be a limitation of the practice of continuously or frequently following the movement of the first or the second objects. That is an invention in and of itself using any emitter-sensor pair.
It should be understood, however, that the 10 transmission between emitters and sensors operate differently, and measure different things when electromagnetic radiation is used as compared to the use of sound, audible or ultrasonic. In the case of electromagnetic radiation, what is being measured is the 15 angle that the radiation path makes between the emitter and the sensor relative to some arbitrary fixed line. By measuring all of these angles of the rays between the emitters and the sensors, conventional analytic geometry solutions will locate the points in space where the 20 various emitters are. On the other hand, when sound radiation is used, what is being measured is the distance between each of the emitters and the sensors. Again, conventional analytic geometry solutions will precisely locate the point in space which is occupied by each 25 emitters/sensor. While the casual observer, or the operator of the systems of this invention will not observe any difference in result, there is a marked difference in the way that result is achieved, and therefore this will necessitate a difference in the manner in which this 30 system is programmed.
The foregoing is illustrative of the principles of the invention. Since numerous modifications and changes will readily occur to those of ordinary skill in the art, given the teachings of this specification, this 35 invention is not limited to the exact construction and operation shown and described herein. Accordingly, all suitable modifications and equivalents that may be resorted to in light of disclosure of this specification are considered to fall within the scope of the invention as defined by the following claims.
a present time three-dimensicnal fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and frcm said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local ccordinate system, which remains fixed with respect to said second object, but is movable, along with said seccnd object, within said fixed coordinate system;
at least three non-collinear radiation emitter means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish between radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independe.~.tly determine the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor mean in known relation to said fixed coordination system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
15a means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first and second objects, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coDrdinate system.
15b In accordance with another aspect of the invention, there is also provided a system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from said objects;
said first and second objects located within said fixed coordinate system;
at least three non-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers;
at least two spaced apart electromagnetic radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted for all of the other radiation ?mitter means;
15c means to independently determine the location of each of said radiation emitter means on said first object as a funcZicn of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientaticn of any point on said first object in relation to said fixed coordinate system by integrating the determine location of at lest two of said radiation emitter means on said first object;
means to crier.t said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previcusly taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time deter;nined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined pcsition and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time repeatedly determine the positions and orientations of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second obj ect ;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said 30positions and orientations of said second object with said previously taken three dimensional image data, an image 15d representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system.
In accordance with yet another aspect of the invention, there is further provided in a system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means, in known spacial relationship to said present time three-dimensional fixed coordinate system, which are spaced from each other and from said objects;
said first and second objects located within said fixed coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
at least three no-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers in said fixed coordinate system;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object;
15e means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independently determine the location of each of said fiducial markers on said second object;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said fiducial markers on said second object into said fixed coordinate system;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and 15f orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relaticn to said fixed coordinate system;
the improvement which comprises said radiation being electromagnetic radiation.
In accordance with yet another aspect of the invention, there is further provided in a system for determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which is spaced from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear fiducial marker means in fixed spacial relationship to said second object and at 15g known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means o said first object and said radiation sensor means in known relation to said fixed coordination system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object; and in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
the improvement which comprises:
radiation emitter means associated in known spacial relationship to said fiducial markers;
15h means to determine the location of said fiducial markers on said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known r=_lation to said fixed coordinate system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time dete= ined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data.
15i The method of this invention relates to the operation of the above described apparatus. This method will be described in relation to "seeing" the location of a point of a surgical probe inside the cranium of a patient, where the inside of the patient's cranium has been previously "seen" on an MRI or a CT scan. In carrying out this method, both the head of the patient, the second object according to this invention, and the surgical probe, the first object according to this invention, will be moved in illustration of the novel operation of this invention.
The method of this invention, described relative to the above described apparatus, includes the steps of:
at a previous time, taking an MRI or a CT or the like, which may hereinbelow be referred to sometimes as the previous scan, through the patient's head with a sufficient number and location of slices as to reveal the internal structures of the patient's head. Where there is an abnormality, the number and location of the slices should be sufficient to depict that abnormality in at least one slice;
establishing and storing an electronic file of the scan in the form of a model of the internals of the patient's head, including the abnormality,' if there is one;
relating that electronic file to a present time coordinate system;
at the present time, as opposed to the previous time when the scan was originally taken, at sufficiently frequent intervals to accurately follow present time movement, detecting the positions of at least three of the radiation emitters operatively associated with the patient's head (the patient's head is the second object in the generic description of this invention);
at the present time, at sufficiently frequent intervals to follow present time movement, computing, from the detected positions of these emitters, the moving 15j locations and orientation of the second object relative to the predetermined fixed coordinate system;
electronically adjusting the stored model to display a view of that model which corresponds to the computed present time location and orientation of the moving second object in the same present time coordination system;
at the present time, at sufficiently frequent intervals to follow present time movement, detecting the locations of at least two of the radiation emitters operatively associated with the probe (the probe is the first object in the generic description of this invention);
at the present time, computing, from the detected positions of these emitters, at sufficiently frequent intervals to follow present time movement, the position and orientation of the first object in present time relative to the predetermined fixed coordinate system;
at sufficiently frequent intervals to track the movement of the first object relative to the second object, determining the positions and orientations of the first object relative to the second object by correlating the positions and orientations of the first object relative to the predetermined fixed coordinate system with the positions and orientations of the object relative to the same predetermined fixed coordinate system;
determining the correlation between the relative position and orientation of the probe with respect to the model of the object; and indicating the location of the probe with respect to the object by displaying a representation of the position and orientation of the probe in present time on the presently displayed model of the previously taken scan of the internal structures of the cranium.
15k Therefore, according to one aspect of the invention there is provided a method of determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
defining a present time three-dimensional global fixed coordinate system;
providing at lest three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
disposing said moveable first and second objects located within said fixed coordinate system;
defining a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
providing at lest three ncn-collinear radiation emitter means in known spacial relationship to said second object ar_d at known coordinates in said local coordinate system;
taking, at a previcus time, three-dimensional image data which geometrically describe said second object;
providing at least two spaced apart radiation emitter means disposed on said first object;
providing first movement means to move said first object relative to said second object and to said fixed coordinate system;
providing second movement means to move said second object relative to said first object and to said fixed coordinate system;
transmitting radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordination system;
distinguishing radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
independently determining the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
independently determining the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
determining the position and orientation of said second I
object in said local coordinate system;
15m orienting said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
integrating the' present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional-image data;
in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and in present time, repeatedly determining the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previouslY taken image data;
wherein, as a consequence of said repeatedly determining and integrating the positions and orientations of said first and second objects, and correlating said position and orientation of said second object with said previously taken three dimensional image data, positioning an image representative first object correctly in present time on said previously taken image data regardless of the movement of at least one of first and second objects in relation to said fixed coordinate system.
15n The present invention also provides for the use of a system wherein said first object is a surgical probe and said second object is a surgical patient, said surgical probe being partially insertable within said patient such that said inserted portion is not visible to an inserter thereof.
The present invention further contemplates the use as above wherein at least two of said emitter means are disposable on so much of said first object as is not insertable into said patient, said at least two of said emifter means being disposed on said first object at a known distance from a tip of said probe which is adapted for insertion into said patient.
In the description of this invention, the radiation emitters have been located on the first and second objects, and the sensors for these radiations have been located in fixed positions within the present time coordination system. It is considered that this arrangement of these emitters and sensors could readily be reversed. That is, the emitters could occupy the fixed positions in the present time coordination system and the sensors could be located on the first and second objects.
The invention would operate in the same manner with either arrangement of sensors and emitters. For convenience, the description of this invention has placed the emitters on the moving objects and the sensors in fixed positions.
This arrangement should not be considered to be a limitation on either the apparatus or method of this invention.
Brief-Description of the Drawings The accompanying drawing figures illustrate a preferred embodiment of the present invention and, together with the description, serve to explain the principles of the invention.
Figure 1A is a block flow diagram of the optical mensuration and correlation system of the present invention showing the major components of this system, except for the means to automatically measure the position and orientation of the movable and second object.
Figure 1B is similar to Fig. lA but includes the additional radiation emitters which will permit automatically measuring the position and orientation of the second object even during the movement of this second obj ect .
Figure 2 is a perspective view illustrating the invention in use by a surgeon performing intracranial surgery on a patient, and showing a cursor on a display screen that marks the corresponding position of the tip of the probe (the first object) within the image of previously obtained model data corresponding to the cranium (the second object).
Figure 3 is a view of a sample display showing a position of the tip of the probe superimposed on previously obtained model data of an inside slice of a cranium and the showing reference points of the second object, as depicted in figure 1A, as triangles on the patient's skull.
Figure 4 is a schema.tic perspective view of a sample of one of the one-dimensional photodetectors which are useful in the practice of the present invention.
Figure 5 is a graph of the image intensity (manifested as a voltage or current) versus locations on the photodetector surface for a typical light detector which could be used by the radiation supported mensuration and correlation apparatus of the present invention.
Figures 6 and 7 are diagrams of the major steps performed by the computer to calculate the position of the probe (first object) with respect to the model of the inside of the cranium (the second object) and to display a cross-sectional image slice of the model of the inside of the cranium on a suitable display screen, such as a computer screen (CRT).
Figure 8 is a schematic view of radiation beams depicting an embodiment of this invention.
Detailed Description of The Preferred Embodiments of This Invention The radiation mensuration and correlation apparatus 10 of the present invention, as applied to a medical application, which is illustrative of one use of this invention, is shown schematically in Figure 1. it comprises a hand-held invasive probe 12 (first object) housing at least two radiation emitters 14 and 16 mounted collinear with one another and with the tip 18 of the probe 12. At 'least three remotely located, one-dimensional radiation sensors 20, 22, and 24 are mounted in fixed, spaced relationship to each other and are located at known positions with respect to a predetermined fixed coordinate system 80. The radiation sensors 20, 22, and 24 sense the radiation widely projected by the individual emitters 14 and 16 and generate electrical output signals from which are derived the location of the probe emitters 14 and 16 and, consequently the position and orientation of the probe tip 18 (which may not be visible to the surgeon because it is within the cranial cavity), with respect to the fixed coordinate system 80.
In addition, where it is desired to determine the moving position of the cranium during the operation, the three sensors 20, 22, and 24 can be programmed to sense and derive the locations of other reference emitters 70, 72, and 74 on the second object 11 (Figure 1B) in the same manner as for the probe emitters 14 and 16. The role of these reference emitters on the second object is to automate the calculation of the relationships between the present time coordinate system of the model's image 13 (Figure 2) of the second object, the coordinate system of the sensors, and the local coordinate system of the second object itself 11.
A control unit 30 connected to the moveable probe 12 via a data line 26 and coupled to the remotely located sensors 20, 22, and 24 via data lines 28, 32, and 34, respectively, synchronizes the sensing of the five (exemplary) emitters and the differentiation between them.
In that embodiment of this invention where the various emitters emit radiation in pulses, as by strobing them, the control unit is adapted to control the time multiplexing of the two emitters 14 and 16 on the probe and the three emitters 70, 72 and 74 on the cranium, controls the operation of the sensors 20, 22, and 24, and receives differentiatable data from these sensors as will be more completely described below. A coordinate computer 36, coupled to the control unit 30 by a data line 38, calculates the three-dimensional spatial location of the probe emitters 14 and 16 and consequently the position and orientation of the probe tip 18, and correlates those positions with data from correlation information 42 and from a model 13 of the second object 11 which has been previously stored electronically in an electronically accessible database 40. Finally, the computer 36 causes an associated cathode ray tube-monitor (CRT) to display the representation of the position and the orientation of the probe tip 18 with respect to the computer image 13 of the cranium 11 on display screen 44 (Figure 2) as will be more fully described below.
The probe 12 could be used without the cable 26, in that it could be coupled to the control unit 30 by employing distinctive modulation of the light emitters 14 and 16 instead of sequentially energizing (strobing) them, or by varying the wavelength or type of the radiation emitted therefrom. For example, the wave forms, color, or frequencies of each could be different. In the case of using sound radiation, the frequencies of the sound emitted by the different emitters could be varied so as to differentiate between them. The controller 30, by detecting the differences between different emitters, that is the wave form, color, frequency or other dissimilarity, of the emitted radiation, can determine to which emitter the sensors 20, 22, and 24 are reacting.
The fundamental mensuration and correlation apparatus 10 of the present invention has been illustrated in connection with aiding surgeons performing delicate intracranial surgery. This general use of this apparatus does not constitute this invention, nor is it a limitation thereon. This use will only serve to illustrate this invention. The remaining description continues to use such a surgical embodiment as illustrative, although many other surgical or other applications besides intra-cranial surgery are possible (for example, back or sinus surgery and breast biopsy). Moreover, the radiation mensuration and correlation apparatus 10 of this invention may be used for other purposes in many various medical or non-medical fields. In the described embodiment, the physical object 11 of interest, that is the second object in the generic application of this invention, is the head or cranium of a patient, and the model of the cranium is replicated using a series of parallel internal image slices (of known mutual spatial relationship) such as those obtained by means of computed tomography (CT) or nuclear magnetic resonance imaging (MRI). These image slices are then digitized, forming a three-dimensional computer model of wo 94/23647 2161126 PCTIUS94/04298 the patient's cranium which is then stored in the electronically accessible database 40.
As shown in Figures 1A, 1B, 2, and 3 a surgeon places the tip 18 of the probe 12, that is the first 5 object, at any point on or inside the cranium 11 of the patient. The position sensors 20, 22, and 24 detect the locations of the emitters 14 and 16 attached to the portion of the probe 12 that remains outside the patient's body. In order to accomplish this, the radiation produced 10 by the emitters 14 and 16 must be "visible" to the sensors 20, 22, and 24. For that reason, more than two emitters may be placed on the first object so that the radiation from at least two of them will always be visible to the sensors. These emitters 14 and 16 are effectively point 15 sources and radiate energy through a wide angle so that this radiation is visible at the sensors over a wide range of probe orientations and positions.
The sensors 20, 22, and 24, the control unit 30, and the computer 36 cooperate to determine the three-20 dimensional location of each emitter 14 and 16 within a coordinate system, and compute the coordinates of each emitter in the predetermined fixed coordinate system 80, in present time. The computer 36 can then calculate the position and orientation of the tip 18 of the probe 12 with respect to the predetermined fixed coordinate system 80, according to the locations of the emitters within the fixed coordinate system 80 and the dimensions of the probe, which dimensions had been placed into the memory (not shown) of the computer 36 beforehand. It should be noticed that the computer 36 can also easily compute position and orientation information about other specific locations on the probe (such as the vector from emitter 14 to the tip 18). Once the computer 36 has calculated the location of the probe tip 18 with respect to the fixed coordinate system 80, the computer 36 then uses the relationship between the model of the cranium, which had previously been obtained and stored in the database 40, and the fixed_coordinate system 80 to calculate the position and orientation of the probe tip 18 in relation to the model of the second object 11. Finally, the computer 36 displays a representation of the model-relative position and the orientation of the tip 18 on a display screen 44. In a simple form of the preferred embodiment of this invention, the computer 36 accomplishes this display by accessing a previously taken CT or MRI
image slice 13 stored in the database 40 that is closest to the present time position of the probe tip 18, and then superimposes a suitable representation 76 of the tip 18 on the image 13 as shown in Figures 2 and 3.. Thus, the surgeon knows the precise position and orientation of the tip 18 in the patient's cranium relative to the image data by merely observing the display screen 44. A most preferred form of the present invention can derive and display an arbitrary oblique cross-section through the multiple image slices of the MRI, etc, where the cross-section can be, for example, perpendicular to the probe orientation.
The details of the optical mensuration and correlation apparatus 10 of the present invention are best understood by reference to Figures 1 and 4 collectively.
Essentially, the probe 12 supports the two radiation emitters 14 and 16, which are rigidly attached to the probe 12 at fixed, known distances from each other as well as from the probe tip. Since only two emitters are used here as being representative of the practice of this invention, the emitters 14 and 16 should preferably be collinear with the tip 18 of the probe 12 so that the computer 36 can determine uniquely the position and orientation of the tip 18 in three dimensions. Moreover, for reasonable measurement accuracy, the emitters 14 and 16 should preferably be at least as far from each other as the nearest one is from the tip 18. In any case, the geometrical relationship of the emitters 14 and 16 to each other and to the probe tip 18 must be specified to the computer 36 beforehand so that the computer 36 can compute the exact location of the tip 18 based on the locations of WO 94/2364' 2161126 PCT/US94/04298 the individual radiation emitters 14 and 16. The use of three or more non-collinear emitters would not require that any two of them to be collinear with the probe tip.
Three or more non-collinear emitters would permit the computer to compute full position and orientation information (yaw, pitch, and roll) for the probe.
Although the invention is described as showing only a cursor locating the relative position of the probe tip 18, the invention can be modified to display a line or a shaped graphic or other icon to indicate the position and orientation of the probe 12. This would entail only the determination of additional points on the probe in the same way that the tip of the probe is located.
The two radiation emitters 14 and 16, as well as the additional radiation emitters 70, 72, and 74, can be, and preferably are, high intensity light emitting diodes (LEDs), which are preferably coordinated such that the emission for any one source is distinguishable from the emissions from the other sources. One such differentiation is to have the emitters time-multiplexed or strobed by the control unit 30 in a predetermined sequence such that only one light emitter is "on" or emitting light at any one time. The light emitted from any one of these emitters is detected by each of the three light sensors 20, 22, and 24, which then determines the location of each particular emitter in relation to the known positions of the sensors 20, 22, and 24 at the time it is strobed.
Each of the one-dimensional sensors 20, 22, and 24 used in the preferred embodiment 10 of the present invention can be identical to the others in every respect.
Therefore, for the purpose of giving a detailed description of this embodiment, only the sensor 20 is shown and described in detail in figure 4 since the remaining sensors 22 and 24 are identical.
In Figure 4, the representative one-dimensional sensor 20 comprises a cylindrical lens 46 having a longitudinal axis 48 which is orthogonal to the optical axis 50 of the sensor 20. A linear radiation detector 52, such as a charge coupled device (CCD) with several thousand elements (or a similar device capable of linear positional radiation detection of a suitable "image"), is positioned in such a manner that the "optical" axis 50 passes through the center of the aperture 54 of the radiation detector 52 and such that the longitudinal axis of the aperture 54 is orthogonal to the longitudinal axis 48 of the lens 46. Radiation, such as light beams, 56 from the emitters 14 (and in the same manner emitters 16, 70, 72, and/or 74) are focused by the cylindrical lens 46 into a real image line 58 on the surface 60 of linear detector 52.
The detector, illustrated by a photodetector 52, then generates an output 68 (Figure 5) that is related to the position of a real image line 58 on the surface 60 of photodetector 52, thus characterizing the location of the image itself. That is, those elements or points of the photodetector 52 illuminated by the real image line 58 will generate a strong signal, while those not illuminated will generate none or very weak signals. Thus, a graph of image intensity (or signal strength) versus locations on the surface of the photodetector will resemble a signal peak curve 68 (see for example Figure 5). The "all-emitters-off" (or background) signal level 66 is never quite zero due to the effects of environmental radiation, such as light in the operating room, electronic noise, and imperfections in the photodetector. In any event, since the image of the illuminated emitter is focused into line 58, only the angular displacement of emitter 14 from the optical axis 50 in the plane of the longitudinal sensor axis 54 is measured by the sensor 52, hence the designation "one-dimensional sensor".
Thus, a single one-dimensional sensor 20 can only locate the plane on which a radiating emitter 14 lies. The detector 20 cannot, by itself, determine the unique point in space on that plane at which radiating emitter 14 is located. To precisely determine the WO 94/2364' 216112 6 PCTIUS94/04298 location in space of the radiating emitter 14 requires at least three such sensors positioned in spaced relationship to each other, since the intersection of three planes defined by the three sensors, respectively, are required to define a single point in space.
To locate the position of one particular radiating emitter, such as 14, the sensors 20, 22, and 24 are mounted so that the optical axes of their lenses 48 are not all parallel and no two of such axes are collinear. In a preferred embodiment of this invention, two light sensors, such as sensors 20 and 24 in Figure 2, are situated so that their respective axes 48 (Figure 4) are in parallel, spaced relationship, and the third detector 22 is situated between and equidistant from the other two detectors, but with its axis 48 perpendicular to the axes of the other two. That is, the sensors 20, 22, and 24 should be arranged along a line or arc (Figure 2), such that each sensor 20, 22, and 24 is generally equidistant from the center of the volume in which the measurements are made, equally spaced from each other, and all aimed at the center of the measurement volume.
Suppose for example that the sensors 20, 22, and 24 are arranged along a horizontal arc and the optical axes of all sensors are oriented horizontall.y. Then the middle sensor should be oriented so as to measure the angular elevation of the radiation emitters as described above.
The two outer sensors measure the horizontal angle (azimuth) relative to the fixed coordinate system 50.
Data from the outer sensors are used to stereographically calculate both the horizontal position and distance from the sensors as will be more fully described below.
The accuracy of three-dimensional measurement depends on the angle formed between the optical axes of the outer two sensors 20 and 24, where the emitter to be measured is at the vertex of the angle. Accuracy will improve as that angle approaches a right angle. At least three of the several possible sensors 20, 22, and 24 must be spaced so that the desired measurement volume is completely within their field of view which can be accomplished by making the focal length of the lens 46 short enough to provide coverage of the entire desired field of view. In another embodiment of this invention, 5 additional sensors, which may be substantially identical to sensors 20, 22, and 24, could be used to provide more viewpoints, to broaden coverage of the field of view, or to enhance measurement accuracy.
While this process of detecting a given 10 radiating emitter, such as 14, can determine the exact location of the radiating emitter, it cannot by itself determine the particular orientation and position of the probe or its tip 18 in three-dimensional space. To do so with only two emitters requires that both emitters 14 and 15 16 be collinear with the probe tip 18, as described above.
Also, the distances between each emitter, 14 and 16, and the probe tip (as well as the distances between the emitters 14 and 16 themselves) must be known and loaded into the memory of the computer 36 before the computer 36 20 can determine the position and orientation of the probe tip 18 from the locations of the emitters 14 and 16 in the fixed coordinate system 80. Consequently, when each of the radiation emitters 14 and 16 is rapidly turned on in sequence, or strobed, the sensors 20, 22, and 24 can 25 detect the exact location of each emitter in turn. Thus computer 36 can determine the exact position and orientation of the probe, and therefore its tip 18. Since only one of the radiation emitters 14 or 16 is on at any one time, the detectors 20, 22, and 24 locate the location of that particular illuminated emitter individually. If the strobe rate, that is, the frequency at which the emitters 14 and 16 are turned on and off in sequence, is fast enough, the detectors 20, 22, and 24 can, for all practical purposes, determine the position and orientation of the probe 12 and its tip 18 at any instant in time, and therefore can follow the movement of the probe tip in present time, that is during the time that the probe tip is actually moving. In other words, this system can WO 94/23647 21611 /r 6 PCT/US94/04298 simulate the movement of the probe tip on the previously taken image in present time during the surgical procedure.
The sensors 20, 22, and 24 need only distinguish which of the radiation emitters 14, 16, 70, 72, or 74 is on at any one time. In the preferred embodiment 10 of the present invention, this function is accomplished by strobing each of the emitters in sequence, as described above. However, other methods can be used to allow the sensors 20, 22, and 24 to distinguish the respective radiation emitters 14, 16, 70, 72, and 74 from one another. For example, different wave lengths (colors) of light, or different frequencies of sound, could be used in conjunction with detectors capable of distinguishing those particular different radiations.
Alternatively, it is one aspect of this invention to modulate each of the respective radiation emitters 14, 16, 70, 72, and 74 with a unique wave form or pulse train. This means of differentiating between the different emitters is believed to be novel and unique to the instant invention. If such different wave forms or pulse trains are used to differentiate the different emitters, it is also within the spirit and scope of this invention to transmit additional information on these wave forms, such as for example the temperature of the particular structure being contacted by the probe tip 18.
It is within the scope of this invention to provide means readily accessible to the surgeon or other operator to engage or disengage the taking of such additional information and the transmission thereof by the unique wave forms or pulse trains radiated by the respective emitters. Under these circumstances, the control unit 30 or computer 36 will be designed to demodulate the wave form to determine to which particular emitter the sensed signal belongs, and to decode the additional information being transmitted.
Numerous other methods for distinguishing the radiation emitters are possible. Therefore, the present invention should not be regarded as limited to the wo 94123647 216112 6 PCT/US94/04298 particular strobing method shown and described herein, but is generic to the use of any means to differentiate between the different emitters.
Conventional or unique auto-focusing or multiple-lens radiation detection may be integrated into the sensors 20, 22, and 24 to improve the performance of the system. However, the simple, fixed-focus optics shown and described herein and shown in Figure 4 for one sensor provide a good level of performance if the working range of the probe is restricted. Even if the real image of an emitter, such as 14, is somewhat out of focus on the detector 52, the angular measurement of the image is still usable. A usable measurement for each of the sensors 20, 22, or 24 to generate will be any of the following: (1) the position of the detector element with peak intensity, (2) the intensity-weighted average (centroid) of all over-threshold elements, or simply (3) the average of the minimum and maximum elements where the intensity is over some threshold. The detector 52 should be placed at the focal distance for the farthest typical operating distance of the radiation emitters. Closer emitters will form slightly defocused images 58, but they require less precise angular measurement for a given distance accuracy.
Furthermore, their de-focused real images are brighter, which increases the brightness gradient at the edges of the image.
As described so far, the real image 58 of the currently activated emitter must be significantly different from (for example brighter than) the rest of the radiation falling on the sensor 52. Otherwise, other lights or reflective surfaces in the field of view of the sensors will hinder the detection of the emitter's real image. Therefore, it is desirable to include in the apparatus, circuitry to subtract the background radiation received by the sensors from other, ambient, sources.
This per se known circuitry enhances use of the invention where the sensors are required to detect the radiation emitters against relatively bright backgrounds.
While the radiation emitters are all momentarily extinguished, the one-dimensional data from each sensor are saved in a memory. This can be done in an analog delay line or by digitally sampling the output signal and storing it in a digital memory. Then, as each emitter is "viewed" sequentially, the saved data are subtracted from the current data generated by the currently radiating emitter. If the background data are stored digitally, the current data are also digitized, and the stored background data are digitally subtracted from the current data.
A graphical representation of the radiation intensity of the image or, equivalently, the generated output voltage amplitude for each element in a row of detecting elements, is shown in Figure 5. The graph depicts typical background image intensities 66 with all emitters off, the intensities 68 with one radiation emitter on, and the element-by-element difference 64 between the intensities with the emitter off and those with it on. The measurements will likely contain some random noise, electronic or otherwise, and two consecutive measurements for a given sensor element may differ slightly even where the background is unchanged.
Therefore, the differential intensities 64 between two consecutive measurements also contain some random electronic noise. However, the two measurements differ substantially only at the location of the radiation emitter image, and this difference exceeds the threshold level 62.
The details of the structure and operation of the control unit 30 are best seen in Figure 6.
Specifically, control unit 30 (see Figure lA and 1B) supplies power to the radiation emitters 14, 16, 70, 72, and 74 and the radiation sensors 20, 22, and 24. A
control and synchronization unit 84 and radiation source sequencer 88 (where a strobed radiation sequencing is used) time-multiplexes or strobes the radiation emitters individually, as described above, so that the position and orientation of the probe tip 18 (Figure 1) can be determined from the signals received from the sensors 20, 22, and 24. The angular data signals received from the sensors 20, 22, and 24 are converted by an analog-to-digital converter 92. Actually, three analog-to-digital converters are used, as shown in Figure 6, but only one is labeled and described herein for brevity, since the other two analog-to-digital converters are substantially identical and are used to convert the signals from the other sensors 22 and 24.
The control and synchronization unit 84 also controls three switches, of which switch 93 is typical, which store all digital data received from the sensors 20, 22, and 24 when the radiation emitters 14 and 16 are off and stores these data into a background memory 94. Then, when the radiation emitters 14, 16, 70, 72, and 74 are illuminated in sequence by radiation source sequencer 18, the synchronization and control unit 84 changes the state of switch 93 which then redirects the data from the three sensors 20, 22, and 24 to a subtraction unit 91. The subtraction unit 91 subtracts the background data from the emitter radiation data, thus resulting in a signal which has been relatively freed from the background signal 66 (Figure 5) since the fixed pattern noise has been subtracted from the signal.
As shown in Figure 6, which should be considered in conjunction with Figure 5, a 1-D (one-dimensional) position calculation unit 95 determines the location of the real image line 58 on the CCD sensor 52 (Figure 4) by measuring the locations of the edges 67 and 69 of the signal blip 68 (Figure 5) generated by the CCD sensor based on a predetermined threshold signal level 62. The 1-D position calculation unit 95 then averages the distance between the two edges to find the center of the signal peak 68 as shown in Figure 5. This method of determining the center of the signal peak is per se well known in the art and need not be described in further detail. Moreover, numerous other methods of determining the location of the signal peak or its centroid are known WO 94'2364' 216112 6 PCT/US94/04298 in the art and will be obvious to those of ordinary skill in the art. The method used depends on the signal characteristics of the radiation sensor used as well as the characteristics of the lens system used to focus the 5 radiation onto the surface of the detector, in addition to other parameters. Those practicing this invention with the various alternatives described herein would have no trouble selecting a signal detection algorithm best suited to the particular characteristics of the sensors and the 10 particular radiation being used.
Finally, the control unit 30 (Figure 1) transmits the radiation data to the computer 36. That is, when the computer 36 is ready to compute the current location of the currently radiating emitter, such as 14, 15 the latest angular data from all sensors 20, 22, and 24 are provided for analysis. If the sensors generate data faster than the control unit 30 can process them, the surplus angular data are simply discarded.
The operation of the computer 36 is most 20 advantageously set forth in Figure 7. The computer 36 calculates one-dimensional positions for each radiation emitter such as 14 or 16, based on the location of the signal peak from each respective sensor 20, 22, and 24.
These one-dimensional angular position measurements are 25 then used to determine the three-dimensional spatial coordinates of the emitters 14 and 16 and thus for the position and orientation of the probe 12 relative to the predetermined fixed coordinate system 80 by coordinate transformation methods which are per se well-known in the 30 art. The output signals from the computer 36 can be in any form desired by the operator or required by the application system, such as XYZ coordinate triples based upon the predetermined fixed coordinate system 80.
Figure 8 and the following paragraphs describe in detail how the location of a single radiation emitter, such as 14, is computed from the data derived from the sensors 20, 22, and 24. The following description applies to these three sensors 20, 22, and 24 only. If there are more than three such sensors, the calculation can be performed using any three or more of the sensors.
Furthermore, if more than three sensors are used, the average of the points calculated from all combinations of three sensors could be used to increase accuracy. Another option is to use the point calculated from the three sensors closest to the radiation emitter 14 or 16. The following parameters are considered to be known XYZ
constants:
D0[i], one endpoint of each linear photodetector i;
D1[i], the other endpoint of linear photodetector i;
L0[i], one endpoint of the axis of each lens i; and Ll[i], the other endpoint of the axis of lens i.
Each sensor generates T[i], a parametric value between 0 and 1 indicating where the peak or center of the line image of the emitter intersects the line segment between D0[i] and D1(i]. The XYZ coordinates of point S are to be calculated, where S is the location of the radiation emitter. For a CCD radiation detector array, T[i] is the index of the element on which the center or peak of the image falls divided by the number of elements on the detector array.
The three-dimensional coordinates of the above points are all referenced to a predetermined fixed coordinate system 80. The cylindrical lens and linear photodetector do not directly measure the angle A of the radiation emitter about its lens axis; rather, they measure a value T[i] linearly related to the tangent of that angle:
tan(A) = C * (2 * T[i] - 1), where C is a constant of proportionality that is related to, and determined empirically by, the dimensions of a particular system.
The three-dimensional location of the image line on the linear photodetector is:
D[i] = (1 - T[i]) * D0[i] + (T[i]) * Dl[i]
If the lens is ideal, then S also lies in plane P[i]. In reality, the point D[i] might have to be computed by a non-linear function F(t) that corrects for non-linear aberrations of the lens or the photodetector:
D[i] = (1 - F(T[i])) * D0[i] + (F(T[i])) * D1[i]
Function F(t) could be a polynomial in variable T, or it could be a value interpolated from an empirically determined table.
P[i] is the unique plane determined by the three points D[i], L0[i], and L1[i], which are never collinear.
S is the point of intersection of the planes P(1], P[2], and P[3] determined respectively by sensors 1, 2, and 3.
S is a unique point if at least two sensor lenses longitudinal axes 48 are not parallel and if no two lens axes 48 are collinear. The intersection point is found by finding the common solution S of the three equations defining the planes P(i]. Once the location S of each of the probe's radiation emitters is computed, the location of the probe's tip 18 can be calculated. The method of making such a determination is well known using the teaching of analytic geometry and matrix manipulations.
If M is a linear transformation describing the relationship between a point R in the image coordinate system and a point S in the fixed coordinate system, then :
R * M = S.
If M'1 is the inverse of M and if S is a point in the fixed coordinate system, then the point R in the image coordinate system corresponding to S is:
S * M"' = R.
Now, suppose that the second object is moved in the WO 94'2364' 216112 6 PCTIUS94/04298 mensuration coordinate system. This can be described by a linear transformation U where the coordinates S of a point are mapped to the coordinates S':
S * U = S' Then the old value of M above must be multiplied by U in order to correct the relationship between the point R in the image coordinate and the corresponding point in the mensuration coordinate system because of the relative movement of first object with respect to the second object:
R = S' * U-' * M'' The preliminary steps required before practicing the method of the invention are now described. Then, after fully describing these preliminary steps, the detailed steps of the method of the optical mensuration and correlation apparatus are described.
Use of the invention takes place in three phases: the imaging phase, the correlation phase, and the normal operation phase. The imaging phase precedes the normal operation of the present invention. During the imaging phase, a scan of the body of the second object of interest is used to build a three-dimensional geometrical model. In the preceding description, the second object was the head of a human intracranial surgical patient because the invention is advantageously used in stereotactic neurosurgery. Accordingly, the three-dimensional model comprises digital data from a series of internal cross-sectional images obtained from computed tomography (CT), magnetic resonance (MRI), ultrasound, or some other diagnostic medical scanner. In any case, the image data are stored in a suitable, electronic memory 40 which can be accessed later by the computer 36. The data are considered to be stored as a series of parallel two-dimensional rectangular arrays of picture elements (pixels), each pixel being an integer representing relative density. If the object is relatively rigid, like a human head, this three-dimensional model may be created at some time before the correlation and operational phases of the invention and possibly at another location.
Also, during the imaging phase, at least three non-collinear reference points 71, 73, and 75 (Figures 2 and 3) must be identified relative to the object 11.
These may be represented by ink spots, tattoos, radiopaque beads, well-defined rigid anatomical landmarks, locations on a stereotactic frame, sterile pins temporarily inserted into rigid tissue or bone of a surgical patient, or some other reference means. The coordinates of these reference points are measured and recorded relative to the coordinate system of the imaging device. One way to accomplish this is to capture the reference points as part of the previously made three dimensional model itself.
For example, radiopaque pins could be placed within the image planes of diagnostic CT slices; the pin locations, if not automatically detectable from their high density, can be identified interactively by the surgeon using a cursor on the computer display of the CT slices. See Figure 3.
The initializing of the position and the orientation of the second object, the patient's cranium, is well known in this art. The instant invention departs from this well known operation to add radiation emitters which have a known and exact spacial relation to these fiducial markings. These additional radiation emitters must then be programmed or otherwise activated for use in a particular manner in order to practice the instant invention. They must be programmed to radiate at some frequency during the surgical procedure which is in progress during present time so that the position and orientation of the second object will be available to the surgeon at all relevant times and so that this position and orientation can be repeatedly and automatically updated in order for the system to revise the specific selected scan slice to superimpose the position and the orientation of the first object on in a correct depictation of the actual relative positions and orientations of both the first and second object in 5 present time during the surgical procedure.
The initial correlation mode immediately precedes the normal operational phase of the present invention and must take place in the operating room.
During this initial correlation phase, the instant system 10 accesses the data of the three-dimensional geometrical model of the patient (or other object), including the reference point (fiducial marker) coordinates which were recorded earlier, that is previously. Next, the surgeon may place the tip of the probe 18 at each of the reference 15 points 71, 73, and 75 on the patient, in turn. This sequence of operations may be directed by the computer program. In the alternative, the system of this invention provides these data automatically by the radiation from the emitters 70, 72, and 74 being received by the sensors 20 directly and automatically without special intervention by the surgeon. Either of these procedures establish an initial relationship between the locations of these reference points in the model coordinate system and their current physical locations in the fixed coordinate system 25 80. However, the preferred determination of this initial position and orientation also carries on during the whole of the surgical procedure and therefore is capable of substantially continuously updating the position and orientation of the second object and relating it to the 30 current position and orientation of the probe. In turn, this establishes a linear mathematical relationship between all points in the model and points in the coordinate system 80. Thereafter, when the patient is moved relative to the sensors, the prior art must 35 establish a new relationship by again digitizing the reference points 71, 73, and 75 within the coordinate system 80. That is, the correlation phase must be repeated. Again, the system of this invention uses the Wo 94/23647 2161126 PCT/US94/04298 emitters 70, 72 and 74 to accomplish this automatically.
For this reason, the automatic tracking of the position of the head, or the second object whatever that is, which is described below and which overcomes this problem, is an essential, significant feature of the present invention.
Since the position and orientation of the head is initially and substantially continually thereafter correlated with the model, the surgeon can relate any locations of interest on the diagnostic images with the corresponding physical locations on this patient during the operation, and vice versa. These include locations accessible to the probe tip 18 but not necessarily directly visible to the surgeon.
Having described the function and purpose of the preliminary steps, the detailed method of the present invention is more easily understood. As shown in Figure 7, the position data 21 of the probe emitters generated by the sensors and control unit are converted into three-dimensional coordinates relative to the predetermined fixed coordinate system 80 of the sensors. Using dimensional parameters describing the relationship among the probe emitters and the probe tip, the computer determines the coordinates of the probe tip in a step 39.
During the initial correlation phase, the probe tip may be placed at each of the reference points 71, 73, and 75 in turn. Alternatively, in accord with a preferred aspect of this invention, the emitters 71, 73 and 75 are located by the sensors and the correct position and orientation of the second object is thereby determined. The coordinates of the second object in the fixed coordinate system along with their coordinates 46 in the image coordinate system determine a unique linear transformation relating the two coordinate systems in a step 45. This is a per se known calculation in analytic geometry and matrix mathematics.
As noted above, a more automated and direct method of determining the location of the second object is to directly read the locations of the fiducial points 71, 73 and 75 by the fixed sensors 20, 22 and 24. This can be WO 94,23647 2161126 PCTIUS94/04298 accomplished by placing radiation emitters 70, 72, and 74 (Figure 1B) at those reference points (or in a known fixed spacial relationship to those reference points 71, 73, and 75). The emissions of these emitters can then be read directly by the sensors 20, 22, and 24, and thus the computer can then automatically determine their locations relative to the predetermined fixed coordinate system 80 of the sensors. Thus the position and orientation of the second object, the cranium in the preferred embodiment of this invention, can be automatically and substantially continuously determined. With the position and orientation of the second object being at least frequently, if not substantially continuously, updated, the position of the first object, the probe, which is also determined at least frequently, if not substantially continuously, can then be updated in relation to the second object at the same frequency. Since the position and orientation of both the first and the second objects are each at least frequently determined and updated in relation to the fixed coordinate system 80, the position and orientation of each of these first and second objects can then be determined relative to each other, by indirect, but well known, calculations which are easily carried out in short order by a computer.
It has been stated herein that the position and orientation of the second object, the cranium, can, according to this invention, be determined continuously or at least frequently. The frequency at which the position and orientation of the second object is determined is a function of the desires of the operator of this system and the frequency at which the radiation emitters and the sensors can be operated. In the case where the emitters all emit the same wave length of radiation and the sensors all sense this same wave length of radiation, the differentiation of the emissions of the several emitters is followed in a sequential pattern. Thus, in this embodiment of this invention, the emitters will emit radiation in sequence, for example 14, then 16, then 70, Wo 94/23647 216112 6 PCT/US94/04298 then 72 and then 74. The sensors will have been programmed to identify a signal with an emitter as a function of when the signal is received.
In this embodiment of this invention, the position and orientation of the first object, the probe, is determined with the same frequency as is the location and the orientation of the second object, the cranium, because all of the emitters radiate in sequence. However, the system can be programmed so that the emitters 14 and 16 fire more or less frequently than the emitters 70, 72 and 74. Under these conditions, the position and orientation of the first object and of the second object will be determined at different individual frequencies, that is at the same frequency as the frequency of the radiation from their respective emitters. It will therefore be clear that the frequency of determination of the location of any given emitter, and therefore the determination of the position and orientation of these first and second objects, is, because of the instant invented system, for the first time entirely controllable by the programmer or the operator, within the capabilities of the system operating the emitters.
However, it should be understood that the position and orientation of the first and/or second objects can be determined in a substantially continuous manner. In this embodiment of this invention, each emitter will radiate a different wave length or wave form or frequency pulse of radiation. Therefore, the radiation emitted from each emitter is simultaneously distinct from the radiation emitted from the other emitters. Under these conditions, the location of each emitter can be determined continuously by a set of sensors which is tuned to the specific, different radiation of each emitter.
Therefore, the location of each emitter can be determined continuously, whereby the position and orientation of either or both of the objects can be calculated by the computer from these continuous locations of the different emitters.
WO 94'2364' 216112 5 pCT/US94104298 While it is a significant distinction of this invention from the prior art that:
in the prior art:
the f irst object is intended to be moved and the second object is intended to be stationary; and h position and orientation of the first object is frequently determined, but the position and orientation of the second object is only determined at the start of the operation and at any time that the second object, the cranium, which is intended not to be moved at all during the operation, is known by the surgeon to be moved;
whereas according to this invention:
the first object is intended to be moved, and the second object is not intended to be rigidly immobilized in place, or, put another way, the second object is permitted to move and is even expected to move;
and the position and the orientation of the first object is frequently determined, and the position and orientation of the second object is also frequently determined. The position and orientation of these two objects may be determined at the same frequency or at different frequencies (or even continuously) as desired by the operator.
In preferred embodiments of this invention, the position and orientation of the second object will be determined from one hundredth to ten times, most preferably from a quarter as often to four times, as often as the frequency at which the position and orientation of the first object is determined. As a general proposition, there is no limit on the relationship between these frequencies of measurement. The preferred relationships set forth herein are illustrative and not limiting. The frequency of each measurement is dependent on the amount of movement which is allowed and is intended to be shown on the CRT. The upper limit on this frequency is determined by the ability of the emitters to be distinguished. There is no lower limit.
In the instant specification, the emitters have been described as being on the first and second objects and being movable therewith, and the sensors have been 5 described as being in a fixed relation to the coordinate system. While this is the preferred system, it is by no means the only configuration of the system of this invention. It is also within the scope of this invention to provide the emitters in fixed relationship to the 10 coordinate system, and the sensors on the f irst and second objects, respectively. The wiring may be somewhat more cumbersome in this configuration, but that should not detract from the viability of such a reversal.
This invention has been described with reference 15 to a first and a second moving object, and the determination of each of their absolute and relative positions and orientations in a fixed coordinate system.
It will be clear that this same system applies to more than two objects. In fact, the position and orientation 20 of any number of objects can be determined, both absolutely with respect to the coordination system, and relatively with respect to each other, by the practice of this invention. Thus, when this specification and the claims appended hereto speak of a first and a second 25 object, these can be two out of any number of total objects. This number is merely illustrative of the practice of this invention and is in no way limiting thereon.
Thus, the preferred system of this invention 30 performs the three primary tasks of this invention, preferably, but not necessarily, simultaneously:
the absolute position and orientation of the second object, the cranium, in the fixed coordination system is determined at least very frequently;
35 the relationship between the absolute position and orientation of the second object with respect to the previously taken images of that object, particularly the inside structures of that object, is determined at least WO 94/23647 _ 2161126 pCT/US94/04298 very frequently; and the absolute position and orientation of the first object, the probe, is determined at least very frequently.
The accomplishment of these three tasks then permits the computer to accomplish the three essential secondary tasks of this invention:
to calculate the position and orientation of the first object, the probe, in relation to the second object, the cranium, even though the first object, or a portion of it, is out of the line of sight of either the surgeon or the sensors;
to select the appropriate slice of the previously taken model of the interior of the second object which corresponds to the present time position and orientation of the first object in relation to the present time position and orientation of the second object; and to display the appropriate slice of the previously taken image of the second object with the present time position and orientation of the first object correctly depicted thereon.
Both the initial and the continual correlation determinations can be automatically initiated and updated by the computer 36 in some predetermined timed sequence or continuously. In fact, according to the most preferred aspect of this embodiment of this invention, the correlation phase is frequently, briefly from time to time, or even continuously, repeated, interspersed in between measurements in the operational phase or conducted simultaneously with the operational phase of the practice of this invention for the purpose of recalculating the linear transformations M and M' when the second object (such as a surgical patient) moves relative to the sensors.
During normal operation, the tip coordinates are transformed in a step 44 using the transformation computed in step 45. The new transformed coordinates, relative to the image coordinate system, are used to determine the plane of some two-dimensional cross-section through the WO 94123647 216112 s PCT/EJS94104298 three-dimensional image model 41 accessible in the accessible memory 43. The simplest method is simply to choose the existing diagnostic image plane located closest to the probe tip's coordinates relative to the model coordinate system.
In any case, a step 47 transforms the two-dimensional cross-sectional slice to a screen image and places a cursor on it to mark the location of the probe tip superimposed in the image. Scaling and viewing parameters determine how the image is displayed. Because the surgeon may not be able to simultaneously view the patient (object) and the computer display screen, the step 47 should be controlled by the surgeon, such as for example by placing an activating button on the probe.
Pressing the button can be the signal for freezing the image and the depicted position and orientation of the probe tip marker at that instant on the display screen.
In a more complex embodiment of this invention, the computer system could generate and display on the screen a cut-away view at an arbitrary angle, for example, perpendicular to the direction the probe is pointing, using the data from multiple image slices. In simpler cases, the computer simply displays any one or more convenient image slices through the location of the probe tip. For example, the displayed slice might simply be the original CT slice which includes the location of the probe tip, or is closest to that location. In any case, the computer then causes the image a cursor at the current position of the probe tip to be displayed on this previously taken image of a slice through the second obj ect .
An alternative means, to record the location of the reference points in the coordinate space of the imaging apparatus during the imaging phase, employs an additional, separate instance of the three-dimensional position mensuration probe, sensors, control unit, and computer of the present invention. In order to implement this embodiment of this invention, the additional sensors are permanently attached directly on the imaging apparatus. The additional probe measures the location of the reference points at the time of imaging, and the additional control unit and computer determines and records their locations relative to the coordinate system of the imaging apparatus. The advantage of this approach is that the fiducial markers, that is the landmarks or reference pins, need not be within the limited cross-sectional slices visible to the imaging device.
As an alternative to true three-dimensional images, standard x-ray radiographs from several distinct directions can be used to construct a crude model in lieu of the imaging phase described above. Radiographs from two or more directions are digitally scanned, and four non-coplanar reference points on them are identified with a cursor or light pen. In a correlation phase similar to that described above, these four points on the patient are digitized just prior to surgery. Then, during surgery, the location of the probe tip is projected onto the digitized computer images of the two-dimensional radiographs where the projection is uniquely defined by mapping and transferring the reference point coordinates from the model coordinate system to the fixed sensor coordinate system.
In a further embodiment of this invention, a videotape recording of the computer screen (as well as the direct view of the surgeon and patient) is used to help document the performance of the instant procedure.
Radiation emitters may be present on more than one standard surgical tool such as the microscope, scalpel, forceps, and cauterizer, each of which thereby becomes, in effect, a probe. These emitters should be differentiated from each other in the same manner as aforesaid.
The method and apparatus of the optical mensuration and correlation apparatus 10 of the present invention has been completely described. While some of the numerous modifications and equivalents of the system of this invention have been described herein, still other WO 94,23647 216112 6 PCT/US94/04298 modifications and changes will readily occur to those of ordinary skill in the art. For instance, the preferred embodiment described herein uses visible light, since human operators can readily observe if the light sources are operative or whether they are causing troublesome reflections. Clearly, other wavelengths of electromagnetic radiation could be used without departing from the spirit and scope of the invention. Non-visible light, such as infrared or ultra-violet light, would have the advantage of not distracting the surgeon with flashing lights. Ultra-sound could be used conveniently. Other modifications to the detector "optics" and lenses are possible which would change, and possibly improve, the image characteristics on the detectors. For example, toroidal lenses could be used which are longitudinally curved along an arc with a radius equal to the focal length of the lens. Similarly, the surfaces of the photodetectors could also be curved, thus allowing the images of distant light sources to remain in sharp focus, regardless of their positions. Numerous enhancements of the digital data are possible by suitably programming the computer.
The most preferred aspects of this invention use electromagnetic radiation, and especially visible light, as the radiation from the emitters. This use of light for this function is a major improvement over the use in the prior art of audible sound emitters and detectors.
However, prior art systems which are based on the use of sound emitters can be reprogrammed to carry out the operations to substantially continuously recorrelate the position and orientation of the second object during the surgical procedure, as they have been described herein.
Thus, the movement of the second object can be at least frequently, if not continuously, tracked using sound emitters and detectors and suitable temperature compensation techniques. In this last regard, the aforementioned ability of the instant system to determine and transmit the temperature of the probe tip can be used wo 94/23647 216112 6 PCT/US94/04298 to good advantage when using sound as the radiation of choice. The fact that the use of electromagnetic radiation, particularly light, emitters and sensors is an improvement over the use of audible sound emitters and 5 sensors is not intended to be a limitation of the practice of continuously or frequently following the movement of the first or the second objects. That is an invention in and of itself using any emitter-sensor pair.
It should be understood, however, that the 10 transmission between emitters and sensors operate differently, and measure different things when electromagnetic radiation is used as compared to the use of sound, audible or ultrasonic. In the case of electromagnetic radiation, what is being measured is the 15 angle that the radiation path makes between the emitter and the sensor relative to some arbitrary fixed line. By measuring all of these angles of the rays between the emitters and the sensors, conventional analytic geometry solutions will locate the points in space where the 20 various emitters are. On the other hand, when sound radiation is used, what is being measured is the distance between each of the emitters and the sensors. Again, conventional analytic geometry solutions will precisely locate the point in space which is occupied by each 25 emitters/sensor. While the casual observer, or the operator of the systems of this invention will not observe any difference in result, there is a marked difference in the way that result is achieved, and therefore this will necessitate a difference in the manner in which this 30 system is programmed.
The foregoing is illustrative of the principles of the invention. Since numerous modifications and changes will readily occur to those of ordinary skill in the art, given the teachings of this specification, this 35 invention is not limited to the exact construction and operation shown and described herein. Accordingly, all suitable modifications and equivalents that may be resorted to in light of disclosure of this specification are considered to fall within the scope of the invention as defined by the following claims.
Claims (78)
1. A system for determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear radiation emitter means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish between radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independently determine the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor mean in known relation to said fixed coordination system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first and second objects, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coordinate system.
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear radiation emitter means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish between radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object;
means to independently determine the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor mean in known relation to said fixed coordination system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first and second objects, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coordinate system.
2. A system as claimed in claim 1 wherein said first and second objects intersect, which system further comprises means to display so much of said previously taken three dimensional image data as corresponds to the intersecting location of said first and second objects, and means to display said representation of said first object on said image display in its correct present time position.
3. A system as claimed in claim 1 wherein said frequency of determining the position and orientation of said second object is 0.01 to 100 times the frequency of determining the position and orientation of said first object.
4. A system as claimed in claim 1 wherein the position and orientation of said second object, the integration of the position and orientation of said second object and said image data, and the position and orientation of said first object are all determined substantially continuously.
5. The system as claimed in claim 1 wherein said previously taken three dimensional image data comprise a succession of substantially two dimensional slices at previously determined, known locations and orientations through said second object.
6. The system as claimed in claim 1 wherein said radiation emitter means are disposed on said first and second objects and said sensor means are spaced from said first and second objects and are disposed in fixed relationship to said fixed coordinate system.
7. The system as claimed in claim 1 wherein said sensor means are disposed on said first and second objects and said emitter means are spaced from said first and second objects and are disposed in fixed relationship to said fixed coordinate system.
8. The system as claimed in claim 1 wherein said radiation is electromagnetic radiation.
9. The system as claimed in claim 8 wherein said radiation is light.
10. The system as claimed in claim 9 wherein light of different wave lengths is emitted from each different emitter means.
11. The system as claimed in claim 8 wherein said radiation is pulse radiated from different of said emitter means sequentially in a repeating order.
12. The system as claimed in claim 1 herein said radiation is sound.
13. The system as claimed in claim 1 wherein said radiation emitter means and said sensor means comprise means for detecting linear and rotational inertial motion.
14. The system as claimed in claim 1 wherein said previously taken image data are taken by a means which is not suitable for use during a surgical procedure.
15. The system as claimed in claim 1 wherein said first object is a surgical probe and said second object is a surgical patient, wherein said surgical probe is adapted to be partially inserted into said patient such that said inserted portion is not visible to an inserter thereof.
16. The system as claimed in claim 15 wherein at least two emitter means are disposed on so much of said first object as is not inserted into said patient, and wherein said so disposed emitter means on said first object are a known distance from a tip of said object which is adapted for insertion into said patient.
17. The system as claimed in claim 1 wherein said image data are selected from the group consisting of magnetic resonant image data, computed tomography data, ultra sound data, and X-radiation data.
18. The system as claimed in claim 1 wherein said second object is a patient, and said first object is an operating microscope, and wherein said image data are projected onto the field of said operating microscope.
19. The system as claimed in claim 1 wherein said second object is a part of the anatomy of a human.
20. The system as claimed in claim 19 wherein the part of the anatomy is a head.
21. The system as claimed in claim 19 wherein there are reference points on said second object, which are distinguishing anatomical features, which are in fixed spacial relationship to said emitter means associated with said second object.
22. The system as claimed in claim 1 operated intermittently, but sufficiently frequently to depict the relative movement of said first object relative to said second object, and the movement of said second object relative to said fixed coordinate system.
23. The system as claimed in claim 2 including means for monitoring in present time the movement of said first object in true relation to said previously taken three dimensional image data.
24. The system as claimed in claim 1 including means for selecting the slice of said previously taken image data which corresponds to the present time location of said first object such that a representation of said first object is correctly disposed on said image slice in present time.
25. The system as claimed in claim 1 including means to freeze a display of a previously taken image and a representation of said first object for a finite period of time.
26. The system as claimed in claim 1 including means to continuously display a series of slices of said previously taken three dimensional data which correspond to a changing position of said first object relative to said second object, and which correctly reflect movement of said second object relative to said fixed coordinate system in present time.
27. A system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from said objects;
said first and second objects located within said fixed coordinate system;
at least three non-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers;
at least two spaced apart electromagnetic radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted for all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determine location of at lest two of said radiation emitter means on said first object;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time repeatedly determine the positions and orientations of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said positions and orientations of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system.
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from said objects;
said first and second objects located within said fixed coordinate system;
at least three non-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
a three dimensional local coordinate system which is fixed in relation to said second object;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers;
at least two spaced apart electromagnetic radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted for all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determine location of at lest two of said radiation emitter means on said first object;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time repeatedly determine the positions and orientations of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said positions and orientations of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system.
28. A system as claimed in claim 27 wherein said first and second objects intersect, which system further comprises means to display so much of said previously taken three dimensional image data as corresponds to the intersecting location of said first and second objects, and means to said representation of said first object on said image display in its correct present time position.
29. A system as claimed in claim 27 wherein the integration of the position and orientation of said second object and said image data, the position and orientation of said first object, and the correlation of the position and orientation of said first object with said image data are all determine substantially continuously.
30. The system as claimed in claim 27 wherein said previously taken three dimensional image data comprise a succession of substantially two dimensional slices through said second object.
31. The system as claimed in claim 27 wherein said radiation emitter means are disposed on said first object and said sensor means are spaced from each other and from said first and second objects and are disposed in known relationship to said fixed coordinate system.
32. The system as claimed in claim 27 wherein said sensor means are disposed on said first object and said radiation emitter means are spaced from said first object and are disposed in known relationship to said fixed coordinate system.
33. The system as claimed in claim 27 wherein said radiation is light.
34. The system as claimed in claim 33 wherein said radiation is visible light.
35. The system as claimed in claim 33 wherein light of different wave lengths is emitted from each different emitter means.
36. The system as claimed in claim 33 wherein said light is pulse radiated from different of said radiation emitter means sequentially in a repeating order.
37. The system as claimed in claim 27 wherein said radiation emitter means and said sensor means comprise means for detecting linear and rotational inertial motion.
38. The system as claimed in claim 27 wherein said previously taken image data are taken by a means which is not suitable for use during a surgical procedure.
39. The system as claimed in claim 27 wherein said first object is a surgical probe and said second object is a surgical patient, wherein said surgical probe is adapted to be partially inserted into said patient such that said inserted portion is not visible to an inserter thereof.
40. The system as claimed in claim 39 wherein at least two emitter means are disposed on so much of said first object as is not inserted into said patient, and wherein said so disposed emitter means on said first object are a known distance from a tip of said probe which is adapted for insertion into said patient.
41. The system as claimed in claim 27 wherein said image data are selected from the group consisting of magnetic resonant image data, computed tomography data, ultra sound data, and X-radiation data.
42. The system as claimed in claim 27 wherein said second object is a patient, and said first object is an operating microscope, and wherein said image data are projected onto the field of said operating microscope.
43. The system as claimed in claim 27 wherein said second object is a part of the anatomy of a human.
44. The system as claimed in claim 43 wherein the part of the anatomy is a head.
45. The system as claimed in claim 43 wherein there are reference points on said second object, which are distinguishing anatomical features, which are in fixed spacial relationship to said second object.
46. The system as claimed in claim 27 operated intermittently, but sufficiently frequently to depict the relative movement of said first object relative to said second object.
47. The system as claimed in claim 28 including means for monitoring in present time the movement of said first object in true relation to said previously taken three dimensional image data.
48. The system as claimed in claim 27 including means for selecting the slice of said previously taken image data which corresponds to the present time location of said first object such that a representation of said first object is correctly disposed on said image slice in present time.
49. The system as claimed in claim 27 including means to freeze a display of a previously taken image and a representation of said first object for a finite period of time.
50. The system as claimed in claim 27 including means to continuously display a series of slices of said previously taken three dimensional data which correspond to a changing position of said first object relative to said second object, and which correctly reflect the position and orientation of said second object relative to said fixed coordinate system in present time.
51. In a system for determining the present time location and orientation of a moveable first object with respect to a second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means, in known spacial relationship to said present time three-dimensional fixed coordinate system, which are spaced from each other and from said objects;
said first and second objects located within said fixed coordinate system;
a three dimensional local coordinate system which is '_ixed in relation to said second object;
at least three no-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers in said fixed coordinate system;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of-said radiation emitter means on said first object;
means to independently determine the location of each of said fiducial markers on said second object;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said fiducial markers on said second object into said fixed coordinate system;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system;
the improvement which comprises said radiation being electromagnetic radiation.
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means, in known spacial relationship to said present time three-dimensional fixed coordinate system, which are spaced from each other and from said objects;
said first and second objects located within said fixed coordinate system;
a three dimensional local coordinate system which is '_ixed in relation to said second object;
at least three no-collinear fiducial markers in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object including said fiducial markers in said fixed coordinate system;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object;
means to transmit radiation between said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of-said radiation emitter means on said first object;
means to independently determine the location of each of said fiducial markers on said second object;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said fiducial markers on said second object into said fixed coordinate system;
means to orient said previously taken three dimensional image data of said second object such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
wherein, as a consequence of said repeated determinations and integrations of the positions and orientations of said first object, and correlation of said position and orientation of said second object with said previously taken three dimensional image data, an image representative of said first object is correctly positioned in present time on said previously taken image data regardless of the movement of said first object in relation to said fixed coordinate system;
the improvement which comprises said radiation being electromagnetic radiation.
52. In a system for determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which is spaced from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear fiducial marker means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means o said first object and said radiation sensor means in known relation to said fixed coordination system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object; and in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
the improvement which comprises:
radiation emitter means associated in known scacial relationship to said fiducial markers;
means to determine the location of said fiducial markers on said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data.
a present time three-dimensional fixed coordinate system;
at least three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which is spaced from said movable objects;
said moveable first and second objects located within said fixed coordinate system;
a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
at least three non-collinear fiducial marker means in fixed spacial relationship to said second object and at known coordinates in said local coordinate system;
previously taken three-dimensional image data which geometrically describe said second object;
at least two spaced apart radiation emitter means disposed on said first object;
first movement means to move said first object relative to said second object and to said fixed coordinate system;
second movement means to move said second object relative to said first object and to said fixed coordinate system;
means to transmit radiation between said radiation emitter means o said first object and said radiation sensor means in known relation to said fixed coordination system;
means to distinguish radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
means to independently determine the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
means to determine the position and orientation of any point on said first object in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first object; and in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object;
the improvement which comprises:
radiation emitter means associated in known scacial relationship to said fiducial markers;
means to determine the location of said fiducial markers on said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordinate system;
means to determine the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
means to determine the position and orientation of said second object in said local coordinate system;
means to orient said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
means to integrate the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data; and means to, in present time, repeatedly determine the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data.
53. A method of determining the present time location and orientation of a moveable first object with respect to a moveable second object and for graphically indicating the corresponding position and orientation of said first object on a previously taken image of said second object, which comprises:
defining a present time three-dimensional global fixed coordinate system;
providing at lest three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
disposing said moveable first and second objects located within said fixed coordinate system;
defining a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
providing at lest three non-collinear radiation emitter means in known spacial relationship to said second object and at known coordinates in said local coordinate system;
taking, at a previous time, three-dimensional image data which geometrically describe said second object;
providing at least two spaced apart radiation emitter means disposed on said first object;
providing first movement means to move said first object relative to said second object and to said fixed coordinate system;
providing second movement means to move said second object relative to said first object and to said fixed coordinate system;
transmitting radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordination system;
distinguishing radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
independently determining the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of any point on said first ob3ect in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first ob3ect;
independently determining the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
determining the position and orientation of said second object in said local coordinate system;
orienting said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
integrating the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and in present time, repeatedly determining the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeatedly determining and integrating the positions and orientations of said first and second objects, and correlating said position and orientation of said second object with said previously taken three dimensional image data, positioning an image representative first object correctly in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coordinate system.
defining a present time three-dimensional global fixed coordinate system;
providing at lest three radiation sensor means in known spacial relationship to said present time three-dimensional fixed coordinate system which are spaced from each other and from said movable objects;
disposing said moveable first and second objects located within said fixed coordinate system;
defining a three-dimensional local coordinate system, which remains fixed with respect to said second object, but is movable, along with said second object, within said fixed coordinate system;
providing at lest three non-collinear radiation emitter means in known spacial relationship to said second object and at known coordinates in said local coordinate system;
taking, at a previous time, three-dimensional image data which geometrically describe said second object;
providing at least two spaced apart radiation emitter means disposed on said first object;
providing first movement means to move said first object relative to said second object and to said fixed coordinate system;
providing second movement means to move said second object relative to said first object and to said fixed coordinate system;
transmitting radiation between said radiation emitter means on said first and second objects and said radiation sensor means in known relation to said fixed coordination system;
distinguishing radiation emitted from any one radiation emitter means from radiation emitted from all of the other radiation emitter means;
independently determining the location of each of said radiation emitter means on said first object as a function of said radiation being transferred between at least two of said radiation emitter means on said first object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of any point on said first ob3ect in relation to said fixed coordinate system by integrating the determined location of at least two of said radiation emitter means on said first ob3ect;
independently determining the location of each of said radiation emitter means associated with said second object as a function of said radiation being transferred between said radiation emitter means associated with said second object and said radiation sensor means in known relation to said fixed coordination system;
determining the position and orientation of said second object in said fixed coordinate system by integrating the determined location of said radiation emitter means associated with said second object;
determining the position and orientation of said second object in said local coordinate system;
orienting said previously taken three dimensional image data of said second object in said local coordination system such as to match the present time position and orientation of said second object with the previously taken said image data;
integrating the present time determined position and orientation of said second object with the present time determined position and orientation of said first object in the same fixed coordinate system whereby integrating the present time determined position and orientation of said first object in relation to said previously taken three dimensional image data;
in present time, repeatedly determining the position and orientation of said first object with sufficient frequency to enable displaying the movement of said first object in relation to said second object; and in present time, repeatedly determining the position and orientation of said second object with sufficient frequency to enable correlating a moved position and orientation of said second object with a view of said previously taken three dimensional image data which is consistent with the moved position and orientation of said second object and allows for the correct present time positioning of an image corresponding to the present time position and orientation of said first object in relation to said previously taken image data;
wherein, as a consequence of said repeatedly determining and integrating the positions and orientations of said first and second objects, and correlating said position and orientation of said second object with said previously taken three dimensional image data, positioning an image representative first object correctly in present time on said previously taken image data regardless of the movement of at least one of said first and second objects in relation to said fixed coordinate system.
54. A method as claimed in claim 53 wherein said first and second objects intersect, which method further comprises displaying so much of said previously taken three dimensional image data as corresponds to the intersecting location of said first and second objects, and displaying said representation of said first object on said image display in its correct present time position.
55. A method as claimed in claim 53 wherein including determining the position and orientation of said second object at a frequency which is 0.01 to 100 times the frequency of determining the position and orientation of said first object.
56. A method as claimed in claim 53 including substantially continuously determining the position and orientation of said second object, the integration of the position and orientation of said second object and said image data, and the position and orientation of said first object.
57. The method as claimed in claim 53 including taking three dimensional image data which comprise a succession of substantially two dimensional slices at a previous time at known locations and orientations through said second object.
58. The method as claimed in claim 53 including disposing said radiation emitter means on said first and second objects, spacing said sensor means a distance from said first and second objects, and disposing said sensor means in fixed relationship to said fixed coordinate system.
59. The method as claimed in claim 53 including disposing said sensor means on said first and second objects, spacing said emitter means a distance from said first and second objects, and disposing said radiation emitters in fixed relationship to said fixed coordinate system.
60. The method as claimed in claim 53 wherein said radiation is electromagnetic radiation
61. The method as claimed in claim 60 wherein said radiation is light.
62. The method as claimed in claim 61 including emitting light of different wave lengths from each different emitter means.
63. The method as claimed in claim 60 including pulse radiating said radiation from different of said emitter means sequentially in a repeating order.
64. The method as claimed in claim 53 wherein said radiation is sound.
65. The method as claimed in claim 53 including detecting linear and rotational inertial motion through said radiation emitter means and said sensor means.
66. The method as claimed in claim 53 including taking said previously taken image data by a means which is not suitable for use during a surgical procedure.
67. Use of a system according to claim 1 wherein said first object is a surgical probe and said second object is a surgical patient, said surgical probe being partially insertable within said patient such that an inserted portion is not visible to an inserter thereof.
68. Use as claimed in claim 67 wherein at least two of said emitter means are disposable on so much of said first object as is not insertable into said patient, said at least two of said emitter means being disposed on said first object at a known distance from a tip of said probe which is adapted for insertion into said patient.
69. The method as claimed in claim 53 including selecting said image data from the group consisting of magnetic resonant image data, computed tomography data, ultra sound data, and X-radiation data.
70. The method as claimed in claim 53 wherein said second object is a patient, and said first object is an operating microscope, and including projecting said image data onto the field of an operating microscope.
71. The method as claimed in claim 53 wherein said second object is a part of the anatomy of a human.
72. The method as claimed in claim 71 wherein the part of the anatomy is a head.
73. The system as claimed in claim 71 wherein there are fiducial reference points on said second object, which are distinguishing anatomical features, which are in fixed spacial relationship to said emitter means associated with said second object.
74. The method as claimed in claim 53 which is operated intermittently, but sufficiently frequently to depict the relative movement of said first object relative to said second object, and the movement of said second object relative to said fixed coordinate system.
75. The method as claimed in claim 54 including monitoring, in present time, the movement of said first object in true relation to said previously taken three dimensional image data.
76. The method as claimed in claim 53 including selecting the slice of said previously taken image data which corresponds to the present time location of said first object, and correctly positioning a representation of said first object on said selected image slice in present time.
77. The method as claimed in claim 53 including freezing a display of a previously taken image and a representation of said first object thereon for a finite period of time.
78. The method as claimed in claim 53 including continuously displaying a series of slices of said previously taken three dimensional data which correspond to a changing position of said first object relative to said second object, and correctly reflecting thereon the movement of said second object relative to said fixed coordinate system in present time.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US5204293A | 1993-04-22 | 1993-04-22 | |
US5204593A | 1993-04-22 | 1993-04-22 | |
US08/052,042 | 1993-04-22 | ||
US08/052,045 | 1993-04-22 | ||
PCT/US1994/004298 WO1994023647A1 (en) | 1993-04-22 | 1994-04-22 | System for locating relative positions of objects |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2161126A1 CA2161126A1 (en) | 1994-10-27 |
CA2161126C true CA2161126C (en) | 2007-07-31 |
Family
ID=26730097
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002161126A Expired - Fee Related CA2161126C (en) | 1993-04-22 | 1994-04-22 | System for locating relative positions of objects |
Country Status (9)
Country | Link |
---|---|
US (4) | US5622170A (en) |
EP (2) | EP0700269B1 (en) |
JP (1) | JPH08509144A (en) |
AU (1) | AU6666894A (en) |
CA (1) | CA2161126C (en) |
DE (2) | DE69432961T2 (en) |
IL (1) | IL109385A (en) |
WO (1) | WO1994023647A1 (en) |
ZA (1) | ZA942812B (en) |
Families Citing this family (554)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6331180B1 (en) | 1988-05-03 | 2001-12-18 | Sherwood Services Ag | Target-centered stereotaxtic surgical arc system with reorientatable arc axis |
FR2652928B1 (en) | 1989-10-05 | 1994-07-29 | Diadix Sa | INTERACTIVE LOCAL INTERVENTION SYSTEM WITHIN A AREA OF A NON-HOMOGENEOUS STRUCTURE. |
US5415169A (en) * | 1989-11-21 | 1995-05-16 | Fischer Imaging Corporation | Motorized mammographic biopsy apparatus |
DE69132412T2 (en) | 1990-10-19 | 2001-03-01 | Univ St Louis | LOCALIZATION SYSTEM FOR A SURGICAL PROBE FOR USE ON THE HEAD |
US6347240B1 (en) | 1990-10-19 | 2002-02-12 | St. Louis University | System and method for use in displaying images of a body part |
US6675040B1 (en) | 1991-01-28 | 2004-01-06 | Sherwood Services Ag | Optical object tracking system |
US6167295A (en) | 1991-01-28 | 2000-12-26 | Radionics, Inc. | Optical and computer graphic stereotactic localizer |
US5662111A (en) | 1991-01-28 | 1997-09-02 | Cosman; Eric R. | Process of stereotactic optical navigation |
US6006126A (en) | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
US6405072B1 (en) | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
US5603318A (en) | 1992-04-21 | 1997-02-18 | University Of Utah Research Foundation | Apparatus and method for photogrammetric surgical localization |
US5762458A (en) * | 1996-02-20 | 1998-06-09 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive cardiac procedures |
US6757557B1 (en) | 1992-08-14 | 2004-06-29 | British Telecommunications | Position location system |
WO1994004938A1 (en) | 1992-08-14 | 1994-03-03 | British Telecommunications Public Limited Company | Position location system |
US5517990A (en) | 1992-11-30 | 1996-05-21 | The Cleveland Clinic Foundation | Stereotaxy wand and tool guide |
US5732703A (en) * | 1992-11-30 | 1998-03-31 | The Cleveland Clinic Foundation | Stereotaxy wand and tool guide |
IL109385A (en) * | 1993-04-22 | 1998-03-10 | Pixsys | System for locating the relative positions of objects in three dimensional space |
CA2161430C (en) | 1993-04-26 | 2001-07-03 | Richard D. Bucholz | System and method for indicating the position of a surgical probe |
NO302055B1 (en) * | 1993-05-24 | 1998-01-12 | Metronor As | Geometry measurement method and system |
DE69531994T2 (en) | 1994-09-15 | 2004-07-22 | OEC Medical Systems, Inc., Boston | SYSTEM FOR POSITION DETECTION BY MEANS OF A REFERENCE UNIT ATTACHED TO A PATIENT'S HEAD FOR USE IN THE MEDICAL AREA |
US5829444A (en) * | 1994-09-15 | 1998-11-03 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications |
US5695501A (en) | 1994-09-30 | 1997-12-09 | Ohio Medical Instrument Company, Inc. | Apparatus for neurosurgical stereotactic procedures |
US6978166B2 (en) * | 1994-10-07 | 2005-12-20 | Saint Louis University | System for use in displaying images of a body part |
CA2201877C (en) | 1994-10-07 | 2004-06-08 | Richard D. Bucholz | Surgical navigation systems including reference and localization frames |
US5762064A (en) * | 1995-01-23 | 1998-06-09 | Northrop Grumman Corporation | Medical magnetic positioning system and method for determining the position of a magnetic probe |
US6690963B2 (en) * | 1995-01-24 | 2004-02-10 | Biosense, Inc. | System for determining the location and orientation of an invasive medical instrument |
US6259943B1 (en) * | 1995-02-16 | 2001-07-10 | Sherwood Services Ag | Frameless to frame-based registration system |
US5868673A (en) * | 1995-03-28 | 1999-02-09 | Sonometrics Corporation | System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly |
US6246898B1 (en) * | 1995-03-28 | 2001-06-12 | Sonometrics Corporation | Method for carrying out a medical procedure using a three-dimensional tracking and imaging system |
US6122541A (en) * | 1995-05-04 | 2000-09-19 | Radionics, Inc. | Head band for frameless stereotactic registration |
US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
US6216029B1 (en) * | 1995-07-16 | 2001-04-10 | Ultraguide Ltd. | Free-hand aiming of a needle guide |
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
US6714841B1 (en) * | 1995-09-15 | 2004-03-30 | Computer Motion, Inc. | Head cursor control interface for an automated endoscope system for optimal positioning |
US6351659B1 (en) | 1995-09-28 | 2002-02-26 | Brainlab Med. Computersysteme Gmbh | Neuro-navigation system |
DE19639615C5 (en) * | 1996-09-26 | 2008-11-06 | Brainlab Ag | Reflector referencing system for surgical and medical instruments |
US5772594A (en) * | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US20020045812A1 (en) | 1996-02-01 | 2002-04-18 | Shlomo Ben-Haim | Implantable sensor for determining position coordinates |
ES2241037T3 (en) | 1996-02-15 | 2005-10-16 | Biosense Webster, Inc. | PRECISE DETERMINATION OF THE POSITION OF ENDOSCOPES. |
AU709081B2 (en) | 1996-02-15 | 1999-08-19 | Biosense, Inc. | Medical procedures and apparatus using intrabody probes |
WO1997029701A1 (en) * | 1996-02-15 | 1997-08-21 | Biosense Inc. | Catheter based surgery |
IL125755A (en) | 1996-02-15 | 2003-05-29 | Biosense Inc | Catheter calibration and usage monitoring system |
ES2251018T3 (en) | 1996-02-15 | 2006-04-16 | Biosense Webster, Inc. | CATHETER WITH LUMEN. |
IL125760A (en) | 1996-02-15 | 2003-07-31 | Biosense Inc | Movable transmit or receive coils for location system |
IL125761A (en) | 1996-02-15 | 2005-05-17 | Biosense Inc | Independently positionable transducers for location system |
ES2195118T3 (en) | 1996-02-15 | 2003-12-01 | Biosense Inc | PROCEDURE TO CONFIGURE AND OPERATE A PROBE. |
JP4141500B2 (en) | 1996-02-27 | 2008-08-27 | バイオセンス・ウェブスター・インコーポレイテッド | Positioning device and operation method thereof |
DE19609564C1 (en) * | 1996-03-12 | 1997-06-26 | Fraunhofer Ges Forschung | Ultrasonic communication system for location of diagnostic capsule |
US6177792B1 (en) | 1996-03-26 | 2001-01-23 | Bisense, Inc. | Mutual induction correction for radiator coils of an objects tracking system |
US6167145A (en) | 1996-03-29 | 2000-12-26 | Surgical Navigation Technologies, Inc. | Bone navigation system |
AU722748B2 (en) | 1996-05-06 | 2000-08-10 | Biosense, Inc. | Radiator calibration |
USRE40176E1 (en) * | 1996-05-15 | 2008-03-25 | Northwestern University | Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy |
US5795295A (en) * | 1996-06-25 | 1998-08-18 | Carl Zeiss, Inc. | OCT-assisted surgical microscope with multi-coordinate manipulator |
US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
US6009212A (en) | 1996-07-10 | 1999-12-28 | Washington University | Method and apparatus for image registration |
US6226418B1 (en) | 1997-11-07 | 2001-05-01 | Washington University | Rapid convolution based large deformation image matching via landmark and volume imagery |
US6611630B1 (en) | 1996-07-10 | 2003-08-26 | Washington University | Method and apparatus for automatic shape characterization |
US6408107B1 (en) | 1996-07-10 | 2002-06-18 | Michael I. Miller | Rapid convolution based large deformation image matching via landmark and volume imagery |
US6296613B1 (en) | 1997-08-22 | 2001-10-02 | Synthes (U.S.A.) | 3D ultrasound recording device |
AU718579B2 (en) * | 1996-08-22 | 2000-04-13 | Ao Technology Ag | Ultrasonographic 3-D imaging system |
JP3198938B2 (en) * | 1996-09-03 | 2001-08-13 | 株式会社エフ・エフ・シー | Image processing device for mobile camera |
US6364888B1 (en) * | 1996-09-09 | 2002-04-02 | Intuitive Surgical, Inc. | Alignment of master and slave in a minimally invasive surgical apparatus |
EP0941450B1 (en) * | 1996-09-16 | 2006-02-15 | Snap-on Incorporated | Measuring device for use with vehicles |
US5865744A (en) * | 1996-09-16 | 1999-02-02 | Lemelson; Jerome H. | Method and system for delivering therapeutic agents |
DE19637822C1 (en) * | 1996-09-17 | 1998-03-26 | Deutsch Zentr Luft & Raumfahrt | Micromechanical tool |
JP3344900B2 (en) * | 1996-09-19 | 2002-11-18 | 松下電器産業株式会社 | Cartesian robot |
US5845646A (en) * | 1996-11-05 | 1998-12-08 | Lemelson; Jerome | System and method for treating select tissue in a living being |
GB9623911D0 (en) * | 1996-11-18 | 1997-01-08 | Armstrong Healthcare Ltd | Improvements in or relating to an orientation detector arrangement |
US6132441A (en) | 1996-11-22 | 2000-10-17 | Computer Motion, Inc. | Rigidly-linked articulating wrist with decoupled motion transmission |
US7302288B1 (en) * | 1996-11-25 | 2007-11-27 | Z-Kat, Inc. | Tool position indicator |
JP3814904B2 (en) * | 1996-12-25 | 2006-08-30 | ソニー株式会社 | Position detection device and remote control device |
US6119033A (en) * | 1997-03-04 | 2000-09-12 | Biotrack, Inc. | Method of monitoring a location of an area of interest within a patient during a medical procedure |
US6731966B1 (en) | 1997-03-04 | 2004-05-04 | Zachary S. Spigelman | Systems and methods for targeting a lesion |
US6026315A (en) * | 1997-03-27 | 2000-02-15 | Siemens Aktiengesellschaft | Method and apparatus for calibrating a navigation system in relation to image data of a magnetic resonance apparatus |
GB9706797D0 (en) * | 1997-04-03 | 1997-05-21 | Sun Electric Uk Ltd | Wireless data transmission |
US5970499A (en) | 1997-04-11 | 1999-10-19 | Smith; Kurt R. | Method and apparatus for producing and accessing composite data |
US6708184B2 (en) | 1997-04-11 | 2004-03-16 | Medtronic/Surgical Navigation Technologies | Method and apparatus for producing and accessing composite data using a device having a distributed communication controller interface |
US6669653B2 (en) * | 1997-05-05 | 2003-12-30 | Trig Medical Ltd. | Method and apparatus for monitoring the progress of labor |
US6752812B1 (en) | 1997-05-15 | 2004-06-22 | Regent Of The University Of Minnesota | Remote actuation of trajectory guide |
US5993463A (en) | 1997-05-15 | 1999-11-30 | Regents Of The University Of Minnesota | Remote actuation of trajectory guide |
US5907395A (en) * | 1997-06-06 | 1999-05-25 | Image Guided Technologies, Inc. | Optical fiber probe for position measurement |
EP0926998B8 (en) * | 1997-06-23 | 2004-04-14 | Koninklijke Philips Electronics N.V. | Image guided surgery system |
EP0999785A4 (en) * | 1997-06-27 | 2007-04-25 | Univ Leland Stanford Junior | Method and apparatus for volumetric image navigation |
US6055449A (en) * | 1997-09-22 | 2000-04-25 | Siemens Corporate Research, Inc. | Method for localization of a biopsy needle or similar surgical tool in a radiographic image |
US6226548B1 (en) | 1997-09-24 | 2001-05-01 | Surgical Navigation Technologies, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
US6081336A (en) * | 1997-09-26 | 2000-06-27 | Picker International, Inc. | Microscope calibrator |
US6147480A (en) * | 1997-10-23 | 2000-11-14 | Biosense, Inc. | Detection of metal disturbance |
US6157853A (en) * | 1997-11-12 | 2000-12-05 | Stereotaxis, Inc. | Method and apparatus using shaped field of repositionable magnet to guide implant |
US6021343A (en) | 1997-11-20 | 2000-02-01 | Surgical Navigation Technologies | Image guided awl/tap/screwdriver |
AU4318499A (en) * | 1997-11-24 | 1999-12-13 | Burdette Medical Systems, Inc. | Real time brachytherapy spatial registration and visualization system |
US6149592A (en) * | 1997-11-26 | 2000-11-21 | Picker International, Inc. | Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data |
US6035228A (en) * | 1997-11-28 | 2000-03-07 | Picker International, Inc. | Frameless stereotactic arm apparatus and method of using same |
US6064904A (en) | 1997-11-28 | 2000-05-16 | Picker International, Inc. | Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures |
US6052611A (en) | 1997-11-28 | 2000-04-18 | Picker International, Inc. | Frameless stereotactic tomographic scanner for image guided interventional procedures |
US5967982A (en) * | 1997-12-09 | 1999-10-19 | The Cleveland Clinic Foundation | Non-invasive spine and bone registration for frameless stereotaxy |
US6348058B1 (en) | 1997-12-12 | 2002-02-19 | Surgical Navigation Technologies, Inc. | Image guided spinal surgery guide, system, and method for use thereof |
DE19882935B4 (en) * | 1997-12-31 | 2005-02-10 | Surgical Navigation Technologies Inc., Broomfield | Systems for displaying a position of an object during an operation |
US6122539A (en) * | 1997-12-31 | 2000-09-19 | General Electric Company | Method for verifying accuracy during intra-operative MR imaging |
US6223066B1 (en) | 1998-01-21 | 2001-04-24 | Biosense, Inc. | Optical position sensors |
DE69835422T2 (en) | 1998-01-22 | 2006-12-21 | Biosense Webster, Inc., Diamond Bar | MEASUREMENT IN THE BODY'S INSIDE |
AU2475799A (en) * | 1998-01-28 | 1999-08-16 | Eric R. Cosman | Optical object tracking system |
US6289235B1 (en) | 1998-03-05 | 2001-09-11 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
CA2326642C (en) * | 1998-04-03 | 2008-06-17 | Image Guided Technologies, Inc. | Wireless optical instrument for position measurement and method of use therefor |
US5947900A (en) * | 1998-04-13 | 1999-09-07 | General Electric Company | Dynamic scan plane tracking using MR position monitoring |
DE19817039A1 (en) * | 1998-04-17 | 1999-10-21 | Philips Patentverwaltung | Arrangement for image guided surgery |
US6298262B1 (en) | 1998-04-21 | 2001-10-02 | Neutar, Llc | Instrument guidance for stereotactic surgery |
US6546277B1 (en) | 1998-04-21 | 2003-04-08 | Neutar L.L.C. | Instrument guidance system for spinal and other surgery |
US6273896B1 (en) | 1998-04-21 | 2001-08-14 | Neutar, Llc | Removable frames for stereotactic localization |
US6529765B1 (en) | 1998-04-21 | 2003-03-04 | Neutar L.L.C. | Instrumented and actuated guidance fixture for sterotactic surgery |
US6363940B1 (en) * | 1998-05-14 | 2002-04-02 | Calypso Medical Technologies, Inc. | System and method for bracketing and removing tissue |
FR2779339B1 (en) * | 1998-06-09 | 2000-10-13 | Integrated Surgical Systems Sa | MATCHING METHOD AND APPARATUS FOR ROBOTIC SURGERY, AND MATCHING DEVICE COMPRISING APPLICATION |
US6122967A (en) * | 1998-06-18 | 2000-09-26 | The United States Of America As Represented By The United States Department Of Energy | Free motion scanning system |
CA2335867C (en) | 1998-06-22 | 2008-12-30 | Synthes (U.S.A.) | Fiducial matching by means of fiducial screws |
US6118845A (en) | 1998-06-29 | 2000-09-12 | Surgical Navigation Technologies, Inc. | System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers |
US6459927B1 (en) * | 1999-07-06 | 2002-10-01 | Neutar, Llc | Customizable fixture for patient positioning |
US6327491B1 (en) * | 1998-07-06 | 2001-12-04 | Neutar, Llc | Customized surgical fixture |
US6081577A (en) | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
US6145509A (en) * | 1998-07-24 | 2000-11-14 | Eva Corporation | Depth sensor device for use in a surgical procedure |
US6439576B1 (en) * | 1998-07-30 | 2002-08-27 | Merlin Technologies, Inc. | Electronic missile location |
US20050105772A1 (en) * | 1998-08-10 | 2005-05-19 | Nestor Voronka | Optical body tracker |
US6801637B2 (en) | 1999-08-10 | 2004-10-05 | Cybernet Systems Corporation | Optical body tracker |
US6282437B1 (en) | 1998-08-12 | 2001-08-28 | Neutar, Llc | Body-mounted sensing system for stereotactic surgery |
US6351662B1 (en) | 1998-08-12 | 2002-02-26 | Neutar L.L.C. | Movable arm locator for stereotactic surgery |
US6477400B1 (en) | 1998-08-20 | 2002-11-05 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US6482182B1 (en) | 1998-09-03 | 2002-11-19 | Surgical Navigation Technologies, Inc. | Anchoring system for a brain lead |
US6266142B1 (en) * | 1998-09-21 | 2001-07-24 | The Texas A&M University System | Noncontact position and orientation measurement system and method |
US6195577B1 (en) * | 1998-10-08 | 2001-02-27 | Regents Of The University Of Minnesota | Method and apparatus for positioning a device in a body |
US6340363B1 (en) | 1998-10-09 | 2002-01-22 | Surgical Navigation Technologies, Inc. | Image guided vertebral distractor and method for tracking the position of vertebrae |
US6373240B1 (en) | 1998-10-15 | 2002-04-16 | Biosense, Inc. | Metal immune system for tracking spatial coordinates of an object in the presence of a perturbed energy field |
US6178358B1 (en) * | 1998-10-27 | 2001-01-23 | Hunter Engineering Company | Three-dimensional virtual view wheel alignment display system |
US6633686B1 (en) | 1998-11-05 | 2003-10-14 | Washington University | Method and apparatus for image registration using large deformation diffeomorphisms on a sphere |
JP4101951B2 (en) * | 1998-11-10 | 2008-06-18 | オリンパス株式会社 | Surgical microscope |
US6201887B1 (en) * | 1998-11-17 | 2001-03-13 | General Electric Company | System for determination of faulty circuit boards in ultrasound imaging machines |
US6659939B2 (en) | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
US8527094B2 (en) | 1998-11-20 | 2013-09-03 | Intuitive Surgical Operations, Inc. | Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures |
US6398726B1 (en) | 1998-11-20 | 2002-06-04 | Intuitive Surgical, Inc. | Stabilizer for robotic beating-heart surgery |
US6852107B2 (en) | 2002-01-16 | 2005-02-08 | Computer Motion, Inc. | Minimally invasive surgical training using robotics and tele-collaboration |
US6951535B2 (en) | 2002-01-16 | 2005-10-04 | Intuitive Surgical, Inc. | Tele-medicine system that transmits an entire state of a subsystem |
US6246896B1 (en) | 1998-11-24 | 2001-06-12 | General Electric Company | MRI guided ablation system |
US6289233B1 (en) | 1998-11-25 | 2001-09-11 | General Electric Company | High speed tracking of interventional devices using an MRI system |
US6430434B1 (en) * | 1998-12-14 | 2002-08-06 | Integrated Surgical Systems, Inc. | Method for determining the location and orientation of a bone for computer-assisted orthopedic procedures using intraoperatively attached markers |
JP4612194B2 (en) * | 1998-12-23 | 2011-01-12 | イメージ・ガイディッド・テクノロジーズ・インコーポレイテッド | Hybrid 3D probe tracked by multiple sensors |
IL143909A0 (en) | 1998-12-23 | 2002-04-21 | Jakab Peter D | Magnetic resonance scanner with electromagnetic position and orientation tracking device |
EP1650576A1 (en) | 1998-12-23 | 2006-04-26 | Peter D. Jakab | Magnetic resonance scanner with electromagnetic position and orientation tracking device |
US6285902B1 (en) * | 1999-02-10 | 2001-09-04 | Surgical Insights, Inc. | Computer assisted targeting device for use in orthopaedic surgery |
DE69931074T2 (en) | 1999-03-17 | 2006-11-16 | Synthes Ag Chur | DEVICE FOR PRESENTING AND PLANNING CRANE COATING OPERATIONS |
US6498477B1 (en) | 1999-03-19 | 2002-12-24 | Biosense, Inc. | Mutual crosstalk elimination in medical systems using radiator coils and magnetic fields |
US6470207B1 (en) * | 1999-03-23 | 2002-10-22 | Surgical Navigation Technologies, Inc. | Navigational guidance via computer-assisted fluoroscopic imaging |
AU767060B2 (en) * | 1999-04-07 | 2003-10-30 | Loma Linda University Medical Center | Patient motion monitoring system for proton therapy |
AU766981B2 (en) | 1999-04-20 | 2003-10-30 | Ao Technology Ag | Device for the percutaneous obtainment of 3D-coordinates on the surface of a human or animal organ |
US6491699B1 (en) | 1999-04-20 | 2002-12-10 | Surgical Navigation Technologies, Inc. | Instrument guidance method and system for image guided surgery |
ATE280541T1 (en) * | 1999-04-22 | 2004-11-15 | Medtronic Surgical Navigation | APPARATUS AND METHOD FOR IMAGE-GUIDED SURGERY |
ATE242865T1 (en) * | 1999-05-03 | 2003-06-15 | Synthes Ag | POSITION DETECTION DEVICE WITH AIDS FOR DETERMINING THE DIRECTION OF THE GRAVITY VECTOR |
US6393314B1 (en) | 1999-05-06 | 2002-05-21 | General Electric Company | RF driven resistive ablation system for use in MRI guided therapy |
GB2352289B (en) * | 1999-07-14 | 2003-09-17 | Dennis Majoe | Position and orientation detection system |
DE19936904A1 (en) * | 1999-07-30 | 2001-02-01 | Biotronik Mess & Therapieg | catheter |
JP3608448B2 (en) | 1999-08-31 | 2005-01-12 | 株式会社日立製作所 | Treatment device |
DE10040498A1 (en) | 1999-09-07 | 2001-03-15 | Zeiss Carl Fa | Device for image-supported processing of working object has display unit for simultaneous visual acquisition of current working object and working object data, freedom of head movement |
US6206891B1 (en) | 1999-09-14 | 2001-03-27 | Medeye Medical Technology Ltd. | Device and method for calibration of a stereotactic localization system |
US20040097996A1 (en) | 1999-10-05 | 2004-05-20 | Omnisonics Medical Technologies, Inc. | Apparatus and method of removing occlusions using an ultrasonic medical device operating in a transverse mode |
US8239001B2 (en) | 2003-10-17 | 2012-08-07 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US6288785B1 (en) * | 1999-10-28 | 2001-09-11 | Northern Digital, Inc. | System for determining spatial position and/or orientation of one or more objects |
WO2001031466A1 (en) | 1999-10-28 | 2001-05-03 | Winchester Development Associates | Coil structures and methods for generating magnetic fields |
US11331150B2 (en) | 1999-10-28 | 2022-05-17 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US6499488B1 (en) | 1999-10-28 | 2002-12-31 | Winchester Development Associates | Surgical sensor |
US7366562B2 (en) | 2003-10-17 | 2008-04-29 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US6493573B1 (en) | 1999-10-28 | 2002-12-10 | Winchester Development Associates | Method and system for navigating a catheter probe in the presence of field-influencing objects |
US6235038B1 (en) | 1999-10-28 | 2001-05-22 | Medtronic Surgical Navigation Technologies | System for translation of electromagnetic and optical localization systems |
US6381485B1 (en) | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies, Inc. | Registration of human anatomy integrated for electromagnetic localization |
US6474341B1 (en) * | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
US6379302B1 (en) | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies Inc. | Navigation information overlay onto ultrasound imagery |
US8644907B2 (en) | 1999-10-28 | 2014-02-04 | Medtronic Navigaton, Inc. | Method and apparatus for surgical navigation |
US6747539B1 (en) | 1999-10-28 | 2004-06-08 | Michael A. Martinelli | Patient-shielding and coil system |
US6671538B1 (en) * | 1999-11-26 | 2003-12-30 | Koninklijke Philips Electronics, N.V. | Interface system for use with imaging devices to facilitate visualization of image-guided interventional procedure planning |
US6290649B1 (en) * | 1999-12-21 | 2001-09-18 | General Electric Company | Ultrasound position sensing probe |
WO2001054579A1 (en) * | 2000-01-10 | 2001-08-02 | Super Dimension Ltd. | Methods and systems for performing medical procedures with reference to projective images and with respect to pre-stored images |
US20010034530A1 (en) | 2000-01-27 | 2001-10-25 | Malackowski Donald W. | Surgery system |
US6725080B2 (en) | 2000-03-01 | 2004-04-20 | Surgical Navigation Technologies, Inc. | Multiple cannula image guided tool for image guided procedures |
US6497134B1 (en) * | 2000-03-15 | 2002-12-24 | Image Guided Technologies, Inc. | Calibration of an instrument |
FR2807549B1 (en) * | 2000-04-06 | 2004-10-01 | Ge Med Sys Global Tech Co Llc | IMAGE PROCESSING METHOD AND ASSOCIATED DEVICE |
US7366561B2 (en) * | 2000-04-07 | 2008-04-29 | Medtronic, Inc. | Robotic trajectory guide |
US7660621B2 (en) * | 2000-04-07 | 2010-02-09 | Medtronic, Inc. | Medical device introducer |
US6535756B1 (en) | 2000-04-07 | 2003-03-18 | Surgical Navigation Technologies, Inc. | Trajectory storage apparatus and method for surgical navigation system |
JP3780146B2 (en) * | 2000-04-12 | 2006-05-31 | オリンパス株式会社 | Surgical navigation device |
US20030135102A1 (en) * | 2000-05-18 | 2003-07-17 | Burdette Everette C. | Method and system for registration and guidance of intravascular treatment |
JP4634570B2 (en) * | 2000-06-05 | 2011-02-16 | 株式会社東芝 | MRI equipment |
US6478802B2 (en) | 2000-06-09 | 2002-11-12 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for display of an image guided drill bit |
JP2001356011A (en) * | 2000-06-13 | 2001-12-26 | National Institute Of Advanced Industrial & Technology | Straightness measuring apparatus of direct-acting unit |
US7085400B1 (en) | 2000-06-14 | 2006-08-01 | Surgical Navigation Technologies, Inc. | System and method for image based sensor calibration |
US6484118B1 (en) | 2000-07-20 | 2002-11-19 | Biosense, Inc. | Electromagnetic position single axis system |
US6902569B2 (en) | 2000-08-17 | 2005-06-07 | Image-Guided Neurologics, Inc. | Trajectory guide with instrument immobilizer |
US6823207B1 (en) | 2000-08-26 | 2004-11-23 | Ge Medical Systems Global Technology Company, Llc | Integrated fluoroscopic surgical navigation and imaging workstation with command protocol |
EP1315465A2 (en) | 2000-09-07 | 2003-06-04 | Cbyon, Inc. | Virtual fluoroscopic system and method |
WO2002024051A2 (en) | 2000-09-23 | 2002-03-28 | The Board Of Trustees Of The Leland Stanford Junior University | Endoscopic targeting method and system |
US7194296B2 (en) | 2000-10-31 | 2007-03-20 | Northern Digital Inc. | Flexible instrument with optical sensors |
EP2130511A1 (en) * | 2000-11-17 | 2009-12-09 | Calypso Medical, Inc | System for locating and defining a target location within a human body |
US6718194B2 (en) * | 2000-11-17 | 2004-04-06 | Ge Medical Systems Global Technology Company, Llc | Computer assisted intramedullary rod surgery system with enhanced features |
NO315143B1 (en) * | 2000-11-24 | 2003-07-21 | Neorad As | Apparatus for light beam-guided biopsy |
WO2002043569A2 (en) | 2000-11-28 | 2002-06-06 | Intuitive Surgical, Inc. | Endoscopic beating-heart stabilizer and vessel occlusion fastener |
US6820614B2 (en) * | 2000-12-02 | 2004-11-23 | The Bonutti 2003 Trust -A | Tracheal intubination |
US6591160B2 (en) * | 2000-12-04 | 2003-07-08 | Asyst Technologies, Inc. | Self teaching robot |
US6757416B2 (en) | 2000-12-04 | 2004-06-29 | Ge Medical Systems Global Technology Company, Llc | Display of patient image data |
AU2002230718B2 (en) * | 2000-12-08 | 2005-08-11 | Loma Linda University Medical Center | Proton beam therapy control system |
EP1216651A1 (en) * | 2000-12-21 | 2002-06-26 | BrainLAB AG | Wireless medical acquisition and treatment system |
US20020149628A1 (en) * | 2000-12-22 | 2002-10-17 | Smith Jeffrey C. | Positioning an item in three dimensions via a graphical representation |
DE10103870B4 (en) * | 2001-01-30 | 2004-02-05 | Daimlerchrysler Ag | Method for image recognition in motor vehicles |
DE10108139A1 (en) * | 2001-02-20 | 2002-08-29 | Boegl Max Bauunternehmung Gmbh | Method for measuring and / or machining a workpiece |
DE10108547B4 (en) * | 2001-02-22 | 2006-04-20 | Siemens Ag | Operating system for controlling surgical instruments based on intra-operative X-ray images |
US20020131643A1 (en) * | 2001-03-13 | 2002-09-19 | Fels Sol Sidney | Local positioning system |
US6695786B2 (en) | 2001-03-16 | 2004-02-24 | U-Systems, Inc. | Guide and position monitor for invasive medical instrument |
EP1379173A2 (en) * | 2001-04-10 | 2004-01-14 | Koninklijke Philips Electronics N.V. | A fluoroscopy intervention method with a cone-beam |
US20020165524A1 (en) | 2001-05-01 | 2002-11-07 | Dan Sanchez | Pivot point arm for a robotic system used to perform a surgical procedure |
EP1401323A4 (en) * | 2001-05-31 | 2009-06-03 | Image Navigation Ltd | Image guided implantology methods |
US6636757B1 (en) | 2001-06-04 | 2003-10-21 | Surgical Navigation Technologies, Inc. | Method and apparatus for electromagnetic navigation of a surgical probe near a metal object |
US20020193685A1 (en) * | 2001-06-08 | 2002-12-19 | Calypso Medical, Inc. | Guided Radiation Therapy System |
US6887245B2 (en) * | 2001-06-11 | 2005-05-03 | Ge Medical Systems Global Technology Company, Llc | Surgical drill for use with a computer assisted surgery system |
ITMI20011635A1 (en) * | 2001-07-27 | 2003-01-27 | G D S Giorgi Dynamic Stereotax | DEVICE AND PROCESS OF MICROSURGERY ASSISTED BY THE PROCESSOR |
DE10136709B4 (en) | 2001-07-27 | 2004-09-02 | Siemens Ag | Device for performing surgical interventions and method for displaying image information during such an intervention on a patient |
US6730926B2 (en) | 2001-09-05 | 2004-05-04 | Servo-Robot Inc. | Sensing head and apparatus for determining the position and orientation of a target object |
DE10143561B4 (en) * | 2001-09-05 | 2011-12-15 | Eads Deutschland Gmbh | Method and system for locating emitters |
US6728599B2 (en) | 2001-09-07 | 2004-04-27 | Computer Motion, Inc. | Modularity system for computer assisted surgery |
US7135978B2 (en) * | 2001-09-14 | 2006-11-14 | Calypso Medical Technologies, Inc. | Miniature resonating marker assembly |
WO2003026505A1 (en) * | 2001-09-19 | 2003-04-03 | Hitachi Medical Corporation | Treatment tool and magnetic resonance imager |
US6587750B2 (en) | 2001-09-25 | 2003-07-01 | Intuitive Surgical, Inc. | Removable infinite roll master grip handle and touch sensor for robotic surgery |
WO2003029921A2 (en) * | 2001-09-28 | 2003-04-10 | University Of North Carolina At Chapel Hill | Methods and systems for three-dimensional motion control and tracking of a mechanically unattached magnetic probe |
CN1612713A (en) | 2001-11-05 | 2005-05-04 | 计算机化医学体系股份有限公司 | Apparatus and method for registration, guidance, and targeting of external beam radiation therapy |
JP3982247B2 (en) * | 2001-12-06 | 2007-09-26 | 株式会社デンソー | Control device for vehicle generator |
US6793653B2 (en) | 2001-12-08 | 2004-09-21 | Computer Motion, Inc. | Multifunctional handle for a medical robotic system |
US6822570B2 (en) * | 2001-12-20 | 2004-11-23 | Calypso Medical Technologies, Inc. | System for spatially adjustable excitation of leadless miniature marker |
US6838990B2 (en) | 2001-12-20 | 2005-01-04 | Calypso Medical Technologies, Inc. | System for excitation leadless miniature marker |
US6812842B2 (en) | 2001-12-20 | 2004-11-02 | Calypso Medical Technologies, Inc. | System for excitation of a leadless miniature marker |
US7715602B2 (en) * | 2002-01-18 | 2010-05-11 | Orthosoft Inc. | Method and apparatus for reconstructing bone surfaces during surgery |
EP1330992A1 (en) * | 2002-01-23 | 2003-07-30 | Stiftung für Plastische und Aesthetische Wundheilung im Sondervermögen der DT Deutschen Stiftungstreuhend AG | Device and method for establishing the spatial position of an instrument relative to an object |
US6947786B2 (en) | 2002-02-28 | 2005-09-20 | Surgical Navigation Technologies, Inc. | Method and apparatus for perspective inversion |
DE10210645B4 (en) * | 2002-03-11 | 2006-04-13 | Siemens Ag | A method of detecting and displaying a medical catheter inserted into an examination area of a patient |
US6990368B2 (en) | 2002-04-04 | 2006-01-24 | Surgical Navigation Technologies, Inc. | Method and apparatus for virtual digital subtraction angiography |
US7010759B2 (en) * | 2002-04-05 | 2006-03-07 | U-Tech Enviromental Manufacturing Supply, Inc. | Method for real time display of maintenance device location in an internal space |
US7998062B2 (en) | 2004-03-29 | 2011-08-16 | Superdimension, Ltd. | Endoscope structures and techniques for navigating to a target in branched structure |
AR039475A1 (en) * | 2002-05-01 | 2005-02-23 | Wyeth Corp | 6-ALQUILIDEN-PENEMS TRICICLICOS AS BETA-LACTAMASA INHIBITORS |
WO2004012599A1 (en) * | 2002-07-29 | 2004-02-12 | Omnisonics Medical Technologies, Inc. | Radiopaque coating for an ultrasonic medical device |
US7187800B2 (en) * | 2002-08-02 | 2007-03-06 | Computerized Medical Systems, Inc. | Method and apparatus for image segmentation using Jensen-Shannon divergence and Jensen-Renyi divergence |
AU2003258059A1 (en) * | 2002-08-06 | 2004-02-23 | Stereotaxis, Inc. | Virtual device interface control for medical devices |
US6741364B2 (en) * | 2002-08-13 | 2004-05-25 | Harris Corporation | Apparatus for determining relative positioning of objects and related methods |
US7428061B2 (en) * | 2002-08-14 | 2008-09-23 | Metris Ipr N.V. | Optical probe for scanning the features of an object and methods thereof |
US7009717B2 (en) * | 2002-08-14 | 2006-03-07 | Metris N.V. | Optical probe for scanning the features of an object and methods therefor |
US6892090B2 (en) | 2002-08-19 | 2005-05-10 | Surgical Navigation Technologies, Inc. | Method and apparatus for virtual endoscopy |
US7317819B2 (en) * | 2002-08-28 | 2008-01-08 | Imaging3, Inc. | Apparatus and method for three-dimensional imaging |
WO2004019799A2 (en) * | 2002-08-29 | 2004-03-11 | Computerized Medical Systems, Inc. | Methods and systems for localizing of a medical imaging probe and of a biopsy needle |
US7794230B2 (en) * | 2002-09-10 | 2010-09-14 | University Of Vermont And State Agricultural College | Mathematical circulatory system model |
US7704260B2 (en) | 2002-09-17 | 2010-04-27 | Medtronic, Inc. | Low profile instrument immobilizer |
US7166114B2 (en) * | 2002-09-18 | 2007-01-23 | Stryker Leibinger Gmbh & Co Kg | Method and system for calibrating a surgical tool and adapter thereof |
WO2004046754A2 (en) * | 2002-11-14 | 2004-06-03 | General Electric Medical Systems Global Technology Company, Llc | Interchangeable localizing devices for use with tracking systems |
US7599730B2 (en) | 2002-11-19 | 2009-10-06 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
US7697972B2 (en) | 2002-11-19 | 2010-04-13 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
US7945309B2 (en) | 2002-11-22 | 2011-05-17 | Biosense, Inc. | Dynamic metal immunity |
US7636596B2 (en) * | 2002-12-20 | 2009-12-22 | Medtronic, Inc. | Organ access device and method |
US20040176686A1 (en) * | 2002-12-23 | 2004-09-09 | Omnisonics Medical Technologies, Inc. | Apparatus and method for ultrasonic medical device with improved visibility in imaging procedures |
WO2004058074A1 (en) | 2002-12-23 | 2004-07-15 | Omnisonics Medical Technologies, Inc. | Apparatus and method for ultrasonic medical device with improved visibility in imaging procedures |
EP1579304A1 (en) * | 2002-12-23 | 2005-09-28 | Universita' Degli Studi di Firenze | Hand pointing apparatus |
JP2004208858A (en) * | 2002-12-27 | 2004-07-29 | Toshiba Corp | Ultrasonograph and ultrasonic image processing apparatus |
US7289839B2 (en) * | 2002-12-30 | 2007-10-30 | Calypso Medical Technologies, Inc. | Implantable marker with a leadless signal transmitter compatible for use in magnetic resonance devices |
US6889833B2 (en) | 2002-12-30 | 2005-05-10 | Calypso Medical Technologies, Inc. | Packaged systems for implanting markers in a patient and methods for manufacturing and using such systems |
US6877634B2 (en) * | 2002-12-31 | 2005-04-12 | Kimberly-Clark Worldwide, Inc. | High capacity dispensing carton |
ES2303915T3 (en) * | 2003-01-02 | 2008-09-01 | Loma Linda University Medical Center | MANAGEMENT OF THE CONFIGURATION AND RECOVERY SYSTEM FOR A PROTONIC RAY THERAPEUTIC SYSTEM. |
US7660623B2 (en) * | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
US7542791B2 (en) | 2003-01-30 | 2009-06-02 | Medtronic Navigation, Inc. | Method and apparatus for preplanning a surgical procedure |
US7111401B2 (en) * | 2003-02-04 | 2006-09-26 | Eveready Battery Company, Inc. | Razor head having skin controlling means |
US20040152955A1 (en) * | 2003-02-04 | 2004-08-05 | Mcginley Shawn E. | Guidance system for rotary surgical instrument |
US20040171930A1 (en) * | 2003-02-04 | 2004-09-02 | Zimmer Technology, Inc. | Guidance system for rotary surgical instrument |
US7458977B2 (en) | 2003-02-04 | 2008-12-02 | Zimmer Technology, Inc. | Surgical navigation instrument useful in marking anatomical structures |
US7559935B2 (en) * | 2003-02-20 | 2009-07-14 | Medtronic, Inc. | Target depth locators for trajectory guide for introducing an instrument |
US7896889B2 (en) * | 2003-02-20 | 2011-03-01 | Medtronic, Inc. | Trajectory guide with angled or patterned lumens or height adjustment |
US7119645B2 (en) | 2003-02-25 | 2006-10-10 | The University Of North Carolina | Methods and systems for controlling motion of and tracking a mechanically unattached probe |
WO2004078039A1 (en) * | 2003-03-07 | 2004-09-16 | Philips Intellectual Property & Standards Gmbh | Device and method for locating an instrument within a body |
US7974680B2 (en) | 2003-05-29 | 2011-07-05 | Biosense, Inc. | Hysteresis assessment for metal immunity |
US7433728B2 (en) | 2003-05-29 | 2008-10-07 | Biosense, Inc. | Dynamic metal immunity by hysteresis |
US6932823B2 (en) * | 2003-06-24 | 2005-08-23 | Zimmer Technology, Inc. | Detachable support arm for surgical navigation system reference array |
AU2004203173A1 (en) * | 2003-07-14 | 2005-02-03 | Sunnybrook And Women's College And Health Sciences Centre | Optical image-based position tracking for magnetic resonance imaging |
US8403828B2 (en) * | 2003-07-21 | 2013-03-26 | Vanderbilt University | Ophthalmic orbital surgery apparatus and method and image-guide navigation system |
US7313430B2 (en) | 2003-08-28 | 2007-12-25 | Medtronic Navigation, Inc. | Method and apparatus for performing stereotactic surgery |
US7633633B2 (en) * | 2003-08-29 | 2009-12-15 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Position determination that is responsive to a retro-reflective object |
US20050054895A1 (en) * | 2003-09-09 | 2005-03-10 | Hoeg Hans David | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems |
EP2316328B1 (en) | 2003-09-15 | 2012-05-09 | Super Dimension Ltd. | Wrap-around holding device for use with bronchoscopes |
ATE438335T1 (en) | 2003-09-15 | 2009-08-15 | Super Dimension Ltd | SYSTEM OF ACCESSORIES FOR USE WITH BRONCHOSCOPES |
US7862570B2 (en) | 2003-10-03 | 2011-01-04 | Smith & Nephew, Inc. | Surgical positioners |
GB0324179D0 (en) * | 2003-10-15 | 2003-11-19 | Isis Innovation | Device for scanning three-dimensional objects |
US7835778B2 (en) | 2003-10-16 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
US7840253B2 (en) | 2003-10-17 | 2010-11-23 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US7764985B2 (en) | 2003-10-20 | 2010-07-27 | Smith & Nephew, Inc. | Surgical navigation system component fault interfaces and related processes |
DE602004031147D1 (en) | 2003-11-14 | 2011-03-03 | Smith & Nephew Inc | |
US7000948B2 (en) * | 2003-11-20 | 2006-02-21 | Delphi Technologies, Inc. | Internally tethered seat bladder for occupant weight estimation |
US7232409B2 (en) * | 2003-11-20 | 2007-06-19 | Karl Storz Development Corp. | Method and apparatus for displaying endoscopic images |
US6950775B2 (en) * | 2003-12-01 | 2005-09-27 | Snap-On Incorporated | Coordinate measuring system and field-of-view indicators therefor |
US7120524B2 (en) * | 2003-12-04 | 2006-10-10 | Matrix Electronic Measuring, L.P. | System for measuring points on a vehicle during damage repair |
US7376492B2 (en) * | 2003-12-04 | 2008-05-20 | Matrix Electronic Measuring, L.P. | System for measuring points on a vehicle during damage repair |
US7771436B2 (en) * | 2003-12-10 | 2010-08-10 | Stryker Leibinger Gmbh & Co. Kg. | Surgical navigation tracker, system and method |
US7873400B2 (en) | 2003-12-10 | 2011-01-18 | Stryker Leibinger Gmbh & Co. Kg. | Adapter for surgical navigation trackers |
US8196589B2 (en) * | 2003-12-24 | 2012-06-12 | Calypso Medical Technologies, Inc. | Implantable marker with wireless signal transmitter |
EP1711119A1 (en) | 2004-01-23 | 2006-10-18 | Traxyz Medical, Inc. | Methods and apparatus for performing procedures on target locations in the body |
US7015376B2 (en) * | 2004-01-30 | 2006-03-21 | Pioneer Hi-Bred International, Inc. | Soybean variety 95M80 |
US20060036162A1 (en) * | 2004-02-02 | 2006-02-16 | Ramin Shahidi | Method and apparatus for guiding a medical instrument to a subsurface target site in a patient |
US8764725B2 (en) | 2004-02-09 | 2014-07-01 | Covidien Lp | Directional anchoring mechanism, method and applications thereof |
US7794414B2 (en) | 2004-02-09 | 2010-09-14 | Emigrant Bank, N.A. | Apparatus and method for an ultrasonic medical device operating in torsional and transverse modes |
US20050182421A1 (en) | 2004-02-13 | 2005-08-18 | Schulte Gregory T. | Methods and apparatus for securing a therapy delivery device within a burr hole |
US20060052691A1 (en) * | 2004-03-05 | 2006-03-09 | Hall Maleata Y | Adjustable navigated tracking element mount |
US20050215888A1 (en) * | 2004-03-05 | 2005-09-29 | Grimm James E | Universal support arm and tracking array |
US9033871B2 (en) | 2004-04-07 | 2015-05-19 | Karl Storz Imaging, Inc. | Gravity referenced endoscopic image orientation |
US8109942B2 (en) | 2004-04-21 | 2012-02-07 | Smith & Nephew, Inc. | Computer-aided methods, systems, and apparatuses for shoulder arthroplasty |
US7567834B2 (en) | 2004-05-03 | 2009-07-28 | Medtronic Navigation, Inc. | Method and apparatus for implantation between two vertebral bodies |
US20050256689A1 (en) * | 2004-05-13 | 2005-11-17 | Conceptual Assets, Inc. | Method and system for measuring attributes on a three-dimenslonal object |
US20050288574A1 (en) * | 2004-06-23 | 2005-12-29 | Thornton Thomas M | Wireless (disposable) fiducial based registration and EM distoration based surface registration |
WO2006020187A2 (en) * | 2004-07-16 | 2006-02-23 | The University Of North Carolina At Chapel Hill | Methods, systems and computer program products for full spectrum projection |
US7776055B2 (en) * | 2004-07-19 | 2010-08-17 | General Electric Company | System and method for tracking progress of insertion of a rod in a bone |
JP2008507367A (en) | 2004-07-23 | 2008-03-13 | カリプソー メディカル テクノロジーズ インコーポレイテッド | Integrated radiation therapy system and method for treating a target in a patient |
US8290570B2 (en) * | 2004-09-10 | 2012-10-16 | Stryker Leibinger Gmbh & Co., Kg | System for ad hoc tracking of an object |
US8007448B2 (en) * | 2004-10-08 | 2011-08-30 | Stryker Leibinger Gmbh & Co. Kg. | System and method for performing arthroplasty of a joint and tracking a plumb line plane |
EP1647236A1 (en) | 2004-10-15 | 2006-04-19 | BrainLAB AG | Apparatus and method for the position checking of markers |
FR2878615B1 (en) | 2004-11-30 | 2009-09-25 | Raquin Cyrille | SIMULATION SYSTEM FOR SHOOTING OR LAUNCHING PROJECTILE USING A SPECIFIC OBJECT OR LAUNCHER |
US7744606B2 (en) * | 2004-12-04 | 2010-06-29 | Medtronic, Inc. | Multi-lumen instrument guide |
US7497863B2 (en) | 2004-12-04 | 2009-03-03 | Medtronic, Inc. | Instrument guiding stage apparatus and method for using same |
US7621874B2 (en) * | 2004-12-14 | 2009-11-24 | Scimed Life Systems, Inc. | Systems and methods for improved three-dimensional imaging of a body lumen |
WO2006067719A2 (en) * | 2004-12-20 | 2006-06-29 | Koninklijke Philips Electronics N.V. | A method, a system and a computer program for integration of medical diagnostic information and a geometric model of a movable body |
US8812096B2 (en) | 2005-01-10 | 2014-08-19 | Braingate Co., Llc | Biological interface system with patient training apparatus |
US20060161059A1 (en) * | 2005-01-20 | 2006-07-20 | Zimmer Technology, Inc. | Variable geometry reference array |
US7623250B2 (en) * | 2005-02-04 | 2009-11-24 | Stryker Leibinger Gmbh & Co. Kg. | Enhanced shape characterization device and method |
US7967742B2 (en) * | 2005-02-14 | 2011-06-28 | Karl Storz Imaging, Inc. | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems |
CA2601976A1 (en) | 2005-02-22 | 2006-08-31 | Smith & Nephew, Inc. | In-line milling system |
JP4417877B2 (en) * | 2005-04-20 | 2010-02-17 | 株式会社セブンスディメンジョンデザイン | Optical transceiver control system |
US8208988B2 (en) * | 2005-05-13 | 2012-06-26 | General Electric Company | System and method for controlling a medical imaging device |
US9492240B2 (en) | 2009-06-16 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
US8971597B2 (en) | 2005-05-16 | 2015-03-03 | Intuitive Surgical Operations, Inc. | Efficient vision and kinematic data fusion for robotic surgical instruments and other applications |
US9289267B2 (en) * | 2005-06-14 | 2016-03-22 | Siemens Medical Solutions Usa, Inc. | Method and apparatus for minimally invasive surgery using endoscopes |
US8784336B2 (en) | 2005-08-24 | 2014-07-22 | C. R. Bard, Inc. | Stylet apparatuses and methods of manufacture |
US7835784B2 (en) | 2005-09-21 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for positioning a reference frame |
US20070078678A1 (en) * | 2005-09-30 | 2007-04-05 | Disilvestro Mark R | System and method for performing a computer assisted orthopaedic surgical procedure |
US7713471B2 (en) * | 2005-10-31 | 2010-05-11 | Codman Neuro Sciences Sarl | System for protecting circuitry in high-temperature environments |
US20070179626A1 (en) * | 2005-11-30 | 2007-08-02 | De La Barrera Jose L M | Functional joint arthroplasty method |
US20070156126A1 (en) * | 2005-12-29 | 2007-07-05 | Flaherty J C | Medical device insertion system and related methods |
US20100023021A1 (en) * | 2005-12-27 | 2010-01-28 | Flaherty J Christopher | Biological Interface and Insertion |
US8862200B2 (en) | 2005-12-30 | 2014-10-14 | DePuy Synthes Products, LLC | Method for determining a position of a magnetic source |
US7525309B2 (en) | 2005-12-30 | 2009-04-28 | Depuy Products, Inc. | Magnetic sensor array |
US9168102B2 (en) | 2006-01-18 | 2015-10-27 | Medtronic Navigation, Inc. | Method and apparatus for providing a container to a sterile environment |
US20070239153A1 (en) * | 2006-02-22 | 2007-10-11 | Hodorek Robert A | Computer assisted surgery system using alternative energy technology |
US7353134B2 (en) * | 2006-03-09 | 2008-04-01 | Dean A. Cirielli | Three-dimensional position and motion telemetry input |
US8112292B2 (en) | 2006-04-21 | 2012-02-07 | Medtronic Navigation, Inc. | Method and apparatus for optimizing a therapy |
EP1854425A1 (en) | 2006-05-11 | 2007-11-14 | BrainLAB AG | Position determination for medical devices with redundant position measurement and weighting to prioritise measurements |
ES2569411T3 (en) | 2006-05-19 | 2016-05-10 | The Queen's Medical Center | Motion tracking system for adaptive real-time imaging and spectroscopy |
US8635082B2 (en) | 2006-05-25 | 2014-01-21 | DePuy Synthes Products, LLC | Method and system for managing inventories of orthopaedic implants |
US8560047B2 (en) | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
US7728868B2 (en) | 2006-08-02 | 2010-06-01 | Inneroptic Technology, Inc. | System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities |
US8565853B2 (en) | 2006-08-11 | 2013-10-22 | DePuy Synthes Products, LLC | Simulated bone or tissue manipulation |
US20080125630A1 (en) * | 2006-09-11 | 2008-05-29 | Caylor Edward J | System and method for determining a location of an orthopaedic medical device |
US8660635B2 (en) | 2006-09-29 | 2014-02-25 | Medtronic, Inc. | Method and apparatus for optimizing a computer assisted surgical procedure |
EP2068716B1 (en) * | 2006-10-02 | 2011-02-09 | Hansen Medical, Inc. | Systems for three-dimensional ultrasound mapping |
US7256899B1 (en) | 2006-10-04 | 2007-08-14 | Ivan Faul | Wireless methods and systems for three-dimensional non-contact shape sensing |
US7794407B2 (en) | 2006-10-23 | 2010-09-14 | Bard Access Systems, Inc. | Method of locating the tip of a central venous catheter |
US8388546B2 (en) | 2006-10-23 | 2013-03-05 | Bard Access Systems, Inc. | Method of locating the tip of a central venous catheter |
WO2008052348A1 (en) * | 2006-11-02 | 2008-05-08 | Northern Digital Inc. | Integrated mapping system |
US8068648B2 (en) * | 2006-12-21 | 2011-11-29 | Depuy Products, Inc. | Method and system for registering a bone of a patient with a computer assisted orthopaedic surgery system |
WO2008103430A2 (en) * | 2007-02-22 | 2008-08-28 | The University Of North Carolina At Chapel Hill | Methods and systems for multiforce high throughput screening |
DE102007009764A1 (en) * | 2007-02-27 | 2008-08-28 | Siemens Ag | Catheter application supporting method for treating cardiac arrhythmia, involves determining position of patient during recording of image and/or during recording of electro-anatomical mapping |
WO2008109801A1 (en) * | 2007-03-07 | 2008-09-12 | Kmt Robotic Solutions, Inc. | System and method of locating relative positions of objects |
US8816959B2 (en) | 2007-04-03 | 2014-08-26 | General Electric Company | Method and apparatus for obtaining and/or analyzing anatomical images |
US20080260095A1 (en) * | 2007-04-16 | 2008-10-23 | Predrag Sukovic | Method and apparatus to repeatably align a ct scanner |
US20090003528A1 (en) | 2007-06-19 | 2009-01-01 | Sankaralingam Ramraj | Target location by tracking of imaging device |
US9883818B2 (en) | 2007-06-19 | 2018-02-06 | Accuray Incorporated | Fiducial localization |
TW200907764A (en) * | 2007-08-01 | 2009-02-16 | Unique Instr Co Ltd | Three-dimensional virtual input and simulation apparatus |
JP2009056299A (en) | 2007-08-07 | 2009-03-19 | Stryker Leibinger Gmbh & Co Kg | Method of and system for planning surgery |
US20090060372A1 (en) * | 2007-08-27 | 2009-03-05 | Riverain Medical Group, Llc | Object removal from images |
RU2491637C2 (en) * | 2007-09-17 | 2013-08-27 | Конинклейке Филипс Электроникс Н.В. | Thickness gauge for measuring image objects |
US8905920B2 (en) | 2007-09-27 | 2014-12-09 | Covidien Lp | Bronchoscope adapter and method |
US8265949B2 (en) | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
US8398645B2 (en) | 2007-09-30 | 2013-03-19 | DePuy Synthes Products, LLC | Femoral tibial customized patient-specific orthopaedic surgical instrumentation |
ES2651898T3 (en) | 2007-11-26 | 2018-01-30 | C.R. Bard Inc. | Integrated system for intravascular catheter placement |
US8781555B2 (en) | 2007-11-26 | 2014-07-15 | C. R. Bard, Inc. | System for placement of a catheter including a signal-generating stylet |
US8849382B2 (en) | 2007-11-26 | 2014-09-30 | C. R. Bard, Inc. | Apparatus and display methods relating to intravascular placement of a catheter |
US9456766B2 (en) | 2007-11-26 | 2016-10-04 | C. R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
US9521961B2 (en) | 2007-11-26 | 2016-12-20 | C. R. Bard, Inc. | Systems and methods for guiding a medical instrument |
US9649048B2 (en) | 2007-11-26 | 2017-05-16 | C. R. Bard, Inc. | Systems and methods for breaching a sterile field for intravascular placement of a catheter |
US9636031B2 (en) | 2007-11-26 | 2017-05-02 | C.R. Bard, Inc. | Stylets for use with apparatus for intravascular placement of a catheter |
US10449330B2 (en) | 2007-11-26 | 2019-10-22 | C. R. Bard, Inc. | Magnetic element-equipped needle assemblies |
US10751509B2 (en) | 2007-11-26 | 2020-08-25 | C. R. Bard, Inc. | Iconic representations for guidance of an indwelling medical device |
US10524691B2 (en) | 2007-11-26 | 2020-01-07 | C. R. Bard, Inc. | Needle assembly including an aligned magnetic element |
US10105168B2 (en) | 2008-01-09 | 2018-10-23 | Stryker European Holdings I, Llc | Stereotactic computer assisted surgery based on three-dimensional visualization |
WO2009094646A2 (en) | 2008-01-24 | 2009-07-30 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for image guided ablation |
US8478382B2 (en) | 2008-02-11 | 2013-07-02 | C. R. Bard, Inc. | Systems and methods for positioning a catheter |
US8340379B2 (en) * | 2008-03-07 | 2012-12-25 | Inneroptic Technology, Inc. | Systems and methods for displaying guidance data based on updated deformable imaging data |
US9575140B2 (en) | 2008-04-03 | 2017-02-21 | Covidien Lp | Magnetic interference detection system and method |
ES2715633T3 (en) | 2008-05-20 | 2019-06-05 | Univ Health Network | Device and method for imaging and fluorescence monitoring |
US8249332B2 (en) * | 2008-05-22 | 2012-08-21 | Matrix Electronic Measuring Properties Llc | Stereoscopic measurement system and method |
US9449378B2 (en) | 2008-05-22 | 2016-09-20 | Matrix Electronic Measuring Properties, Llc | System and method for processing stereoscopic vehicle information |
US8326022B2 (en) | 2008-05-22 | 2012-12-04 | Matrix Electronic Measuring Properties, Llc | Stereoscopic measurement system and method |
US8345953B2 (en) * | 2008-05-22 | 2013-01-01 | Matrix Electronic Measuring Properties, Llc | Stereoscopic measurement system and method |
US8473032B2 (en) | 2008-06-03 | 2013-06-25 | Superdimension, Ltd. | Feature-based registration method |
EP2293720B1 (en) | 2008-06-05 | 2021-02-24 | Varian Medical Systems, Inc. | Motion compensation for medical imaging and associated systems and methods |
US8218847B2 (en) | 2008-06-06 | 2012-07-10 | Superdimension, Ltd. | Hybrid registration method |
US8086026B2 (en) * | 2008-06-27 | 2011-12-27 | Waldean Schulz | Method and system for the determination of object positions in a volume |
US8932207B2 (en) | 2008-07-10 | 2015-01-13 | Covidien Lp | Integrated multi-functional endoscopic tool |
EP2313143B1 (en) | 2008-08-22 | 2014-09-24 | C.R. Bard, Inc. | Catheter assembly including ecg sensor and magnetic assemblies |
US8551074B2 (en) | 2008-09-08 | 2013-10-08 | Bayer Pharma AG | Connector system having a compressible sealing element and a flared fluid path element |
US20100076721A1 (en) * | 2008-09-23 | 2010-03-25 | Crucial Innovation, Inc. | Dynamic Sizing Apparatus, System, and Method of Using the Same |
GB2464092A (en) * | 2008-09-25 | 2010-04-07 | Prosurgics Ltd | Surgical mechanism control system |
US8165658B2 (en) | 2008-09-26 | 2012-04-24 | Medtronic, Inc. | Method and apparatus for positioning a guide relative to a base |
US8437833B2 (en) | 2008-10-07 | 2013-05-07 | Bard Access Systems, Inc. | Percutaneous magnetic gastrostomy |
WO2010048475A1 (en) * | 2008-10-23 | 2010-04-29 | Immersion Corporation | Systems and methods for ultrasound simulation using depth peeling |
US20100103432A1 (en) * | 2008-10-27 | 2010-04-29 | Mcginnis William J | Positioning system and method of using same |
US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
DE102008064105A1 (en) * | 2008-12-19 | 2010-07-08 | Siemens Aktiengesellschaft | Device for determining the position of at least one local coil arranged or to be arranged on a patient couch of a magnetic resonance device, magnetic resonance system with such a device and associated method |
US8830224B2 (en) | 2008-12-31 | 2014-09-09 | Intuitive Surgical Operations, Inc. | Efficient 3-D telestration for local robotic proctoring |
US8632448B1 (en) | 2009-02-05 | 2014-01-21 | Loma Linda University Medical Center | Proton scattering analysis system |
US8834394B2 (en) * | 2009-02-06 | 2014-09-16 | Jamshid Ghajar | Apparatus and methods for reducing brain and cervical spine injury |
US10575979B2 (en) | 2009-02-06 | 2020-03-03 | Jamshid Ghajar | Subject-mounted device to measure relative motion of human joints |
US8641621B2 (en) | 2009-02-17 | 2014-02-04 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US8554307B2 (en) | 2010-04-12 | 2013-10-08 | Inneroptic Technology, Inc. | Image annotation in image-guided medical procedures |
US8690776B2 (en) | 2009-02-17 | 2014-04-08 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image guided surgery |
US11464578B2 (en) | 2009-02-17 | 2022-10-11 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
US8611984B2 (en) | 2009-04-08 | 2013-12-17 | Covidien Lp | Locatable catheter |
KR101019189B1 (en) | 2009-04-28 | 2011-03-04 | 삼성중공업 주식회사 | Position measuring method and position measuring apparatus |
EP2440129A4 (en) | 2009-06-08 | 2015-06-03 | Mri Interventions Inc | Mri-guided surgical systems with preset scan planes |
US9532724B2 (en) | 2009-06-12 | 2017-01-03 | Bard Access Systems, Inc. | Apparatus and method for catheter navigation using endovascular energy mapping |
ES2745861T3 (en) | 2009-06-12 | 2020-03-03 | Bard Access Systems Inc | Apparatus, computer-aided data-processing algorithm, and computer storage medium for positioning an endovascular device in or near the heart |
WO2010148083A2 (en) | 2009-06-16 | 2010-12-23 | Surgivision, Inc. | Mri-guided devices and mri-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time |
US9155592B2 (en) * | 2009-06-16 | 2015-10-13 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
ES2823456T3 (en) | 2009-06-25 | 2021-05-07 | Univ North Carolina Chapel Hill | Method and System for Using Powered Surface Bonded Posts to Assess the Rheology of Biological Fluids |
EP2464407A4 (en) | 2009-08-10 | 2014-04-02 | Bard Access Systems Inc | Devices and methods for endovascular electrography |
US8494614B2 (en) | 2009-08-31 | 2013-07-23 | Regents Of The University Of Minnesota | Combination localization system |
US8494613B2 (en) | 2009-08-31 | 2013-07-23 | Medtronic, Inc. | Combination localization system |
US20110190582A1 (en) * | 2009-09-28 | 2011-08-04 | Bennett James D | Intravaginal optics targeting system |
JP6034695B2 (en) | 2009-10-01 | 2016-11-30 | ローマ リンダ ユニヴァーシティ メディカル センター | Ion-induced impact ionization detector and its use |
US11103213B2 (en) | 2009-10-08 | 2021-08-31 | C. R. Bard, Inc. | Spacers for use with an ultrasound probe |
US8319687B2 (en) * | 2009-12-09 | 2012-11-27 | Trimble Navigation Limited | System for determining position in a work space |
US8237786B2 (en) * | 2009-12-23 | 2012-08-07 | Applied Precision, Inc. | System and method for dense-stochastic-sampling imaging |
JP2013518676A (en) | 2010-02-02 | 2013-05-23 | シー・アール・バード・インコーポレーテッド | Apparatus and method for locating catheter navigation and tip |
WO2011100628A2 (en) | 2010-02-12 | 2011-08-18 | Loma Linda University Medical Center | Systems and methodologies for proton computed tomography |
US10588647B2 (en) | 2010-03-01 | 2020-03-17 | Stryker European Holdings I, Llc | Computer assisted surgery system |
US8643850B1 (en) | 2010-03-02 | 2014-02-04 | Richard L. Hartman | Automated system for load acquisition and engagement |
US8749797B1 (en) | 2010-03-02 | 2014-06-10 | Advanced Optical Systems Inc. | System and method for remotely determining position and orientation of an object |
CN103037762B (en) | 2010-05-28 | 2016-07-13 | C·R·巴德股份有限公司 | For inserting, with pin, the device that guiding system is used together |
US10582834B2 (en) | 2010-06-15 | 2020-03-10 | Covidien Lp | Locatable expandable working channel and method |
AU2010357460B2 (en) | 2010-07-16 | 2013-10-31 | Stryker European Operations Holdings Llc | Surgical targeting system and method |
CA2806353A1 (en) | 2010-08-09 | 2012-02-16 | C.R. Bard Inc. | Support and cover structures for an ultrasound probe head |
MX338127B (en) | 2010-08-20 | 2016-04-04 | Bard Inc C R | Reconfirmation of ecg-assisted catheter tip placement. |
US8425425B2 (en) | 2010-09-20 | 2013-04-23 | M. Dexter Hagy | Virtual image formation method for an ultrasound device |
US8657809B2 (en) | 2010-09-29 | 2014-02-25 | Stryker Leibinger Gmbh & Co., Kg | Surgical navigation system |
CN103189009B (en) | 2010-10-29 | 2016-09-07 | C·R·巴德股份有限公司 | The bio-impedance auxiliary of Medical Devices is placed |
US20120127012A1 (en) * | 2010-11-24 | 2012-05-24 | Samsung Electronics Co., Ltd. | Determining user intent from position and orientation information |
US10813553B2 (en) * | 2011-03-02 | 2020-10-27 | Diagnostic Photonics, Inc. | Handheld optical probe in combination with a fixed-focus fairing |
EP2684034A4 (en) | 2011-03-07 | 2014-09-03 | Univ Loma Linda Med | Systems, devices and methods related to calibration of a proton computed tomography scanner |
US8407111B2 (en) * | 2011-03-31 | 2013-03-26 | General Electric Company | Method, system and computer program product for correlating information and location |
US8687172B2 (en) | 2011-04-13 | 2014-04-01 | Ivan Faul | Optical digitizer with improved distance measurement capability |
US11911117B2 (en) | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
WO2013052187A2 (en) | 2011-06-27 | 2013-04-11 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
AU2012278809B2 (en) | 2011-07-06 | 2016-09-29 | C.R. Bard, Inc. | Needle length determination and calibration for insertion guidance system |
USD699359S1 (en) | 2011-08-09 | 2014-02-11 | C. R. Bard, Inc. | Ultrasound probe head |
USD724745S1 (en) | 2011-08-09 | 2015-03-17 | C. R. Bard, Inc. | Cap for an ultrasound probe |
US9606209B2 (en) | 2011-08-26 | 2017-03-28 | Kineticor, Inc. | Methods, systems, and devices for intra-scan motion correction |
US9167989B2 (en) * | 2011-09-16 | 2015-10-27 | Mako Surgical Corp. | Systems and methods for measuring parameters in joint replacement surgery |
WO2013070775A1 (en) | 2011-11-07 | 2013-05-16 | C.R. Bard, Inc | Ruggedized ultrasound hydrogel insert |
US9451810B2 (en) | 2011-11-18 | 2016-09-27 | Nike, Inc. | Automated identification of shoe parts |
US8755925B2 (en) | 2011-11-18 | 2014-06-17 | Nike, Inc. | Automated identification and assembly of shoe parts |
US8958901B2 (en) | 2011-11-18 | 2015-02-17 | Nike, Inc. | Automated manufacturing of shoe parts |
US10552551B2 (en) | 2011-11-18 | 2020-02-04 | Nike, Inc. | Generation of tool paths for shore assembly |
US8849620B2 (en) | 2011-11-18 | 2014-09-30 | Nike, Inc. | Automated 3-D modeling of shoe parts |
EP2803044B1 (en) | 2012-01-10 | 2020-09-30 | Koninklijke Philips N.V. | Image processing apparatus |
US8670816B2 (en) | 2012-01-30 | 2014-03-11 | Inneroptic Technology, Inc. | Multiple medical device guidance |
US20150025548A1 (en) | 2012-03-08 | 2015-01-22 | Neutar, Llc | Patient and Procedure Customized Fixation and Targeting Devices for Stereotactic Frames |
US9186053B2 (en) | 2012-05-03 | 2015-11-17 | Covidien Lp | Methods of using light to repair hernia defects |
GB2540075B (en) * | 2012-05-18 | 2017-04-19 | Acergy France SAS | Improvements relating to pipe measurement |
EP2854685A1 (en) * | 2012-06-05 | 2015-04-08 | Brainlab AG | Improving the accuracy of navigating a medical device |
WO2013188833A2 (en) | 2012-06-15 | 2013-12-19 | C.R. Bard, Inc. | Apparatus and methods for detection of a removable cap on an ultrasound probe |
US9008757B2 (en) | 2012-09-26 | 2015-04-14 | Stryker Corporation | Navigation system including optical and non-optical sensors |
WO2014048447A1 (en) * | 2012-09-27 | 2014-04-03 | Stryker Trauma Gmbh | Rotational position determination |
US9792836B2 (en) * | 2012-10-30 | 2017-10-17 | Truinject Corp. | Injection training apparatus using 3D position sensor |
WO2014070799A1 (en) | 2012-10-30 | 2014-05-08 | Truinject Medical Corp. | System for injection training |
WO2014085804A1 (en) | 2012-11-30 | 2014-06-05 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for determining physical properties of a specimen in a portable point of care diagnostic device |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
CN105392423B (en) | 2013-02-01 | 2018-08-17 | 凯内蒂科尔股份有限公司 | The motion tracking system of real-time adaptive motion compensation in biomedical imaging |
US10314559B2 (en) | 2013-03-14 | 2019-06-11 | Inneroptic Technology, Inc. | Medical device guidance |
US10105149B2 (en) | 2013-03-15 | 2018-10-23 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9381417B2 (en) | 2013-08-16 | 2016-07-05 | Shimano Inc. | Bicycle fitting system |
US9922578B2 (en) | 2014-01-17 | 2018-03-20 | Truinject Corp. | Injection site training system |
ES2811323T3 (en) | 2014-02-06 | 2021-03-11 | Bard Inc C R | Systems for the guidance and placement of an intravascular device |
DE102014102398A1 (en) * | 2014-02-25 | 2015-08-27 | Aesculap Ag | Medical instruments and procedures |
US10290231B2 (en) | 2014-03-13 | 2019-05-14 | Truinject Corp. | Automated detection of performance characteristics in an injection training system |
CN106572810A (en) | 2014-03-24 | 2017-04-19 | 凯内蒂科尔股份有限公司 | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
JP6321441B2 (en) * | 2014-05-07 | 2018-05-09 | 株式会社ミツトヨ | Three-dimensional measurement system, three-dimensional measurement method, and object to be measured |
EP3811891A3 (en) | 2014-05-14 | 2021-05-05 | Stryker European Holdings I, LLC | Navigation system and processor arrangement for tracking the position of a work target |
KR102258800B1 (en) * | 2014-05-15 | 2021-05-31 | 삼성메디슨 주식회사 | Ultrasound diagnosis apparatus and mehtod thereof |
US10952593B2 (en) | 2014-06-10 | 2021-03-23 | Covidien Lp | Bronchoscope adapter |
CN106714681A (en) | 2014-07-23 | 2017-05-24 | 凯内蒂科尔股份有限公司 | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
EP3957232A1 (en) | 2014-07-24 | 2022-02-23 | University Health Network | Collection and analysis of data for diagnostic purposes |
US9901406B2 (en) | 2014-10-02 | 2018-02-27 | Inneroptic Technology, Inc. | Affected region display associated with a medical device |
US20180296183A1 (en) * | 2014-11-04 | 2018-10-18 | Vib Vzw | Method and apparatus for ultrasound imaging of brain activity |
KR102477470B1 (en) * | 2014-11-21 | 2022-12-13 | 씽크 써지컬, 인크. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
CN107111963B (en) | 2014-12-01 | 2020-11-17 | 特鲁因杰克特公司 | Injection training tool emitting omnidirectional light |
US10188467B2 (en) | 2014-12-12 | 2019-01-29 | Inneroptic Technology, Inc. | Surgical guidance intersection display |
CN107106253B (en) * | 2014-12-16 | 2020-04-03 | 皇家飞利浦有限公司 | Pulsating light-emitting marking device |
US10973584B2 (en) | 2015-01-19 | 2021-04-13 | Bard Access Systems, Inc. | Device and method for vascular access |
US10426555B2 (en) | 2015-06-03 | 2019-10-01 | Covidien Lp | Medical instrument with sensor for use in a system and method for electromagnetic navigation |
US10349890B2 (en) | 2015-06-26 | 2019-07-16 | C. R. Bard, Inc. | Connector interface for ECG-based catheter positioning system |
US9949700B2 (en) | 2015-07-22 | 2018-04-24 | Inneroptic Technology, Inc. | Medical device approaches |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
US10973587B2 (en) * | 2015-08-19 | 2021-04-13 | Brainlab Ag | Reference array holder |
KR102532287B1 (en) * | 2015-10-08 | 2023-05-15 | 삼성메디슨 주식회사 | Ultrasonic apparatus and control method for the same |
US10500340B2 (en) | 2015-10-20 | 2019-12-10 | Truinject Corp. | Injection system |
US9962134B2 (en) | 2015-10-28 | 2018-05-08 | Medtronic Navigation, Inc. | Apparatus and method for maintaining image quality while minimizing X-ray dosage of a patient |
US10716515B2 (en) | 2015-11-23 | 2020-07-21 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US10134188B2 (en) * | 2015-12-21 | 2018-11-20 | Intel Corporation | Body-centric mobile point-of-view augmented and virtual reality |
WO2017127571A1 (en) | 2016-01-19 | 2017-07-27 | Magic Leap, Inc. | Augmented reality systems and methods utilizing reflections |
US11000207B2 (en) | 2016-01-29 | 2021-05-11 | C. R. Bard, Inc. | Multiple coil system for tracking a medical device |
US9675319B1 (en) | 2016-02-17 | 2017-06-13 | Inneroptic Technology, Inc. | Loupe display |
WO2017151441A2 (en) | 2016-02-29 | 2017-09-08 | Truinject Medical Corp. | Cosmetic and therapeutic injection safety systems, methods, and devices |
US10648790B2 (en) | 2016-03-02 | 2020-05-12 | Truinject Corp. | System for determining a three-dimensional position of a testing tool |
US10849688B2 (en) | 2016-03-02 | 2020-12-01 | Truinject Corp. | Sensory enhanced environments for injection aid and social training |
WO2017189450A1 (en) | 2016-04-26 | 2017-11-02 | Magic Leap, Inc. | Electromagnetic tracking with augmented reality systems |
KR101790772B1 (en) * | 2016-05-10 | 2017-10-26 | 주식회사 힐세리온 | Portable ultrasound diagnosis system providing ultrasound image for guide |
US10478254B2 (en) | 2016-05-16 | 2019-11-19 | Covidien Lp | System and method to access lung tissue |
WO2018032084A1 (en) * | 2016-08-17 | 2018-02-22 | Synaptive Medical (Barbados) Inc. | Wireless active tracking fiducials |
JP6748299B2 (en) * | 2016-08-30 | 2020-08-26 | マコー サージカル コーポレイション | System and method for intraoperative pelvic registration |
US10278778B2 (en) | 2016-10-27 | 2019-05-07 | Inneroptic Technology, Inc. | Medical device navigation using a virtual 3D space |
US10751126B2 (en) | 2016-10-28 | 2020-08-25 | Covidien Lp | System and method for generating a map for electromagnetic navigation |
US10418705B2 (en) | 2016-10-28 | 2019-09-17 | Covidien Lp | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
US10792106B2 (en) | 2016-10-28 | 2020-10-06 | Covidien Lp | System for calibrating an electromagnetic navigation system |
US10615500B2 (en) | 2016-10-28 | 2020-04-07 | Covidien Lp | System and method for designing electromagnetic navigation antenna assemblies |
US10638952B2 (en) | 2016-10-28 | 2020-05-05 | Covidien Lp | Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system |
US10517505B2 (en) | 2016-10-28 | 2019-12-31 | Covidien Lp | Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system |
US10722311B2 (en) | 2016-10-28 | 2020-07-28 | Covidien Lp | System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map |
US10446931B2 (en) | 2016-10-28 | 2019-10-15 | Covidien Lp | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
US10510171B2 (en) * | 2016-11-29 | 2019-12-17 | Biosense Webster (Israel) Ltd. | Visualization of anatomical cavities |
US10650703B2 (en) | 2017-01-10 | 2020-05-12 | Truinject Corp. | Suture technique training system |
US10269266B2 (en) | 2017-01-23 | 2019-04-23 | Truinject Corp. | Syringe dose and position measuring apparatus |
US10154885B1 (en) | 2017-05-26 | 2018-12-18 | Medline Industries, Inc. | Systems, apparatus and methods for continuously tracking medical items throughout a procedure |
US10699448B2 (en) | 2017-06-29 | 2020-06-30 | Covidien Lp | System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data |
US11259879B2 (en) | 2017-08-01 | 2022-03-01 | Inneroptic Technology, Inc. | Selective transparency to assist medical device navigation |
US11219489B2 (en) | 2017-10-31 | 2022-01-11 | Covidien Lp | Devices and systems for providing sensors in parallel with medical tools |
US11484365B2 (en) | 2018-01-23 | 2022-11-01 | Inneroptic Technology, Inc. | Medical image guidance |
US11007018B2 (en) | 2018-06-15 | 2021-05-18 | Mako Surgical Corp. | Systems and methods for tracking objects |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
EP3852622A1 (en) | 2018-10-16 | 2021-07-28 | Bard Access Systems, Inc. | Safety-equipped connection systems and methods thereof for establishing electrical connections |
US11617625B2 (en) | 2019-03-12 | 2023-04-04 | Medline Industries, Lp | Systems, apparatus and methods for properly locating items |
CN216090756U (en) | 2019-08-12 | 2022-03-22 | 巴德阿克塞斯系统股份有限公司 | Medical device and shape sensing system for medical device |
CN112826480A (en) | 2019-11-25 | 2021-05-25 | 巴德阿克塞斯系统股份有限公司 | Shape sensing system with filter and method thereof |
CN214804697U (en) | 2019-11-25 | 2021-11-23 | 巴德阿克塞斯系统股份有限公司 | Optical tip tracking system |
US11269407B2 (en) * | 2020-01-30 | 2022-03-08 | Dell Products L.P. | System and method of determining attributes of a workspace configuration based on eye gaze or head pose |
EP4110175A1 (en) | 2020-02-28 | 2023-01-04 | Bard Access Systems, Inc. | Optical connection systems and methods thereof |
CN113456054A (en) | 2020-03-30 | 2021-10-01 | 巴德阿克塞斯系统股份有限公司 | Optical and electrical diagnostic system and method thereof |
EP4171423A1 (en) | 2020-06-26 | 2023-05-03 | Bard Access Systems, Inc. | Malposition detection system |
CN113926050A (en) | 2020-06-29 | 2022-01-14 | 巴德阿克塞斯系统股份有限公司 | Automatic dimensional reference system for optical fibers |
CN113907705A (en) | 2020-07-10 | 2022-01-11 | 巴德阿克塞斯系统股份有限公司 | Continuous optical fiber function monitoring and self-diagnosis reporting system |
EP4188212A1 (en) | 2020-08-03 | 2023-06-07 | Bard Access Systems, Inc. | Bragg grated fiber optic fluctuation sensing and monitoring system |
WO2022081586A1 (en) | 2020-10-13 | 2022-04-21 | Bard Access Systems, Inc. | Disinfecting covers for functional connectors of medical devices and methods thereof |
Family Cites Families (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3821469A (en) * | 1972-05-15 | 1974-06-28 | Amperex Electronic Corp | Graphical data device |
US3983474A (en) * | 1975-02-21 | 1976-09-28 | Polhemus Navigation Sciences, Inc. | Tracking and determining orientation of object using coordinate transformation means, system and process |
US4182312A (en) * | 1977-05-20 | 1980-01-08 | Mushabac David R | Dental probe |
FR2416480A1 (en) * | 1978-02-03 | 1979-08-31 | Thomson Csf | RADIANT SOURCE LOCATION DEVICE AND STEERING TRACKING SYSTEM INCLUDING SUCH A DEVICE |
US4341220A (en) * | 1979-04-13 | 1982-07-27 | Pfizer Inc. | Stereotactic surgery apparatus and method |
US4608977A (en) * | 1979-08-29 | 1986-09-02 | Brown Russell A | System using computed tomography as for selective body treatment |
US4419012A (en) * | 1979-09-11 | 1983-12-06 | Elliott Brothers (London) Limited | Position measuring system |
US4638798A (en) * | 1980-09-10 | 1987-01-27 | Shelden C Hunter | Stereotactic method and apparatus for locating and treating or removing lesions |
US4805616A (en) | 1980-12-08 | 1989-02-21 | Pao David S C | Bipolar probes for ophthalmic surgery and methods of performing anterior capsulotomy |
US4396945A (en) * | 1981-08-19 | 1983-08-02 | Solid Photography Inc. | Method of sensing the position and orientation of elements in space |
US4585350A (en) * | 1983-01-28 | 1986-04-29 | Pryor Timothy R | Pulsed robotic inspection |
US4651732A (en) * | 1983-03-17 | 1987-03-24 | Frederick Philip R | Three-dimensional light guidance system for invasive procedures |
NL8302228A (en) * | 1983-06-22 | 1985-01-16 | Optische Ind De Oude Delft Nv | MEASURING SYSTEM FOR USING A TRIANGULAR PRINCIPLE, CONTACT-FREE MEASURING A DISTANCE GIVEN BY A SURFACE CONTOUR TO AN OBJECTIVE LEVEL. |
NL8304023A (en) * | 1983-11-23 | 1985-06-17 | Kinetics Technology | METHOD FOR PURIFYING FINISHED LUBRICATING OIL. |
DE3342675A1 (en) * | 1983-11-25 | 1985-06-05 | Fa. Carl Zeiss, 7920 Heidenheim | METHOD AND DEVICE FOR CONTACTLESS MEASUREMENT OF OBJECTS |
US4753528A (en) * | 1983-12-13 | 1988-06-28 | Quantime, Inc. | Laser archery distance device |
US4841967A (en) | 1984-01-30 | 1989-06-27 | Chang Ming Z | Positioning device for percutaneous needle insertion |
US4705395A (en) * | 1984-10-03 | 1987-11-10 | Diffracto Ltd. | Triangulation data integrity |
US4706665A (en) | 1984-12-17 | 1987-11-17 | Gouda Kasim I | Frame for stereotactic surgery |
US4782239A (en) * | 1985-04-05 | 1988-11-01 | Nippon Kogaku K. K. | Optical position measuring apparatus |
SE447848B (en) * | 1985-06-14 | 1986-12-15 | Anders Bengtsson | INSTRUMENTS FOR SEATING SURFACE TOPOGRAPHY |
US4743771A (en) * | 1985-06-17 | 1988-05-10 | View Engineering, Inc. | Z-axis height measurement system |
US4805615A (en) | 1985-07-02 | 1989-02-21 | Carol Mark P | Method and apparatus for performing stereotactic surgery |
US4705401A (en) * | 1985-08-12 | 1987-11-10 | Cyberware Laboratory Inc. | Rapid three-dimensional surface digitizer |
US4737032A (en) * | 1985-08-26 | 1988-04-12 | Cyberware Laboratory, Inc. | Surface mensuration sensor |
IL76517A (en) * | 1985-09-27 | 1989-02-28 | Nessim Igal Levy | Distance measuring device |
US4709156A (en) * | 1985-11-27 | 1987-11-24 | Ex-Cell-O Corporation | Method and apparatus for inspecting a surface |
US4722056A (en) * | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
SE469321B (en) * | 1986-04-14 | 1993-06-21 | Joenkoepings Laens Landsting | SET AND DEVICE TO MAKE A MODIFIED THREE-DIMENSIONAL IMAGE OF AN ELASTIC DEFORMABLE PURPOSE |
US4822163A (en) * | 1986-06-26 | 1989-04-18 | Robotic Vision Systems, Inc. | Tracking vision sensor |
US4723544A (en) | 1986-07-09 | 1988-02-09 | Moore Robert R | Hemispherical vectoring needle guide for discolysis |
US4791934A (en) | 1986-08-07 | 1988-12-20 | Picker International, Inc. | Computer tomography assisted stereotactic surgery system and method |
US4733969A (en) * | 1986-09-08 | 1988-03-29 | Cyberoptics Corporation | Laser probe for determining distance |
US4743770A (en) * | 1986-09-22 | 1988-05-10 | Mitutoyo Mfg. Co., Ltd. | Profile-measuring light probe using a change in reflection factor in the proximity of a critical angle of light |
US4761072A (en) * | 1986-09-30 | 1988-08-02 | Diffracto Ltd. | Electro-optical sensors for manual control |
US4750487A (en) * | 1986-11-24 | 1988-06-14 | Zanetti Paul H | Stereotactic frame |
DE3703422A1 (en) * | 1987-02-05 | 1988-08-18 | Zeiss Carl Fa | OPTOELECTRONIC DISTANCE SENSOR |
US4745290A (en) * | 1987-03-19 | 1988-05-17 | David Frankel | Method and apparatus for use in making custom shoes |
US4875478A (en) | 1987-04-10 | 1989-10-24 | Chen Harry H | Portable compression grid & needle holder |
US4793355A (en) * | 1987-04-17 | 1988-12-27 | Biomagnetic Technologies, Inc. | Apparatus for process for making biomagnetic measurements |
US4809694A (en) | 1987-05-19 | 1989-03-07 | Ferrara Vincent L | Biopsy guide |
US4836778A (en) * | 1987-05-26 | 1989-06-06 | Vexcel Corporation | Mandibular motion monitoring system |
US4829373A (en) * | 1987-08-03 | 1989-05-09 | Vexcel Corporation | Stereo mensuration apparatus |
US4931056A (en) | 1987-09-04 | 1990-06-05 | Neurodynamics, Inc. | Catheter guide apparatus for perpendicular insertion into a cranium orifice |
US4991579A (en) * | 1987-11-10 | 1991-02-12 | Allen George S | Method and apparatus for providing related images over time of a portion of the anatomy using fiducial implants |
US5027818A (en) * | 1987-12-03 | 1991-07-02 | University Of Florida | Dosimetric technique for stereotactic radiosurgery same |
US4896673A (en) * | 1988-07-15 | 1990-01-30 | Medstone International, Inc. | Method and apparatus for stone localization using ultrasound imaging |
US5099846A (en) * | 1988-12-23 | 1992-03-31 | Hardy Tyrone L | Method and apparatus for video presentation from a variety of scanner imaging sources |
US5197476A (en) * | 1989-03-16 | 1993-03-30 | Christopher Nowacki | Locating target in human body |
JP3021561B2 (en) * | 1989-10-16 | 2000-03-15 | オリンパス光学工業株式会社 | Surgical microscope device with observation point coordinate display function |
US5273039A (en) * | 1989-10-16 | 1993-12-28 | Olympus Optical Co., Ltd. | Surgical microscope apparatus having a function to display coordinates of observation point |
ES2085885T3 (en) * | 1989-11-08 | 1996-06-16 | George S Allen | MECHANICAL ARM FOR INTERACTIVE SURGERY SYSTEM DIRECTED BY IMAGES. |
US5107139A (en) * | 1990-03-30 | 1992-04-21 | Texas Instruments Incorporated | On-chip transient event detector |
US5224049A (en) * | 1990-04-10 | 1993-06-29 | Mushabac David R | Method, system and mold assembly for use in preparing a dental prosthesis |
US5107839A (en) * | 1990-05-04 | 1992-04-28 | Pavel V. Houdek | Computer controlled stereotaxic radiotherapy system and method |
US5086401A (en) * | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
US5017139A (en) | 1990-07-05 | 1991-05-21 | Mushabac David R | Mechanical support for hand-held dental/medical instrument |
US5198877A (en) * | 1990-10-15 | 1993-03-30 | Pixsys, Inc. | Method and apparatus for three-dimensional non-contact shape sensing |
DE69132412T2 (en) * | 1990-10-19 | 2001-03-01 | Univ St Louis | LOCALIZATION SYSTEM FOR A SURGICAL PROBE FOR USE ON THE HEAD |
US5059789A (en) * | 1990-10-22 | 1991-10-22 | International Business Machines Corp. | Optical position and orientation sensor |
US5309913A (en) * | 1992-11-30 | 1994-05-10 | The Cleveland Clinic Foundation | Frameless stereotaxy system |
US5305091A (en) * | 1992-12-07 | 1994-04-19 | Oreo Products Inc. | Optical coordinate measuring system for large objects |
IL109385A (en) * | 1993-04-22 | 1998-03-10 | Pixsys | System for locating the relative positions of objects in three dimensional space |
-
1994
- 1994-04-22 IL IL109385A patent/IL109385A/en not_active IP Right Cessation
- 1994-04-22 AU AU66668/94A patent/AU6666894A/en not_active Abandoned
- 1994-04-22 JP JP6523546A patent/JPH08509144A/en active Pending
- 1994-04-22 DE DE69432961T patent/DE69432961T2/en not_active Expired - Lifetime
- 1994-04-22 ZA ZA942812A patent/ZA942812B/en unknown
- 1994-04-22 EP EP94915394A patent/EP0700269B1/en not_active Expired - Lifetime
- 1994-04-22 EP EP02004032A patent/EP1219259B1/en not_active Expired - Lifetime
- 1994-04-22 CA CA002161126A patent/CA2161126C/en not_active Expired - Fee Related
- 1994-04-22 WO PCT/US1994/004298 patent/WO1994023647A1/en active IP Right Grant
- 1994-04-22 DE DE69431875T patent/DE69431875T2/en not_active Expired - Lifetime
- 1994-10-04 US US08/317,805 patent/US5622170A/en not_active Expired - Lifetime
-
1997
- 1997-04-18 US US08/844,365 patent/US5987349A/en not_active Expired - Lifetime
- 1997-11-12 US US08/967,890 patent/US5920395A/en not_active Expired - Lifetime
-
1998
- 1998-12-28 US US09/220,888 patent/US6442416B1/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
EP0700269A4 (en) | 1998-07-22 |
DE69431875D1 (en) | 2003-01-23 |
US5622170A (en) | 1997-04-22 |
IL109385A (en) | 1998-03-10 |
EP0700269B1 (en) | 2002-12-11 |
US6442416B1 (en) | 2002-08-27 |
US5920395A (en) | 1999-07-06 |
DE69432961T2 (en) | 2004-02-12 |
DE69431875T2 (en) | 2003-05-28 |
US5987349A (en) | 1999-11-16 |
ZA942812B (en) | 1995-11-22 |
EP1219259A1 (en) | 2002-07-03 |
AU6666894A (en) | 1994-11-08 |
CA2161126A1 (en) | 1994-10-27 |
JPH08509144A (en) | 1996-10-01 |
IL109385A0 (en) | 1994-07-31 |
EP0700269A1 (en) | 1996-03-13 |
EP1219259B1 (en) | 2003-07-16 |
DE69432961D1 (en) | 2003-08-21 |
WO1994023647A1 (en) | 1994-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2161126C (en) | System for locating relative positions of objects | |
US6850794B2 (en) | Endoscopic targeting method and system | |
JP4204109B2 (en) | Real-time positioning system | |
US7831096B2 (en) | Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use | |
EP0600610B1 (en) | A position determining system and method | |
US9320569B2 (en) | Systems and methods for implant distance measurement | |
EP0931516B1 (en) | Surgical probe locating system for head use | |
US7885441B2 (en) | Systems and methods for implant virtual review | |
US8131031B2 (en) | Systems and methods for inferred patient annotation | |
US6675040B1 (en) | Optical object tracking system | |
US6146390A (en) | Apparatus and method for photogrammetric surgical localization | |
US5394875A (en) | Automatic ultrasonic localization of targets implanted in a portion of the anatomy | |
US5389101A (en) | Apparatus and method for photogrammetric surgical localization | |
US6490473B1 (en) | System and method of interactive positioning | |
RU2434600C2 (en) | Surgical system controlled by images | |
Lathrop et al. | Minimally invasive holographic surface scanning for soft-tissue image registration | |
US20080154120A1 (en) | Systems and methods for intraoperative measurements on navigated placements of implants | |
US20020172328A1 (en) | 3-D Navigation for X-ray imaging system | |
US20080119724A1 (en) | Systems and methods for intraoperative implant placement analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKLA | Lapsed |
Effective date: 20130422 |