WO2007049323A1 - Apparatus for moving surgical instruments - Google Patents

Apparatus for moving surgical instruments Download PDF

Info

Publication number
WO2007049323A1
WO2007049323A1 PCT/IT2006/000758 IT2006000758W WO2007049323A1 WO 2007049323 A1 WO2007049323 A1 WO 2007049323A1 IT 2006000758 W IT2006000758 W IT 2006000758W WO 2007049323 A1 WO2007049323 A1 WO 2007049323A1
Authority
WO
WIPO (PCT)
Prior art keywords
equipment according
image
images
detection device
detection
Prior art date
Application number
PCT/IT2006/000758
Other languages
French (fr)
Inventor
Manolo Omiciuolo
Massimo Pagani
Simone Pio Negri
Vito Basile
Original Assignee
Sintesi S.C.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sintesi S.C.P.A. filed Critical Sintesi S.C.P.A.
Priority to EP06821748A priority Critical patent/EP1951141A1/en
Publication of WO2007049323A1 publication Critical patent/WO2007049323A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00703Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement of heart, e.g. ECG-triggered
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0801Prevention of accidental cutting or pricking
    • A61B2090/08021Prevention of accidental cutting or pricking of the patient or his organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • the object of the present invention is an equipment for moving medical elements, such as by way of non-limiting example, surgical elements.
  • Technological background of the invention is an equipment for moving medical elements, such as by way of non-limiting example, surgical elements.
  • diagnostic imaging With reference to diagnostic activities, various types of diagnostic imaging are known to be currently available in order to obtain digital imaging of anatomic details, organs and tissues, which allow identifying morphological and/or functional characteristics that may be traced to particular physiological or pathological conditions.
  • Ultrasound, radiographic, nuclear magnetic resonance and scintigraphic detections can be considered as a non-exhaustive example. These detections have generally the purpose of generating digital images of organs and/or tissues, which point out morphologic and/or functional properties even with the aid of contrast means and/or injectable radioactive tracers.
  • the accuracy in the detection of neoplasias appears to be a crucial point in order to be suitably ensured that the tissue intended to be removed and/or taken has been actually excised.
  • this localization occurs by analyzing the results of the examinations that have been carried out during the diagnosis, and consequently, by identifying the region to be operated on the surface of the animal's or patient's body, which corresponds to the inner portion of interest (organ, tissue or the like).
  • a mark is made on the body's surface, which indicates the region to be operated, hi any case, the identification of this region on the body's surface is carried out by means of a visual analysis of the body by an operator in charge, such as a surgeon.
  • the object of the present invention is to provide an equipment for moving medical elements that is alternative to known equipment, and for example, that overcomes the limitations of the above-mentioned known techniques on the localization of a patient's body region to be operated.
  • the object of the present invention is achieved by an equipment for moving medical elements such as described in Claim 1, and preferred embodiments thereof as described in Claims 2 to 37.
  • the object of the present invention is also a method for using an equipment for moving medical elements such as defined in Claim 38 and preferred embodiments thereof as defined in Claims 39 and 40.
  • the equipment 1 is capable of carrying out in-vivo diagnoses by multi-modal and real-time imaging; the equipment 1 can further carry out the in-vivo localization of morphological and/or functional alterations and controlling the moving of surgical elements.
  • in-vivo is meant herein to refer to diagnoses and detections that are carried out on living beings that are preferably anaesthetized.
  • multi-modal is meant herein to refer to the possibility of using several detection methodologies.
  • the equipment 1 comprises a medical element 50, which is moved as a function of preset detections carried out and preset controls imposed by the operator, as will be better detailed below.
  • This medical element 50 can comprise, for example, a surgical element and/or a pointer element .
  • the surgical element can be any element that is capable of surgically operating a patient's body region , and which can comprise for example, a tool suitable to carry out incisions, holes, injections, taking liquid or solid samples, suitable to be operated according to a suitable movement (e.g. a biopsy needle).
  • the pointer element is capable of operating on a patient's body region by indicating and pointing out the latter.
  • the pointer element is optical and comprises such a device as to send a visible radiation beam impinging on a limited area of a surface in this region (for example, also by projecting an image, such as a point, a circle, a X, or the like) such as to make it identifiable and seen by naked eye with a sufficient precision.
  • the pointer element can preferably comprise a laser-emitting device.
  • the medical element 50 includes both the surgical element and the pointer element, which can be suitably moved in an automated manner, hi accordance with a second embodiment, the medical element 50 includes only the pointer element, whereas the tools of the surgical element can be of a manual type and not moved in an automated manner.
  • the medical element 50 is of a surgical type and includes the surgical tools thereof, but is not provided with a pointer element. If not stated otherwise, exemplary reference will be made herein below to this third embodiment, though, however, the description below can be also applied to the other embodiments of the medical element 50.
  • Suitable moving elements 51 can be associated to the surgical element 50, such as an electric and/or pneumatic motor which operatively drive the element 50 to move the latter in one ore more directions.
  • Both the surgical element 50 and the moving means 51 thereof can be mounted to a suitable support structure (not illustrated herein).
  • the surgical element 50 is preferably mounted to the same structure to which at least one of the first, second and third detection devices 10, 20, 30 is mounted, which are described herein below.
  • the equipment 1 comprises a first detection device 10 for identifying a patient's region of interest to be operated.
  • the first detection device 10 can be suitable, according to an example, to carry out a computer-aided Single Photon Emission Computer Tomography (SPECT), and can be provided with a suitable gamma camera.
  • SPECT computer-aided Single Photon Emission Computer Tomography
  • the device 10 is a light-weight, small-sized scintigraphic system with small scan area and high spatial resolution.
  • the first detection device 10 acquires, with high sensibility, functional types of data, i.e. concerning the behaviour of a specific gamma-emitting radio pharmaceutical that is intravenously injected into the patient.
  • the first detection device 10 provides a plurality of first identification images 11 that are time-ordered, i.e. detected in subsequent time instants from each other.
  • the first detection device 10 can be associated with a respective first moving element 51c that is arranged for moving the first detection device 10, as well as with the moving system 51 in general.
  • the acquisition process performed by the first device 10 can be carried out by a suitable mutual moving between the detection device 10 itself and the portion to be examined.
  • the first moving element 51c can be driven by a control device by means of which a user can define the scanning typology as desired.
  • the equipment 1 further comprises a central control unit (or central processing unit) 40 that is operatively associated at least with the first detection device 10 in order to process the images detected by the latter, and to operate said surgical member 50 as a function of these images by means of a suitable command signal 100.
  • a central control unit or central processing unit 40 that is operatively associated at least with the first detection device 10 in order to process the images detected by the latter, and to operate said surgical member 50 as a function of these images by means of a suitable command signal 100.
  • control unit 40 can comprise a pre-processing block 41 being provided with a filtering block 41a that is arranged for filtering the noise and disturbances of the signal received from the first detection device 10.
  • the pre-processing block 41 can further comprise a scaling block 41b, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a) such that the corresponding display occurs in a determined scale.
  • the pre-processing block 41 can further comprise a spatial alignment block 41c, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a or scaling block 41b) such that the corresponding display occurs in a determined reference system.
  • scaling and spatial alignment the following technique is to be considered as a preferred, though not exclusive, example: to guide the alignment, the absolute coordinates of several pixels (2D case) and voxels (3D case) can be used.
  • the transformation between two coordinate systems is given by the following expression:
  • x is the coordinates of the "aligned" image (i.e. the image suitably scaled and referred to the desired reference system)
  • y is the coordinates of the initial image (for example, the first identification image 11)
  • A is the scaling and rotation matrix
  • T is the translation matrix.
  • the equipment 1 further comprises one or more sensors 60 that are operatively associated with the patient's body.
  • the sensors 60 have the task of detecting the patient's heart rate, the movements associated with breathing and/or the movements associated with tremor events.
  • the sensors 60 are arranged to detect physiological signals, such as mechanical and/or electrical, and/or electromagnetic and/or electrochemical and/or the like, preferably periodical, which are generated by the patient's body.
  • the sensors 60 are connected to the processing unit 40, such that "blurring" events (i.e. interference between the vibrations generated by the patient's body and the detections carried out by the first device 10) can be avoided, which are due to the movements of the subject being observed and/or due to other events that are a disturbance to the detections carried out by said device.
  • the processing unit 40 in fact, is provided with a synchronization block 42, which is operatively associated with at least the first detection device 10 in order to synchronize the signal incorporating the first images 11 with the signals 60a generated by the sensors 60.
  • the synchronization block 42 is collected downstream of the preprocessing block 41, such that the synchronization block 42 can operate on images that have been processed and filtered beforehand. This synchronization is carried out particularly with reference to the frequencies and amplitudes of the movements that are generated by the patient's body.
  • the equipment 1 can further comprise a graphic communication interface 70, by means of which an operator can interact with the equipment 1.
  • the communication interface 70 is provided with data presentation means, such as a monitor or similar display device, by means of which it provides the operator with the information that is detected by the first detection device 10.
  • the communication interface 70 is further provided with data input means, which can be employed by the operator to input information and/or commands. Particularly, when the result of the first detection has been displayed, the operator can decide whether to proceed with a further inspection, by means of a subsequent detection which will be described below, or to input a command to the equipment 1 in order to activate the generation of said command signal 100, such that the surgical element 50 can be moved according to the data available so far.
  • the subsequent detection can be carried out using a second detection device 20 and/or a third detection device 30, as will be discussed herein below.
  • the equipment 1 can comprise a second detection device 20, in order to acquire a plurality of three-dimensional images of at least one portion of the patient's body; preferably, this portion comprises the region identified by the first detection device 10.
  • the second detection device 20 has the task of acquiring geometrical (or morphological) information at least on said region.
  • This acquisition can be generally carried out throughout the patient's body, however, in order to limit the duration and complexity of the operation, as well as the discomfort caused to the patient, the acquisition of the three-dimensional images is provided to be carried out only on a portion of the patient's body.
  • the second detection device 20 can be a 2D and/or 3D scanning system suitable for acquiring the external morphology of a limited region of the patient, by means of a plurality of two-dimensional and three-dimensional images; for example, the second detection device 20 can be a structured light acquisition system.
  • the second detection device 20 acquires a plurality of second acquisition images 21 that are time-ordered, i.e. detected in subsequent time instants from each other.
  • the second detection device 20 can be associated with a respective second moving element 5 Id that is arranged for moving the second device 20, and belonging to the general moving system 51.
  • the acquisition process performed by the second device 20 can be carried out by a suitable mutual moving between the detection device 20 and the portion to be examined.
  • the second moving element 51d can be driven by a command device, by means of which a user can define the type of scanning as desired; preferably this command device is the same command device that operatively drives the first moving element 51c.
  • the second detection device 20 is operatively associated with the filtering block 41a, such that the second images 21 can be filtered by disturbances or noise that can be present.
  • the second detection device 20 can be operatively associated with the synchronization block 42, such that the signals incorporating the second images 21 can be synchronized with the signals 60a generated by the sensors 60.
  • the control unit 40 is operatively associated both with the first and second detection devices 10, 20 in order to combine the identification of the first device 10 with the acquisition carried out by the second device 20; following this combination, the control unit 40 generates a corresponding display signal for the interface 70, which helps the operator to generate a command 100 for moving the surgical element 50 as a function of this combination.
  • the command signal 100 can be sent to the moving means 51, which act on the architecture and thus on the surgical element 50, such that the latter can be moved as desired.
  • control unit 40 can comprise, within the pre-processing block 41, an adaptation unit 41b, which is arranged for referring the images that are detected by the first and second detection devices 10 and 20 to a same (either reduction or magnification) scale, such that the images result to be comparable to each other; this operation is generally indicated as the "scaling".
  • an adaptation unit 41b which is arranged for referring the images that are detected by the first and second detection devices 10 and 20 to a same (either reduction or magnification) scale, such that the images result to be comparable to each other; this operation is generally indicated as the "scaling".
  • the selected reference scale can be that of the first identification images 11, that of the second acquisition images 21, or a scale other than the preceding ones.
  • the control unit 40 can further comprise, within the pre-processing block 41, a spatial alignment block 41c in order to refer the images provided by the first and second detection devices 10, 20 to a same spatial reference system.
  • the detections carried out by the first and second devices 10, 20 are combined in a same spatial reference system, such that the various images from the two devices 10, 20 can be superimposed and the information provided in said first and second images 11, 21 is simultaneously available.
  • the spatial alignment block 41c is connected downstream of the adaptation unit 41b such that the detected images are inserted in the same spatial reference after they have been all transformed to the same scale.
  • the synchronization block 42 can be connected to the filtering block 41a and/or the spatial alignment block 41c; thereby, the images provided by the second detection device 20 can be also synchronized with the signals 60a from the sensors 60.
  • connection interface 70 the operator is provided both with the information detected by means of the first detection device 10, and with the information detected by means of the second detection device 20; the operator can, at this stage, decide whether to activate the generation of said command signal 100, or to input a command in the equipment 1 in order to carry out with further detections.
  • the control unit 40 can also comprise time alignment means 43, in order to refer the identification of the first device 10 and the acquisition of the second device 20 to a same time scale: this results in a command imposing the simultaneous acquisition by the two devices 10 and 20.
  • the time alignment means 43 have thus the task of setting the first identification images 11 (detected by the first device 10) and the second acquisition images 21 (detected by the second device 20) in a same time reference system.
  • the first and second detection devices 10, 20 have a same image detection frequency [or one is a multiple of the other], and are particularly in phase to each other, such as to obtain said simultaneous detection.
  • the time alignment means 43 ensure the simultaneity of the identifications by the first device 10 an/or the identifications carried out by the second device 20, obtaining said time alignment.
  • the purpose of the time alignment means 43 is to associate each of said first identification images 11 with at least one of said second acquisition images 21, these first and second images 11, 21 relating to an identification and an acquisition in a same instant, respectively. This is necessary for carrying out a dynamic multimodal display.
  • a second identification image 11 which is obtained like said first image 11 and temporally subsequent thereto, is associated with a second acquisition image 21 that is also obtained like said first image 21, and temporally subsequent thereto.
  • the same work is done for the subsequent images.
  • two sequences are generated, which are synchronized to each other, of images 11 and 21 suitable to represent the time evolution of the events detected by the devices 10 and 20.
  • the control unit 40 can further comprise a reconstruction block 44, which is operatively associated at least with the first and second detection devices 10, 20; preferably, the reconstruction block is connected downstream of the time alignment means 43.
  • the reconstruction block 44 has the task of obtaining a suitable reconstruction, relative to the scanning carried out by the devices, of the signals detected by the detection devices 10, 20.
  • suitable mathematical algorithms process the signals and/or information from the block 43 such as to generate a planar image and/or a volume reconstruction, and/or a surface reconstruction in the space.
  • mathematical algorithms for volume reconstruction can be "Filtered Back Projection” (FBP), and/or “Ordered Subset Expectation Maximisation” (OSEM), and/or the like.
  • the control unit 40 can further comprise the composition block 45, which is operatively associated with at least the first and second detection devices 10, 20.
  • the composition block 45 is arranged to combine with each other (for example, by means of a pyramidal technique, "wavelet", and the like) the information incorporated in the images supplied by the detection devices 10, 20 such as to obtain a corresponding fusion image 123.
  • composition block 45 can be operatively associated with the synchronization 42, time alignment 43, and reconstruction 44 blocks, such that the signal incorporating the fusion image 123 can be synchronized with the signals 60a generated by the sensors 60.
  • a pyramid is defined as a sequence of auxiliary images where each level in the pyramid is a filtered and subsampled copy of the preceding level.
  • the lowermost level in the pyramid has the same scale as the original image (for example, the first identification image 11) and contains the information of higher resolution than the remaining levels in the pyramid.
  • the highest levels in the pyramid have a lower resolution, but they have a higher scale than the original image.
  • the basic concept is making a pyramid for the fused image (for example, the fusion image 123) from the pyramids of each starting image (for example, the first and second images 11, 21).
  • the fusion image 123 is then obtained by operating an inverse transformation on the pyramid.
  • the first step is making the pyramid for each source image; the fusion is thus obtained for each level in the pyramid using a selection principle that is based on the absolute maximum luminosity or on component average or other selection principles.
  • the fused image (for example, the image 123) of the fused pyramid is reconstructed.
  • composition block 45 can comprise a first processing block
  • the first auxiliary image 12 has lower resolution and higher scale than the first identification image 11 from which it has been generated; in other words, the first auxiliary image 12 is smaller and less defined than the first starting image 11.
  • the composition block 45 comprises a processing block 45b which is operatively associated with the second detection device 20 in order to receive the second acquisition image 21 that is associated with the first identification image 11 and to generate a corresponding second auxiliary acquisition image
  • the second auxiliary acquisition image 22 has a lower resolution and a higher scale than the first image 21 from which it has been generated; in other words, the second auxiliary image 22 is smaller and less defined than the second starting image 21.
  • the auxiliary images 12, 22 have the same resolution; furthermore, the first and second auxiliary images 12, 22 have the same reduction scale relative to the real dimensions of the imaged area of human body.
  • composition block 45 can further comprise combination means 45d that are operatively associated with the processing blocks 45a and 45b for generating a fusion image 123 as a function of the combination of the first and second auxiliary images 12, 22.
  • composition means 45d have the task of combining with each other the information incorporated in the first auxiliary image 12 and in the second auxiliary image 22, such as to obtain said fusion image 123, in which the relevant data from the first and second images 11, 21 (from which the images 12, 22 have been generated) are suitably combined.
  • a morphological-functional imaging i.e. an image simultaneously containing morphological and functional information of a same portion of tissue.
  • the control unit 40 sends it to the graphic interface 70.
  • the operator then activates the generation of said command signal 100, as a function of said fusion image 123, by means of the suitable moving control system 46; i.e. according to what is shown in the fusion image 123, the control unit 40 provides to move the surgical element 50 such that the latter can properly operate, preferably following a confirm command being entered by the operator, after the latter has checked the contents of the fusion image 123.
  • the procedure of tissue sampling via a needle will be described below: the steps of acquisition, processing, synchronization and fusion as described above generate a video image with a spatial and functional content that is subjected to the operator's interpretation.
  • the system is capable of providing the spatial coordinates, relative to a known reference system, of the location in which the operator decides to take some tissue.
  • the central processing unit by means of suitable routines of direct and inverse kinematics (by way of example, refer to the Denavit-Hartenberg matrixes), guides the moving and orientation, in order to adopt an operative position of the pointer and any surgical element, such as the biopsy needle. After the operative position has been reached, the moving and actuation of the medical element can be commanded, i.e. according to the example, the biopsy sampling, or the activation of the laser emitter indicating the region to be operated can be carried out.
  • the equipment 1 further comprises a third detection device 30, in order to make the operation of the equipment 1 further accurate and reliable.
  • the third detection device 30 can be, for example, a ultra-sound and/or radiographic and/or nuclear magnetic resonance scanning system; this device plays a major role in the acquisition of morphological images of hard and/or soft tissues.
  • the third detection device 30 comprises a ultrasound and/or radiographic and/or nuclear magnetic resonance probe.
  • the control unit 40 is operatively associated with the third detection device 30 in order to combine, within said coordinate system, the detection by the third detection device 30 with the acquisition of the first device 10 and/or with the identification of the second device 20.
  • the images detected by the third detection device 30 are processed such as to be referred to the same spatial coordinate system as used for the acquisition carried out by the first device 10 and/or for the identification carried out by the second device 20.
  • the detections of the third device 30 can be superimposed to what has been detected by the first and/or second devices 10, 20, such that the various available information can be simultaneously used for moving the surgical element 50.
  • the third detection device 30 is operatively associated with the filtering block 41a, such that the images provided by the third device can be filtered from disturbances or noise that may be present.
  • the third detection device 30 is also operatively associated with the adaptation unit 41b, such that the images detected by the third device 30 can be referred to the same scale as the images from the first and/or second detection devices 10, 20.
  • the selected reference scale can be that of the first image 11, that of the second image 21, that of the images detected by the third device 30, or a scale other than the preceding ones.
  • the third detection device 30 is also operatively associated with the spatial alignment block 41c, such that the images detected by the third device 30 can be referred to the same spatial reference as the images from the first and/or second detection devices 10, 20.
  • the third detection device 30 is also operatively associated with the synchronization means 42 such that the signal incorporating the third detection images 31 is synchronized with the signals 60a that are generated by the sensors 60.
  • the third detection device 30 is also operatively associated with the time alignment means 43, such that it can carry out detection simultaneously with the detection that is carried out by the devices 10 and 20. The detection by this third device 30 is thus referred to the same time scale as used for the acquisition of the first device 10 and/or the identification of the second device 20.
  • the third detection device 30 has the same image detection frequency (but it may be also a multiple or sub-multiple of the frequency) as the first and/or second detection devices 10, 20, and particularly, the detection of the third detection device 30 is in phase with the detection of the first and/or second detection devices 10, 20; thereby, a substantially simultaneous detection of the devices used can be obtained.
  • the third detection device 30 provides a plurality of third detection images 31, which are time-ordered relative to each other.
  • the reconstruction and composition blocks 44 and 45 can be also associated with the third detection device 30 to combine the images acquired by the latter with those of the first and/or second detection devices 10, 20 to obtain a corresponding fuse image 123 and a command signal 100 that is preferably intended for the moving element 51.
  • the composition block 45 can comprise a third processing block 45c that is operatively associated with the third detection device 30 to receive at least one third image 31 and generating a corresponding third auxiliary image
  • the third auxiliary image 32 has lower resolution and higher scale than the third image 31 from which it has been generated.
  • the resolution and scale provided by the third auxiliary image 32 are substantially the same as provided by the first auxiliary image 12.
  • Said combination means 45d can be also operatively associated to the third detection device 30 for generating the fusion image 123 also as a function of the third auxiliary image 32.
  • the command signal 100 can be generated also as a function of what has been detected by the third device 30.
  • the operator is provided with the possibility of activating the generation of the command signal 100 or, alternatively, in the case where the detections that have been carried out will prove to be insufficient, the possibility of activating one or more of the detection devices 10, 20, 30.
  • the first processing block 45a can be arranged for generating a plurality of first auxiliary images 12 from an individual first image 11. hi this case, these first auxiliary images 12 have, progressively, a lower resolution and a higher scale than the first starting image 11.
  • a virtual pyramid is generated, which is defined by the sequence of first auxiliary images 12, in which each level - downward up to the vertex - is a filtered and subsampled copy of the auxiliary image of the lower level.
  • the lowermost level in the pyramid thus consists of the first source image 11; the uppermost levels have a lower resolution and a higher scale than the original image 11.
  • the latter can, in fact, generate a plurality of third auxiliary images 32 that have, in a progressive sequence, a lower resolution and higher scale than the third source image 31.
  • first and second auxiliary images 12, 22 that occupy corresponding levels have the same resolution and the same scale as the corresponding first and second source images 11, 21. Furthermore, first and third auxiliary images 12, 32 that occupy corresponding levels also have the same resolution and the same scale as the corresponding first and third source images 11, 31.
  • the command unit 40 can be arranged for carrying out a volume reconstruction and a consequent tomographic imaging of the region of interest, starting from one or more scintigraphic detections that is obtained via the first detection device 10; practically, a series of acquisitions is carried out from different points of view, which can be combined in order to have the functional information according to the tomographic technique.
  • the various functional blocks comprised within the control unit 40 have been separately and individually presented only to explain the different functionalities of the control unit 40; actually, however, the control unit 40 can be made as an individual electronic device, which is suitably programmed to carry out the operations described above. The invention achieves considerable advantages.
  • the equipment according to the invention allows transferring the precision and reliability of the detections that have been carried out during the diagnostic step to the surgical step. Thereby, the total quality and therapeutic effectiveness of the operation are significantly improved, while reducing the duration of the latter and the discomfort caused to the patient. Furthermore, the equipment according to the invention allows tracing, localizing and pointing out in a precise manner, directly on the patient and on the same site where the operation will be carried out, the exact location in which the operation has to be carried out, thereby significantly reducing the positioning and alignment errors that are generated when the diagnostic scanning and therapeutic operation are carried out a separated place and time.
  • a further significant advantage offered by the device is the possibility of repeating the diagnostic scanning for several times either during (i.e. on-line) and/or at the end of the operation, such as to be capable of checking the result simultaneously with the operation, by following the time dynamic of the pathologic and/or physiologic event being inspected, or as a final check, which ensures how a sampling has been actually carried out according to the preceding diagnostic indications. It should be observed how a further advantage derives from the possibility of using the device s once suitably sized, for small animals: i.e. all the diagnosis functions are transferred "in- vivo" for real-time images guiding any medical element also on those subjects (by way of non-limiting examples, mice and rats) that are used for medical and pharmacologic research.
  • the particular solution using several detection devices advantageously offers the possibility of integrating various types of information as they come from distinct dedicated diagnostic instruments.
  • the possibility of carrying out a fusion of the information incorporated within the images provided by the various detection devices results notably advantageous when the apparatus is used in-vivo on patients or animals.
  • the introduction of integrated techniques gains importance not only in guided surgery but generally also in those examinations on pathologies that require a great precision, such as neoplasias at an initial stage.
  • the possibility offered by particular embodiments of the invention of using different morphological techniques confers a high degree of flexibility within the possible applications, in that the morphological and functional techniques can be selected and adapted to particular requirements, such as machines for operating rooms, diagnosis machines and biopsy sampling systems, and machines for in- vivo pharmacologic and diagnostic research on animals.
  • the equipment Based on clinical inspections, the equipment will be provided with ultrasound or X-ray techniques integrated with scintigraphic techniques, by dimensioning the detection field to the pathology size or to the most suitable regions to be explored in the scintigraphic mode, by means of linear and/or tomographic scanning.
  • the possibility of displaying the information revealed (acquired) by the detection device in accordance with an example of the invention is an advantage within the pharmacologic kinetics in that it offers an added value in the quality and quantity study of the drug behaviour and patient's reaction thereto.

Abstract

Equipment for moving medical elements comprising one or more detection devices (10, 20, 30) to identify a region to be operated on a patient's body, and a medical element (50) to operate this region; the equipment (1) further comprises a control unit (40) that is operatively associated with said devices (10, 20, 30) in order to receive a signal representing at least the identification as carried out by one of the latter and to generate a corresponding command signal (100) in order to move said medical element (50) as a function of said identification.

Description

APPARATUS FOR MOVING SURGICAL INSTRUMENTS
DESCRIPTION
Field of the invention
The object of the present invention is an equipment for moving medical elements, such as by way of non-limiting example, surgical elements. Technological background of the invention
With reference to diagnostic activities, various types of diagnostic imaging are known to be currently available in order to obtain digital imaging of anatomic details, organs and tissues, which allow identifying morphological and/or functional characteristics that may be traced to particular physiological or pathological conditions. Ultrasound, radiographic, nuclear magnetic resonance and scintigraphic detections can be considered as a non-exhaustive example. These detections have generally the purpose of generating digital images of organs and/or tissues, which point out morphologic and/or functional properties even with the aid of contrast means and/or injectable radioactive tracers.
With reference to surgical activities (also when carried out for diagnostic purposes), the need exists of accurately identifying a typically limited region which must be subjected to operation by means of a surgical element; the operation can entail incision and/or excision of a precise part of tissue that is located in a given anatomic region. For example, for the surgeon carrying out the removal of a body inner tissue to be subjected to biopsy, the accuracy in the detection of neoplasias appears to be a crucial point in order to be suitably ensured that the tissue intended to be removed and/or taken has been actually excised. At present, this localization occurs by analyzing the results of the examinations that have been carried out during the diagnosis, and consequently, by identifying the region to be operated on the surface of the animal's or patient's body, which corresponds to the inner portion of interest (organ, tissue or the like). In several cases, a mark is made on the body's surface, which indicates the region to be operated, hi any case, the identification of this region on the body's surface is carried out by means of a visual analysis of the body by an operator in charge, such as a surgeon.
This methodology is unsatisfactory, and can lead to mistakes. The Applicant has observed that the exact position of the body's inner portion to be operated strongly depends on the conditions (for example, the position of the body, the pathology development stage, possible deformations of the tissue induced by a surgical tool) occurring in the animal or patient just before the surgical operation, or during this operation; which conditions are different from those that occurred during the diagnostic examination. Summary of the invention
The object of the present invention is to provide an equipment for moving medical elements that is alternative to known equipment, and for example, that overcomes the limitations of the above-mentioned known techniques on the localization of a patient's body region to be operated. The object of the present invention is achieved by an equipment for moving medical elements such as described in Claim 1, and preferred embodiments thereof as described in Claims 2 to 37. The object of the present invention is also a method for using an equipment for moving medical elements such as defined in Claim 38 and preferred embodiments thereof as defined in Claims 39 and 40. Further characteristics and the advantages will be better understood from the detailed description of a preferred, though non-exclusive embodiment of the equipment according to the invention; this description is given with reference to the attached Fig. 1, which also have a merely exemplary, and thus non-limiting, purpose, which shows a block diagram of the equipment according to the invention.
With reference to the attached figure, with 1 has been generally designated an equipment for moving medical elements according to the invention, which is employed, for example, on laboratory animals, and human beings. The equipment 1 is capable of carrying out in-vivo diagnoses by multi-modal and real-time imaging; the equipment 1 can further carry out the in-vivo localization of morphological and/or functional alterations and controlling the moving of surgical elements. The term "in-vivo" is meant herein to refer to diagnoses and detections that are carried out on living beings that are preferably anaesthetized.
The term "multi-modal" is meant herein to refer to the possibility of using several detection methodologies.
The equipment 1 comprises a medical element 50, which is moved as a function of preset detections carried out and preset controls imposed by the operator, as will be better detailed below.
This medical element 50 can comprise, for example, a surgical element and/or a pointer element . The surgical element can be any element that is capable of surgically operating a patient's body region , and which can comprise for example, a tool suitable to carry out incisions, holes, injections, taking liquid or solid samples, suitable to be operated according to a suitable movement (e.g. a biopsy needle).
The pointer element is capable of operating on a patient's body region by indicating and pointing out the latter. Advantageously, the pointer element is optical and comprises such a device as to send a visible radiation beam impinging on a limited area of a surface in this region (for example, also by projecting an image, such as a point, a circle, a X, or the like) such as to make it identifiable and seen by naked eye with a sufficient precision. The pointer element can preferably comprise a laser-emitting device. According to a first embodiment, the medical element 50 includes both the surgical element and the pointer element, which can be suitably moved in an automated manner, hi accordance with a second embodiment, the medical element 50 includes only the pointer element, whereas the tools of the surgical element can be of a manual type and not moved in an automated manner. In accordance with a third embodiment, the medical element 50 is of a surgical type and includes the surgical tools thereof, but is not provided with a pointer element. If not stated otherwise, exemplary reference will be made herein below to this third embodiment, though, however, the description below can be also applied to the other embodiments of the medical element 50. Suitable moving elements 51 can be associated to the surgical element 50, such as an electric and/or pneumatic motor which operatively drive the element 50 to move the latter in one ore more directions.
Both the surgical element 50 and the moving means 51 thereof can be mounted to a suitable support structure (not illustrated herein). The surgical element 50 is preferably mounted to the same structure to which at least one of the first, second and third detection devices 10, 20, 30 is mounted, which are described herein below.
The equipment 1 comprises a first detection device 10 for identifying a patient's region of interest to be operated.
The first detection device 10 can be suitable, according to an example, to carry out a computer-aided Single Photon Emission Computer Tomography (SPECT), and can be provided with a suitable gamma camera. In accordance with a particular example, the device 10 is a light-weight, small-sized scintigraphic system with small scan area and high spatial resolution. The first detection device 10 acquires, with high sensibility, functional types of data, i.e. concerning the behaviour of a specific gamma-emitting radio pharmaceutical that is intravenously injected into the patient.
Generally, the first detection device 10 provides a plurality of first identification images 11 that are time-ordered, i.e. detected in subsequent time instants from each other. The first detection device 10 can be associated with a respective first moving element 51c that is arranged for moving the first detection device 10, as well as with the moving system 51 in general.
Thereby, the acquisition process performed by the first device 10 can be carried out by a suitable mutual moving between the detection device 10 itself and the portion to be examined.
The first moving element 51c can be driven by a control device by means of which a user can define the scanning typology as desired.
The equipment 1 further comprises a central control unit (or central processing unit) 40 that is operatively associated at least with the first detection device 10 in order to process the images detected by the latter, and to operate said surgical member 50 as a function of these images by means of a suitable command signal 100.
Particularly, the control unit 40 can comprise a pre-processing block 41 being provided with a filtering block 41a that is arranged for filtering the noise and disturbances of the signal received from the first detection device 10.
The pre-processing block 41 can further comprise a scaling block 41b, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a) such that the corresponding display occurs in a determined scale. The pre-processing block 41 can further comprise a spatial alignment block 41c, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a or scaling block 41b) such that the corresponding display occurs in a determined reference system. As regards scaling and spatial alignment, the following technique is to be considered as a preferred, though not exclusive, example: to guide the alignment, the absolute coordinates of several pixels (2D case) and voxels (3D case) can be used. The transformation between two coordinate systems is given by the following expression:
Figure imgf000007_0001
Wherein x is the coordinates of the "aligned" image (i.e. the image suitably scaled and referred to the desired reference system), y is the coordinates of the initial image (for example, the first identification image 11), A is the scaling and rotation matrix and T is the translation matrix. Accordingly, by knowing the absolute coordinates of 4 non-coplanar points, by means of specific mathematic operations, the transformation matrixes A and T are obtained, which allow superimposing the aligned image to the starting image.
Generally, as the points after transformation do not coincide with the point grid of the starting image, an interpolation is required in order to obtain the intensity values of the points just calculated.
Preferably, the equipment 1 further comprises one or more sensors 60 that are operatively associated with the patient's body.
The sensors 60 have the task of detecting the patient's heart rate, the movements associated with breathing and/or the movements associated with tremor events. Generally, the sensors 60 are arranged to detect physiological signals, such as mechanical and/or electrical, and/or electromagnetic and/or electrochemical and/or the like, preferably periodical, which are generated by the patient's body. The sensors 60 are connected to the processing unit 40, such that "blurring" events (i.e. interference between the vibrations generated by the patient's body and the detections carried out by the first device 10) can be avoided, which are due to the movements of the subject being observed and/or due to other events that are a disturbance to the detections carried out by said device. The processing unit 40, in fact, is provided with a synchronization block 42, which is operatively associated with at least the first detection device 10 in order to synchronize the signal incorporating the first images 11 with the signals 60a generated by the sensors 60.
Conveniently, the synchronization block 42 is collected downstream of the preprocessing block 41, such that the synchronization block 42 can operate on images that have been processed and filtered beforehand. This synchronization is carried out particularly with reference to the frequencies and amplitudes of the movements that are generated by the patient's body.
The equipment 1 can further comprise a graphic communication interface 70, by means of which an operator can interact with the equipment 1.
The communication interface 70 is provided with data presentation means, such as a monitor or similar display device, by means of which it provides the operator with the information that is detected by the first detection device 10. The communication interface 70 is further provided with data input means, which can be employed by the operator to input information and/or commands. Particularly, when the result of the first detection has been displayed, the operator can decide whether to proceed with a further inspection, by means of a subsequent detection which will be described below, or to input a command to the equipment 1 in order to activate the generation of said command signal 100, such that the surgical element 50 can be moved according to the data available so far. The subsequent detection can be carried out using a second detection device 20 and/or a third detection device 30, as will be discussed herein below. The equipment 1 can comprise a second detection device 20, in order to acquire a plurality of three-dimensional images of at least one portion of the patient's body; preferably, this portion comprises the region identified by the first detection device 10.
In other words, the second detection device 20 has the task of acquiring geometrical (or morphological) information at least on said region. This acquisition can be generally carried out throughout the patient's body, however, in order to limit the duration and complexity of the operation, as well as the discomfort caused to the patient, the acquisition of the three-dimensional images is provided to be carried out only on a portion of the patient's body. Practically, the second detection device 20 can be a 2D and/or 3D scanning system suitable for acquiring the external morphology of a limited region of the patient, by means of a plurality of two-dimensional and three-dimensional images; for example, the second detection device 20 can be a structured light acquisition system.
Generally, the second detection device 20 acquires a plurality of second acquisition images 21 that are time-ordered, i.e. detected in subsequent time instants from each other. The second detection device 20 can be associated with a respective second moving element 5 Id that is arranged for moving the second device 20, and belonging to the general moving system 51.
Thereby, the acquisition process performed by the second device 20 can be carried out by a suitable mutual moving between the detection device 20 and the portion to be examined.
The second moving element 51d can be driven by a command device, by means of which a user can define the type of scanning as desired; preferably this command device is the same command device that operatively drives the first moving element 51c. The second detection device 20 is operatively associated with the filtering block 41a, such that the second images 21 can be filtered by disturbances or noise that can be present.
The second detection device 20 can be operatively associated with the synchronization block 42, such that the signals incorporating the second images 21 can be synchronized with the signals 60a generated by the sensors 60. Advantageously, the control unit 40 is operatively associated both with the first and second detection devices 10, 20 in order to combine the identification of the first device 10 with the acquisition carried out by the second device 20; following this combination, the control unit 40 generates a corresponding display signal for the interface 70, which helps the operator to generate a command 100 for moving the surgical element 50 as a function of this combination.
The command signal 100 can be sent to the moving means 51, which act on the architecture and thus on the surgical element 50, such that the latter can be moved as desired.
As stated above, more particularly, the control unit 40 can comprise, within the pre-processing block 41, an adaptation unit 41b, which is arranged for referring the images that are detected by the first and second detection devices 10 and 20 to a same (either reduction or magnification) scale, such that the images result to be comparable to each other; this operation is generally indicated as the "scaling".
The selected reference scale can be that of the first identification images 11, that of the second acquisition images 21, or a scale other than the preceding ones. The control unit 40 can further comprise, within the pre-processing block 41, a spatial alignment block 41c in order to refer the images provided by the first and second detection devices 10, 20 to a same spatial reference system. In other words, the detections carried out by the first and second devices 10, 20 are combined in a same spatial reference system, such that the various images from the two devices 10, 20 can be superimposed and the information provided in said first and second images 11, 21 is simultaneously available. Preferably, the spatial alignment block 41c is connected downstream of the adaptation unit 41b such that the detected images are inserted in the same spatial reference after they have been all transformed to the same scale. The synchronization block 42 can be connected to the filtering block 41a and/or the spatial alignment block 41c; thereby, the images provided by the second detection device 20 can be also synchronized with the signals 60a from the sensors 60.
Furthermore, by means of the connection interface 70, the operator is provided both with the information detected by means of the first detection device 10, and with the information detected by means of the second detection device 20; the operator can, at this stage, decide whether to activate the generation of said command signal 100, or to input a command in the equipment 1 in order to carry out with further detections. The control unit 40 can also comprise time alignment means 43, in order to refer the identification of the first device 10 and the acquisition of the second device 20 to a same time scale: this results in a command imposing the simultaneous acquisition by the two devices 10 and 20.
The time alignment means 43 have thus the task of setting the first identification images 11 (detected by the first device 10) and the second acquisition images 21 (detected by the second device 20) in a same time reference system.
Preferably, the first and second detection devices 10, 20 have a same image detection frequency [or one is a multiple of the other], and are particularly in phase to each other, such as to obtain said simultaneous detection. In greater detail, the time alignment means 43 ensure the simultaneity of the identifications by the first device 10 an/or the identifications carried out by the second device 20, obtaining said time alignment.
Practically, the purpose of the time alignment means 43 is to associate each of said first identification images 11 with at least one of said second acquisition images 21, these first and second images 11, 21 relating to an identification and an acquisition in a same instant, respectively. This is necessary for carrying out a dynamic multimodal display.
Accordingly, with each first identification image 11, which is directly obtained by a detection of the first device 10 that is managed by the time alignment means 43, is associated a first acquisition image 21, which is also directly obtained by the second detection device 20 that is managed by means of the same alignment means 43.
Similarly, a second identification image 11, which is obtained like said first image 11 and temporally subsequent thereto, is associated with a second acquisition image 21 that is also obtained like said first image 21, and temporally subsequent thereto. The same work is done for the subsequent images. Thereby, two sequences are generated, which are synchronized to each other, of images 11 and 21 suitable to represent the time evolution of the events detected by the devices 10 and 20. The control unit 40 can further comprise a reconstruction block 44, which is operatively associated at least with the first and second detection devices 10, 20; preferably, the reconstruction block is connected downstream of the time alignment means 43.
The reconstruction block 44 has the task of obtaining a suitable reconstruction, relative to the scanning carried out by the devices, of the signals detected by the detection devices 10, 20. By way of example, suitable mathematical algorithms, process the signals and/or information from the block 43 such as to generate a planar image and/or a volume reconstruction, and/or a surface reconstruction in the space. By way of non-limiting example, mathematical algorithms for volume reconstruction can be "Filtered Back Projection" (FBP), and/or "Ordered Subset Expectation Maximisation" (OSEM), and/or the like.
The control unit 40 can further comprise the composition block 45, which is operatively associated with at least the first and second detection devices 10, 20. Practically, the composition block 45 is arranged to combine with each other (for example, by means of a pyramidal technique, "wavelet", and the like) the information incorporated in the images supplied by the detection devices 10, 20 such as to obtain a corresponding fusion image 123.
The composition block 45 can be operatively associated with the synchronization 42, time alignment 43, and reconstruction 44 blocks, such that the signal incorporating the fusion image 123 can be synchronized with the signals 60a generated by the sensors 60.
The following fusion technique using a Laplace pyramid transform has to be considered as a preferred but not exclusive example. According to this technique, a pyramid is defined as a sequence of auxiliary images where each level in the pyramid is a filtered and subsampled copy of the preceding level. The lowermost level in the pyramid has the same scale as the original image (for example, the first identification image 11) and contains the information of higher resolution than the remaining levels in the pyramid. The highest levels in the pyramid have a lower resolution, but they have a higher scale than the original image. The basic concept is making a pyramid for the fused image (for example, the fusion image 123) from the pyramids of each starting image (for example, the first and second images 11, 21).
The fusion image 123 is then obtained by operating an inverse transformation on the pyramid. The first step is making the pyramid for each source image; the fusion is thus obtained for each level in the pyramid using a selection principle that is based on the absolute maximum luminosity or on component average or other selection principles.
Finally, the fused image (for example, the image 123) of the fused pyramid is reconstructed.
To this purpose, the composition block 45 can comprise a first processing block
45 a that is operatively associated with the first detection device 10 to receive at least one of said first identification images 11 and generating a corresponding first auxiliary identification image 12. Particularly, the first auxiliary image 12 has lower resolution and higher scale than the first identification image 11 from which it has been generated; in other words, the first auxiliary image 12 is smaller and less defined than the first starting image 11.
The composition block 45 comprises a processing block 45b which is operatively associated with the second detection device 20 in order to receive the second acquisition image 21 that is associated with the first identification image 11 and to generate a corresponding second auxiliary acquisition image
22.
Preferably, the second auxiliary acquisition image 22 has a lower resolution and a higher scale than the first image 21 from which it has been generated; in other words, the second auxiliary image 22 is smaller and less defined than the second starting image 21.
In the preferred embodiment, the auxiliary images 12, 22 have the same resolution; furthermore, the first and second auxiliary images 12, 22 have the same reduction scale relative to the real dimensions of the imaged area of human body.
The composition block 45 can further comprise combination means 45d that are operatively associated with the processing blocks 45a and 45b for generating a fusion image 123 as a function of the combination of the first and second auxiliary images 12, 22.
Practically, the composition means 45d have the task of combining with each other the information incorporated in the first auxiliary image 12 and in the second auxiliary image 22, such as to obtain said fusion image 123, in which the relevant data from the first and second images 11, 21 (from which the images 12, 22 have been generated) are suitably combined.
This allows obtaining, for example, a morphological-functional imaging, i.e. an image simultaneously containing morphological and functional information of a same portion of tissue. After the fusion image has been obtained 123, the control unit 40 sends it to the graphic interface 70. The operator then activates the generation of said command signal 100, as a function of said fusion image 123, by means of the suitable moving control system 46; i.e. according to what is shown in the fusion image 123, the control unit 40 provides to move the surgical element 50 such that the latter can properly operate, preferably following a confirm command being entered by the operator, after the latter has checked the contents of the fusion image 123.
By way of non-limiting example, the procedure of tissue sampling via a needle (typically, a biopsy) will be described below: the steps of acquisition, processing, synchronization and fusion as described above generate a video image with a spatial and functional content that is subjected to the operator's interpretation. The system is capable of providing the spatial coordinates, relative to a known reference system, of the location in which the operator decides to take some tissue. The central processing unit, by means of suitable routines of direct and inverse kinematics (by way of example, refer to the Denavit-Hartenberg matrixes), guides the moving and orientation, in order to adopt an operative position of the pointer and any surgical element, such as the biopsy needle. After the operative position has been reached, the moving and actuation of the medical element can be commanded, i.e. according to the example, the biopsy sampling, or the activation of the laser emitter indicating the region to be operated can be carried out.
In a preferred embodiment, the equipment 1 further comprises a third detection device 30, in order to make the operation of the equipment 1 further accurate and reliable. The third detection device 30 can be, for example, a ultra-sound and/or radiographic and/or nuclear magnetic resonance scanning system; this device plays a major role in the acquisition of morphological images of hard and/or soft tissues.
Preferably, the third detection device 30 comprises a ultrasound and/or radiographic and/or nuclear magnetic resonance probe. The control unit 40 is operatively associated with the third detection device 30 in order to combine, within said coordinate system, the detection by the third detection device 30 with the acquisition of the first device 10 and/or with the identification of the second device 20.
In other words, the images detected by the third detection device 30 are processed such as to be referred to the same spatial coordinate system as used for the acquisition carried out by the first device 10 and/or for the identification carried out by the second device 20.
Thereby, the detections of the third device 30 can be superimposed to what has been detected by the first and/or second devices 10, 20, such that the various available information can be simultaneously used for moving the surgical element 50.
Particularly, the third detection device 30 is operatively associated with the filtering block 41a, such that the images provided by the third device can be filtered from disturbances or noise that may be present. Preferably, the third detection device 30 is also operatively associated with the adaptation unit 41b, such that the images detected by the third device 30 can be referred to the same scale as the images from the first and/or second detection devices 10, 20.
The selected reference scale can be that of the first image 11, that of the second image 21, that of the images detected by the third device 30, or a scale other than the preceding ones.
Preferably, the third detection device 30 is also operatively associated with the spatial alignment block 41c, such that the images detected by the third device 30 can be referred to the same spatial reference as the images from the first and/or second detection devices 10, 20. Preferably, the third detection device 30 is also operatively associated with the synchronization means 42 such that the signal incorporating the third detection images 31 is synchronized with the signals 60a that are generated by the sensors 60. Preferably, the third detection device 30 is also operatively associated with the time alignment means 43, such that it can carry out detection simultaneously with the detection that is carried out by the devices 10 and 20. The detection by this third device 30 is thus referred to the same time scale as used for the acquisition of the first device 10 and/or the identification of the second device 20.
In the preferred embodiment, the third detection device 30 has the same image detection frequency (but it may be also a multiple or sub-multiple of the frequency) as the first and/or second detection devices 10, 20, and particularly, the detection of the third detection device 30 is in phase with the detection of the first and/or second detection devices 10, 20; thereby, a substantially simultaneous detection of the devices used can be obtained. In greater detail, the third detection device 30 provides a plurality of third detection images 31, which are time-ordered relative to each other. Similarly to what has been described above for the first and second detection devices 10, 20, the reconstruction and composition blocks 44 and 45 can be also associated with the third detection device 30 to combine the images acquired by the latter with those of the first and/or second detection devices 10, 20 to obtain a corresponding fuse image 123 and a command signal 100 that is preferably intended for the moving element 51. In greater detail, the composition block 45 can comprise a third processing block 45c that is operatively associated with the third detection device 30 to receive at least one third image 31 and generating a corresponding third auxiliary image
32.
Particularly, the third auxiliary image 32 has lower resolution and higher scale than the third image 31 from which it has been generated.
Preferably, the resolution and scale provided by the third auxiliary image 32 are substantially the same as provided by the first auxiliary image 12.
Said combination means 45d can be also operatively associated to the third detection device 30 for generating the fusion image 123 also as a function of the third auxiliary image 32.
Consequently, the command signal 100 can be generated also as a function of what has been detected by the third device 30.
Furthermore, following the detection carried out by the third detection device 30 by means of the communication interface 70, the operator is provided with the possibility of activating the generation of the command signal 100 or, alternatively, in the case where the detections that have been carried out will prove to be insufficient, the possibility of activating one or more of the detection devices 10, 20, 30.
It should be noted how, for clarity of description, reference has been made so far to an individual first auxiliary image 12, an individual second auxiliary image
22 and an individual third auxiliary image 32.
According to another exemplary embodiment, the first processing block 45a can be arranged for generating a plurality of first auxiliary images 12 from an individual first image 11. hi this case, these first auxiliary images 12 have, progressively, a lower resolution and a higher scale than the first starting image 11.
Practically, for each first image 11 (which can be considered as a "source image") a virtual pyramid is generated, which is defined by the sequence of first auxiliary images 12, in which each level - downward up to the vertex - is a filtered and subsampled copy of the auxiliary image of the lower level.
The lowermost level in the pyramid thus consists of the first source image 11; the uppermost levels have a lower resolution and a higher scale than the original image 11.
An entirely analogous consideration can be also done for the second processing block 45b: the latter can, in fact, generate a plurality of second auxiliary images
22 that have, in a progressive sequence, a lower resolution and higher scale than the second source image 21.
An entirely analogous consideration can be also done for the third processing block 45c: the latter can, in fact, generate a plurality of third auxiliary images 32 that have, in a progressive sequence, a lower resolution and higher scale than the third source image 31.
Advantageously, first and second auxiliary images 12, 22 that occupy corresponding levels have the same resolution and the same scale as the corresponding first and second source images 11, 21. Furthermore, first and third auxiliary images 12, 32 that occupy corresponding levels also have the same resolution and the same scale as the corresponding first and third source images 11, 31.
In addition to the above, the command unit 40 can be arranged for carrying out a volume reconstruction and a consequent tomographic imaging of the region of interest, starting from one or more scintigraphic detections that is obtained via the first detection device 10; practically, a series of acquisitions is carried out from different points of view, which can be combined in order to have the functional information according to the tomographic technique. It should be noted how the various functional blocks comprised within the control unit 40 have been separately and individually presented only to explain the different functionalities of the control unit 40; actually, however, the control unit 40 can be made as an individual electronic device, which is suitably programmed to carry out the operations described above. The invention achieves considerable advantages. First, the equipment according to the invention allows transferring the precision and reliability of the detections that have been carried out during the diagnostic step to the surgical step. Thereby, the total quality and therapeutic effectiveness of the operation are significantly improved, while reducing the duration of the latter and the discomfort caused to the patient. Furthermore, the equipment according to the invention allows tracing, localizing and pointing out in a precise manner, directly on the patient and on the same site where the operation will be carried out, the exact location in which the operation has to be carried out, thereby significantly reducing the positioning and alignment errors that are generated when the diagnostic scanning and therapeutic operation are carried out a separated place and time.
A further significant advantage offered by the device is the possibility of repeating the diagnostic scanning for several times either during (i.e. on-line) and/or at the end of the operation, such as to be capable of checking the result simultaneously with the operation, by following the time dynamic of the pathologic and/or physiologic event being inspected, or as a final check, which ensures how a sampling has been actually carried out according to the preceding diagnostic indications. It should be observed how a further advantage derives from the possibility of using the device s once suitably sized, for small animals: i.e. all the diagnosis functions are transferred "in- vivo" for real-time images guiding any medical element also on those subjects (by way of non-limiting examples, mice and rats) that are used for medical and pharmacologic research. The particular solution using several detection devices (such as the devices 10, 20, and 30) advantageously offers the possibility of integrating various types of information as they come from distinct dedicated diagnostic instruments. The possibility of carrying out a fusion of the information incorporated within the images provided by the various detection devices results notably advantageous when the apparatus is used in-vivo on patients or animals. The introduction of integrated techniques gains importance not only in guided surgery but generally also in those examinations on pathologies that require a great precision, such as neoplasias at an initial stage.
Particularly, the possibility offered by particular embodiments of the invention of using different morphological techniques (by way of a non-exhaustive example, the X-ray technique) confers a high degree of flexibility within the possible applications, in that the morphological and functional techniques can be selected and adapted to particular requirements, such as machines for operating rooms, diagnosis machines and biopsy sampling systems, and machines for in- vivo pharmacologic and diagnostic research on animals. Based on clinical inspections, the equipment will be provided with ultrasound or X-ray techniques integrated with scintigraphic techniques, by dimensioning the detection field to the pathology size or to the most suitable regions to be explored in the scintigraphic mode, by means of linear and/or tomographic scanning. Finally, the possibility of displaying the information revealed (acquired) by the detection device in accordance with an example of the invention is an advantage within the pharmacologic kinetics in that it offers an added value in the quality and quantity study of the drug behaviour and patient's reaction thereto.

Claims

1. An equipment for moving medical elements characterized in that it comprises:
- a first detection device (10) for identifying a region on a patient's body to be operated;
- a movable medical element (50) for operating said region;
- a control unit (40) that is operatively associated with said first device (10) to receive a signal representing at least the identification carried out by said first device (10) and to generate a corresponding command signal (100) for moving said medical element (50) as a function of said identification.
2. The equipment according to claim 1, characterized in that it further comprises a communication and visualization graphic interface (70) that is operatively associated with said control unit (40) in order to selectively allow an operator to activate the generation of said command signal (100) or commanding at least one further detection by simultaneously displaying the identification signals.
3. The equipment according to the preceding claims, characterized in that said first device (10) comprises a gamma camera that is arranged for scintigraphic identifications.
4. The equipment according to any preceding claim, characterized in that it further comprises one or more sensors (60) that are suitable to detect mechanical stress, preferably periodical, which are generated by the patient's body, and to generate corresponding signals (60a).
5. The equipment according to claim 4, characterized in that said command unit (40) comprises a synchronization block (42), that is operatively associated at least with the first detection device (10) and to said one or more sensors (60) in order to synchronize the signal generated by said detection device (10) with signals (60a) that are generated by said sensors (60).
6. The equipment according to any preceding claim, characterized in that it further comprises a second detection device (20) for detecting two-dimensional and/or three-dimensional images representing at least the patient's region as identified by the first detection device (10).
7. The equipment according to claim 6, characterized in that said control unit (40) is operatively associated with said first and second devices (10, 20) in order to generate said command signal as a function of the identification that has been carried out by the first device (10) and of the detection that has been carried out by the second device (20).
8. The equipment according to any preceding claim, characterized in that it further comprises a third detection device (30), that is preferably provided with an ultrasound probe.
9. The equipment according to claim 8, characterized in that said control unit (40) is operatively associated with the third detection device (30) in order to combine the detection of the third device (30) with the identification of the first device (10) and/or the acquisition of the second device (20).
10. The equipment according to any preceding claim, characterized in that said command unit (40) comprises a composition block (45) that is operatively associated with said first detection device (10) and at least one between said second and third detection devices (20, 30) in order to generate a fusion image (123) as a function of the images that have been detected by said detection devices (10, 20, 30).
11. The equipment according to claim 10, characterized in that said control unit (40) further comprises a moving control system (46) in order to generated said command signal (100) as a function of said fusion image (123), said moving control system (46) being preferably driven by said interface (70).
12. The equipment according to any preceding claim, characterized in that said control unit (40) comprises a filtering block (41a) for filtering the images from said first, second and/or third detection devices (10, 20, 30).
13. The equipment according to any preceding claim, characterized in that said control unit (40) further comprises an adaptation unit (41b) that is operatively associated with said first detection device (10) and at least one between said second and third detection devices (20, 30) in order to refer the images that have been detected by said devices (10, 20, 30) to a same scale.
14. The equipment according to any preceding claim, characterized in that said control unit (40) comprises a spatial alignment block (41c) that is operatively associated with said first detection device (10) and with at least one of said second and third detection devices (20, 30) in order to refer the images that have been detected by said devices (10, 20, 30) to a same spatial reference system.
15. The equipment according to any preceding claim, characterized in that said control unit (40) comprises time alignment means (43) that are operatively associated with said first detection device (10) and with at least one of said second and third detection devices (20, 30) in order to refer the images that have been detected by said devices (10, 20, 30) to a same time reference system.
16. The equipment according to any preceding claim, characterized in that the identification by said first detection device (10) provides a plurality of first identification images (11) that are time-ordered.
17. The equipment according to any claim 6 to 16, characterized in that the acquisition by said second detection device (20) provides a plurality of second acquisition images (21) that are time-ordered.
18. The equipment according to any claim 8 to 17, characterized in that the detection by said third detection device (30) provides a plurality of third detection images (31) that are time-ordered.
19. The equipment according to claim 17 or 18, characterized in that each of said first images (11) is associated with a respective second image (21), said first image (11) being representative of an identification in a time instant that is substantially coincident with the acquisition of which said respective second image (21 ) is representative.
20. The equipment according to claim 18 or 19, characterized in that each of said first images (11) is associated with a respective third image (31), said first image (11) being representative of an identification in a time instant that is substantially coincident with the detection of which said respective third image (31) is representative.
21. The equipment according to any claims 19 to 20, characterized in that each of said first images (11) is associated with said respective second image (21) and/or said respective third image (31) by means of said time alignment means (44).
22. The equipment according to any claims 16 to 21, characterized in that said composition block (45) further comprises a first processing block (45a) that is operatively associated with said first detection device (10) in order to receive at least one of said first images (11) and to generate at least one corresponding first auxiliary image (12).
23. The equipment according to claim 22, characterized in that said first auxiliary image (12) has a lower resolution and a higher scale than said first image (11).
24. The equipment according to any claim 17 to 23, characterized in that said composition block (45) further comprises a second processing block (45b) that s is operatively associated with said second detection device (20) in order to receive at least said respective second image (21) and to generate at least one corresponding second auxiliary image (22).
25. The equipment according to claim 24, characterized in that said second auxiliary image (22) has a lower resolution and higher scale than said respective o second image (21).
26. The equipment according to any claim 18 to 25, characterized in that said composition block (45) further comprises a third processing block (45c) that is operatively associated with said third detection device (30) in order to receive at least said respective third image (31) and to generate at least one corresponding s third auxiliary image (32).
27. The equipment according to claim 26, characterized in that said third auxiliary image (32) has a lower resolution and higher scale than said respective third image (31).
28. The equipment according to claim 26 or 27, characterized in that said second 0 auxiliary image (22) and preferably said third auxiliary image (32) substantially have the same resolution and same scale as said first auxiliary image (12).
29. The equipment according to any claim 22 to 28, characterized in that said composition block (45) further comprises combination means (45d) that are operatively associated with said first, second and preferably third processing 5 blocks (41, 42, 43) in order to generate said fusion image (123) as a function of the combination of said first, second, and preferably third auxiliary images (12, 22, 32).
30. The equipment according to claim 1, characterized in that said medical element (50) is of a surgical type.
31. The equipment according to claim 30, characterized in that said surgical element comprises at least one of the following tools: tool for incisions, tool for holes, tool for injections, tool for liquid or solid sampling, a biopsy needle.
32. The equipment according to claim 1, characterized in that said medical element comprises a pointer element that is capable of indicating at least one surface of this region.
33. The equipment according to claim 32, characterized in that said pointer element is such as to emit a visible radiation beam.
34. The equipment according to claim 33, characterized in that said pointer element is such as to project a figure on said at least one surface.
35. The equipment according to claim 34, characterized in that said pointer element comprises a laser emitting device.
36. The equipment according to claims 30 and 32, characterized in that said medical element of surgical type further comprises said pointer element.
37. The equipment according to any preceding claim, characterized in that it further comprises moving means (51b) that are driven by said control unit (40) in order to receive said command signal (100) and correspondingly moving said medical element (50).
38. An operating method of an equipment for moving medical elements, characterized in that it comprises: - providing an information identifying a patient's body region to be operated; having a medical element (50) for operating said region; providing a signal representing at least said identifying information to a control unit (40); - generating a command signal (100) by the control unit (40) as a function of said identification information; moving said medical element (50) based on said command signal in order to adopt at least one preparatory position for an operation on said region by means of the medical element.
39. The method according to claim 38, characterized hi that said medical element (50) is of a surgical type.
40. The method according to claim 38, characterized in that said medical element comprises a pointer element that is capable of indicating at least one surface of said region.
PCT/IT2006/000758 2005-10-28 2006-10-27 Apparatus for moving surgical instruments WO2007049323A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP06821748A EP1951141A1 (en) 2005-10-28 2006-10-27 Apparatus for moving surgical instruments

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT002060A ITMI20052060A1 (en) 2005-10-28 2005-10-28 EQUIPMENT FOR THE MOVEMENT OF SURGICAL ORGANS
ITMI2005A002060 2005-10-28

Publications (1)

Publication Number Publication Date
WO2007049323A1 true WO2007049323A1 (en) 2007-05-03

Family

ID=37745588

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IT2006/000758 WO2007049323A1 (en) 2005-10-28 2006-10-27 Apparatus for moving surgical instruments

Country Status (3)

Country Link
EP (1) EP1951141A1 (en)
IT (1) ITMI20052060A1 (en)
WO (1) WO2007049323A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US5676673A (en) * 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
DE10045779A1 (en) * 2000-07-22 2002-02-21 Robert Boesecke Surgical robot arm provides and guides surgical and radiological instruments
US20030114743A1 (en) * 2001-12-19 2003-06-19 Kai Eck Method of improving the resolution of a medical nuclear image
US20030128801A1 (en) * 2002-01-07 2003-07-10 Multi-Dimensional Imaging, Inc. Multi-modality apparatus for dynamic anatomical, physiological and molecular imaging
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US5676673A (en) * 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
DE10045779A1 (en) * 2000-07-22 2002-02-21 Robert Boesecke Surgical robot arm provides and guides surgical and radiological instruments
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
US20030114743A1 (en) * 2001-12-19 2003-06-19 Kai Eck Method of improving the resolution of a medical nuclear image
US20030128801A1 (en) * 2002-01-07 2003-07-10 Multi-Dimensional Imaging, Inc. Multi-modality apparatus for dynamic anatomical, physiological and molecular imaging

Also Published As

Publication number Publication date
ITMI20052060A1 (en) 2007-04-29
EP1951141A1 (en) 2008-08-06

Similar Documents

Publication Publication Date Title
JP5417609B2 (en) Medical diagnostic imaging equipment
EP2372660A2 (en) Projection image generation apparatus and method, and computer readable recording medium on which is recorded program for the same
CN102665560B (en) X-ray computed tomography device and image display method based thereon
US20080198966A1 (en) Method and Arrangement Relating to X-Ray Imaging
JP6950801B2 (en) Diagnostic imaging system
US20040161137A1 (en) Method of determining physical parameters of bodily structures
EP1787594A2 (en) System and method for improved ablation of tumors
EP2934326A2 (en) Three dimensional mapping display system for diagnostic ultrasound machines
EP0406352A1 (en) Process and apparatus particularly for guiding neurosurgical operations
CN101410060A (en) Determining tissue surrounding an object being inserted into a patient
CN102512209A (en) Ultrasonography device
CN101422378B (en) Ultrasound diagnostic device
US20110286653A1 (en) Method for processing radiological images to determine a 3d position of a needle
US10537293B2 (en) X-ray CT system, image display device, and image display method
US20130223703A1 (en) Medical image processing apparatus
US10922812B2 (en) Image processing apparatus, x-ray diagnostic apparatus, and image processing method
CN115279272A (en) Apparatus and method for automated ultrasound segmentation for visualization and measurement
US9339249B2 (en) Medical image processing apparatus
US20230320700A1 (en) Apparatus and method for automatic ultrasound segmentation for visualization and measurement
JP4634179B2 (en) Diagnostic imaging equipment
JP6959612B2 (en) Diagnostic imaging system
JP2000051207A (en) Medical image processor
US8625873B2 (en) Medical image processing apparatus
EP1951141A1 (en) Apparatus for moving surgical instruments
JP6953974B2 (en) Diagnostic imaging system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006821748

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2006821748

Country of ref document: EP