WO2012101286A1 - Insertion procedures in augmented reality - Google Patents

Insertion procedures in augmented reality Download PDF

Info

Publication number
WO2012101286A1
WO2012101286A1 PCT/EP2012/051469 EP2012051469W WO2012101286A1 WO 2012101286 A1 WO2012101286 A1 WO 2012101286A1 EP 2012051469 W EP2012051469 W EP 2012051469W WO 2012101286 A1 WO2012101286 A1 WO 2012101286A1
Authority
WO
WIPO (PCT)
Prior art keywords
entry
path
computer programme
virtual
computer
Prior art date
Application number
PCT/EP2012/051469
Other languages
French (fr)
Inventor
Jacqueline Francisca Gerarda Maria SCHOOLEMAN
Original Assignee
Virtual Proteins B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtual Proteins B.V. filed Critical Virtual Proteins B.V.
Publication of WO2012101286A1 publication Critical patent/WO2012101286A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints

Definitions

  • the invention relates to insertion procedures, particularly high-precision insertion procedures, for contacting a target area of an object, and more particularly to methods for planning, training or performing such insertion procedures, as well as to systems, computer programs and computer program products configured for use in such insertion procedures.
  • insertion procedures include the insertion of a biopsy needle to extract a diagnostic sample from the target area within the body of a patient, or the insertion of a delivery needle for local delivery of a medicament, e.g. , placement of a radiation source such as radioactive seeds in the target area during brachytherapy for treatment of tumours or cancer.
  • a radiation source such as radioactive seeds
  • the insertion tool such as a needle needs to avoid certain obstacles and/or select structures within the object, such as for example bones, vital organs, blood vessels, nerves and the like.
  • An optimal entry strategy for needle insertion needs to be defined during a pre-procedure planning phase.
  • precautions need to be made to ensure that the pre-defined entry strategy is subsequently followed through during the procedure.
  • a medical insertion procedure to start a three-dimensional (3D) scan is made of the body of a patient or of an area of interest of said body, using a suitable medical imaging technique such as for example computed tomography (CT), providing spatial data on the anatomy of the body or area of interest .
  • CT computed tomography
  • the data is converted into an image of the underlying anatomy and the information gathered from the image allows a medical practitioner to define an optimal path of entry.
  • the optimal path of entry can usually constitute a straight line connecting a target area within the body with an entry point on the surface of the body, but avoiding any obstacles and/or vital structures within.
  • a needle is injected into the patient's body at the entry point and along the optimal path of entry, albeit only partially.
  • a new scan is made with the needle partially inserted (and visible in the resulting image), to ensure the conformity of the actual position and depth of the needle with the pre-defined optimal path of entry, and to re-adjust where required.
  • This sequence of partially inserting the needle, imaging and re-adjusting is repeated several times until the needle reaches the desired position and depth and contacts the target area.
  • EP 1 095 628 describes a method for planning needle surgery wherein a virtual entry trajectory is defined and displayed in a volumetric image of a patient's anatomy obtained by a scanning technique.
  • a medical practitioner aligns a virtual surgical instrument visualised in the same image, and representing a physical surgical instrument controlled by a mechanical arm assembly, along the virtual entry trajectory. Once the desired alignment is achieved, the mechanical arm assembly is locked in its present position and orientation, and the practitioner then advances the surgical instrument into the patient's body.
  • the planning of the entry trajectory and the alignment of the surgical instrument therewith are performed in virtual space, whereas the actual insertion of the surgical instrument occurs wholly in physical space.
  • the visual information about the entry trajectory and optionally the patient's anatomy are only available during the pre-operation phase but not during the very crucial operation phase. Consequently, once the mechanical arm assembly locks the surgical instrument along the chosen optimal entry path, the advancement of the surgical instrument cannot be modified or corrected by the practitioner.
  • WO 2007/136771 discloses a method and apparatus to impose haptic constraints on movements of a medical practitioner to ensure a desired position, orientation velocity and/or acceleration of a surgical tool operated by the latter, such as to maintain the surgical tool within a predefined virtual boundary registered to the anatomy of the patient.
  • haptic e.g., tactile and/or force
  • the practitioner may be in doubt whether this is due to the distinct properties (e.g., hardness) of the tissues or organs contacted by surgical instrument or due to the haptic guidance indicating a potential deviation from the intended procedure.
  • the system does not help to minimise the initial occurrence of such deviations.
  • the present invention aims to provide methods and systems for planning, training and/or performing insertion procedures which address the shortcomings existing in the art, and particularly such methods and systems that ensure more pronounced and intuitive compliance with a pre-determined optimal path of entry during insertion procedures, while providing for flexibility in slightly modifying the path of entry in a well-informed manner.
  • An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • a further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps: (a) registering the path of entry or part thereof to the object;
  • Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (d) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
  • Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (d) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
  • Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
  • any one of the methods may comprise the step:
  • - defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • any one of the methods may comprise the steps:
  • any one or more such preparatory steps may be carried out by one or more persons same as or distinct from (e.g., persons at diverging geographical locations) the user planning, performing and/or training the insertion procedure.
  • a further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
  • a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof;
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
  • Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
  • a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof;
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of:
  • - means for e.g., an imaging or scanning device for
  • obtaining a data set comprising information on interior spatial structure of the object or part thereof
  • - means for e.g., a computing system programmed for
  • - means for e.g., a computing system programmed for
  • a user e.g., an image generating system
  • the haptic device in communication with the insertion tool may be controlled (i.e., operated or instructed) by a computing system programmed to configure said haptic device to control deviation of the insertion tool from the path of entry or part thereof.
  • the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof.
  • the haptic device and the image generating system may be controlled by same or separate computing system(s) programmed accordingly, wherein such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • This or additional computing system(s) may be provided programmed for any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof; and - configuring an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • a computing system e.g., a digital computer programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
  • the methods, systems and related aspects as disclosed herein can immerse the user in an augmented reality environment comprising displayed therein the object of the insertion procedure or its physical or virtual surrogate and superimposed thereon a virtual rendition of a predetermined path of entry or part thereof guiding to a target area within the object.
  • the user can observe the physical insertion tool (or where appropriate (also) a virtual cursor representing and superimposed on the tool) and its spatial relation with the virtual rendition of the predefined optimal path of entry, relying on this intuitive and highly informative visual guidance to ensure compliance with the path of entry.
  • the present methods and systems also encompass haptic guidance which further controls any unwanted deviations from the path of entry.
  • the haptic guidance will contain the movement and will assist to ensure the compliance with the path of entry.
  • the user can instantly determine whether this may be due to encountering obstacles and/or structures within the object, since the present methods and systems may be configured such that insofar the user remains within the visually indicated path of entry no haptic input is expected. This can provide the user with additional freedom during the procedure and/or prevent potentially detrimental disruptions of so-encountered obstacles and/or structures.
  • the insertion procedure may be surgical, particularly a surgical insertion procedure performed on an object which is a living human or animal body, more particularly for the purposes of treatment and/or diagnosis.
  • surgical procedures may include without limitation minimally invasive surgery (MIS) procedures, percutaneous interventions, laparoscopic or endoscopic interventions, arthroscopy, biopsy, and brachytherapy particularly for treatment of tumours or cancer.
  • MIS minimally invasive surgery
  • Preferred may be surgical procedures involving the insertion of a needle, e.g. a delivery (precision placement) needle or biopsy needle.
  • Alternative insertion tools include instruments configured for placement of screws such as pedicle screws, borescopes, etc.
  • the insertion procedure may be non-surgical.
  • nonsurgical procedures may include without limitation visual inspection, reparation and/or dismantling of non-living objects, such as for example mechanical, electromechanical or electronic apparatuses, devices or appliances or part thereof, such as for example engines (e.g., aircraft engine, diesel engine, electrical engine, etc.), turbines (e.g., gas turbine, steam turbine, etc.), nuclear reactors, firearms or explosive devices.
  • engines e.g., aircraft engine, diesel engine, electrical engine, etc.
  • turbines e.g., gas turbine, steam turbine, etc.
  • nuclear reactors e.g., firearms or explosive devices.
  • the object may be non-transparent (including at least partly or largely non- transparent objects).
  • non-transparent as used herein carries its common meaning, it particularly refers to objects whose non-transparent nature causes that the target area and/or the path of entry or at least a portion thereof cannot be viewed or seen by naked eye.
  • the present methods, systems and related aspects may be particularly informative and necessary for insertion procedures on non-transparent objects, although the scope also encompasses situations in which the object may be transparent.
  • the object may be a living human or animal body or a part thereof, such as a body portion, organ, tissue or cell grouping that may be integral to or separated from the human or animal body.
  • the present methods, systems and related aspects provide for visual and haptic guidance aimed to ensure compliance with (i.e., observance of) the desired path of entry.
  • the path of entry connects an entry point or entry area on the surface of an object with the target area of the object.
  • the path of entry may be extrapolated to protrude to a certain extent (e.g., between >0 cm and about 20 or about 10 cm) beyond the entry point or entry area and away from the surface of the object, such as to allow for visual and haptic guidance even before the insertion tool contacts the object or the physical or virtual surrogate thereof.
  • the path of entry may be without limitation defined as or represented by any one or more of:
  • a one-dimensional object such as preferably a line, more preferably straight line;
  • a two-dimensional object such as a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc.;
  • a three-dimensional object such as preferably a generally cylindrical object (encompassing not only circular cylinders, which may be particularly preferred, but also shapes that can be generally approximated by a geometrical cylinder or a derivative thereof, e.g., frustoconical shape, a barrel shape, oblate or partially flattened cylinder shape, curved cylinder shape, cylindrical shapes with varying cross-sectional areas such as hourglass shape, bullet shape, etc.), a conical object, a funnel-shaped object, etc.
  • Such objects may be particularly tube-shaped, i.e., devoid of one or more caps.
  • Paths of entry defined as or represented by such two- or three-dimensional objects may also be deemed as collections or sets of a plurality of possible straight-line insertion approaches encompassed within such objects.
  • the width or diameter of such two- or three-dimensional objects can be adjusted to provide the user with more flexibility to choose the particular insertion trajectory (typically a straight-line trajectory), while still avoiding any obstacles and/or vital structures during the insertion procedure.
  • the width or diameter of such two- or three-dimensional objects may be constant or variable along the path of entry, indicative of the user's freedom to modify the movement, e.g., depending on the proximity of obstacles and/or vital structures.
  • the target area may also be defined as or represented by an object, such as a null-dimensional (e.g., a point), one-dimensional (e.g., a line), two-dimensional (e.g., a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc ), or three-dimensional object (e.g., a cube, sphere, etc.) or a combination thereof (e.g. two spheres with different radii to indicate not only the target area, but also an enveloping area so haptic and or visual feedback can be provided when the physician almost reaches the target area).
  • a null-dimensional e.g., a point
  • one-dimensional e.g., a line
  • two-dimensional e.g., a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc
  • three-dimensional object e.g., a cube, sphere, etc.
  • the present methods, systems and related aspects may employ suitable object markers perceivable in both the physical work space and thus in the augmented reality environment and in the data set or image of the interior spatial structure of the object or part thereof.
  • one, two or preferably three or more object markers are secured to the object or part thereof, and the object or part thereof is then subjected to technique capable of generating the data set and optionally image of the interior structure of the object or part thereof.
  • the object markers are chosen to be detectable by said technique, whereby the technique generates representations of the object markers in the data set or image of the interior structure of the object or part thereof.
  • the markers may be preferably attached to areas of the object substantially not susceptible to relative movement, e.g., to areas not joined by bendable elements such as joints.
  • the position and orientation of the path of entry i.e., a set of coordinates defining the path of entry, is then determined relative to the sets of coordinates defining the representations of the object markers in the data set or image of the interior structure of the object or part thereof.
  • the data set or image of the interior structure of the object or part thereof (or at least relevant elements thereof including the representations of the object markers and the path of entry) is matched back onto the object such that the representations of the object markers align or overlay with the object markers secured to the object, whereby the set of coordinates defining the path of entry is now determined relative to the actual object markers secured to the object, i.e., whereby the path of entry becomes registered to the object.
  • the object markers may be deemed as a tool for transforming the set of coordinates defining the path of entry in the coordinate system of the data set or image of the interior structure of the object or part thereof to a coordinate system employed in the augmented reality environment.
  • gold markers may be used in conjunction with imaging techniques such as MRI and CT scans.
  • One, two or preferably three or more object markers may be secured to the object.
  • the object markers may be complex, whereby the object marker may comprise a recognition pattern, for example a black and white recognition pattern, or the object markers may be simple, whereby the object marker might not comprise a recognition pattern.
  • one object marker may already be sufficient to determine the position and orientation of the object marker, which subsequently may be sufficient to derive the position and orientation of the object.
  • a higher accuracy may be obtained, especially for estimating the orientation of the object, by using relatively large object markers.
  • simple object markers e.g. not-patterned
  • multiple markers may be required to reach the desired accuracy.
  • at least three markers are spaced apart, which markers define an area.
  • the entry point for the insertion procedure lies within the area defined by at least three markers.
  • three markers are arranged to form an area shaped as a substantially scalene triangle. Said scalene triangle may uniquely define the position and orientation of the rigid body to which the markers are secured.
  • the object markers may be identified in this data set or image directly.
  • the object markers are chosen such that they are easily distinguishable from the surroundings, so a filter may be applied to identify the markers in the data set or image comprising information on the interior spatial structure of the object or part thereof.
  • the position and orientation of the object markers may be defined in a coordinate system related to the routine imaging or scanning apparatus.
  • the markers may be uniquely identified.
  • two cameras form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Given certain parameters of the camera system (e.g. focal length), triangulation of each stereoscopic pair may provide 3D coordinates for the object markers in a coordinate system related to the camera system.
  • Any coordinate system may function as the common coordinate system for registering the path of entry or part thereof to the object.
  • the coordinate system related to the camera system is used as the common coordinate system used for registering the path of entry or part thereof to the object.
  • the position and orientation of the object markers may then be defined in a coordinate system related to the camera system.
  • a transformation may subsequently be obtained between the coordinate system related to the camera system and the coordinate system related to the routine imaging or scanning apparatus.
  • This transformation will usually consist of a translation and a rotation, whereby the translation may be rigid, and whereby no scaling may be required since the data sets or images are of the same object.
  • An example of an algorithm providing such a transformation is the Iterative Closest Point algorithm (and extensions thereof, such as Milella, A.; Siegwart, "Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point"., in proceedings of the "IEEE International Conference on Computer Vision Systems 2006", pp 21 , January 2006, ISBN: 0-7695-2506-7).
  • An explicit method to obtain such transformation may also be used, e.g. when three markers are used.
  • the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the routine imaging or scanning apparatus into the coordinate system related to the camera system.
  • the data sets or images in the virtual space and the physical work space may be defined in the same coordinate space and may be registered to one another by co- locating or superimposing the detected object markers, thereby generating an image in augmented reality space. If only one camera is used to capture the view of the physical object of interest, and consequently no set of stereoscopic pairs can be constructed, the depth of the object cannot always be uniquely determined. To solve this issue when there is only one camera present to capture the view of the physical object of interest, either complex object markers, or more than three object markers may be required.
  • the position of the haptic device may be determined and registered into the coordinate system related to the camera system.
  • Object markers may be attached to the haptic device.
  • a limited set of complex object markers may be used, or a larger set of simple object markers may be used.
  • the object markers may be used in a similar fashion to determine the transformation from the coordinate system related to the haptic device to the coordinate system related to the camera system, as previously described for registering the path of entry or part thereof to the object.
  • the registration of the path of entry to the object may be dynamic or real-time, e.g., may be updated continuously, upon the user's or system's command or at preset intervals, such that the accuracy of the alignment between the object markers and their representations is examined and if disturbed (e.g., beyond a preset, acceptable threshold) the registration process is at least partly repeated to adjust the position and orientation of the path of entry to the new situation.
  • Such dynamic or real-time registration process may suitably take into account changes in the path of entry due to deformations and/or movements of the object, such as due to breathing or other involuntary movements of patients.
  • the present methods, systems and related aspects provide for visual guidance particularly in form of a virtual rendition (i.e., rendition as a visible virtual element) of the path of entry or part thereof superimposed on the image of the object or the physical or virtual surrogate thereof in an augmented reality environment.
  • a virtual rendition i.e., rendition as a visible virtual element
  • the path of entry may be suitably rendered in the augmented reality environment as the corresponding object, e.g., one-, two- or three-dimensional object, particularly preferably as a straight line or a cylindrical or funnel-shaped tube.
  • the appearance of the virtual rendition of the path of entry e.g., colour, texture, brightness, contrast, etc.
  • the relative transparency of the virtual rendition of the path of entry may be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof.
  • the appearance of the virtual rendition of the path of entry may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, etc.
  • this may advantageously be a portion adjacent to the surface of the object, and optionally and preferably a portion protruding away from the surface of the object, such as to correctly prime the insertion procedure.
  • the above principles may be analogously applied to virtual rendition of the target area in the augmented reality environment.
  • Visibly rendering the target area may provide the user with more intuitive and thorough understanding of inter alia the required depth and level of precision of the insertion procedure.
  • the appearance of the virtual rendition of the target area may be configured to vary in the course of the insertion procedure, such as, e.g., in function of the distance of the insertion tool from the target area.
  • the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool).
  • the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object).
  • the relative transparency of the virtual rendition of the cursor can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, and the virtual cursor.
  • the appearance of the virtual rendition of the cursor may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, or in function of the distance of the insertion tool from the target area, etc.
  • Visibly rendering the virtual cursor can provide the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry, thereby allowing for much better informed insertion procedures, and allowing for the synergistic cross-talk between visual and haptic cues throughout substantially the entire procedure.
  • the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of the interior spatial structure of the object or part thereof generated from the data set comprising information on said interior spatial structure of the object or part thereof and superimposed on the image of the object or the physical or virtual surrogate thereof.
  • the relative transparency of the virtual rendition of the interior spatial structure of the object can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual cursor, and the virtual rendition of the interior spatial structure of the object or part thereof.
  • haptic guidance particularly aims to ensure compliance with (i.e., observance of) a preset, desired spatial and/or kinetic relation of the insertion tool relative to the body or the physical or virtual surrogate thereof, said preset, desired spatial and/or kinetic relation being compatible with or conducive to maintaining the path of entry.
  • the haptic device is configured to control deviation (i.e., departure, divergence or discrepancy) of the insertion tool from the path of entry
  • the haptic device may be configured to control deviation of the actual position, orientation, velocity and/or acceleration (preferably at least position and/or orientation) of the insertion tool from a preset, desired position, orientation, velocity and/or acceleration of the insertion tool which is compatible with or conducive to maintaining the path of entry.
  • Controlling deviation of the insertion tool from the path of entry by the haptic device may entail one or more measures generally aimed at one or more or all of: reducing or preventing the occurrence of a deviation; minimising the extent or size of an occurred deviation; correcting an occurred deviation.
  • measures are realised through the haptic device being configured to impose haptic feedback on the insertion tool, to be perceived by a user manipulating the insertion tool.
  • the haptic device may be configured to impose on the insertion tool haptic feedback comprising any one or more or all of tactile (e.g., vibration), force and/or torque feedback.
  • haptic feedback may be perceived by the user as comprising any one or more or all of:
  • - pull i.e., a drag or tow driving the insertion tool towards, to alignment with and/or along the path of entry;
  • resistance to a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or rotation of said movement, which nevertheless allows the movement to occur and/or proceed.
  • the magnitude of such counter-force and/or counter-torque may be without limitation constant or may be proportional to (e.g., linearly or non-linearly, such as exponentially, proportional to) the magnitude of said movement.
  • prohibition of a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or torque of said movement, which is such that it blocks or prevents said movement from occurring and/or proceeding further.
  • Haptic devices and haptic rendering in virtual reality solutions are known per se and can be suitably integrated with the present system (see, inter alia, McLaughlin et al. "Touch in Virtual Environments: Haptics and the Design of Interactive Systems", 1 st ed., Pearson Education 2001 , ISBN 0130650978; M Grunwald, ed., “Human Haptic Perception: Basics and Applications", 1 st ed., Birkhauser Basel 2008, ISBN 37643761 12; Lin & Otaduy, eds., “Haptic Rendering: Foundations, Algorithms and Applications", A K Peters 2008, ISBN 1568813325; WO 2007/136771).
  • the haptic device is a 6-degrees of freedom haptic device.
  • the present methods, systems and related aspects may preferably allow for a stereoscopic (i.e., three-dimensional, 3D) view of the augmented reality environment.
  • This may particularly entail the stereoscopic view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.).
  • Such stereoscopic view allows a user to perceive the depth of the viewed scene, ensures a more realistic experience and thus helps the user to more intuitively and accurately plan, perform or train any insertion procedures.
  • Means and processes for capturing stereoscopic images of physical work space, rendering stereoscopic images of virtual space, combining said images to produce composite stereoscopic images of the mixed, i.e., augmented reality space, and stereoscopically displaying the resulting images thereby providing the user with an augmented reality environment are known per se and may be applied herein (see inter alia Judge, "Stereoscopic Photography", Ghose Press 2008, ISBN: 1443731366; Girling, “Stereoscopic Drawing: A Theory of 3-D Vision and its application to Stereoscopic Drawing", 1 st ed., Reel Three-D Enterprises 1990, ISBN: 0951602802).
  • the present methods, systems and related aspects may also preferably allow for realtime view of the augmented reality environment.
  • This may particularly include real-time view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.).
  • the image of the physical space may be preferably captured at a rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second.
  • display means providing the view of the augmented reality environment may have a refresh rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second.
  • Stereoscopic real-time view of the augmented reality environment may be particularly preferred.
  • any image generating system capable of generating an augmented reality environment may be employed.
  • image generating system may comprise image pickup means for capturing an image of a physical work space, virtual space image generating means for generating an image of a virtual space comprising desired virtual elements (such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, optionally the virtual cursor representing the insertion tool, etc.), composite image generating means for generating a composite image by synthesising the image of the virtual space generated by the virtual space image generating means and the image of the physical work space outputted by the image pickup means, and display means for displaying the composite image generated by the composite image generating means.
  • desired virtual elements such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part
  • a particularly advantageous feature taught herein entails displaying in the augmented reality environment a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool), which provides the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry.
  • the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object).
  • An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • a further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (c) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
  • Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (c) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
  • Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
  • any one of the methods may comprise the step:
  • - defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • any one of the methods may comprise the steps:
  • a further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • means for e.g., a computing system programmed for
  • Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of: - means for (e.g., an imaging or scanning device for) obtaining a data set comprising information on interior spatial structure of the object or part thereof;
  • - means for e.g., a computing system programmed for
  • - means for e.g., a computing system programmed for
  • a user e.g., an image generating system
  • the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • This or additional computing system(s) may be provided programmed for any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • a computing system e.g., a digital computer programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
  • Figure 1 illustrates a perspective view of an embodiment of the invention comprising an image generating system.
  • Figure 2 illustrates a perspective view of an augmented reality environment in an embodiment of the invention.
  • Figure 3 illustrates a preferred embodiment for registering the path of entry or part thereof to the object.
  • a path of entry may mean one path of entry or more than one paths of entry.
  • insertion procedure As used herein the terms "insertion procedure", “insertion task” or “insertion” may be used interchangeably to generally denote an action involving at least partly inserting (i.e., placing or introducing) a suitable tool, instrument or utensil (i.e., an "insertion tool") to an object. Particularly useful insertion procedures may be aimed at contacting a target area of an object by the insertion tool, typically by the distal end or distal portion of the insertion tool.
  • the insertion tool is configured to be manipulated by a user, preferably to be directly manipulated by the user.
  • the user may hold the insertion tool or may hold a suitable element in a mechanical connection (e.g. , directly or via one or more interposed mechanically connected elements) with the insertion tool.
  • a suitable insertion tool may comprise distally an insertion portion and proximally a handle or grip for grasping by a user.
  • the distal end of the insertion tool may comprise an end effector for performing a desired action in the object, more particularly at the target area.
  • Such end effectors may for example be configured for any one or more of: injecting, infusing, delivering or placing of substances, compositions or items (e.g., a delivery needle, e.g., connected to a reservoir such as a syringe), extracting or removing material from the object or target area (e.g., biopsy needle), cutting or incising (e.g., knife, scissors), grasping or holding (e.g., tongs, pincers, forceps), screwing (e.g., screwdriver), dilating or probing, cannulating (e.g.
  • injecting, infusing, delivering or placing of substances, compositions or items e.g., a delivery needle, e.g., connected to a reservoir such as a syringe
  • extracting or removing material from the object or target area e.g., biopsy needle
  • cutting or incising e.g., knife, scissors
  • grasping or holding e.g.
  • cannula draining or aspirating (e.g., a draining needle), suturing or ligating (e.g., a stitching needle), inspecting (e.g., camera), illuminating (e.g., a lighting element).
  • An end effector may be suitably actuated by the user (e.g., by manipulating a corresponding actuator provided on the handle or on another relatively more proximal part of the insertion tool).
  • Needles such as delivery needles or biopsy needles, endoscopic and laparoscopic instruments, or borescopes may be particularly preferred insertion tools, particularly for surgical insertion procedures.
  • object is used broadly herein and encompasses any tangible thing or item on which an insertion can be performed.
  • Typical objects may be comparatively stable in their overall form and interior structure.
  • such objects may appear or behave on the whole as solid or semi-solid, although they may comprise liquid and/or gaseous substances, components and/or compartments.
  • Exemplary objects may include objects comprised mainly of organic matter or of inorganic matter.
  • an object may be a living or non-living (deceased) human, animal, plant or fungal body or part thereof, such as a body portion, organ, tissue or cell grouping, that may be integral to or separated from said human, animal, plant or fungal body.
  • an object may be a mechanical, electromechanical or electronic apparatus, device or appliance or part thereof.
  • interior spatial structure denotes in particular the spatial organisation or arrangement of the object or part thereof beneath its surface, i.e., in its interior (inside).
  • interior spatial information may particularly comprise information on the position, orientation, shape, dimensions, and/or other properties such as for example composition, hardness, flexibility, etc., of and connections between relevant elements or components of the object, such as, e.g., organs, tissues, etc.
  • target area may generally denote any point or area (i.e., region, place, element) of an object, and particularly a point or area inside (i.e., within, in the interior of) an object, which is intended to be contacted by the insertion tool during an insertion procedure.
  • target areas may comprise or reside in or in the vicinity of organs, tissues or cell groupings, which may be healthy, pathological or suspected of being pathological, such as for example neoplastic, cancerous or tumour tissues or cell groupings.
  • path of entry generally denotes an imaginary trajectory or route connecting an entry point or entry area on the surface of an object with the target area of the object.
  • a path of entry as intended herein is preferably chosen optimally, i.e., to avoid all, most or comparatively as many as possible obstacles and/or vital or sensitive structures (for surgical procedures, e.g., blood vessels, lymphatic paths, nervous tissue, vital organs, bones, joints and/or tendons) that may be encountered when contacting the target area.
  • a path of entry may be a line (e.g., a straight line), or it may be defined as or represented by a suitable two- or three- dimensional object, such as preferably but without limitation a cylindrical or a funnel- shaped tube.
  • a suitable two- or three- dimensional object such as preferably but without limitation a cylindrical or a funnel- shaped tube.
  • a haptic device is "in communication with" an insertion tool
  • the haptic device, particularly the effector end thereof may be suitably connected (e.g., fixedly or releasably) to the insertion tool either directly or via one or more interposed mechanically connected elements.
  • the haptic device, particularly the effector end thereof may comprise a holder, grip or clip adapted to receive the insertion tool, typically a comparatively proximal portion of the latter.
  • haptic feedback haptic guidance
  • haptic stimulus any one or more or all of:
  • - tactile feedback i.e., one perceptible by the sense of touch, such as, e.g., vibration, thermal sensation, piercing or pressing sensation, etc.
  • - kinesthetic feedback i.e., forces provided in degree(s) of freedom of motion of the insertion tool, or in other words force and/or torque feedback.
  • a “haptic device” as intended herein generally denotes a device or apparatus configured to provide haptic feedback in function of appropriate commands.
  • a haptic device may be suitably controlled by a computing system integral or external thereto, programmed to instruct the haptic device with said appropriate commands.
  • Visual generally refers to anything perceptible by the sense of sight. Without limitation, the term “visual” may particularly encompass anything that a user can see in physical reality such as in a physical work space, any images displayed on a human readable display, as well as any images of physical objects and renditions of virtual elements comprised in virtual or augmented reality environments.
  • augmented reality and “mixed reality” are used interchangeably herein and generally denote any view of the physical world, such as of a physical work space, modified, supplemented and/or enhanced by virtual computer-generated imagery.
  • virtual elements are rendered and superimposed onto a backdrop image of the physical world to generate composite mixed reality images.
  • augmented reality environment may be real-time and stereoscopic.
  • Virtual elements may be rendered using conventional real-time 3D graphics software, such as, e.g., OpenGL, Direct3D, etc.
  • a user may be immersed in augmented reality environment by means of an image generating system comprising a human readable display, such as a stereoscopic display, computer screen, head mounted display, etc.
  • a virtual element is generally understood to denote anything that exists or results in essence or effect though not in actual tangible form, a virtual element as used herein may commonly denote to a computer-implemented simulation or representation of some imaginary (e.g., computed) or physical thing.
  • stereoscopy may be interchangeably used to denote any techniques and systems capable of recording three-dimensional visual information and/or creating the illusion of depth in a displayed image.
  • surrogate carries its usual meaning, and may particularly denote a physical or virtual replacement or representation of an actual object or part thereof.
  • a physical surrogate may be a model, replica or dummy representing the object or part thereof, the interior structure and optionally also exterior appearance of which is closely modelled on that of the object.
  • a virtual surrogate of an object may be a virtual rendition of the object or part thereof (e.g., a rendition of the exterior and/or interior of the object or part thereof) in the augmented reality environment.
  • the term "computing system” may preferably refer to a computer, particularly a digital computer. Substantially any computer may be configured to a functional arrangement suitable for performing in the systems, methods and related aspects disclosed herein.
  • the hardware architecture of a computer may, depending on the required operations, typically comprise hardware components including one or more processors (CPU), a random- access memory (RAM), a read-only memory (ROM), an internal or external data storage medium (e.g., hard disk drive), one or more video capture boards, one or more graphic boards, such components suitably interconnected via a bus inside the computer.
  • the computer may further comprise suitable interfaces for communicating with general- purpose external components such as a monitor, keyboard, mouse, network, etc. and with external components such as video cameras, displays, manipulators, etc.
  • suitable machine-executable instructions may be stored on an internal or external data storage medium and loaded into the memory of the computer on operation.
  • a data set comprising information on interior spatial structure of the object or part thereof may be suitably obtained by imaging the object using routine imaging or scanning techniques and apparatuses, such as inter alia magnetic resonance imaging (MRI), computed tomography (CT), X-ray imaging, ultrasonography, positron emission tomography (PET), etc.
  • object markers are employed as coordinate system reference points as explained elsewhere in this specification.
  • the object markers are also 'visible' by (i.e., detected or imaged by) the selected imaging or scanning technique.
  • the object markers as imaged by the imaging or scanning technique may be virtually rendered in the augmented reality environment, and matched onto the image of the physical object markers.
  • Such imaging or scanning techniques and apparatuses and markers are commonly used in medical practice to obtain data sets comprising information on the anatomy of a patient or part thereof, as well-documented in among others “Fundamentals of Medical Imaging” (Suetens P, 2nd ed., Cambridge University Press 2009, ISBN: 0521519152), “Medical Imaging Signals and Systems” (Prince JL & Links J, 1 st ed., Prentice Hall 2005, ISBN: 0130653535), and “Digital Image Processing for Medical Applications” (Dougherty G, 1st ed., Cambridge University Press 2009, ISBN: 0521860857).
  • For mechanical objects, technical drawings, blueprints or atlases may also be considered for providing information on interior spatial structure of such objects or part thereof.
  • a data set comprising information on interior spatial structure of the object or part thereof obtained using one of the aforementioned techniques and apparatuses may be suitably reconstructed by any reconstruction algorithm known per se run on a suitable computing system, to generate an image of the interior spatial structure of the object or part thereof and to display said image on a human readable display device.
  • image may be provided as one or more 2-dimensional slice views through the object or its part, or as a volumetric, 3-dimensional image representation.
  • a user 1 is immersed in an augmented reality environment generated by the image generating system 2 comprising and operated by the computer 3.
  • the image generating system 2 is essentially as described in detail in WO 2009/127701 and comprises stereoscopically arranged right-eye camera 4 and left- eye camera 5 configured to capture a 3-dimensional image of the physical work space in front of the cameras, a computer 3 configured to render a 3-dimensional image of the virtual space using conventional real-time 3D graphics software such as OpenGL or Direct3D, further to superimpose the 3-dimensional image of the virtual space onto the captured 3-dimensional image of the physical work space outputted by the cameras 4, 5, and to output the resulting composite 3-dimensional image to stereoscopically arranged right-eye and left-eye displays facing the respective eyes of the user.
  • the capture, processing and display are configured to proceed at a rate of at least 30 frames per second.
  • the image generating system 2 presents the user 1 with an augmented reality environment comprising displayed therein a 3-dimensional virtual rendition 6 of the interior structure of a portion of the abdominal cavity of a patient.
  • the user 1 operates a physical manipulator 7.
  • the position and orientation of the manipulator 7 in the physical work space and thus in the augmented reality environment can be determined from the image of the physical work space outputted by the cameras 4, 5 as described in detail in WO 2009/127701.
  • the manipulator comprises a recognition member 8 having an appearance which is recognisable in the image captured by the cameras 4, 5 by an image recognition algorithm.
  • the recognition member 8 is configured such that its appearance in the image captured by the cameras 4, 5 is a function of its position and orientation relative to the said cameras 4, 5 (e.g., in a coordinate system originating at the cameras 4 or 5).
  • said function e.g., can be theoretically predicted or has been empirically determined
  • the position and orientation of the recognition member 8 and of the manipulator 7 comprising the same relative to the cameras 4, 5 can be derived from the appearance of said recognition member 8 in an image captured by the cameras 4, 5.
  • the position and orientation of the recognition member 8 and of the manipulator 7 relative to the cameras 4, 5 can then be readily transformed to their position and orientation in the physical work space and augmented reality space, using coordinate system transformation methods known per se.
  • the recognition member 8 may comprise one or more suitable graphical elements, such as one or more distinctive graphical markers or patterns. Any image recognition algorithm or software having the requisite functions is suitable for use herein; exemplary algorithms are discussed inter alia in PJ Besl and ND McKay. "A method for registration of 3-d shapes". IEEE Trans. Pattern Anal. Mach. Intell. 14(2): 239-256, 1992.
  • the position and orientation of the manipulator 7 may be determined by other means such as by being connected to an effector end of a 6-degree of freedom mechanical or electromechanical arm assembly capable of sensing and communicating its position and orientation; or by means of electromagnetic or ultrasonic transmitter-receiver devices communicating with the manipulator (e.g., as taught in US 2002/0075286 and US 2006/0256036).
  • a virtual cursor 9 in form of a pointer is superposed onto the image of the manipulator 7 in the augmented reality environment.
  • the user 1 Using the manipulator 7 and the virtual cursor 9 superposed thereon, the user 1 pinpoints the desired target area in the 3-dimensional virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient. Giving a command, the user 1 stores to the computer 3 the set of coordinates defining the target area in an appropriate coordinate system. Using the manipulator 7 or giving one or more commands, the user may control various attributes of the target area, such as its shape, dimensions, appearance, etc. Preferably, the computer 3 generates a virtual rendition of said target area which is displayed in the augmented reality environment using the image generating system 2.
  • the user 1 pinpoints a potential entry point or entry area on the surface of the patient as represented in the 3-dimensional virtual rendition 6, and giving a command he stores to the computer 3 the set of coordinates defining said entry point or entry area in the appropriate coordinate system.
  • the computer 3 generates a virtual rendition of said potential entry point or entry area which is displayed in the augmented reality environment using the image generating system 2.
  • the computer 3 then computes the set of coordinates in the appropriate coordinate system corresponding to a potential path of entry connecting said target area with said potential entry point or entry area, and generates a virtual rendition of said potential path of entry which is displayed in the augmented reality environment using the image generating system 2.
  • the user 1 can alter various attributes of the potential path of entry, such as its shape, dimensions (e.g., width or diameter), appearance, etc.
  • the user 1 approves of this potential path of entry (e.g., if this avoids obstacles and critical structures)
  • he can give a command to store to the computer 3 the set of coordinates defining said path of entry and its attributes, to be used subsequently in planning, performing or training the insertion procedure.
  • the above process can be repeated again by deleting the current potential path of entry and pinpointing a new, alternative potential entry point or entry area.
  • the computer 3 may be programmed to simultaneously or sequentially propose and render one or more entry points or entry areas and the corresponding potential paths of entry, from which the user may choose by giving suitable commands.
  • Commands can be given by means of appliances such as a keyboard, mouse, joystick, voice recognition, etc.
  • the image generating system 2 may be adapted to allow to translate, rotate and/or change the dimensions of (e.g., zoom in or out) the 3-dimensional virtual rendition 6, by giving appropriate commands.
  • the image generating system 2 may be adapted to allow changing the attributes of the potential path of entry, such that the interior structures enclosed thereby become better visually perceptible or otherwise 'stand out'. For example, giving suitable commands the user 1 may change the brightness, contrast, colour, etc. of the interior structures enclosed by the potential path of entry, or may even crop the 3-dimensional virtual rendition 6, such that only the path of entry and structures enclosed thereby remain visible. This allows to better inspect of the potential path of entry, for example to ensure that it does not collide with unwanted obstacles and/or structures.
  • the set of coordinates and attributes defining the selected path of entry is registered to the object by matching the respective coordinate systems using conventional object markers as explained elsewhere in this specification.
  • An augmented reality environment comprising displayed therein the object and a virtual rendition of the registered path of entry is generated using an image generating system substantially identical to that employed in Figure 1 and explained above.
  • FIG. 2 A schematic view of such a stereoscopic (3-dimensional) augmented reality environment is shown in Figure 2.
  • This comprises the image of the physical patient 18 on whom the insertion procedure is to be performed and superimposed thereon the virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient; the virtual rendition of the path of entry 10 having a substantially frustoconical shape, extending and narrowing down from the entry area 11 (where the path of entry 10 intersects with the surface of the patient) towards the target area 12 as well as protruding away from the entry area 11 , i.e., out of the patient's body, and avoiding obstacles such as the blood vessels 16; the image of the insertion tool 13, herein a biopsy needle; and the image of a portion of an articulated arm 14 of a 6-degrees of freedom haptic device 17 (e.g., Phantom® DesktopTM from SensAble Technologies, Inc.) in communication with the insertion tool 13.
  • the insertion tool 13 is secured to the tool holder portion 15 of said arm 14
  • the set of coordinates and attributes defining the selected path of entry 10 are provided to the 6-degrees of freedom haptic device 17 together with commands to restrain (i.e., restrict, confine) the movements of the insertion tool 13 along the defined path of entry 10.
  • the position and orientation of the tool holder portion 15 of the haptic device 17 and thus of the insertion tool 13 relative to the coordinate system of the haptic device is readily available by querying the sensory information from the articulated arm 14 of the haptic device 17.
  • the position and orientation of the haptic device relative to the coordinate system of the image generating system 2 can be readily determined through the use of a calibration marker placed at a non-moving part (e.g., a base) of the haptic device 17.
  • a calibration marker placed at a non-moving part (e.g., a base) of the haptic device 17.
  • This allows to transform the position and orientation of the insertion tool 13 from the coordinate system of the haptic device 17 to the coordinate system of the augmented reality environment and vice versa. Transformations between various coordinate systems are an ubiquitous feature of virtual and augmented environment renderings, and are generally understood by the skilled person without requiring detailed account herein.
  • the movement restraints may be expressed as a data set comprising a collection of allowed vs.
  • a standard collision detection algorithm is employed to check for imminent and/or occurring collisions between the actual position and orientation of the insertion tool 10 and the boundaries of the haptic path of entry, and the imminent and/or occurring collisions are signalled to the user through the haptic device 17.
  • the haptic restraints are set to prevent the movement of the insertion tool 13 beyond the boundaries of the path of entry 10, i.e., to form a rigid virtual border by means of collision detection, and optionally to apply an increasing opposing force and/or torque when the insertion tool 13 starts approaching said boundaries (e.g., when it comes within a certain present distance from the boundary).
  • Figure 3 further illustrates a preferred embodiment for registering the path of entry or part thereof to the object.
  • a data set comprising information on the interior spatial structure of the object is obtained from a scanning apparatus.
  • Three markers are arranged to form an area shaped as a substantially scalene triangle.
  • the object markers are chosen such that they are easily distinguishable from the surroundings.
  • a filter is applied to identify the markers in the data set comprising information on the interior spatial structure of the object.
  • the position and orientation of the object markers are defined in a coordinate system related to the scanning apparatus.
  • Two cameras (camera 4 and camera 5) form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Each stereoscopic pair provides 3D coordinates for the object markers in a coordinate system related to the camera system.
  • a transformation is subsequently obtained between the coordinate system related to the camera system and the coordinate system related to the scanning apparatus.
  • the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the scanning apparatus into the coordinate system related to the camera system.
  • the common coordinate system is then used for registering the path of entry or part thereof to the object.
  • the stereoscopic pairs of the physical work space are combined with the data set of the virtual space, thereby generating an image in augmented reality space.
  • the present systems may commonly comprise managing means for managing information about the position, orientation and status of objects in the physical work space and managing information about the position, orientation and status of virtual visual and haptic objects in the virtual space.
  • the managing means may receive, calculate, store and update said information, and may communicate said information to other components of the system such as to allow for generating the images of the physical work space, virtual space, composite images combining such to provide the augmented reality environments, and to allow for setting and tracking the requisite haptic constraints.
  • the managing means may be configured to receive, to process and to output data and information in a streaming fashion.
  • the processes involved in the operation of the present systems may be advantageously executed by a data processing (computing) apparatus, such as one or more computers.
  • Said computers may perform the functions of managing means of the systems.
  • the object of the present invention may also be achieved by supplying a system or an apparatus with a storage medium which stores program code of software that realises the functions of the above-described embodiments, and causing a computer (or CPU or MPU) of the system or apparatus to read out and execute the program code stored in the storage medium.
  • the program code itself read out from the storage medium realizes the functions of the embodiments described above, so that the storage medium storing the program code also and the program code per se constitutes the present invention.
  • the storage medium for supplying the program code may be selected, for example, from a floppy disk, hard disk, optical disk, magneto-optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card, ROM, DVD-ROM, Blue-ray disk, solid state disk, and network attached storage (NAS).
  • the program code read out from the storage medium may be written into a memory provided in an expanded board inserted in the computer, or an expanded unit connected to the computer, and a CPU or the like provided in the expanded board or expanded unit may actually perform a part or all of the operations according to the instructions of the program code, so as to accomplish the functions of the embodiment described above.

Abstract

The invention relates to insertion procedures, particularly high-precision insertion procedures, for contacting a target area of an object by an insertion tool along a path of entry, and more particularly to methods for planning, training or performing such insertion procedures, as well as to systems, computer programs and computer program products configured for use in such insertion procedures. A user is immersed in an augmented reality environment comprising displayed therein the object of the insertion procedure or its physical or virtual surrogate and superimposed thereon a virtual rendition of the path of entry or part thereof. Together with this visual guidance the invention also encompasses haptic guidance which controls any unwanted deviations from the path of entry.

Description

INSERTION PROCEDURES IN AUGMENTED REALITY
FIELD OF THE INVENTION
The invention relates to insertion procedures, particularly high-precision insertion procedures, for contacting a target area of an object, and more particularly to methods for planning, training or performing such insertion procedures, as well as to systems, computer programs and computer program products configured for use in such insertion procedures.
BACKGROUND OF THE INVENTION
During high-precision insertion procedures, contact is made between an insertion tool and a target area of an object, typically a small region within the interior of the object, while at the same time avoiding contact between the insertion tool and obstacles and/or select structures within the object. Illustrative examples of insertion procedures include the insertion of a biopsy needle to extract a diagnostic sample from the target area within the body of a patient, or the insertion of a delivery needle for local delivery of a medicament, e.g. , placement of a radiation source such as radioactive seeds in the target area during brachytherapy for treatment of tumours or cancer.
During insertion the insertion tool such as a needle needs to avoid certain obstacles and/or select structures within the object, such as for example bones, vital organs, blood vessels, nerves and the like. An optimal entry strategy for needle insertion needs to be defined during a pre-procedure planning phase. In addition, precautions need to be made to ensure that the pre-defined entry strategy is subsequently followed through during the procedure.
Currently, tedious, inefficient and costly techniques are frequently used for performing insertion procedures, whereby numerous images of the object are required before as well as during the procedure. Using an example of a medical insertion procedure, to start a three-dimensional (3D) scan is made of the body of a patient or of an area of interest of said body, using a suitable medical imaging technique such as for example computed tomography (CT), providing spatial data on the anatomy of the body or area of interest . The data is converted into an image of the underlying anatomy and the information gathered from the image allows a medical practitioner to define an optimal path of entry. The optimal path of entry can usually constitute a straight line connecting a target area within the body with an entry point on the surface of the body, but avoiding any obstacles and/or vital structures within. Next, a needle is injected into the patient's body at the entry point and along the optimal path of entry, albeit only partially. A new scan is made with the needle partially inserted (and visible in the resulting image), to ensure the conformity of the actual position and depth of the needle with the pre-defined optimal path of entry, and to re-adjust where required. This sequence of partially inserting the needle, imaging and re-adjusting is repeated several times until the needle reaches the desired position and depth and contacts the target area.
There thus exists a continuing need for methods and systems which improve the efficiency, user-friendliness and precision of insertion procedures.
EP 1 095 628 describes a method for planning needle surgery wherein a virtual entry trajectory is defined and displayed in a volumetric image of a patient's anatomy obtained by a scanning technique. A medical practitioner aligns a virtual surgical instrument visualised in the same image, and representing a physical surgical instrument controlled by a mechanical arm assembly, along the virtual entry trajectory. Once the desired alignment is achieved, the mechanical arm assembly is locked in its present position and orientation, and the practitioner then advances the surgical instrument into the patient's body. However, in EP 1 095 628 the planning of the entry trajectory and the alignment of the surgical instrument therewith are performed in virtual space, whereas the actual insertion of the surgical instrument occurs wholly in physical space. Hence, the visual information about the entry trajectory and optionally the patient's anatomy are only available during the pre-operation phase but not during the very crucial operation phase. Consequently, once the mechanical arm assembly locks the surgical instrument along the chosen optimal entry path, the advancement of the surgical instrument cannot be modified or corrected by the practitioner.
WO 2007/136771 discloses a method and apparatus to impose haptic constraints on movements of a medical practitioner to ensure a desired position, orientation velocity and/or acceleration of a surgical tool operated by the latter, such as to maintain the surgical tool within a predefined virtual boundary registered to the anatomy of the patient. In operation, when the practitioner moves the surgical tool in a manner that violates the desired relationship relative to the patient's anatomy, haptic (e.g., tactile and/or force) feedback is provided to the practitioner. However, when experiencing a haptic stimulus in this system, the practitioner may be in doubt whether this is due to the distinct properties (e.g., hardness) of the tissues or organs contacted by surgical instrument or due to the haptic guidance indicating a potential deviation from the intended procedure. Moreover, while signalling and allowing to correct deviations from the intended procedure once they happen, the system does not help to minimise the initial occurrence of such deviations.
The present invention aims to provide methods and systems for planning, training and/or performing insertion procedures which address the shortcomings existing in the art, and particularly such methods and systems that ensure more pronounced and intuitive compliance with a pre-determined optimal path of entry during insertion procedures, while providing for flexibility in slightly modifying the path of entry in a well-informed manner.
SUMMARY OF THE INVENTION
An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to the object;
(b) configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof as registered in (a);
(c) providing an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof as registered in (a).
A further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps: (a) registering the path of entry or part thereof to the object;
(b) configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof as registered in (a);
(c) providing an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof as registered in (a);
(d) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (d) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed. Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to a physical or virtual surrogate of the object;
(b) configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof as registered in (a);
(c) providing an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof as registered in (a);
(d) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (d) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
Hence, any one of the methods may comprise the step:
- defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
Or any one of the methods may comprise the steps:
- obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- optionally, generating from said data set an image of the interior spatial structure of the object or part thereof and displaying said image; and
- defining in said data set or image the target area and the path of entry to said target area. Any one or more such preparatory steps may be carried out by one or more persons same as or distinct from (e.g., persons at diverging geographical locations) the user planning, performing and/or training the insertion procedure.
A further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
(a) the insertion tool configured to be manipulated by a user;
(b) a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof; and
(c) an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
(a) the insertion tool configured to be manipulated by a user;
(b) a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof; and
(c) an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object. Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of:
- means for (e.g., an imaging or scanning device for) obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- means for (e.g., a computing system programmed for) generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof and preferably means for (e.g., human readable display for) displaying said image;
- means for (e.g., a computing system programmed for) or allowing a user to (e.g., an image generating system) define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
In the present systems and methods, the haptic device in communication with the insertion tool may be controlled (i.e., operated or instructed) by a computing system programmed to configure said haptic device to control deviation of the insertion tool from the path of entry or part thereof.
Further in the present systems and methods, the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof.
Hence, the haptic device and the image generating system may be controlled by same or separate computing system(s) programmed accordingly, wherein such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
This or additional computing system(s) may be provided programmed for any one, more or all of:
- registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object; - generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof;
- defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
Accordingly, also disclosed is a computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
- configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof; and
- configuring an image generating system to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
Accordingly, also disclosed is a computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
- configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof; and - configuring an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof.
Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
- generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof;
- defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
As well disclosed is a computing system (e.g., a digital computer) programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
The methods, systems and related aspects as disclosed herein can immerse the user in an augmented reality environment comprising displayed therein the object of the insertion procedure or its physical or virtual surrogate and superimposed thereon a virtual rendition of a predetermined path of entry or part thereof guiding to a target area within the object. In the augmented reality environment the user can observe the physical insertion tool (or where appropriate (also) a virtual cursor representing and superimposed on the tool) and its spatial relation with the virtual rendition of the predefined optimal path of entry, relying on this intuitive and highly informative visual guidance to ensure compliance with the path of entry. Synergistically with the visual guidance the present methods and systems also encompass haptic guidance which further controls any unwanted deviations from the path of entry. For example, whereas visual guidance may be constant and immediate, if a deviation does occur the corrective reactions of the user may tend to overcompensate and cause an undesirably large movement in the opposite direction. Inter alia in such situations the haptic guidance will contain the movement and will assist to ensure the compliance with the path of entry. Moreover, given the concurrent receipt of visual and haptic guidance, when experiencing force or tactile input during the procedure, the user can instantly determine whether this may be due to encountering obstacles and/or structures within the object, since the present methods and systems may be configured such that insofar the user remains within the visually indicated path of entry no haptic input is expected. This can provide the user with additional freedom during the procedure and/or prevent potentially detrimental disruptions of so-encountered obstacles and/or structures.
The insertion procedure may be surgical, particularly a surgical insertion procedure performed on an object which is a living human or animal body, more particularly for the purposes of treatment and/or diagnosis. Examples of such surgical procedures may include without limitation minimally invasive surgery (MIS) procedures, percutaneous interventions, laparoscopic or endoscopic interventions, arthroscopy, biopsy, and brachytherapy particularly for treatment of tumours or cancer. Preferred may be surgical procedures involving the insertion of a needle, e.g. a delivery (precision placement) needle or biopsy needle. Alternative insertion tools include instruments configured for placement of screws such as pedicle screws, borescopes, etc.
In an alternative, the insertion procedure may be non-surgical. Examples of such nonsurgical procedures may include without limitation visual inspection, reparation and/or dismantling of non-living objects, such as for example mechanical, electromechanical or electronic apparatuses, devices or appliances or part thereof, such as for example engines (e.g., aircraft engine, diesel engine, electrical engine, etc.), turbines (e.g., gas turbine, steam turbine, etc.), nuclear reactors, firearms or explosive devices. Where in the remainder of the specification the present methods and systems are described or illustrated in connection to surgical procedures, it shall be appreciated that same or analogous principles also apply to and are hereby expressly disclosed for non-surgical procedures.
Preferably, the object may be non-transparent (including at least partly or largely non- transparent objects). Whereas the term "non-transparent" as used herein carries its common meaning, it particularly refers to objects whose non-transparent nature causes that the target area and/or the path of entry or at least a portion thereof cannot be viewed or seen by naked eye. The present methods, systems and related aspects may be particularly informative and necessary for insertion procedures on non-transparent objects, although the scope also encompasses situations in which the object may be transparent.
In preferred insertion procedures the object may be a living human or animal body or a part thereof, such as a body portion, organ, tissue or cell grouping that may be integral to or separated from the human or animal body.
The present methods, systems and related aspects provide for visual and haptic guidance aimed to ensure compliance with (i.e., observance of) the desired path of entry.
The path of entry connects an entry point or entry area on the surface of an object with the target area of the object. The path of entry may be extrapolated to protrude to a certain extent (e.g., between >0 cm and about 20 or about 10 cm) beyond the entry point or entry area and away from the surface of the object, such as to allow for visual and haptic guidance even before the insertion tool contacts the object or the physical or virtual surrogate thereof. The path of entry may be without limitation defined as or represented by any one or more of:
- a one-dimensional object, such as preferably a line, more preferably straight line;
- a two-dimensional object, such as a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc.;
- a three-dimensional object, such as preferably a generally cylindrical object (encompassing not only circular cylinders, which may be particularly preferred, but also shapes that can be generally approximated by a geometrical cylinder or a derivative thereof, e.g., frustoconical shape, a barrel shape, oblate or partially flattened cylinder shape, curved cylinder shape, cylindrical shapes with varying cross-sectional areas such as hourglass shape, bullet shape, etc.), a conical object, a funnel-shaped object, etc. Such objects may be particularly tube-shaped, i.e., devoid of one or more caps. Paths of entry defined as or represented by such two- or three-dimensional objects may also be deemed as collections or sets of a plurality of possible straight-line insertion approaches encompassed within such objects. The width or diameter of such two- or three-dimensional objects can be adjusted to provide the user with more flexibility to choose the particular insertion trajectory (typically a straight-line trajectory), while still avoiding any obstacles and/or vital structures during the insertion procedure. Where applicable, the width or diameter of such two- or three-dimensional objects may be constant or variable along the path of entry, indicative of the user's freedom to modify the movement, e.g., depending on the proximity of obstacles and/or vital structures.
Advantageously, the target area may also be defined as or represented by an object, such as a null-dimensional (e.g., a point), one-dimensional (e.g., a line), two-dimensional (e.g., a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc ), or three-dimensional object (e.g., a cube, sphere, etc.) or a combination thereof (e.g. two spheres with different radii to indicate not only the target area, but also an enveloping area so haptic and or visual feedback can be provided when the physician almost reaches the target area).
To register a path of entry as defined in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set, to the object or to the physical or virtual surrogate thereof, the present methods, systems and related aspects may employ suitable object markers perceivable in both the physical work space and thus in the augmented reality environment and in the data set or image of the interior spatial structure of the object or part thereof.
Typically but without limitation, one, two or preferably three or more object markers are secured to the object or part thereof, and the object or part thereof is then subjected to technique capable of generating the data set and optionally image of the interior structure of the object or part thereof. The object markers are chosen to be detectable by said technique, whereby the technique generates representations of the object markers in the data set or image of the interior structure of the object or part thereof. The markers may be preferably attached to areas of the object substantially not susceptible to relative movement, e.g., to areas not joined by bendable elements such as joints. The position and orientation of the path of entry, i.e., a set of coordinates defining the path of entry, is then determined relative to the sets of coordinates defining the representations of the object markers in the data set or image of the interior structure of the object or part thereof. Subsequently, the data set or image of the interior structure of the object or part thereof (or at least relevant elements thereof including the representations of the object markers and the path of entry) is matched back onto the object such that the representations of the object markers align or overlay with the object markers secured to the object, whereby the set of coordinates defining the path of entry is now determined relative to the actual object markers secured to the object, i.e., whereby the path of entry becomes registered to the object. Hence, the object markers may be deemed as a tool for transforming the set of coordinates defining the path of entry in the coordinate system of the data set or image of the interior structure of the object or part thereof to a coordinate system employed in the augmented reality environment.
Without limitation, gold markers may be used in conjunction with imaging techniques such as MRI and CT scans.
One, two or preferably three or more object markers may be secured to the object. The object markers may be complex, whereby the object marker may comprise a recognition pattern, for example a black and white recognition pattern, or the object markers may be simple, whereby the object marker might not comprise a recognition pattern. For complex object markers of known size, one object marker may already be sufficient to determine the position and orientation of the object marker, which subsequently may be sufficient to derive the position and orientation of the object. A higher accuracy may be obtained, especially for estimating the orientation of the object, by using relatively large object markers. For simple object markers (e.g. not-patterned), multiple markers may be required to reach the desired accuracy. In an embodiment, at least three markers are spaced apart, which markers define an area. Preferably, the entry point for the insertion procedure lies within the area defined by at least three markers. Preferably, three markers are arranged to form an area shaped as a substantially scalene triangle. Said scalene triangle may uniquely define the position and orientation of the rigid body to which the markers are secured. In a data set or image comprising information on the interior spatial structure of the object or part thereof using routine imaging or scanning techniques and apparatuses, the object markers may be identified in this data set or image directly. Preferably, the object markers are chosen such that they are easily distinguishable from the surroundings, so a filter may be applied to identify the markers in the data set or image comprising information on the interior spatial structure of the object or part thereof. The position and orientation of the object markers may be defined in a coordinate system related to the routine imaging or scanning apparatus. When three markers are arranged to form a substantially scalene triangle, the markers may be uniquely identified. In an embodiment, two cameras form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Given certain parameters of the camera system (e.g. focal length), triangulation of each stereoscopic pair may provide 3D coordinates for the object markers in a coordinate system related to the camera system. Any coordinate system may function as the common coordinate system for registering the path of entry or part thereof to the object. Preferably, the coordinate system related to the camera system is used as the common coordinate system used for registering the path of entry or part thereof to the object. The position and orientation of the object markers may then be defined in a coordinate system related to the camera system. A transformation may subsequently be obtained between the coordinate system related to the camera system and the coordinate system related to the routine imaging or scanning apparatus. This transformation will usually consist of a translation and a rotation, whereby the translation may be rigid, and whereby no scaling may be required since the data sets or images are of the same object. An example of an algorithm providing such a transformation is the Iterative Closest Point algorithm (and extensions thereof, such as Milella, A.; Siegwart, "Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point"., in proceedings of the "IEEE International Conference on Computer Vision Systems 2006", pp 21 , January 2006, ISBN: 0-7695-2506-7). An explicit method to obtain such transformation may also be used, e.g. when three markers are used. Based on the transformation between the coordinate system related to the camera system and the coordinate system related to the routine imaging or scanning apparatus, the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the routine imaging or scanning apparatus into the coordinate system related to the camera system. In this way, the data sets or images in the virtual space and the physical work space may be defined in the same coordinate space and may be registered to one another by co- locating or superimposing the detected object markers, thereby generating an image in augmented reality space. If only one camera is used to capture the view of the physical object of interest, and consequently no set of stereoscopic pairs can be constructed, the depth of the object cannot always be uniquely determined. To solve this issue when there is only one camera present to capture the view of the physical object of interest, either complex object markers, or more than three object markers may be required.
Using a similar approach, the position of the haptic device may be determined and registered into the coordinate system related to the camera system. Object markers may be attached to the haptic device. A limited set of complex object markers may be used, or a larger set of simple object markers may be used. The object markers may be used in a similar fashion to determine the transformation from the coordinate system related to the haptic device to the coordinate system related to the camera system, as previously described for registering the path of entry or part thereof to the object.
The present methods, systems and related aspects also envisage that the registration of the path of entry to the object may be dynamic or real-time, e.g., may be updated continuously, upon the user's or system's command or at preset intervals, such that the accuracy of the alignment between the object markers and their representations is examined and if disturbed (e.g., beyond a preset, acceptable threshold) the registration process is at least partly repeated to adjust the position and orientation of the path of entry to the new situation. Such dynamic or real-time registration process may suitably take into account changes in the path of entry due to deformations and/or movements of the object, such as due to breathing or other involuntary movements of patients.
It shall be appreciated that the above principles may be analogously applied to register, preferably dynamically register, the target area to the object or to the physical or virtual surrogate thereof.
The present methods, systems and related aspects provide for visual guidance particularly in form of a virtual rendition (i.e., rendition as a visible virtual element) of the path of entry or part thereof superimposed on the image of the object or the physical or virtual surrogate thereof in an augmented reality environment.
The path of entry may be suitably rendered in the augmented reality environment as the corresponding object, e.g., one-, two- or three-dimensional object, particularly preferably as a straight line or a cylindrical or funnel-shaped tube. The appearance of the virtual rendition of the path of entry (e.g., colour, texture, brightness, contrast, etc.) may be suitably chosen to prompt the user to appreciate relevant properties of the path of entry, such as its position, orientation, shape, size and particularly importantly the bounds or limits of the path of entry. Also, the relative transparency of the virtual rendition of the path of entry may be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof.
Furthermore, the appearance of the virtual rendition of the path of entry may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, etc.
Where only a part of the path of entry is rendered, this may advantageously be a portion adjacent to the surface of the object, and optionally and preferably a portion protruding away from the surface of the object, such as to correctly prime the insertion procedure.
It shall be appreciated that the above principles may be analogously applied to virtual rendition of the target area in the augmented reality environment. Visibly rendering the target area may provide the user with more intuitive and thorough understanding of inter alia the required depth and level of precision of the insertion procedure. Advantageously, the appearance of the virtual rendition of the target area may be configured to vary in the course of the insertion procedure, such as, e.g., in function of the distance of the insertion tool from the target area.
In the present methods, systems and related aspects the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool). Preferably, the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object). The relative transparency of the virtual rendition of the cursor can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, and the virtual cursor. Advantageously, the appearance of the virtual rendition of the cursor may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, or in function of the distance of the insertion tool from the target area, etc. Visibly rendering the virtual cursor can provide the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry, thereby allowing for much better informed insertion procedures, and allowing for the synergistic cross-talk between visual and haptic cues throughout substantially the entire procedure.
In the present methods, systems and related aspects the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of the interior spatial structure of the object or part thereof generated from the data set comprising information on said interior spatial structure of the object or part thereof and superimposed on the image of the object or the physical or virtual surrogate thereof. This can aid the user in verifying the adequacy of the selected entry path, and may also allow to correlate any force or tactile input felt by the user during the insertion procedure with obstacles and/or structures encountered by the insertion tool and visible in the virtual rendition of the interior spatial structure of the object, thereby allowing for even more informed and precise planning, performing or training of insertion procedures. The relative transparency of the virtual rendition of the interior spatial structure of the object can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual cursor, and the virtual rendition of the interior spatial structure of the object or part thereof. In the present methods, systems and related aspects, haptic guidance particularly aims to ensure compliance with (i.e., observance of) a preset, desired spatial and/or kinetic relation of the insertion tool relative to the body or the physical or virtual surrogate thereof, said preset, desired spatial and/or kinetic relation being compatible with or conducive to maintaining the path of entry. Particularly, the haptic device is configured to control deviation (i.e., departure, divergence or discrepancy) of the insertion tool from the path of entry, more particularly the haptic device may be configured to control deviation of the actual position, orientation, velocity and/or acceleration (preferably at least position and/or orientation) of the insertion tool from a preset, desired position, orientation, velocity and/or acceleration of the insertion tool which is compatible with or conducive to maintaining the path of entry.
Controlling deviation of the insertion tool from the path of entry by the haptic device may entail one or more measures generally aimed at one or more or all of: reducing or preventing the occurrence of a deviation; minimising the extent or size of an occurred deviation; correcting an occurred deviation. Such measures are realised through the haptic device being configured to impose haptic feedback on the insertion tool, to be perceived by a user manipulating the insertion tool. Preferably, the haptic device may be configured to impose on the insertion tool haptic feedback comprising any one or more or all of tactile (e.g., vibration), force and/or torque feedback.
Without limitation, haptic feedback may be perceived by the user as comprising any one or more or all of:
- pull (i.e., a drag or tow) driving the insertion tool towards, to alignment with and/or along the path of entry;
- tactile sensation (e.g., vibration) indicative of compliance with or deviation from the path of entry;
- resistance to (i.e., opposition to) to movement of the insertion tool towards the bounds or limits of the path of entry;
- resistance to (i.e., opposition to) or prohibition of (i.e., ban on or prevention of) movement of the insertion tool beyond the bounds or limits of the path of entry;
- resistance to (i.e., opposition to) or prohibition of (i.e., ban on or prevention of) movement of the insertion tool having velocity and/or acceleration exceeding a preset value.
Suitably, resistance to a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or rotation of said movement, which nevertheless allows the movement to occur and/or proceed. The magnitude of such counter-force and/or counter-torque may be without limitation constant or may be proportional to (e.g., linearly or non-linearly, such as exponentially, proportional to) the magnitude of said movement. Suitably, prohibition of a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or torque of said movement, which is such that it blocks or prevents said movement from occurring and/or proceeding further.
Haptic devices and haptic rendering in virtual reality solutions are known per se and can be suitably integrated with the present system (see, inter alia, McLaughlin et al. "Touch in Virtual Environments: Haptics and the Design of Interactive Systems", 1st ed., Pearson Education 2001 , ISBN 0130650978; M Grunwald, ed., "Human Haptic Perception: Basics and Applications", 1st ed., Birkhauser Basel 2008, ISBN 37643761 12; Lin & Otaduy, eds., "Haptic Rendering: Foundations, Algorithms and Applications", A K Peters 2008, ISBN 1568813325; WO 2007/136771). Preferably, the haptic device is a 6-degrees of freedom haptic device.
The present methods, systems and related aspects may preferably allow for a stereoscopic (i.e., three-dimensional, 3D) view of the augmented reality environment. This may particularly entail the stereoscopic view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.). Such stereoscopic view allows a user to perceive the depth of the viewed scene, ensures a more realistic experience and thus helps the user to more intuitively and accurately plan, perform or train any insertion procedures.
Means and processes for capturing stereoscopic images of physical work space, rendering stereoscopic images of virtual space, combining said images to produce composite stereoscopic images of the mixed, i.e., augmented reality space, and stereoscopically displaying the resulting images thereby providing the user with an augmented reality environment, are known per se and may be applied herein (see inter alia Judge, "Stereoscopic Photography", Ghose Press 2008, ISBN: 1443731366; Girling, "Stereoscopic Drawing: A Theory of 3-D Vision and its application to Stereoscopic Drawing", 1st ed., Reel Three-D Enterprises 1990, ISBN: 0951602802).
The present methods, systems and related aspects may also preferably allow for realtime view of the augmented reality environment. This may particularly include real-time view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.). Such real-time view allows a user to perceive without substantial delay any alterations in the position, orientation, shape or other properties of physical and/or virtual elements comprised in the viewed scene, which ensures a more realistic experience and helps the user to more intuitively and accurately plan, perform or train any insertion procedures. To achieve real-time view of the augmented reality environment, the image of the physical space may be preferably captured at a rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second. Moreover, display means providing the view of the augmented reality environment may have a refresh rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second.
Stereoscopic real-time view of the augmented reality environment may be particularly preferred.
In the present methods, systems and related aspects any image generating system capable of generating an augmented reality environment may be employed. Without limitation, such image generating system may comprise image pickup means for capturing an image of a physical work space, virtual space image generating means for generating an image of a virtual space comprising desired virtual elements (such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, optionally the virtual cursor representing the insertion tool, etc.), composite image generating means for generating a composite image by synthesising the image of the virtual space generated by the virtual space image generating means and the image of the physical work space outputted by the image pickup means, and display means for displaying the composite image generated by the composite image generating means. Such image generating systems are known per se, and are described inter alia in WO 2009/127701 , US 2002/0075286 and US 2006/0256036, incorporated by reference herein in their entirety.
As explained above, a particularly advantageous feature taught herein entails displaying in the augmented reality environment a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool), which provides the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry. Preferably, the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object). This teaching is realised more generally through the following aspects of the invention, to which the aforementioned embodiments and specific features apply analogously and are hereby expressly disclosed.
An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to the object;
(b) providing an augmented reality environment comprising displayed therein the object, a virtual rendition of the path of entry or part thereof as registered in (a) and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
A further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to the object;
(b) providing an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof as registered in (a) and a virtual rendition of a cursor representing and superimposed on the physical insertion tool;
(c) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (c) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to a physical or virtual surrogate of the object;
(b) providing an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof as registered in (a) and a virtual rendition of a cursor representing and superimposed on the physical insertion tool; (c) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (c) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
Hence, any one of the methods may comprise the step:
- defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
Or any one of the methods may comprise the steps:
- obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- optionally, generating from said data set an image of the interior spatial structure of the object or part thereof and displaying said image; and
- defining in said data set or image the target area and the path of entry to said target area.
A further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
(a) the insertion tool configured to be manipulated by a user;
(b) an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
(a) the insertion tool configured to be manipulated by a user;
(b) an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of: - means for (e.g., an imaging or scanning device for) obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- means for (e.g., a computing system programmed for) generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof and preferably means for (e.g., human readable display for) displaying said image;
- means for (e.g., a computing system programmed for) or allowing a user to (e.g., an image generating system) define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
In the present systems and methods, the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
This or additional computing system(s) may be provided programmed for any one, more or all of:
- registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object;
- generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof;
- defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
Accordingly, also disclosed is a computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
- configuring an image generating system to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool. Accordingly, also disclosed is a computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
- configuring an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
- generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof;
- defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
As well disclosed is a computing system (e.g., a digital computer) programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
The above and additional aspects, preferred embodiments and features of the invention are described in the following sections and in the appended claims. Each aspect or embodiment described herein may be combined with any other aspect(s) or embodiment(s) unless clearly indicated to the contrary. In particular, any feature indicated as being preferred or advantageous may be combined with any other feature or features indicated as being preferred or advantageous. The subject matter of appended claims is hereby specifically incorporated in this specification.
BRIEF DESCRIPTION OF FIGURES
The invention will be described in the following in greater detail by way of example only and with reference to the attached drawings of non-limiting embodiments of the invention, in which:
Figure 1 illustrates a perspective view of an embodiment of the invention comprising an image generating system.
Figure 2 illustrates a perspective view of an augmented reality environment in an embodiment of the invention.
Figure 3 illustrates a preferred embodiment for registering the path of entry or part thereof to the object.
DETAILED DESCRIPTION OF THE INVENTION
As used herein, the singular forms "a", "an", and "the" include both singular and plural referents unless the context clearly dictates otherwise. By way of example, "a path of entry" may mean one path of entry or more than one paths of entry.
The terms "comprising", "comprises" and "comprised of as used herein are synonymous with "including", "includes" or "containing", "contains", and are inclusive or open-ended and do not exclude additional, non-recited members, elements or method steps. The term also encompasses "consisting of" and "consisting essentially of". The recitation of numerical ranges by endpoints includes all numbers and fractions subsumed within the respective ranges, as well as the recited endpoints.
The term "about" as used herein when referring to a measurable value such as a parameter, an amount, a temporal duration, and the like, is meant to encompass variations of and from the specified value, in particular variations of +1-20% or less, preferably +/-10% or less, more preferably +1-5% or less, even more preferably +/-1 % or less, and still more preferably +/-0.1 % or less of and from the specified value, insofar such variations are appropriate to perform in the disclosed invention. It is to be understood that the value to which the modifier "about" refers is itself also specifically, and preferably, disclosed.
All documents cited in the present disclosure are hereby incorporated by reference in their entirety.
Unless otherwise specified, all terms used in disclosing the invention, including technical and scientific terms, have the meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. By means of further guidance, definitions of terms may be included to better appreciate the teaching of the present invention, and the terms used may be preferably construed in accordance with the respective definitions, unless a context dictates otherwise.
As used herein the terms "insertion procedure", "insertion task" or "insertion" may be used interchangeably to generally denote an action involving at least partly inserting (i.e., placing or introducing) a suitable tool, instrument or utensil (i.e., an "insertion tool") to an object. Particularly useful insertion procedures may be aimed at contacting a target area of an object by the insertion tool, typically by the distal end or distal portion of the insertion tool.
As intended herein, the insertion tool is configured to be manipulated by a user, preferably to be directly manipulated by the user. For example, the user may hold the insertion tool or may hold a suitable element in a mechanical connection (e.g. , directly or via one or more interposed mechanically connected elements) with the insertion tool.
Generally, a suitable insertion tool may comprise distally an insertion portion and proximally a handle or grip for grasping by a user. Optionally, the distal end of the insertion tool may comprise an end effector for performing a desired action in the object, more particularly at the target area. Such end effectors may for example be configured for any one or more of: injecting, infusing, delivering or placing of substances, compositions or items (e.g., a delivery needle, e.g., connected to a reservoir such as a syringe), extracting or removing material from the object or target area (e.g., biopsy needle), cutting or incising (e.g., knife, scissors), grasping or holding (e.g., tongs, pincers, forceps), screwing (e.g., screwdriver), dilating or probing, cannulating (e.g. cannula), draining or aspirating (e.g., a draining needle), suturing or ligating (e.g., a stitching needle), inspecting (e.g., camera), illuminating (e.g., a lighting element). An end effector may be suitably actuated by the user (e.g., by manipulating a corresponding actuator provided on the handle or on another relatively more proximal part of the insertion tool). Needles such as delivery needles or biopsy needles, endoscopic and laparoscopic instruments, or borescopes may be particularly preferred insertion tools, particularly for surgical insertion procedures.
The term "object" is used broadly herein and encompasses any tangible thing or item on which an insertion can be performed. Typical objects may be comparatively stable in their overall form and interior structure. For example but without limitation, such objects may appear or behave on the whole as solid or semi-solid, although they may comprise liquid and/or gaseous substances, components and/or compartments. Exemplary objects may include objects comprised mainly of organic matter or of inorganic matter. Without limitation, an object may be a living or non-living (deceased) human, animal, plant or fungal body or part thereof, such as a body portion, organ, tissue or cell grouping, that may be integral to or separated from said human, animal, plant or fungal body. Also without limitation, an object may be a mechanical, electromechanical or electronic apparatus, device or appliance or part thereof.
As used herein, "interior spatial structure" denotes in particular the spatial organisation or arrangement of the object or part thereof beneath its surface, i.e., in its interior (inside). Such interior spatial information may particularly comprise information on the position, orientation, shape, dimensions, and/or other properties such as for example composition, hardness, flexibility, etc., of and connections between relevant elements or components of the object, such as, e.g., organs, tissues, etc.
The term "target area" may generally denote any point or area (i.e., region, place, element) of an object, and particularly a point or area inside (i.e., within, in the interior of) an object, which is intended to be contacted by the insertion tool during an insertion procedure. By means of example and without limitation, in medical and surgical applications such target areas may comprise or reside in or in the vicinity of organs, tissues or cell groupings, which may be healthy, pathological or suspected of being pathological, such as for example neoplastic, cancerous or tumour tissues or cell groupings.
The term "path of entry" as used herein generally denotes an imaginary trajectory or route connecting an entry point or entry area on the surface of an object with the target area of the object. A path of entry as intended herein is preferably chosen optimally, i.e., to avoid all, most or comparatively as many as possible obstacles and/or vital or sensitive structures (for surgical procedures, e.g., blood vessels, lymphatic paths, nervous tissue, vital organs, bones, joints and/or tendons) that may be encountered when contacting the target area. As noted elsewhere in this specification, a path of entry may be a line (e.g., a straight line), or it may be defined as or represented by a suitable two- or three- dimensional object, such as preferably but without limitation a cylindrical or a funnel- shaped tube. Use of such more-dimensional definitions or representations of the path of entry allows a user of the present methods, systems and related aspects additional freedom in selecting an actual insertion approach to the target area, said actual insertion approach being encompassed within the bounds or limits of the path of entry, e.g., within the diameter of a path of entry defined as or represented by a three-dimensional cylindrical or funnel-shaped tube. By means of example, where the actual insertion approach is performed along a straight line, such more-dimensional definitions or representations of the path of entry may be deemed as comprised of and allowing for a plurality of potential straight-line insertion approaches encompassed within said path of entry.
Where the present specification states that a haptic device is "in communication with" an insertion tool, this generally denotes a relationship or arrangement of said elements, whereby the haptic device is able to impose haptic feedback on the insertion tool, to be perceived by a user manipulating the insertion tool. For example, the haptic device, particularly the effector end thereof, may be suitably connected (e.g., fixedly or releasably) to the insertion tool either directly or via one or more interposed mechanically connected elements. For example, the haptic device, particularly the effector end thereof, may comprise a holder, grip or clip adapted to receive the insertion tool, typically a comparatively proximal portion of the latter.
Herein, the terms "haptic feedback", "haptic guidance" or "haptic stimulus" are conceived broadly and encompass any one or more or all of:
- tactile feedback, i.e., one perceptible by the sense of touch, such as, e.g., vibration, thermal sensation, piercing or pressing sensation, etc. - kinesthetic feedback, i.e., forces provided in degree(s) of freedom of motion of the insertion tool, or in other words force and/or torque feedback.
A "haptic device" as intended herein generally denotes a device or apparatus configured to provide haptic feedback in function of appropriate commands. A haptic device may be suitably controlled by a computing system integral or external thereto, programmed to instruct the haptic device with said appropriate commands.
The modifier "visual" generally refers to anything perceptible by the sense of sight. Without limitation, the term "visual" may particularly encompass anything that a user can see in physical reality such as in a physical work space, any images displayed on a human readable display, as well as any images of physical objects and renditions of virtual elements comprised in virtual or augmented reality environments.
The terms "augmented reality" and "mixed reality" are used interchangeably herein and generally denote any view of the physical world, such as of a physical work space, modified, supplemented and/or enhanced by virtual computer-generated imagery. In augmented reality environment, virtual elements are rendered and superimposed onto a backdrop image of the physical world to generate composite mixed reality images. Preferably, augmented reality environment may be real-time and stereoscopic. Virtual elements may be rendered using conventional real-time 3D graphics software, such as, e.g., OpenGL, Direct3D, etc. A user may be immersed in augmented reality environment by means of an image generating system comprising a human readable display, such as a stereoscopic display, computer screen, head mounted display, etc.
Whereas the term "virtual" is generally understood to denote anything that exists or results in essence or effect though not in actual tangible form, a virtual element as used herein may commonly denote to a computer-implemented simulation or representation of some imaginary (e.g., computed) or physical thing.
The terms "stereoscopy", "stereoscopic" and "3D imaging" may be interchangeably used to denote any techniques and systems capable of recording three-dimensional visual information and/or creating the illusion of depth in a displayed image.
The term "surrogate" carries its usual meaning, and may particularly denote a physical or virtual replacement or representation of an actual object or part thereof. For example, a physical surrogate may be a model, replica or dummy representing the object or part thereof, the interior structure and optionally also exterior appearance of which is closely modelled on that of the object. For example, a virtual surrogate of an object may be a virtual rendition of the object or part thereof (e.g., a rendition of the exterior and/or interior of the object or part thereof) in the augmented reality environment.
The term "computing system" may preferably refer to a computer, particularly a digital computer. Substantially any computer may be configured to a functional arrangement suitable for performing in the systems, methods and related aspects disclosed herein. The hardware architecture of a computer may, depending on the required operations, typically comprise hardware components including one or more processors (CPU), a random- access memory (RAM), a read-only memory (ROM), an internal or external data storage medium (e.g., hard disk drive), one or more video capture boards, one or more graphic boards, such components suitably interconnected via a bus inside the computer. The computer may further comprise suitable interfaces for communicating with general- purpose external components such as a monitor, keyboard, mouse, network, etc. and with external components such as video cameras, displays, manipulators, etc. For execution of processes needed for performing in the systems, methods and related aspects disclosed herein, suitable machine-executable instructions (program) may be stored on an internal or external data storage medium and loaded into the memory of the computer on operation.
In the following, the present methods, systems and related aspects are further explained with reference to illustrative albeit non-limiting embodiments thereof.
A data set comprising information on interior spatial structure of the object or part thereof may be suitably obtained by imaging the object using routine imaging or scanning techniques and apparatuses, such as inter alia magnetic resonance imaging (MRI), computed tomography (CT), X-ray imaging, ultrasonography, positron emission tomography (PET), etc. To allow for later registration through alignment of the information from such techniques and apparatuses with the actual object, object markers are employed as coordinate system reference points as explained elsewhere in this specification. The object markers are also 'visible' by (i.e., detected or imaged by) the selected imaging or scanning technique. During the subsequent augmented reality session (see below), the object markers as imaged by the imaging or scanning technique may be virtually rendered in the augmented reality environment, and matched onto the image of the physical object markers. Such imaging or scanning techniques and apparatuses and markers are commonly used in medical practice to obtain data sets comprising information on the anatomy of a patient or part thereof, as well-documented in among others "Fundamentals of Medical Imaging" (Suetens P, 2nd ed., Cambridge University Press 2009, ISBN: 0521519152), "Medical Imaging Signals and Systems" (Prince JL & Links J, 1 st ed., Prentice Hall 2005, ISBN: 0130653535), and "Digital Image Processing for Medical Applications" (Dougherty G, 1st ed., Cambridge University Press 2009, ISBN: 0521860857). For mechanical objects, technical drawings, blueprints or atlases may also be considered for providing information on interior spatial structure of such objects or part thereof.
A data set comprising information on interior spatial structure of the object or part thereof obtained using one of the aforementioned techniques and apparatuses may be suitably reconstructed by any reconstruction algorithm known per se run on a suitable computing system, to generate an image of the interior spatial structure of the object or part thereof and to display said image on a human readable display device. Such image may be provided as one or more 2-dimensional slice views through the object or its part, or as a volumetric, 3-dimensional image representation.
In an embodiment shown in Figure 1 , a user 1 is immersed in an augmented reality environment generated by the image generating system 2 comprising and operated by the computer 3. The image generating system 2 is essentially as described in detail in WO 2009/127701 and comprises stereoscopically arranged right-eye camera 4 and left- eye camera 5 configured to capture a 3-dimensional image of the physical work space in front of the cameras, a computer 3 configured to render a 3-dimensional image of the virtual space using conventional real-time 3D graphics software such as OpenGL or Direct3D, further to superimpose the 3-dimensional image of the virtual space onto the captured 3-dimensional image of the physical work space outputted by the cameras 4, 5, and to output the resulting composite 3-dimensional image to stereoscopically arranged right-eye and left-eye displays facing the respective eyes of the user. The capture, processing and display are configured to proceed at a rate of at least 30 frames per second.
The image generating system 2 presents the user 1 with an augmented reality environment comprising displayed therein a 3-dimensional virtual rendition 6 of the interior structure of a portion of the abdominal cavity of a patient. The user 1 operates a physical manipulator 7. The position and orientation of the manipulator 7 in the physical work space and thus in the augmented reality environment can be determined from the image of the physical work space outputted by the cameras 4, 5 as described in detail in WO 2009/127701. In particular, the manipulator comprises a recognition member 8 having an appearance which is recognisable in the image captured by the cameras 4, 5 by an image recognition algorithm. Moreover, the recognition member 8 is configured such that its appearance in the image captured by the cameras 4, 5 is a function of its position and orientation relative to the said cameras 4, 5 (e.g., in a coordinate system originating at the cameras 4 or 5). Hence, when said function is known (e.g., can be theoretically predicted or has been empirically determined) the position and orientation of the recognition member 8 and of the manipulator 7 comprising the same relative to the cameras 4, 5 can be derived from the appearance of said recognition member 8 in an image captured by the cameras 4, 5. The position and orientation of the recognition member 8 and of the manipulator 7 relative to the cameras 4, 5 can then be readily transformed to their position and orientation in the physical work space and augmented reality space, using coordinate system transformation methods known per se. The recognition member 8 may comprise one or more suitable graphical elements, such as one or more distinctive graphical markers or patterns. Any image recognition algorithm or software having the requisite functions is suitable for use herein; exemplary algorithms are discussed inter alia in PJ Besl and ND McKay. "A method for registration of 3-d shapes". IEEE Trans. Pattern Anal. Mach. Intell. 14(2): 239-256, 1992. Alternatively, the position and orientation of the manipulator 7 may be determined by other means such as by being connected to an effector end of a 6-degree of freedom mechanical or electromechanical arm assembly capable of sensing and communicating its position and orientation; or by means of electromagnetic or ultrasonic transmitter-receiver devices communicating with the manipulator (e.g., as taught in US 2002/0075286 and US 2006/0256036). In Figure 1 , a virtual cursor 9 in form of a pointer is superposed onto the image of the manipulator 7 in the augmented reality environment.
Using the manipulator 7 and the virtual cursor 9 superposed thereon, the user 1 pinpoints the desired target area in the 3-dimensional virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient. Giving a command, the user 1 stores to the computer 3 the set of coordinates defining the target area in an appropriate coordinate system. Using the manipulator 7 or giving one or more commands, the user may control various attributes of the target area, such as its shape, dimensions, appearance, etc. Preferably, the computer 3 generates a virtual rendition of said target area which is displayed in the augmented reality environment using the image generating system 2. Further, using the manipulator 7 and the virtual cursor 9 superposed thereon the user 1 pinpoints a potential entry point or entry area on the surface of the patient as represented in the 3-dimensional virtual rendition 6, and giving a command he stores to the computer 3 the set of coordinates defining said entry point or entry area in the appropriate coordinate system. Optionally, the computer 3 generates a virtual rendition of said potential entry point or entry area which is displayed in the augmented reality environment using the image generating system 2. The computer 3 then computes the set of coordinates in the appropriate coordinate system corresponding to a potential path of entry connecting said target area with said potential entry point or entry area, and generates a virtual rendition of said potential path of entry which is displayed in the augmented reality environment using the image generating system 2. Giving one or more commands, the user 1 can alter various attributes of the potential path of entry, such as its shape, dimensions (e.g., width or diameter), appearance, etc. In case that the user 1 approves of this potential path of entry (e.g., if this avoids obstacles and critical structures), he can give a command to store to the computer 3 the set of coordinates defining said path of entry and its attributes, to be used subsequently in planning, performing or training the insertion procedure. If the user 1 does not approve of the potential path of entry, the above process can be repeated again by deleting the current potential path of entry and pinpointing a new, alternative potential entry point or entry area. Otherwise, the computer 3 may be programmed to simultaneously or sequentially propose and render one or more entry points or entry areas and the corresponding potential paths of entry, from which the user may choose by giving suitable commands. Commands can be given by means of appliances such as a keyboard, mouse, joystick, voice recognition, etc.
To facilitate the inspection of a potential path of entry, the image generating system 2 may be adapted to allow to translate, rotate and/or change the dimensions of (e.g., zoom in or out) the 3-dimensional virtual rendition 6, by giving appropriate commands. Moreover, the image generating system 2 may be adapted to allow changing the attributes of the potential path of entry, such that the interior structures enclosed thereby become better visually perceptible or otherwise 'stand out'. For example, giving suitable commands the user 1 may change the brightness, contrast, colour, etc. of the interior structures enclosed by the potential path of entry, or may even crop the 3-dimensional virtual rendition 6, such that only the path of entry and structures enclosed thereby remain visible. This allows to better inspect of the potential path of entry, for example to ensure that it does not collide with unwanted obstacles and/or structures.
The set of coordinates and attributes defining the selected path of entry is registered to the object by matching the respective coordinate systems using conventional object markers as explained elsewhere in this specification. An augmented reality environment comprising displayed therein the object and a virtual rendition of the registered path of entry is generated using an image generating system substantially identical to that employed in Figure 1 and explained above.
A schematic view of such a stereoscopic (3-dimensional) augmented reality environment is shown in Figure 2. This comprises the image of the physical patient 18 on whom the insertion procedure is to be performed and superimposed thereon the virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient; the virtual rendition of the path of entry 10 having a substantially frustoconical shape, extending and narrowing down from the entry area 11 (where the path of entry 10 intersects with the surface of the patient) towards the target area 12 as well as protruding away from the entry area 11 , i.e., out of the patient's body, and avoiding obstacles such as the blood vessels 16; the image of the insertion tool 13, herein a biopsy needle; and the image of a portion of an articulated arm 14 of a 6-degrees of freedom haptic device 17 (e.g., Phantom® Desktop™ from SensAble Technologies, Inc.) in communication with the insertion tool 13. The insertion tool 13 is secured to the tool holder portion 15 of said arm 14 of the haptic device. The user suitably grasps said holder portion 15 to manipulate the insertion tool.
The set of coordinates and attributes defining the selected path of entry 10 are provided to the 6-degrees of freedom haptic device 17 together with commands to restrain (i.e., restrict, confine) the movements of the insertion tool 13 along the defined path of entry 10. The position and orientation of the tool holder portion 15 of the haptic device 17 and thus of the insertion tool 13 relative to the coordinate system of the haptic device is readily available by querying the sensory information from the articulated arm 14 of the haptic device 17. The position and orientation of the haptic device relative to the coordinate system of the image generating system 2 (said coordinate system generally having its origin at one of the cameras 4, 5 and being used as a basis coordinate system for the augmented reality environment) can be readily determined through the use of a calibration marker placed at a non-moving part (e.g., a base) of the haptic device 17. This allows to transform the position and orientation of the insertion tool 13 from the coordinate system of the haptic device 17 to the coordinate system of the augmented reality environment and vice versa. Transformations between various coordinate systems are an ubiquitous feature of virtual and augmented environment renderings, and are generally understood by the skilled person without requiring detailed account herein. The movement restraints may be expressed as a data set comprising a collection of allowed vs. forbidden positions and orientations of the insertion tool 13, effectively defining a haptic path of entry having boundaries that match the boundaries of the path of entry 10. A standard collision detection algorithm is employed to check for imminent and/or occurring collisions between the actual position and orientation of the insertion tool 10 and the boundaries of the haptic path of entry, and the imminent and/or occurring collisions are signalled to the user through the haptic device 17. The haptic restraints are set to prevent the movement of the insertion tool 13 beyond the boundaries of the path of entry 10, i.e., to form a rigid virtual border by means of collision detection, and optionally to apply an increasing opposing force and/or torque when the insertion tool 13 starts approaching said boundaries (e.g., when it comes within a certain present distance from the boundary).
Figure 3 further illustrates a preferred embodiment for registering the path of entry or part thereof to the object. A data set comprising information on the interior spatial structure of the object is obtained from a scanning apparatus. Three markers are arranged to form an area shaped as a substantially scalene triangle. The object markers are chosen such that they are easily distinguishable from the surroundings. A filter is applied to identify the markers in the data set comprising information on the interior spatial structure of the object. The position and orientation of the object markers are defined in a coordinate system related to the scanning apparatus. Two cameras (camera 4 and camera 5) form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Each stereoscopic pair provides 3D coordinates for the object markers in a coordinate system related to the camera system. A transformation is subsequently obtained between the coordinate system related to the camera system and the coordinate system related to the scanning apparatus. Based on the transformation between the coordinate system related to the camera system and the coordinate system related to the scanning apparatus, the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the scanning apparatus into the coordinate system related to the camera system. The common coordinate system is then used for registering the path of entry or part thereof to the object. In this common coordinate system, the stereoscopic pairs of the physical work space are combined with the data set of the virtual space, thereby generating an image in augmented reality space.
The present systems may commonly comprise managing means for managing information about the position, orientation and status of objects in the physical work space and managing information about the position, orientation and status of virtual visual and haptic objects in the virtual space. The managing means may receive, calculate, store and update said information, and may communicate said information to other components of the system such as to allow for generating the images of the physical work space, virtual space, composite images combining such to provide the augmented reality environments, and to allow for setting and tracking the requisite haptic constraints. To allow real-time operation of the system, the managing means may be configured to receive, to process and to output data and information in a streaming fashion.
The processes involved in the operation of the present systems may be advantageously executed by a data processing (computing) apparatus, such as one or more computers. Said computers may perform the functions of managing means of the systems.
The object of the present invention may also be achieved by supplying a system or an apparatus with a storage medium which stores program code of software that realises the functions of the above-described embodiments, and causing a computer (or CPU or MPU) of the system or apparatus to read out and execute the program code stored in the storage medium.
In this case, the program code itself read out from the storage medium realizes the functions of the embodiments described above, so that the storage medium storing the program code also and the program code per se constitutes the present invention.
The storage medium for supplying the program code may be selected, for example, from a floppy disk, hard disk, optical disk, magneto-optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card, ROM, DVD-ROM, Blue-ray disk, solid state disk, and network attached storage (NAS).
It is to be understood that the functions of the embodiments described above can be realised not only by executing a program code read out by a computer, but also by causing an operating system (OS) that operates on the computer to perform a part or the whole of the actual operations according to instructions of the program code.
Furthermore, the program code read out from the storage medium may be written into a memory provided in an expanded board inserted in the computer, or an expanded unit connected to the computer, and a CPU or the like provided in the expanded board or expanded unit may actually perform a part or all of the operations according to the instructions of the program code, so as to accomplish the functions of the embodiment described above.
It is apparent that there have been provided in accordance with the invention, methods for planning, training or performing insertion procedures, as well as systems, computer programs and computer program products configured for use in such insertion procedures, that provide for substantial advantages as set forth above. While the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications, and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, it is intended to embrace all such alternatives, modifications, and variations as follows in the spirit and broad scope of the appended claims.

Claims

1. A method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
(a) registering the path of entry or part thereof to the object;
(b) configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof as registered in (a);
(c) providing an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof as registered in (a).
2. The method according to claim 1 further comprising the step (d) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
3. The method according to claim 1 , wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (d) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
4. The method according to any one of claims 1 to 3 further comprising the step of defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
5. The method according to any one of claims 1 to 3 further comprising the steps:
- obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- optionally, generating from said data set an image of the interior spatial structure of the object or part thereof and displaying said image; and
- defining in said data set or image the target area and the path of entry to said target area.
6. A system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
(a) the insertion tool configured to be manipulated by a user; (b) a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof; and
(c) an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
7. A system according to claim 6, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof, whereby the system is for training the insertion procedure.
8. The system according to any one of claims 6 or 7 further comprising means for registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
9. The system according to any one of claims 6 to 8 further comprising any one, more or all of:
- means for obtaining a data set comprising information on interior spatial structure of the object or part thereof;
- means for generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof and preferably means for displaying said image;
- means for or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
10. A computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer- readable medium, or a combination of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
- configuring a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof; and
- configuring an image generating system to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
1 1. The computer programme, or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination of such computer programmes or computer programme products, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof, whereby the computer programme or computer programme product or combination thereof is for training the insertion procedure.
12. The computer programme or computer programme product or combination thereof according to any one of claims 10 or 1 1 further capable of or comprising a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
13. The computer programme or computer programme product or combination thereof according to any one of claims 10 to 12 further capable of or comprising a computer programme or computer programme product capable of any one, more or all of:
- generating from a data set comprising information on interior spatial structure of the object or part thereof an image of the interior spatial structure of the object or part thereof;
- defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
14. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the insertion procedure is surgical or non-surgical.
15. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the object is non- transparent.
16. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the path of entry is a straight line or a cylindrical or funnel-shaped tube.
17. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the registration of the path of entry to the object is dynamic or real-time.
18. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the appearance of the virtual rendition of the path of entry is configured to vary in the course of the insertion procedure, particularly in function of compliance with or a imminent or occurred deviation from the path of entry or in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance.
19. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the augmented reality environment comprises displayed therein also a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
20. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the augmented reality environment comprises displayed therein also a virtual rendition of the interior spatial structure of the object or part thereof generated from the data set comprising information on said interior spatial structure of the object or part thereof and superimposed on the image of the object or the physical or virtual surrogate thereof.
21. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, wherein the haptic device is a 6-degrees of freedom haptic device.
22. The method according to any one of claims 1 to 5 or the system according to any one of claims 6 to 9 or the computer programme or computer programme product or combination thereof according to any one of claims 10 to 13, which allow for a stereoscopic view of the augmented reality environment and/or for real-time view of the augmented reality environment.
PCT/EP2012/051469 2011-01-28 2012-01-30 Insertion procedures in augmented reality WO2012101286A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11152597.8 2011-01-28
EP11152597 2011-01-28

Publications (1)

Publication Number Publication Date
WO2012101286A1 true WO2012101286A1 (en) 2012-08-02

Family

ID=45688444

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/051469 WO2012101286A1 (en) 2011-01-28 2012-01-30 Insertion procedures in augmented reality

Country Status (1)

Country Link
WO (1) WO2012101286A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014100697A1 (en) * 2012-12-21 2014-06-26 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US8764449B2 (en) 2012-10-30 2014-07-01 Trulnject Medical Corp. System for cosmetic and therapeutic training
WO2017098506A1 (en) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Autonomic goals-based training and assessment system for laparoscopic surgery
US9792836B2 (en) 2012-10-30 2017-10-17 Truinject Corp. Injection training apparatus using 3D position sensor
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
CN109330668A (en) * 2018-09-08 2019-02-15 潍坊学院 A kind of Internal Medicine-Oncology drug interventional therapy device
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
WO2020153411A1 (en) * 2019-01-23 2020-07-30 Sony Corporation Medical arm system, control device, control method, and program
US10743942B2 (en) 2016-02-29 2020-08-18 Truinject Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
CN111975765A (en) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
WO2022101734A1 (en) * 2020-11-10 2022-05-19 Sony Corporation Of America Medical examination of human body using haptics
US11750794B2 (en) 2015-03-24 2023-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095628A2 (en) 1999-10-29 2001-05-02 Marconi Medical Systems, Inc. Planning minimally invasive procedures for in - vivo placement of objects
US20020075286A1 (en) 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US20060256036A1 (en) 2005-05-11 2006-11-16 Yasuo Katano Image processing method and image processing apparatus
WO2007136771A2 (en) 2006-05-19 2007-11-29 Mako Surgical Corp. Method and apparatus for controlling a haptic device
US20090000626A1 (en) * 2002-03-06 2009-01-01 Mako Surgical Corp. Haptic guidance system and method
WO2009127701A1 (en) 2008-04-16 2009-10-22 Virtual Proteins B.V. Interactive virtual reality image generating system
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095628A2 (en) 1999-10-29 2001-05-02 Marconi Medical Systems, Inc. Planning minimally invasive procedures for in - vivo placement of objects
US20020075286A1 (en) 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US20090000626A1 (en) * 2002-03-06 2009-01-01 Mako Surgical Corp. Haptic guidance system and method
US20060256036A1 (en) 2005-05-11 2006-11-16 Yasuo Katano Image processing method and image processing apparatus
WO2007136771A2 (en) 2006-05-19 2007-11-29 Mako Surgical Corp. Method and apparatus for controlling a haptic device
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method
WO2009127701A1 (en) 2008-04-16 2009-10-22 Virtual Proteins B.V. Interactive virtual reality image generating system
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
"Digital Image Processing for Medical Applications", 2009, CAMBRIDGE UNIVERSITY PRESS
"Fundamentals of Medical Imaging", 2009, CAMBRIDGE UNIVERSITY PRESS
"Haptic Rendering: Foundations, Algorithms and Applications", 2008, A K PETERS
"Human Haptic Perception: Basics and Applications", 2008, BIRKHAUSER BASEL
"Medical Imaging Signals and Systems", 2005, PRENTICE HALL
GIRLING: "Stereoscopic Drawing: A Theory of 3-D Vision and its application to Stereoscopic Drawing", 1990, REEL THREE-D ENTERPRISES
JUDGE: "Stereoscopic Photography", 2008, GHOSE PRESS
MCLAUGHLIN ET AL.: "Touch in Virtual Environments: Haptics and the Design of Interactive Systems", 2001, PEARSON EDUCATION
MILELLA, A.; SIEGWART: "Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point", IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION SYSTEMS 2006, January 2006 (2006-01-01), pages 21, XP010899374, DOI: doi:10.1109/ICVS.2006.56
PJ BESL; ND MCKAY: "A method for registration of 3-d shapes", IEEE TRANS. PATTERN ANAL. MACH. INTELL., vol. 14, no. 2, 1992, pages 239 - 256, XP001013705, DOI: doi:10.1109/34.121791

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11403964B2 (en) 2012-10-30 2022-08-02 Truinject Corp. System for cosmetic and therapeutic training
US8764449B2 (en) 2012-10-30 2014-07-01 Trulnject Medical Corp. System for cosmetic and therapeutic training
US8961189B2 (en) 2012-10-30 2015-02-24 Truinject Medical Corp. System for cosmetic and therapeutic training
US11854426B2 (en) 2012-10-30 2023-12-26 Truinject Corp. System for cosmetic and therapeutic training
US9443446B2 (en) 2012-10-30 2016-09-13 Trulnject Medical Corp. System for cosmetic and therapeutic training
US10902746B2 (en) 2012-10-30 2021-01-26 Truinject Corp. System for cosmetic and therapeutic training
US9792836B2 (en) 2012-10-30 2017-10-17 Truinject Corp. Injection training apparatus using 3D position sensor
US10643497B2 (en) 2012-10-30 2020-05-05 Truinject Corp. System for cosmetic and therapeutic training
WO2014100697A1 (en) * 2012-12-21 2014-06-26 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US11278296B2 (en) 2012-12-21 2022-03-22 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
CN104918573A (en) * 2012-12-21 2015-09-16 玛口外科股份有限公司 Systems and methods for haptic control of a surgical tool
US11857200B2 (en) 2012-12-21 2024-01-02 Mako Surgical Corp. Automated alignment of a surgical tool
US11259816B2 (en) 2012-12-21 2022-03-01 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US10398449B2 (en) 2012-12-21 2019-09-03 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US11857201B2 (en) 2012-12-21 2024-01-02 Mako Surgical Corp. Surgical system with automated alignment
US10595880B2 (en) 2012-12-21 2020-03-24 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
US10896627B2 (en) 2014-01-17 2021-01-19 Truinjet Corp. Injection site training system
US10290232B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US11750794B2 (en) 2015-03-24 2023-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
WO2017098506A1 (en) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Autonomic goals-based training and assessment system for laparoscopic surgery
US10743942B2 (en) 2016-02-29 2020-08-18 Truinject Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
US11730543B2 (en) 2016-03-02 2023-08-22 Truinject Corp. Sensory enhanced environments for injection aid and social training
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US11710424B2 (en) 2017-01-23 2023-07-25 Truinject Corp. Syringe dose and position measuring apparatus
CN109330668A (en) * 2018-09-08 2019-02-15 潍坊学院 A kind of Internal Medicine-Oncology drug interventional therapy device
CN109330668B (en) * 2018-09-08 2020-06-12 山东第一医科大学附属肿瘤医院(山东省肿瘤防治研究院、山东省肿瘤医院) Medical oncology medicine intervenes treatment device
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
WO2020153411A1 (en) * 2019-01-23 2020-07-30 Sony Corporation Medical arm system, control device, control method, and program
JP7400494B2 (en) 2019-01-23 2023-12-19 ソニーグループ株式会社 Medical arm system, control device, control method, and program
CN113301866A (en) * 2019-01-23 2021-08-24 索尼集团公司 Medical arm system, control device, control method, and program
CN111975765A (en) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
CN111975765B (en) * 2019-05-24 2023-05-23 京瓷办公信息系统株式会社 Electronic device, robot system, and virtual area setting method
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
US11776687B2 (en) 2020-11-10 2023-10-03 Sony Group Corporation Medical examination of human body using haptics
WO2022101734A1 (en) * 2020-11-10 2022-05-19 Sony Corporation Of America Medical examination of human body using haptics
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Similar Documents

Publication Publication Date Title
WO2012101286A1 (en) Insertion procedures in augmented reality
US11931117B2 (en) Surgical guidance intersection display
US11484365B2 (en) Medical image guidance
US11259879B2 (en) Selective transparency to assist medical device navigation
US20230384734A1 (en) Method and system for displaying holographic images within a real object
US20220296208A1 (en) Loupe display
US20230233264A1 (en) Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
JP5417609B2 (en) Medical diagnostic imaging equipment
CN103971574A (en) Ultrasonic guidance tumor puncture training simulation system
EP3009096A1 (en) Method and system for displaying the position and orientation of a linear instrument navigated with respect to a 3D medical image
CN109273091A (en) A kind of percutaneous nephrolithy based on data in art takes stone system of virtual operation
Traub et al. Advanced display and visualization concepts for image guided surgery
US11771508B2 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
EP2954846B1 (en) Swipe to see through ultrasound imaging for intraoperative applications
CA3149196C (en) Method and system for generating a simulated medical image
US11109930B2 (en) Enhanced haptic feedback system
JP2011131020A (en) Trocar port positioning simulation method and device therefor
CN114730628A (en) Image capture vision for augmented reality
WO2021146313A1 (en) Systems and methods for providing surgical assistance based on operational context
Vogt et al. Augmented reality system for MR-guided interventions: Phantom studies and first animal test
JP7355514B2 (en) Medical image processing device, medical image processing method, and medical image processing program
US20220414914A1 (en) Systems and methods for determining a volume of resected tissue during a surgical procedure
Martins et al. Input system interface for image-guided surgery based on augmented reality
Makhlouf et al. Biomechanical Modeling and Pre-Operative Projection of A Human Organ using an Augmented Reality Technique During Open Hepatic Surgery
Hoßbach et al. Simplified stereo-optical ultrasound plane calibration

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12704727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12704727

Country of ref document: EP

Kind code of ref document: A1