Endoscopic Procedure Simulation
This invention relates generally to endoscopic procedure simulation and, more particularly, to an endoscopic procedure simulation system which enables an operative to realistically perform an endoscopic procedure in a virtual reality environment.
Airway management is an anaesthetist's crucial specialist skill. It is a vital skill, because when a patient is anaesthetised for surgery or is otherwise unable breathe for themselves, failure to ensure that that patient's airway is patent for ventilation means that oxygen cannot be delivered to them, which can result in brain damage or even death.
In order to assist an anaesthetist in achieving proper airway management, a large number of airway devices have been developed over the years, and some of these devices are aimed at a specific group of patients, whose airways may be especially difficult to manage using standard techniques and devices. The reasons for this an range from anatomical variations to a pathological obstruction, such as a tumour.
One such device, the fibreoptic bronchoscope, is now generally accepted as the device of choice by anaesthetists. It has a handset that an anaesthetist can manipulate and a flexible stem that contains fibreoptic cables. There are two fibreoptic bundles, one for transmitting illumination from the light source to the tip of the fibreoptic bronchoscope and the other for transmitting the image as seen from the tip of the fibreoptic bronchoscope to the anaesthetist. Cables bend the tip in response to operator movements at the handset, and this is known as tip angulation or tip flexion. Thus, there are three potential movements of the bronchoscope which can be effected as required by an operative in order to negotiate a patient's airway: translational movement, rotational movement and flexion or angulation, and because the operative can visualise the patient's airway and see any obstructions, etc., it is possible for the bronchoscope to be manipulated around them.
The fibreoptic bronchoscope is designed to work with a fairly standard airway management device, known as the endotracheal tube. Such a tube is designed to
enter the airway and rest in the trachea, which is the final common pathway to both lungs. An inflatable cuff, located before the end of the tube provides a seal against the contents of the pharynx, and upon connection to an external source, oxygen can be delivered from the source directly to the lungs. Thus, the fibreoptic bronchoscope is first inserted into the airway and negotiated through the airway to the trachea. Then an endotracheal tube is passed over the stem into the trachea, in a process commonly known as "railroading". Finally, the fibreoptic bronchoscope is removed and the endotracheal tube is connected to an external oxygen source.
As with all clinical skills, effective use of the fibreoptic bronchoscope requires experience. This can be partly achieved by practice on mannequins and live patients (where appropriate), but the difficult airway patient group, for which the fibreoptic bronchoscope was designed, comprises only 0.1 - 0.3% of the population and patients falling into this category are therefore relatively infrequently encountered. The necessary experience is therefore difficult to obtain through normal channels, and training reinforcement by repetition cannot occur.
The rapid progress of computer technology has raised the possibility of bridging the gap between standard models, such as mannequins and non-difficult airway patients, and the difficult airway patient. For example, the Human Patient Simulator is known which is a sophisticated, computer-driven mannequin, in which servo motors allow scenario-dependent changes to the airway anatomy. While effective, this type of device can be prohibitively costly, and may also require dedicated equipment.
US Patent No. 4,907,973 describes an expert system simulator for modelling realistic internal environments, and the simulator may be used to simulate an endoscopic procedure, whereby a mock endoscope is inserted and manipulated within a model. The model includes a mock bodily region of interest and a plurality of sensors to detect the position of the endoscope. A computer receives signals from the sensors, and retrieves data from memory in accordance with the signals received, which data is representative of the view observed from the measured endoscope position during a real operation. The data is subsequently shown on a video display, whereby the displayed image is adjusted based on real movement of the endoscope within the model.
However, because this type of system employs a physical model representative of a specified bodily region, use of such a system is limited to training in procedures relating to that particular bodily region or the paths defined by the model. Further, repeated use of the physical models degrades the realism of the simulation and reduces the benefits of simulation training because any model is unlikely to take into account the complex anatomy, and variations thereof, in real subjects, and only permits the physician to become used to that particular model.
US Patent Application Publication No. US-2004/076940-A describes a virtual reality simulation system in which the model simply comprises a "box" having a base and opposing supports, with a carriage assembly disposed between the supports, possibly on guide rods or the like which extend between the supports. A pulley is provided at each support and a belt is provided over each pulley such that it extends between the supports. The carriage assembly is arranged to receive an end of the endoscopic device extended into the model through a simulated bodily orifice, such that 'further insertion of the device into the box will cause the carriage assembly to move toward the end support. Conversely, any force applied to remove the device from the box, while the device is engaged with the carriage assembly, will result in the carriage assembly moving in the direction of such force, toward the opposite support. Marks are provided longitudinally along the length of the box and an encoder is provided on the carriage assembly, which encoder senses the above-mentioned marks, thereby measuring translational movement of the carriage assembly and hence the endoscopic device held therein. Similarly, rotational movement of the endoscopic device can be measured by means of an encoder sensor which senses movement of an encoder disc (having marks thereon) which is coupled to the distal end of the endoscopic device via a collet. Translational and rotational movement measured by the above-mentioned sensors is transmitted to a computer system to enable such motion to be reflected during simulation in respect of a virtual reality model of a bodily region of interest displayed on a screen.
However, one of the major drawbacks of the latter system is that the use of electro¬ mechanical encoders and sensors, and the additional hardware required to support such components, results in a relatively costly system, the cost often making it
prohibitive to many medical institutions, particularly if it is only being considered for use in training physicians to deal with management of difficult airways.
We have now devised an improved arrangement, and in accordance with the present invention, there is provided a system for simulating performance of an endoscopic procedure within an anatomical region represented by a model comprising a path through which an endoscope can move under control of an operative and one or more image capture devices for capturing images of said endoscope, the system comprising means for receiving images captured in respect of said endoscope as it is moved within said model, means for tracking said movement, and means for displaying a simulated image representative of said movement.
In one exemplary embodiment of the invention, the system may be arranged to track movement of the endoscope by determining its three-dimensional position relative to a region of interest within one or more image frames captured by said one or more image captured devices.
In a preferred embodiment, the system is arranged to display said simulated image representative of said movement relative to an image of said anatomical region.
The system is beneficially arranged to measure inserted endoscopic depth, endoscope tip rotation and/or endoscope tip angulation.
The system is beneficially arranged to assess the performance of an endoscopic procedure in terms of time taken and/or simulated contact of the tip of said endoscope with the walls of said anatomical region.
Also in accordance with the present invention, there is provided a model for use with a system as defined above, the model being composed of abstract obstacles or representative of an anatomical region in respect of which an endoscopic procedure is to be simulated, the model comprising a carriage assembly movably mounted relative to a support for translational movement relative thereto, means for affixing the tip of an endoscope to said carriage assembly for movement therewith, means for mounting an image capture device relative to said tip of said endoscope, and means for feeding
data representative of images captured by said image capture device to the simulation system defined above.
In a preferred embodiment, one or more image capture devices is mounted or otherwise provided on a support mounted to the carriage assembly for movement therewith. In a preferred embodiment, at least one image capture device is mounted in a cradle which is fixed to, and suspended below, the carriage assembly. The carriage assembly beneficially comprises a collar, rotatably mounted thereon, said collar and said tip of said endoscope being coupled together such that rotation of said tip of said endoscope causes corresponding rotation of said collar.
The collar may be provided with tags, whereby the angle between said tags in an image frame captured by said image capture device is indicative of the rotational position of said tip of said endoscope. A tag may be provided relative to the translational movement path of the carriage assembly and endoscope tip, which tag is of varying width along its length, whereby the width of the tag in an image frame captured by said image capture device is indicative of the endoscope insertion depth.
Means may be provided for transmitting a light beam down the endoscope for measurement of endoscope tip angulation, whereby the location and distance of the light beam relative to a reference point, and/or the size of the light beam relative to a reference size, is indicative of the tip angulation.
The present invention extends to a simulation system comprising a system and model as defined above.
It will be appreciated that the term "endoscope" used throughout this specification is intended to encompass both a real endoscope and a mock endoscope device.
These and other aspects of the present invention will be apparent from, and elucidated with reference to, the embodiment described herein.
An embodiment of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which:
Figure 1 is a schematic line drawing of a model for use in a simulation system according to an exemplary embodiment of the present invention.
Thus, it is an object of the invention to provide a simulation system for use in training an operative to use a flexible endoscope, such as a fibreoptic bronchoscope, which simulation system is effective, adaptable and relatively inexpensive.
As explained above, there are three basic movements involved in endoscopy, namely inserted scope depth, endoscope stem rotation and endoscope tip angulation. Thus, it is required of any simulation system to be able to measure at least one, and more preferably all, of these movements so as to simulate endoscope movements during a procedure. However, during a real endoscopic procedure, the endoscope is passed into an anatomical space and, once advanced, the tip and the inserted length of the distal endoscopic stem disappears from view.
Thus, a first element of the simulation system according to an exemplary embodiment of the present invention, comprises a "box" for mimicking the anatomical space into which an endoscope is to be passed and manipulated. Referring to Figure 1 of the drawings, the box 10 comprises a base 12 having opposing end plates 14 extending substantially perpendicular thereto. One of the end plates 14 may have support legs 16, or the like, for supporting the box 10 when it is in the illustrated upright position, although it will be appreciated that the box can be manipulated and used in a number of different orientations, as required by the application to be simulated. The opposite end plate 14 is provided with an orifice 18, through which the tip of the flexible tube 20 of the endoscope can be passed into the space defined by the box interior. A pair of parallel guide rails 22 extend between the two end plates 14, and a carriage assembly 24 is disposed between the two guide rails. Means 26 are provided on either side of the carriage assembly 24 to slidably mount the carriage assembly 24 on the guide rails.
Pulleys 28, 30 are provided at each end of the base 12, adjacent respective end plates 14 with a belt 32 therebetween. The carriage assembly 24 is coupled to the belt 32 (in this case via the sliding means 26) such that carrier assembly motion up and down
along the length of the guide rails 22 enables the belt 32 to traverse and rotate the pulleys 28, 30. The carriage assembly 24 comprises a rotating collar 34 having a central opening 36 through which the tip 20 of the endoscope tube extends, and the tube of the endoscope is affixed to the collar 34 adjacent the tip 20, such that rotational movement of the tube causes corresponding rotational movement of the collar 34.
A cradle 38 is suspended below the carriage assembly 24, in which is supported an upwardly directed video camera or webcam 40. The cradle 38 is suspended in such a way that movement of the carriage assembly 24 causes corresponding movement of the cradle 38 (and camera 40). This has a significant advantage over the use of one or more fixed cameras, as a constant field of view of the camera relative to the tip 20 of the endoscope tube is maintained, thereby ensuring optimum images for processing.
Images captured by the camera 40 are fed to a computer system (not shown) for processing so as to measure the movement and display a simulated view of such movement on a screen. A graphic model of an anatomical body region may also be displayed on the screen, with movement of the endoscope through the displayed body region also being shown. Assessment of an operatives performance during a simulated procedure may be performed automatically, for example, according to time taken to complete the procedure and/or number of times the endoscope tip is determined to "hit" the sides of the simulated anatomical body region being negotiated.
As stated above, movement of the endoscope tip 20 within the space defined by the box 10 can take three specific forms: inserted endoscope depth, endoscope stem rotation and endoscope tip angulation.
Taking each of these in turn, in accordance with this exemplary embodiment of the present invention, a triangular (preferably white) tag 42 is provided along the length of the inner wall of the base 12, the width of the tag 42 increasing from the top to the bottom or vice versa. The computer simulation system is programmed with calibration information such that, when images of the endoscope tip 20 are received, its relative position within the space defined by the box 10 can be determined by the
width of the portion of the tag 42 visible in the image. Equally, translational movement up and down within the space will be identified by the changing width of the tag 42 within the corresponding images.
Typical pseudo code used by the simulation system for measuring stem depth from received images may be as follows:
• go through all pixels of the wall region of interest within a video frame;
• if the blue, green and red components of a pixel are above the defined threshold, consider it a white pixel and increment the pixel count by one (which pixel count is indicative of the width of the portion of the "tag" 42 visible within the video frame); and
• this is directly related to the height;
As stated above, in an exemplary embodiment of the present invention, the carriage assembly 24 comprises a collar 34 to which the endoscope tip 20 is affixed such that rotation of the tip causes corresponding rotational movement of the collar. In order to facilitate measurement of such rotational movement by the simulation system receiving images of the endoscope tip 20 captured by the camera 40, three different coloured tags (not shown), say blue, green and red, are provided on the lower plane surface of the collar 34, at substantially equidistant intervals around the central opening 36. As the collar 34 rotates (due to rotation of the endoscope tip 20), the angle between the coloured tags changes in the images captured by the camera 40. The simulation system receiving the images from the camera is calibrated such that rotational position and/or movement of the tip 20 can be measured by determining the angle(s) between the tags and/or determining changes therein.
Typical pseudo code used by the simulation system for measuring tip rotation from the received images may be as follows:
• go through all the pixels within the central region of interest of a video frame;
• if pixel is predominantly blue, green or red, add the x and y coordinate value to the respective running total and increment the respective (blue, green or red) pixel count;
• get the average x and y coordinates for blue, green and red by dividing the respective pixel running totals by the respective pixel count;
• the gradient between any two centres of gravity give the tangent of step rotation.
Finally, in order to measure tip angulation, a separate light module (not shown) may be provided which transmits light down the fibreoptic tip to give a bright spot within the captured images, and the simulation system may be calibrated to determine direction and extent of tip angulation as a function of the distance and location of the bright spot relative to a reference or central point.
Typical pseudo code used by the simulation system for measuring tip rotation from the received images may be as follows:
• go through all pixels within the central region of interest of a video frame;
• if the blue, green and red component of a pixel is above a defined threshold, consider it a white pixel and add the x and y coordinate value to the running total and increment the pixel count;
• get the average x an y coordinates for white (i.e. the bright spot) by dividing the running total by the respective pixel count;
• the distance of the averaged white coordinates from the calculated initial position for the step rotation and/or the ratio of the number of white pixels against the calibrated initial number of white pixels against the calibrated initial number of white pixels indicate tip angulation.
The anatomical region displayed on the screen of the computer simulation system may comprise real footage, simulated graphics, or a combination of the two. The technology required to provide a path and simulate movement therethrough is well- known in the field of computer gaming and the like, and will not be discussed in any great detail herein. The significance of the simulation system of the present invention is the ability to identify and determine movement of a remote endoscopic device using computer vision (i.e. processing and analysis of received images) and then translate such movement into the same movement within a simulated environment.
Thus, the present invention proposes a cost-effective, realistic fibreoptic endoscope (e.g. bronchoscope) simulator which combines computer vision technology and computer game technology to provide a unique system. As described above, in an exemplary embodiment, the endoscope is passed into a hollow box, mimicking the usual passage into space. The box interior should ideally be substantially optimised, possibly through empirical experimentation, for computer vision tracking of the three- dimensional position of the endoscope tip.
Video cameras within the box (or even attached to the eyepiece on the handset of the endoscope) pass their video feeds to the computer simulation system. Thresholding and edge detection image processing operations are performed respectively. The resulting significant pixels or edges are trigonometrically translated to the endoscope's three-dimensional position in the box. The endoscope thus effectively functions as a three-dimensional "joystick".
Virtual reality obstacle courses can be constructed de novo, while virtual reality difficult airways (for example) may be reconstructed from existing medical video libraries. A simulation can be constructed using any commercial computer game authoring software, or the like, whereby the object is to negotiate the endoscope through the "obstacles". Using 3D game technology, the operator sees the simulated view from the endoscopic tip on the monitor. Virtual reality obstacle courses may be designed for dexterity training, whereas virtual reality difficult airway models may be designed for realistic scenario training. Responses such as fogging, coughing and haemorrhage may be simulated. Video sequences of the actual intubation may be made available for comparison. Qualitative user feedback and quantitative time- based measurements are used to assess for and optimise effectiveness of the scenes and the simulation engine.
It will be appreciated that, while an exemplary embodiment of the present invention is described above with specific reference to bronchoscopy, the system of the present invention is equally applicable to various other minimally invasive medical procedures, particularly endoscopic procedures including, but not limited to, laryngoscopy, gastroscopy, colonoscopy, sigmoidoscopy, arthroscopy, laparoscopy,
uteroscopy, etc., and the present invention is not necessarily intended to be limited in this regard.
It should be noted that the above-mentioned embodiment illustrates rather than limits the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word "comprising" and "comprises", and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. The singular reference of an element does not exclude the plural reference of such elements and vice-versa. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.