WO2015110929A1 - Robotic control of an endoscope orientation - Google Patents

Robotic control of an endoscope orientation Download PDF

Info

Publication number
WO2015110929A1
WO2015110929A1 PCT/IB2015/050137 IB2015050137W WO2015110929A1 WO 2015110929 A1 WO2015110929 A1 WO 2015110929A1 IB 2015050137 W IB2015050137 W IB 2015050137W WO 2015110929 A1 WO2015110929 A1 WO 2015110929A1
Authority
WO
WIPO (PCT)
Prior art keywords
endoscope
robot
anatomical region
endoscopic image
controller
Prior art date
Application number
PCT/IB2015/050137
Other languages
French (fr)
Inventor
Aleksandra Popovic
David Paul Noonan
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015110929A1 publication Critical patent/WO2015110929A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/0016Holding or positioning arrangements using motor drive units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • endoscope 12 is mounted to the end-effector of robot 11.
  • a pose of the end-effector of robot 11 is a location and an orientation of the end-effector within a coordinate system of an actuator of robot.
  • any given pose of a field-of-view ("FOV") of endoscope 12 within an anatomical region corresponds to a distinct pose of the end-effector of robot 11 within the robotic coordinate system. Consequently, each distinct individual endoscopic image of the anatomical region generated by endoscope 12 may be linked to a corresponding pose of endoscope 12 within the robotic coordinate system.
  • FOV field-of-view

Abstract

A robot guiding system employing a robot unit (10) including an endoscope (12) and a robot (11),and a control unit (20) including an endoscopic image controller (22) and a robot controller (21). In operation, the endoscope (12) generates an endoscopic image of an anatomical region as the robot (11) moves the endoscope (12) within the anatomical region in response to robotic actuation commands. The endoscopic image controller (22) controls a display of the endoscopic image (14)of the anatomical region, and the controller (21) communicates the robotic actuator commands to the robot (11). The robot controller (21) and/or the endoscopic image controller (22) is operable to maintain a displayed orientation of the anatomical region within the endoscopic image(14) as the robot (11) moves the endoscope (12) within the anatomical region.

Description

ROBOTIC CONTROL OF AN ENDOSCOPE ORIENTATION
The present invention generally relates to robotic control of an endoscope during a minimally invasive surgical procedure (e.g., a minimally invasive coronary bypass grafting surgery). The present invention specifically relates to maintaining a displayed orientation of an anatomical region within an endoscopic image as a robot controls a movement (i.e., translation and/or rotation) of the endoscope within the anatomical region.
Minimally invasive surgery is often performed under a visual guidance from a rigid endoscope. Examples of such procedures include, but are not limited to, laparoscopic gastrointestinal surgery, thoracoscopy, and video-assisted cardiac surgery. More particularly, the rigid endoscope is inserted into a patient's body through a small port and it is rotated and/or pivoted around the port during the surgery to allow visualization of all relevant structures inside the body. The manipulation of endoscope is usually performed by a surgical assistant as the surgeon is holding other instruments. This setup is difficult due to many coordinate systems (e.g., surgeon's, assistant's, scope's, etc.) and so called 'fulcrum effect' - an inherent inversion of instrument axes.
Moreover, the fulcrum effect is amplified if the endoscope axis is rotated with respect to the interventional instrument(s) (i.e., when the endoscope image is not pointing to 'up' direction of the interventional instrument s)). In addition, if the endoscope is being held by a robotic device, then this rotation may be inherent part of the robot kinematics. For example, spherical robot as known in the art (i.e., robot having concentric arches) is suitable for endoscopic surgery since it maintains a remote center of motion (i.e., a fulcrum point at which there is no motion). However, rotation of the endoscope axis will pose an issue for the surgeon as it will continuously change the orientation of the image.
One known possible solution is to place a servo motor and an accelerometer on a CCD camera of the endoscope. In operation, the accelerometer computes relative orientation of the CCD camera to gravitational field of earth. The servo motor can rotate the CCD camera around the endoscope axis and is responsive to values from accelerometer whereby the servo motor orients the CCD camera so that a main axis of the CCD camera is always aligned with earth's gravitational field.
Another known possible solution also uses an accelerometer on the CCD camera. However, this solution does not mechanically rotate the camera, but rotates the image presented on the screen based on the same operating principle of alignment with the earth's gravitation field.
While the aforementioned solutions maintain rotation with respect to gravity, these solutions are not applicable for minimally invasive procedures, particularly when the operating room table is tilted as is common for minimally invasive procedures. The present invention maintains a desired displayed orientation of an anatomical region within an endoscopic image for robotic systems with spherical mechanisms without using additional sensors. The desired displayed orientation of the anatomical region within the endoscopic image is maintained either actively by a physically rotation of the endoscope camera (e.g., a rotation of the endoscope or a rotation of the endoscope camera counter to a rotation of the endoscope), or passively by a graphical rotation of the endoscopic image to thereby maintain the correct endoscopic image orientation at all times.
One form of the present invention A robot guiding system employing a robot unit including an endoscope and a robot, and a control unit including an endoscopic image controller and a robot controller. In operation, the endoscope generates an endoscopic image of an anatomical region as the robot moves the endoscope within the anatomical region in response to robotic actuation commands. The endoscopic image controller controls a display of the endoscopic image of the anatomical region, and the controller communicates the robotic actuator commands to the robot. The robot controller and/or the endoscopic image controller is operable to maintain a displayed orientation of the anatomical region within the endoscopic image as the robot moves the endoscope within the anatomical region.
A second form of the present invention is a robot guiding method involving a commanding of a robot to move an endoscope within an anatomical region, and a displaying of an endoscopic image of the anatomical region as the endoscope is moved by the robot within the anatomical region. The commanding of the robot and/or the displaying of the endoscopic image maintains a displayed orientation of the anatomical region within the endoscopic image as the robot moves the endoscope within the anatomical region.
The foregoing forms and other forms of the present invention as well as various features and advantages of the present invention will become further apparent from the following detailed description of various embodiments of the present invention read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present invention rather than limiting, the scope of the present invention being defined by the appended claims and equivalents thereof. FIG. 1 illustrates an exemplary embodiment of a robotic guiding system in accordance with the present invention.
FIG. 2 illustrates an exemplary surgical implementation of the robotic guiding system shown in FIG. 1.
FIG. 3 illustrates an exemplary fulcrum point of a spherical endoscope shown in FIG.
2.
FIG. 4 illustrates a flowchart representative of an exemplary embodiment of an endoscopic image orientation method in accordance with the present invention.
As shown in FIG. 1, a robotic guiding system employs a robot unit 10 and a control unit 20 for any endoscopic procedure involving an endoscopic imaging of an anatomical region (e.g., cranial region, thoracic region, abdominal region, patellar region, etc.). Examples of such endoscopic procedures include, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve
replacement), laparoscopic surgery (e.g., hysterectomy, prostactomy and gall bladder surgery), natural orifice transluminal surgery (NOTES), single incision laparoscopic surgery (SILS), pulmonary /bronchoscopic surgery and minimally invasive diagnostic interventions (e.g., arthroscopy).
Robot unit 10 includes a spherical robot 1 1 , an endoscope 12 rigidly attached to robot 1 1 and a video capture device 13 attached to endoscope 12.
Spherical robot 1 1 is broadly defined herein as any robotic device structurally configured with motorized control of one or more joints concentrically connecting arc segments for maneuvering an end-effector as desired for the particular endoscopic procedure. In practice, robot 11 should have all three (3) of the rotational degrees-of- freedom being pitch, yaw and rolling
Endoscope 12 is broadly defined herein as any device structurally configured with ability to image from inside a body. Examples of endoscope 12 for purposes of the present invention include, but are not limited to, any type of scope, flexible or rigid (e.g., endoscope, arthroscope, bronchoscope, choledochoscope, colonoscope, cystoscope, duodenoscope, gastroscope, hysteroscope, laparoscope, laryngoscope, neuroscope, otoscope, push
enteroscope, rhinolaryngoscope, sigmoidoscope, sinuscope, thorascope, etc.) and any device similar to a scope that is equipped with an image system .). The imaging is local, and surface images may be obtained optically with fiber optics, lenses, and miniaturized (e.g. CCD based) imaging systems (e.g., laparoscopic ultrasound). For purposes of the present invention, camera 12a is defined herein as any type of imaging component of endoscope 12.
In practice, endoscope 12 is mounted to the end-effector of robot 11. A pose of the end-effector of robot 11 is a location and an orientation of the end-effector within a coordinate system of an actuator of robot. With endoscope 12 mounted to the end-effector of robot 11, any given pose of a field-of-view ("FOV") of endoscope 12 within an anatomical region corresponds to a distinct pose of the end-effector of robot 11 within the robotic coordinate system. Consequently, each distinct individual endoscopic image of the anatomical region generated by endoscope 12 may be linked to a corresponding pose of endoscope 12 within the robotic coordinate system.
Video capture device 13 is broadly defined herein as any device structurally configured with a capability to convert an endoscopic video signal from endoscope 12 into a computer readable temporal sequence of endoscopic image frames ("EIF") 14. In practice, video capture device 13 may employ a frame grabber of any type for capturing individual digital still frames from the endoscopic video signal.
Still referring to FIG. 1, control unit 20 includes a robot controller 21 and an endoscopic image controller 22.
Robot controller 21 is broadly defined herein as any controller structurally configured to provide one or more robot actuator commands ("RAC") 26 to robot 11 as known in the art for controlling a pose of the end-effector of robot 11 as desired for the endoscopic procedure. More particularly, robot controller 21 converts operator pose commands ("OPC") 25 from an operator of control unit 20 into robot actuator commands 26. For example, operator pose command(s) 25 may indicate an endoscopic path leading to a desired 3D pose of the FOV of endoscope 12 within the anatomical region whereby robot controller 21 converts operator pose command(s) 25 into robot actuator commands 26 including an actuation current for each motor of robot 11 as needed to move (i.e. translate and/or rotate) endoscope 12 to the desired 3D pose of the FOV of endoscope 12 within the anatomical region.
Endoscope controller 22 is broadly defined herein as any controller structurally configured for controlling an endoscopic image display 15 of endoscopic image frames 14 as known in the art. For purposes of the present invention, endoscopic image display 15 is broadly define to include an operator viewing of a display of endoscopic image frames 14 via an eyepiece (not shown) of endoscope 12 and/or by a screen monitor (not shown) for videoscope embodiments of endoscope 12. The present invention enhances endoscopic image display 15 by further structurally configuring robot controller 21 to generate endoscope pose commands 25 for maintaining a desired displayed orientation of an anatomical region within endoscopic image display 15. More particularly, an operator orients camera 12a (or a component thereof (e.g., a lens)) relative to the anatomical region to define a desired display orientation of anatomical region in terms of "up", "down", "left" and "right" for purposes of visually tracking interventional tool(s) as desired. For example, as shown in FIG. 2, an operator may orient camera 12a relative to a thoracic region 31 of a patient 30 to define a desired display orientation of anatomical region in terms of up ("U"), down ("D", left ("L") and right ("R") for purposes of visually tracking an interventional tool 40 as desired. The desired displayed orientation of the anatomical region within the endoscopic image display 15 as shown in FIG. 2 is maintained either actively by a physically rotation of endoscope camera 12a (e.g., a rotation of endoscope 12 or a rotation of endoscope camera 12a counter to a rotation of the endoscope 12), or passively by a graphical rotation endoscopic image frames 14 to thereby maintain the desired endoscopic image orientation of display 15 at all times.
Also by example, an operator may generally orient camera 12a relative to thoracic region 31 of a patient 30. To specifically define a desired display orientation of anatomical region for purposes of visually tracking interventional tool 40 as desired, the operator views the current endoscopic display 15 of the thoracic region 31 and clicks a monitor to select specific anatomical feature(s) within display 15. Again, the desired displayed orientation of the anatomical region within the endoscopic image display 15 will be maintained either actively by a physically rotation of endoscope camera 12a, or passively by a graphical rotation endoscopic image frames 14 to thereby maintain the desired endoscopic image orientation of display 15 at all times.
For maintaining a desired displayed orientation of an anatomical region within endoscopic image display 15, endoscope pose commands 25 are inclusive of robotic actuator commands 26 and optionally image pose commands 27 in accordance an endoscopic image orientation method of the present invention. A description of FIGS. 3 and 4 will now be provided herein to facilitate a further understanding of endoscope pose commands 25.
Referring to FIG. 3, the present invention is premised on a discovery that a rolling velocity for an endoscope along its main axis is defined by a given pitch velocity and a yaw velocity for the endoscope relative to the main axis. More particularly, robot 11 includes actuators 11a and 11c, arc segments 1 lb and l id, end effector l ie, and camera interface 1 If In operation, (1) actuator 11a may be commanded to co-rotate arc segments 1 lb and l id about a rotation axis RAl for a desired φi degrees, (2) actuator 11c may be commanded to rotate arc segment 11d about a rotation axis RA2 for a desired φ2 degrees, (3) end effector l ie has a capability of rotating endoscope 12 about its rotational axis EA, and (4) camera interface 1 If has a capability of rotating endoscope camera 12a (not shown) about rotational axis EA. Furthermore, (5) rotational axes RAl and RA2 maintain a θι degree of separation, (6) rotational axes RAl and EA maintain a θ2 degree of separation, and (7) rotational axes RAl, RAD and EA intersect at a fulcrum point FP.
The following equations [1]-[21] mathematical establish a matrix T defining a position and an orientation of endoscope 12 where pitch and yaw of endoscope 12 are along axis EA:
Figure imgf000007_0001
Figure imgf000008_0001
where J is a distance from an endoscope tip to fulcrum point FP.
Consequently, a Jacobian matrix J relating a pitch angle velocity dp/dt, a yaw angle velocity dy/dt and a roll angle velocity dr/dt may be utilized for visual servoing of robot 11 in accordance with the following equations [22]-[41]:
Figure imgf000008_0002
Figure imgf000009_0001
Referring to FIG. 4, an execution flowchart 50 insures a correct orientation of endoscope 12, the roll degree-of- freedom of the end-effector or camera interface of robot 11 (FIG. 1) or display 15 (FIG. 1) has to rotate with velocity -dr/dt, which is counter to the rolling angle velocity +dr/dt derived from equations [1]-[41].
Specifically, prior to an execution of flowchart 50 at the beginning of the procedure, a surgeon selects a desired endoscope orientation by mounting the camera on the endoscope at a specific angle or interfacing with the camera interface of robot 11. As the procedure is performed, a stage S52 of flowchart 50 encompasses robot controller 21 (FIG. 1) computing the angle velocities for pitch, yaw, rolling and counter-rolling responsive to operator pose commands 24 (FIG. 1) for moving endoscope 12 to a desired 3D pose position inclusive of a fixed display orientation, and a stage S54 of flowchart 50 encompasses robot controller 20 generating endoscope pose commands 25 (FIG. 1) to achieve the computed angle velocities.
In one embodiment of stage S54, endoscope pose commands 25 consist of robot actuator commands 26 (FIG. 1) for actuating robot 11 as needed to move endoscope 12 to the desired 3D pose position inclusive of the fixed display orientation. Of importance is the commands for implementing the counter-rolling angle velocity -dr/dt is applied to the end- effector and/or camera interface of endoscope 12.
Concurrently or alternatively, in another embodiment of stage S56, endoscope pose commands 25 consist exclusively of image pose commands 27 (FIG. 1) for rotating endoscopic image frames (14) as needed to display fixed orientation of the anatomical region.
At any point of during flowchart 40, the surgeon may select a different orientation and the robot automatically continues corrects for the new orientation. From the description of FIGS. 1-4 herein, those having ordinary skill in the art will appreciate the numerous benefits of the present invention including, but not limited to, an application of the present invention to any type of endoscopy surgery performed on any type of blood vessels.
While various embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that the embodiments of the present invention as described herein are illustrative, and various changes and modifications may be made and equivalents may be substituted for elements thereof without departing from the true scope of the present invention. In addition, many modifications may be made to adapt the teachings of the present invention without departing from its central scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed as the best mode contemplated for carrying out the present invention, but that the present invention includes all embodiments falling within the scope of the appended claims.

Claims

1. A robot guiding system, comprising:
a robot unit (10) including
an endoscope (12) operable to generate an endoscopic image (14) of an anatomical region, and
a robot (11) operably connected to the endoscope (12) to move the endoscope (12) within the anatomical region responsive to robotic actuation commands; and
a control unit (20) including
an endoscopic image controller (22) operably connected to the endoscope (12) to control a display of the endoscopic image (14) of the anatomical region, and
a robot controller (21) operably connected to the robot (11) to communicate the robotic actuation commands to the robot (11),
wherein at least of the robot controller (21) and the endoscopic image controller (22) is operable to maintain a displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
2. The robot guiding system of claim 1, wherein the robot controller (21) controls a rotation of the endoscope (12) by the robot (11) to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
3. The robot guiding system of claim 1, wherein the endoscopic image controller (22) controls a rotation the display of the endoscopic image (14) of the anatomical region to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
4. The robot guiding system of claim 1, wherein the robot controller (21) calculates a roll velocity of the endoscope (12) along a main axis of endoscope (12).
5. The robot guiding system of claim 4, wherein the robot controller (21) calculates the roll velocity of the endoscope (12) as a function of a pitch velocity of the endoscope (12) relative to the main axis of endoscope (12).
6. The robot guiding system of claim 4, wherein the robot controller (21) calculates the roll velocity of the endoscope (12) as a function of a yaw velocity of the endoscope (12) relative to the main axis of endoscope (12).
7. The robot guiding system of claim 4, wherein at least one of the robot controller (21) controls a rotation of the endoscope (12) along the main axis of the endoscope (12) to counter the calculated roll velocity of the endoscope (12) along the main axis of endoscope (12).
8. The robot guiding system of claim 4, wherein the endoscopic image controller (22) controller a rotation of the display of the endoscopic image (14) of the anatomical region to counter the calculated roll velocity of the endoscope (12) along the main axis of endoscope (12).
9. The robot guiding system of claim 1, wherein the endoscopic image controller (22) is operated to define at least one of an up view, a down view, a left view and a right view of the displayed orientation of the anatomical region within the endoscopic image (14).
10. The robot guiding system of claim 1, wherein the robot controller (21) determines a pose of the robot (11) to define at least one of an up view, a down view, a left view and a right view of the displayed orientation of the anatomical region within the endoscopic image (14).
11. A control unit (20) for a robot (11) connected to an endoscope (12) generating an endoscopic image (14) of an anatomical region, the control unit (20) comprising:
an endoscopic image controller (22) operable to control a display of the endoscopic image (14) of the anatomical region; and
a robot controller (21) operable to communicate robotic actuation commands to the robot (11) to move the endoscope (12) within the anatomical region; and
wherein at least of the robot controller (21) and the endoscopic image controller (22) is operable to maintain a displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
12. The control unit (20) of claim 11, wherein the robot controller (21) controls a rotation of the endoscope (12) by the robot (11) to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
13. The control unit (20) of claim 11, wherein the endoscopic image controller (22) controls a rotation the display of the endoscopic image (14) of the anatomical region to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
14. The control unit (20) of claim 11, wherein the robot controller (21) calculates a roll velocity of the endoscope (12) along a main axis of endoscope (12) as a function of at least one of pitch velocity and a yaw velocity of the endoscope (12) relative to the main axis of endoscope (12).
15. The control unit (20) of claim 14, wherein at least one of the robot controller (21) controls a rotation of the endoscope (12) along the main axis of the endoscope (12) to counter the calculated roll velocity of the endoscope (12) along the main axis of endoscope (12), and the endoscopic image controller (22) controls a rotation of the display of the endoscopic image (14) of the anatomical region to counter the calculated roll velocity of the endoscope (12) along the main axis of endoscope (12).
16. A robot guiding method, comprising:
commanding a robot (11) to move an endoscope (12) within an anatomical region; and
displaying an endoscopic image (14) of the anatomical region as the endoscope (12) is moved by the robot (11) within the anatomical region,
wherein at least one of the commanding of the robot (11) and the displaying of the endoscopic image (14) maintains a displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
17. The robot guiding method of claim 16, wherein a rotation of the endoscope (12) by the robot (11) is controlled to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
18. The robot guiding method of claim 16, wherein a rotation the display of the endoscopic image (14) of the anatomical region is controlled to maintain the displayed orientation of the anatomical region within the endoscopic image (14) as the robot (11) moves the endoscope (12) within the anatomical region.
19. The robot guiding method of claim 16, wherein a roll velocity of the endoscope (12) along a main axis of endoscope (12) is calculated as a function of at least one of pitch velocity and a yaw velocity of the endoscope (12) relative to the main axis of endoscope (12).
20. The robot guiding method of claim 16, wherein, to counter the calculated roll velocity of the endoscope (12) along the main axis of endoscope (12), at least one of controlling a rotation of the endoscope (12) along the main axis of the endoscope (12) and a rotation of the display of the endoscopic image (14) of the anatomical region.
PCT/IB2015/050137 2014-01-24 2015-01-08 Robotic control of an endoscope orientation WO2015110929A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461931187P 2014-01-24 2014-01-24
US61/931,187 2014-01-24

Publications (1)

Publication Number Publication Date
WO2015110929A1 true WO2015110929A1 (en) 2015-07-30

Family

ID=52474042

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/050137 WO2015110929A1 (en) 2014-01-24 2015-01-08 Robotic control of an endoscope orientation

Country Status (1)

Country Link
WO (1) WO2015110929A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4183313A1 (en) * 2021-11-22 2023-05-24 ROEN Surgical, Inc. System and device for endoscope surgery robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082612A1 (en) * 1998-11-20 2002-06-27 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US20050154260A1 (en) * 2004-01-09 2005-07-14 Schara Nathan J. Gravity referenced endoscopic image orientation
US20110196199A1 (en) * 2010-02-11 2011-08-11 Intuitive Surgical Operations, Inc. Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082612A1 (en) * 1998-11-20 2002-06-27 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US20050154260A1 (en) * 2004-01-09 2005-07-14 Schara Nathan J. Gravity referenced endoscopic image orientation
US20110196199A1 (en) * 2010-02-11 2011-08-11 Intuitive Surgical Operations, Inc. Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4183313A1 (en) * 2021-11-22 2023-05-24 ROEN Surgical, Inc. System and device for endoscope surgery robot

Similar Documents

Publication Publication Date Title
EP3104804B1 (en) Robotic control of surgical instrument visibility
US10675105B2 (en) Controller definition of a robotic remote center of motion
CN110325331B (en) Medical support arm system and control device
US10786319B2 (en) System, control unit and method for control of a surgical robot
EP2822445B1 (en) Overall endoscopic control system
EP3119325B1 (en) Systems and methods for control of imaging instrument orientation
US8668638B2 (en) Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
JP5737796B2 (en) Endoscope operation system and endoscope operation program
WO2014073122A1 (en) Endoscope operating system
US20190069955A1 (en) Control unit, system and method for controlling hybrid robot having rigid proximal portion and flexible distal portion
JP6886968B2 (en) How to use an angled endoscope to visualize body cavities using a robotic surgical system
CN114051387A (en) Medical observation system, control device, and control method
WO2015110929A1 (en) Robotic control of an endoscope orientation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15704846

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15704846

Country of ref document: EP

Kind code of ref document: A1