DE10249786A1 - Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm - Google Patents
Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm Download PDFInfo
- Publication number
- DE10249786A1 DE10249786A1 DE10249786A DE10249786A DE10249786A1 DE 10249786 A1 DE10249786 A1 DE 10249786A1 DE 10249786 A DE10249786 A DE 10249786A DE 10249786 A DE10249786 A DE 10249786A DE 10249786 A1 DE10249786 A1 DE 10249786A1
- Authority
- DE
- Germany
- Prior art keywords
- robot
- workpiece
- reference point
- camera
- referencing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39008—Fixed camera detects reference pattern held by end effector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39039—Two cameras detect same reference on workpiece to define its position in space
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Mechanical Engineering (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
Description
Die Erfindung betrifft ein Verfahren, mit dem ein Roboter zu einem Werkstück referenziert werden kann. Weiterhin betrifft es eine Vorrichtung, die für dieses Verfahren geeignet ist.The invention relates to a method with which a robot can be referenced to a workpiece. Furthermore, it relates to a device that is suitable for this method is.
In der Technik sind derzeit etliche Verfahren bekannt, mit denen ein Roboter in Beziehung zu einem Werkstück gebracht wird. Insbesondere für medizinische Anwendungen, z. B. für Operationsroboter, werden optische Systeme eingesetzt. Hierfür werden mindestens zwei Kameras an unterschiedlichen Positionen im Raum installiert, die beide auf das "Werkstück", bei medizinischen Anwendungen nämlich den Patienten oder genauer den Teil des Patienten, der behandelt werden soll; beispielsweise die Eintrittsöffnung eines Endoskops, gerichtet sind und jeweils aus ihren Blickwinkeln Bilder des "Werkstückes" (soll hier auch für einen Patienten verwendet werden) aufnehmen. Die Bilder werden später überlagert und zueinander in Beziehung gesetzt, wodurch sich die dreidimensionale, räumliche Position des Werkstückes ermitteln lässt. Anhand der Koordinatendaten des Roboters, mit denen die Position des Roboters angegeben werden, ist nun die Lage und Ausrichtung des Roboters im Verhältnis zum Werkstück bekannt, sie sind zueinander in Beziehung gesetzt, d. h. der Roboter ist referenziert. Eine derartige Referenzierung kann nicht nur zu einem Werkstück erfolgen, sondern oftmals wird die Referenzierung des Roboters auch anhand von sogenannten Referenzmarken vorgenommen, die an festen Positionen im Raum vorhanden sind und wodurch die Position des Roboters determiniert werden kann. Ist der Roboter relativ zum Werkstück referenziert, so kann dieser seine Tätigkeit und Operationen ausführen. In der Regel wird darüber hinaus während eines Tätigkeits- bzw. Operationsvorganges die Position des Roboters ebenso wie die des Werkstückes weiterhin kontrolliert und ggf. eine Nachreferenzierung des Roboters vorgenommen.There are currently a number of them in technology Known methods by which a robot is related to a workpiece becomes. Especially for medical applications, e.g. B. for surgical robots optical systems used. For this, at least two cameras installed at different positions in the room, both on the "workpiece" in medical Applications namely the patient or more precisely the part of the patient who is treating shall be; for example, the entry opening of an endoscope are and each from their point of view pictures of the "workpiece" (should also here for one Patients are used). The images are later overlaid and related to each other, whereby the three-dimensional, spatial Position of the workpiece can be determined. Based on the coordinate data of the robot, with which the position of the robot, the position and orientation is now of the robot in relation to the workpiece known, they are related to each other, i. H. the robot is referenced. Such referencing can not only be done a workpiece, but often the referencing of the robot is also based on of so-called reference marks made at fixed positions in the Space is available and what determines the position of the robot can be. If the robot is referenced relative to the workpiece, it can his activity and perform operations. Usually about it out while an activity or Operation, the position of the robot as well as that of the workpiece continue to be checked and, if necessary, the robot is referenced.
Dieses System weist jedoch mehrere Nachteile auf. So werden zumindest zwei Kameras gebraucht, die an fixen Positionen im Raum positioniert sein müssen und deren Aufnahmewinkel und Abstand zum Werkstück bekannt sein müssen, um die Referenzierung durchführen zu können. Sollten die Kameras aus irgendwelchen Gründen unerkannterweise bewegt worden sein, so würden die damit verursachten Abweichungen zu enormen Fehlern bei der Tätigkeit des Roboters führen. Aus diesem Grund ist eine stetige Abgleichung der Positionsdaten und Kontrolle der Lokalisierung der Kameras unumgänglich. Darüber hinaus bringt das Vorhandensein von zwei oder mehr Kameras einen erhöhten Aufwand an technischen Apparaturen und damit an Kosten mit sich, da nicht nur die Mehrzahl an Kameras bereitgestellt werden muss, sondern auch die entsprechenden Mess-, Übertragungs- und Verarbeitungseinheiten. Aus diesem Grund wurde schon lange danach gestrebt, die Verfahren zur Referenzierung des Roboters zu einem Werkstück, wobei auch im medizintechnischen Sinne ein Patient zu verstehen ist, zu verbessern.However, this system has several Disadvantages. So at least two cameras are needed fixed positions must be positioned in the room and their recording angle and distance to the workpiece must be known to carry out the referencing to be able to. Should the cameras move undetectably for some reason would have been the resulting deviations from enormous errors in work of the robot. For this reason, there is a constant alignment of the position data and control of the location of the cameras is essential. About that the presence of two or more cameras also brings one increased Expenditure on technical equipment and thus costs, because not only the majority of cameras have to be provided, but also the corresponding measuring, transmission and processing units. For this reason, the process has long been sought for referencing the robot to a workpiece, also in medical technology To understand the meaning of a patient is to improve.
Aus der Fototechnik ist bekannt,
dass stereoskope Bilder mit nur einer Kamera aufgenommen werden
können,
indem eine Kamera in einem gewissen Abstand zu einem aufzunehmenden
Objekt platziert wird und entweder parallel oder auf einem Kreisbogen
bewegt wird, wodurch Bilder aus verschiedenen Positionen aufgenommen
werden können,
welche dann später
paarweise zu stereoskopen Bildern zusammengesetzt werden. Derartige
Vorrichtungen und Verfahren sind beispielsweise aus den
Neben dem oben beschriebenen Referenzierungsverfahren, d. h. dem Vorhandensein von zwei Kameras, ist aus dem US-Patent 5,784,282 ein Verfahren und eine Vorrichtung bekannt, mit der mit einer Videokamera eine Referenzierung vorgenommen werden kann. Hierfür werden im Raum eine Anzahl an Referenzflächen installiert, von denen die Kamera Bilder aufnimmt. Nachdem die optischen Achsen bestimmt wurden, werden diese Achsen zueinander in Beziehung gesetzt, insbesondere durch Triangulation. Dadurch erfolgt die Bestimmung der Position und der Ausrichtung des beweglichen Objektes. Dieses Verfahren wird angewendet um die Position und Ausrichtung von Sensoren oder Werkzeugen, die an einem Roboter befestigt sind, genau zu bestimmen. Aufgrund des Vorhandenseins der Referenzflächen und der in der Medizin benötigten Sterilitäterweist sich dieses Verfahren hierfür jedoch nicht geeignet, da die Referenzflächen, wie jeder Gegenstand in einem Operationssaal, Gegenstand von Kontaminationen sein können und die Desinfizierung des Operationssaales zusätzlich erschweren. Ferner wird durch das Verfahren nur eine indirekte Referenzierung von Werkstück und Roboter erreicht.In addition to the referencing method described above, ie the presence of two cameras, US Pat. No. 5,784,282 discloses a method and a device with which a video camera can be used for referencing. For this purpose, a number of reference surfaces are installed in the room, of which the camera takes pictures. After the optical axes have been determined, these axes are related to one another, in particular by triangulation. This determines the position and orientation of the moving object. This method is used to precisely determine the position and orientation of sensors or tools that are attached to a robot. However, due to the presence of the reference surfaces and the sterility required in medicine, this method is not suitable for this, since the reference surfaces, like any object in an operating room, can be subject to contamination and disinfection make the operating room even more difficult. Furthermore, the method only achieves indirect referencing of the workpiece and the robot.
Infolgedessen besteht ein Bedürfnis, ein Verfahren zur Referenzierung eines Roboters zu einem Werkstück bereitzustellen, das die geschilderten Nachteile vermeidet und mit dem ein Roboter zuverlässig und wiederholbar zu einem Werkstück referenziert werden kann. Es soll ferner einfach, ohne großen Geräteaufwand und sehr genau arbeiten können. Darüber hinaus soll eine Vorrichtung zur Durchführung des Referenzierungsverfahrens bereitgestellt werden.As a result, there is a need to To provide methods for referencing a robot to a workpiece, that avoids the disadvantages described and with which a robot reliable and repeatable to a workpiece can be referenced. It should also be simple, without a lot of equipment and can work very precisely. About that In addition, a device for carrying out the referencing method to be provided.
Diese Aufgabe wird gelöst durch ein Verfahren nach Anspruch 1 und eine Vorrichtung nach Anspruch 9. In den Unteransprüchen werden bevorzugte Ausführungsbeispiele der Erfindung dargelegt.This task is solved by a method according to claim 1 and an apparatus according to claim 9. In the subclaims become preferred embodiments of the invention.
Das erfindungsgemäße Referenzierungsverfahren weist mehrere Schritte auf. Zunächst werden zumindest jeweils ein Bild von einem Werkstück (bzw. Patienten) aus zumindest zwei Positionen mit zumindest einer Kamera aufgenommen. Diese eine Kamera befindet sich am Roboter, insbesondere am Roboterarm, mit dem dieser auch die späteren Operationen durchführt. Als Werkstück sind dabei jegliche Gegenstände und Vorrichtungen bezeichnet, die von einem Roboter bearbeitet werden können. Für den medizintechnischen Bereich, wobei die Anwendung jedoch keinesfalls auf diesen beschränkt ist, wäre als Werkstück der Patient oder genauer die Operationsstelle zu betrachten. Von diesem Werkstück werden folglich mindestens zwei Bilder aus verschiedenen Positionen aufgenommen, wobei diese Positionen möglichst weit auseinanderliegen sollen, da dies die Genauigkeit des Verfahrens erhöht. Eine Erhöhung der Genauigkeit des Verfahrens kann weiterhin dadurch erreicht werden, dass mehr als zwei Aufnahmen gemacht werden. Erfindungsgemäß ist weiter vorgesehen, dass in zumindest einem Bild ein Referenzpunkt am Werkstück ausgewählt wird. Dieser Referenzpunkt ist bei den beiden aufgenommenen Bildern identisch auszuwählen.The referencing method according to the invention has several steps. First at least one image of a workpiece (or Patients) from at least two positions with at least one camera added. This one camera is located on the robot, in particular on the robot arm with which it also carries out the subsequent operations. As workpiece are any objects and denotes devices that are processed by a robot can. For the medical technology area, but the application is not limited to this is, would be as a workpiece to look at the patient or more precisely at the surgical site. Of this workpiece are consequently at least two pictures from different positions recorded, these positions being as far apart as possible should, since this increases the accuracy of the method. An increase in Accuracy of the method can also be achieved by that more than two pictures are taken. According to the invention is further provided that a reference point on the workpiece is selected in at least one image. This reference point is identical in the two images taken select.
Die Auswahl kann dabei sowohl automatisch durch einen Rechner erfolgen, beispielsweise indem dieser gewisse Markierungen oder "Landmarken" auswählt, die zuvor bestimmt wurden. Insbesondere im medizinischen Bereich wird sich jedoch die Auswahl des Referenzpunktes durch den Operateur, den Bediener des Roboters oder eine andere Person als günstiger und sicherer herausstellen. Hierfür ist als bevorzugtes Ausführungsbeispiel vorgesehen, dass die Bilder vor der Auswahl auf einer Anzeigevorrichtung, beispielsweise einem Bildschirm angezeigt werden und die Auswahl des Referenzpunktes in diesem Bild an der Anzeigevorrichtung erfolgt. Dies kann beispielsweise durch eine Auswahl per Mausklick erfolgen oder aber durch einen sogenannten Segmentierungsvorgang. Die Segmentierung erfordert jedoch ein bekanntes, beispielsweise in CAD-Daten vorliegendes Werkstück oder auch als Patientendaten in der Software des Systems. Unabhängig davon, wie dieser Referenzpunkt des Werkstückes ausgewählt wird, werden die x-, y-Positionen des Referenzpunktes der zweidimensionalen Bilder zueinander in Beziehung gesetzt und werden so im dreidimensionalen Raum relationiert. Dies kann durch bekannte Verfahren des Pairpoint Matchings erfolgen, beispielsweise durch die Triangulation. Ist die Position des Referenzpunktes im dreidimensionalen Raum bekannt, so kann nun die Position des Roboters zum Werkstück korreliert werden, was durch einfache Algorithmen geschehen kann. Folglich ist der Roboter zum Werkstück referenziert.The selection can be done both automatically a computer, for example by making certain markings or "Landmarks" that were previously determined. Especially in the medical field however, the surgeon's selection of the reference point the operator of the robot or another person as cheaper and point out more securely. For this is as a preferred embodiment provided that the images before selection on a display device, for example, a screen will appear and the selection of the reference point in this image on the display device. This can be done, for example, by making a selection with a click of the mouse or by a so-called segmentation process. The segmentation however requires a known one, for example in CAD data workpiece or as patient data in the system software. Independently of, how this reference point of the workpiece is selected, the x, y positions of the Reference point of the two-dimensional images in relation to each other are set and are thus related in three-dimensional space. This can be done using known pairpoint matching methods, for example through triangulation. Is the position of the reference point known in three-dimensional space, so the position of the robot can now to the workpiece be correlated, which can be done by simple algorithms. As a result, the robot is referenced to the workpiece.
Im Gegensatz zu dem aus dem Stand der Technik bekannten Verfahren wird bei dem erfindungsgemäßen Verfahren eine multiskopische Kamera quasi imitiert. Es wird auf die exakte Kamerarelation als bekannte Größe verzichtet, wie sie im Stand der Technik unumgänglich ist, und die Bilder werden über die Positionsdaten der Kamera während der Aufnahme zueinander in Beziehung gesetzt. Es entfällt somit das Vorhandensein von mehreren Kameras als auch deren genau Positionierung im Raum, da die Positionsdaten der Kamera in jedem Moment über die Positionsdaten des Roboters bekannt sind. Infolgedessen wird das Referenzierungsverfahren wesentlich vereinfacht und kann in jedem Punkt der Operation erneut durchgeführt werden. Weiterhin entfallen aufwendige Referenzierungsvorrichtungen, wie beispielsweise die bekannten Referenzierungsflächen.In contrast to that from the stand The method known in the art is used in the method according to the invention imitating a multiscopic camera. It is on the exact Camera relation as a known quantity, as is inevitable in the prior art, and the images are over the Position data of the camera during the Recording related to each other. It is therefore not applicable the presence of multiple cameras as well as their exact positioning in the room, since the position data of the camera is transmitted via the Position data of the robot are known. As a result, it will Referencing procedures are significantly simplified and can be used in any point performed the operation again become. Furthermore, there is no need for complex referencing devices, such as the known referencing surfaces.
Die erfindungsgemäße Vorrichtung, die sich zur Durchführung des Verfahrens eignet, weist einen Roboter mit zumindest einem Roboterarm auf, an dessen Ende eine Kamera angebracht ist. Diese Kamera ist in zumindest zwei unterschiedliche Positionen verbringbar und mit ihr ist zumindest jeweils eine Aufnahme von einem Werkstück aufnehmbar, bevorzugt von möglichst weit auseinander liegenden Positionen des Roboter-Bewegungsbereiches. Anders ausgedrückt wird die Kamera durch Bewegung des Roboterarmes, an dessen Ende sie befestigt ist, in unterschiedliche, möglichst die Extrem-Positionen verbracht, aus denen dann jeweils zumindest eine Aufnahme gemacht wird. Da die Kamera am vorderen Ende des Roboterarmes angebracht ist, hängt die Position der Kamera von der Positionierung des Roboters ab. Weiterhin ist ein Controller vorgesehen, der die Positionsdaten von Kamera und Roboter auslesen kann. Dies ist in der Robotertechnik weit verbreitet. In der Regel wird in der Robotertechnik ein Koordinatensystem verwendet, in dem der Roboterarm oder die Roboterhand und deren Gelenke spezifiziert ist. Jeder Punkt und jede Position, in dem sich die Roboterhand befindet, entspricht somit einem gewissen Punkt innerhalb dieses Koordinatensystems. Nun wird nicht nur der Roboterarm in diesem Koordinatensystem spezifiziert, sondern auch die Kamera, indem die Koordinaten der Kamera indirekt über die des Roboterarms definiert werden. Folglich ist in jedem Punkt, in dem sich die Kamera befindet, deren Position bekannt. Darüber hinaus ist eine Recheneinheit vorgesehen, mit der die Auswahl des Referenzpunktes vorgenommen werden kann und die die Rechenoperationen für die Relationierung und Referenzierung durchführt. Vorzugsweise ist ferner eine Anzeigevorrichtung, insbesondere ein Bildschirm, zur Anzeige der Aufnahmen vorgesehen, an dem dann auch die manuelle Auswahl des Referenzpunktes erfolgen kann, insbesondere durch Anklicken der graphischen Bedienungsoberfläche. Der Roboter ist vorzugsweise ein Operationsroboter. Die Vorrichtung und das Verfahren eignen sich jedoch auch für sämtliche andere Arten der Roboteranwendung z. B. in der automatisierten Fertigung oder auch der Lager- und Verpackungstechnik. Die am Roboterarm befestigte Kamera kann dabei eine einfache, handelsübliche Kamera sein, beispielsweise ein CCD-Kamera.The device according to the invention, which is suitable for carrying out the method, has a robot with at least one robot arm, at the end of which a camera is attached. This camera can be brought into at least two different positions and with it at least one image of a workpiece can be recorded, preferably of positions of the robot movement area that are as far apart as possible. In other words, by moving the robot arm, at the end of which it is attached, the camera is moved into different, possibly extreme, positions, from which at least one picture is then taken. Since the camera is attached to the front end of the robot arm, the position of the camera depends on the positioning of the robot. A controller is also provided, which can read the position data from the camera and robot. This is widespread in robotics. As a rule, a coordinate system is used in robot technology in which the robot arm or the robot hand and its joints are specified. Every point and every position in which the robot hand is located thus corresponds to a certain point within this coordinate system. Now not only the robot arm is specified in this coordinate system, but also the camera in that the coordinates of the camera are defined indirectly via those of the robot arm. Consequently, in each the point at which the camera is located, its position known. In addition, an arithmetic unit is provided with which the selection of the reference point can be made and which carries out the arithmetic operations for the relation and referencing. Furthermore, a display device, in particular a screen, is preferably provided for displaying the recordings, on which the reference point can then also be selected manually, in particular by clicking on the graphical user interface. The robot is preferably an operating robot. However, the device and the method are also suitable for all other types of robot application, e.g. B. in automated manufacturing or storage and packaging technology. The camera attached to the robot arm can be a simple, commercially available camera, for example a CCD camera.
Nachfolgend wird die Erfindung anhand der Figuren näher erläutert und beschrieben.The invention is explained below of the figures closer explained and described.
Es zeigen:Show it:
In
Wie aus
Claims (12)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10249786A DE10249786A1 (en) | 2002-10-24 | 2002-10-24 | Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10249786A DE10249786A1 (en) | 2002-10-24 | 2002-10-24 | Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm |
Publications (1)
Publication Number | Publication Date |
---|---|
DE10249786A1 true DE10249786A1 (en) | 2004-05-13 |
Family
ID=32102998
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DE10249786A Withdrawn DE10249786A1 (en) | 2002-10-24 | 2002-10-24 | Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm |
Country Status (1)
Country | Link |
---|---|
DE (1) | DE10249786A1 (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004044457A1 (en) * | 2004-09-15 | 2006-03-30 | Wiest Ag | Method for compensating kinetic changes at robot, comprising determination of transformation matrix |
DE102005040714A1 (en) * | 2005-08-27 | 2007-03-08 | Abb Research Ltd. | Method and system for creating a movement sequence for a robot |
DE102005049439A1 (en) * | 2005-10-15 | 2007-04-26 | Bayerische Motoren Werke Ag | Control method for processing positions on a work-piece while operating a processing program uses a robot with a robot arm fitted with a tool positioned on the work-piece |
WO2007041267A3 (en) * | 2005-09-30 | 2007-06-07 | Restoration Robotics Inc | Automated systems and methods for harvesting and implanting follicular units |
DE102006004703A1 (en) * | 2006-01-31 | 2007-08-09 | MedCom Gesellschaft für medizinische Bildverarbeitung mbH | Method for operating a positioning robot especially in medical apparatus involves evaluating three dimensional image data and determining registration of coordinates to produce target area |
WO2008062285A2 (en) * | 2006-11-22 | 2008-05-29 | Health Robotics S.R.L. | Method and machine for manipulating toxic substances |
DE102007009851B3 (en) * | 2007-02-28 | 2008-05-29 | Kuka Roboter Gmbh | Industrial robot's position determining method, involves determining position of robot relative to object based on assigned model points, and determining position of camera assigned to image and position of camera relative to robot |
EP1932488A1 (en) * | 2006-12-12 | 2008-06-18 | Prosurgics Limited | Frame of reference registration system and method |
WO2009065827A1 (en) * | 2007-11-19 | 2009-05-28 | Kuka Roboter Gmbh | Device comprising a robot, medical work station, and method for registering an object |
US7621933B2 (en) | 2005-09-30 | 2009-11-24 | Restoration Robotics, Inc. | Tool assembly for harvesting and implanting follicular units |
US7922688B2 (en) | 2007-01-08 | 2011-04-12 | Restoration Robotics, Inc. | Automated delivery of a therapeutic or cosmetic substance to cutaneous, subcutaneous and intramuscular tissue regions |
US7962192B2 (en) | 2005-09-30 | 2011-06-14 | Restoration Robotics, Inc. | Systems and methods for aligning a tool with a desired location or object |
CN101568307B (en) * | 2006-11-22 | 2011-06-15 | 健康机器人技术有限公司 | Method and machine for manipulating toxic substances |
DE102009058607A1 (en) * | 2009-12-17 | 2011-06-22 | KUKA Laboratories GmbH, 86165 | Method and device for controlling a manipulator |
US8036448B2 (en) | 2007-04-05 | 2011-10-11 | Restoration Robotics, Inc. | Methods and devices for tattoo application and removal |
CN101304842B (en) * | 2005-09-13 | 2011-11-16 | 古德曼·斯莱特芒 | Optical-mechanical spotter |
US8811660B2 (en) | 2008-09-29 | 2014-08-19 | Restoration Robotics, Inc. | Object-tracking systems and methods |
WO2018019550A1 (en) * | 2016-07-26 | 2018-02-01 | Siemens Aktiengesellschaft | Method for controlling an end element of a machine tool, and machine tool |
AT519176A1 (en) * | 2016-10-14 | 2018-04-15 | Engel Austria Gmbh | robot system |
US10299871B2 (en) | 2005-09-30 | 2019-05-28 | Restoration Robotics, Inc. | Automated system and method for hair removal |
DE102019130046A1 (en) * | 2019-01-25 | 2020-07-30 | Mujin, Inc. | A ROBOT SYSTEM WITH IMPROVED SCAN MECHANISM |
DE102019126903B3 (en) * | 2019-10-07 | 2020-09-24 | Fachhochschule Bielefeld | Method and robot system for entering a work area |
US10870204B2 (en) | 2019-01-25 | 2020-12-22 | Mujin, Inc. | Robotic system control method and controller |
EP3912610A1 (en) * | 2014-10-27 | 2021-11-24 | Intuitive Surgical Operations, Inc. | System for registering to a surgical table |
US11413103B2 (en) | 2014-10-27 | 2022-08-16 | Intuitive Surgical Operations, Inc. | System and method for monitoring control points during reactive motion |
US11419687B2 (en) | 2014-10-27 | 2022-08-23 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table motion |
US11576737B2 (en) | 2014-10-27 | 2023-02-14 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table |
US11684448B2 (en) | 2014-10-27 | 2023-06-27 | Intuitive Surgical Operations, Inc. | Device with active brake release control |
DE102022200461A1 (en) | 2022-01-17 | 2023-07-20 | Volkswagen Aktiengesellschaft | Method and robot system for machining a workpiece and coordinate system markers for a robot system |
US11806875B2 (en) | 2014-10-27 | 2023-11-07 | Intuitive Surgical Operations, Inc. | Disturbance compensation in computer-assisted devices |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE1807884A1 (en) * | 1967-11-10 | 1969-07-17 | L R Industries Ltd | Condom |
DE4115846A1 (en) * | 1991-05-15 | 1992-11-19 | Ameling Walter | Contactless spatial position measurement in robot processing chamber - acquiring images of robotic actuator with defined marking enabling calibration of imaging units in coordinate system |
EP0528054A1 (en) * | 1991-03-07 | 1993-02-24 | Fanuc Ltd. | Detected position correcting method |
DE19626459C2 (en) * | 1996-07-02 | 1999-09-02 | Kuka Schweissanlagen Gmbh | Method and device for teaching a program-controlled robot |
DE19807884A1 (en) * | 1998-02-25 | 1999-09-09 | Schweikard | Computer-aided intra-operative anatomical object visualization method used during complex brain surgery |
DE10016963C2 (en) * | 2000-04-06 | 2002-02-14 | Vmt Vision Machine Technic Gmb | Method for determining the position of a workpiece in 3D space |
EP1215017A2 (en) * | 2000-12-07 | 2002-06-19 | Fanuc Ltd | Robot teaching apparatus |
-
2002
- 2002-10-24 DE DE10249786A patent/DE10249786A1/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE1807884A1 (en) * | 1967-11-10 | 1969-07-17 | L R Industries Ltd | Condom |
EP0528054A1 (en) * | 1991-03-07 | 1993-02-24 | Fanuc Ltd. | Detected position correcting method |
DE4115846A1 (en) * | 1991-05-15 | 1992-11-19 | Ameling Walter | Contactless spatial position measurement in robot processing chamber - acquiring images of robotic actuator with defined marking enabling calibration of imaging units in coordinate system |
DE19626459C2 (en) * | 1996-07-02 | 1999-09-02 | Kuka Schweissanlagen Gmbh | Method and device for teaching a program-controlled robot |
DE19807884A1 (en) * | 1998-02-25 | 1999-09-09 | Schweikard | Computer-aided intra-operative anatomical object visualization method used during complex brain surgery |
DE10016963C2 (en) * | 2000-04-06 | 2002-02-14 | Vmt Vision Machine Technic Gmb | Method for determining the position of a workpiece in 3D space |
EP1215017A2 (en) * | 2000-12-07 | 2002-06-19 | Fanuc Ltd | Robot teaching apparatus |
Cited By (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004044457A1 (en) * | 2004-09-15 | 2006-03-30 | Wiest Ag | Method for compensating kinetic changes at robot, comprising determination of transformation matrix |
DE102005040714B4 (en) * | 2005-08-27 | 2015-06-18 | Abb Research Ltd. | Method and system for creating a movement sequence for a robot |
DE102005040714A1 (en) * | 2005-08-27 | 2007-03-08 | Abb Research Ltd. | Method and system for creating a movement sequence for a robot |
CN101304842B (en) * | 2005-09-13 | 2011-11-16 | 古德曼·斯莱特芒 | Optical-mechanical spotter |
KR101015670B1 (en) * | 2005-09-30 | 2011-02-22 | 레스토레이션 로보틱스, 인코포레이티드 | Automated apparatus for harvesting and implanting follicular units |
US7621934B2 (en) | 2005-09-30 | 2009-11-24 | Restoration Robotics, Inc | Methods of harvesting and implanting follicular units using a coaxial tool |
US8133247B2 (en) | 2005-09-30 | 2012-03-13 | Restoration Robotics, Inc. | Tool assembly for harvesting and implanting follicular units |
US9526581B2 (en) | 2005-09-30 | 2016-12-27 | Restoration Robotics, Inc. | Automated system and method for harvesting or implanting follicular units |
WO2007041267A3 (en) * | 2005-09-30 | 2007-06-07 | Restoration Robotics Inc | Automated systems and methods for harvesting and implanting follicular units |
EP2781200A1 (en) * | 2005-09-30 | 2014-09-24 | Restoration Robotics, Inc. | Automated systems and methods for harvesting and implanting follicular units |
US10299871B2 (en) | 2005-09-30 | 2019-05-28 | Restoration Robotics, Inc. | Automated system and method for hair removal |
US8690894B2 (en) | 2005-09-30 | 2014-04-08 | Restoration Robotics, Inc. | Automated system for harvesting or implanting follicular units |
US7621933B2 (en) | 2005-09-30 | 2009-11-24 | Restoration Robotics, Inc. | Tool assembly for harvesting and implanting follicular units |
US10327850B2 (en) | 2005-09-30 | 2019-06-25 | Restoration Robotics, Inc. | Automated system and method for harvesting or implanting follicular units |
CN101277657B (en) * | 2005-09-30 | 2010-12-01 | 修复型机器人公司 | Automated systems and methods for harvesting and implanting follicular units |
KR101155258B1 (en) * | 2005-09-30 | 2012-06-13 | 레스토레이션 로보틱스, 인코포레이티드 | Apparatus and methods for harvesting and implanting follicular units |
CN101926678B (en) * | 2005-09-30 | 2012-06-20 | 修复型机器人公司 | Automated systems and methods for harvesting and implanting follicular units |
US7962192B2 (en) | 2005-09-30 | 2011-06-14 | Restoration Robotics, Inc. | Systems and methods for aligning a tool with a desired location or object |
DE102005049439A1 (en) * | 2005-10-15 | 2007-04-26 | Bayerische Motoren Werke Ag | Control method for processing positions on a work-piece while operating a processing program uses a robot with a robot arm fitted with a tool positioned on the work-piece |
DE102006004703B4 (en) * | 2006-01-31 | 2016-08-04 | MedCom Gesellschaft für medizinische Bildverarbeitung mbH | Method and arrangement for operating a positioning robot |
DE102006004703A1 (en) * | 2006-01-31 | 2007-08-09 | MedCom Gesellschaft für medizinische Bildverarbeitung mbH | Method for operating a positioning robot especially in medical apparatus involves evaluating three dimensional image data and determining registration of coordinates to produce target area |
WO2008062285A3 (en) * | 2006-11-22 | 2008-08-14 | Health Robotics Srl | Method and machine for manipulating toxic substances |
CN101568307B (en) * | 2006-11-22 | 2011-06-15 | 健康机器人技术有限公司 | Method and machine for manipulating toxic substances |
US8404492B2 (en) | 2006-11-22 | 2013-03-26 | Health Robotics S.R.L. | Method and machine for manipulating toxic substances |
WO2008062285A2 (en) * | 2006-11-22 | 2008-05-29 | Health Robotics S.R.L. | Method and machine for manipulating toxic substances |
EP1932488A1 (en) * | 2006-12-12 | 2008-06-18 | Prosurgics Limited | Frame of reference registration system and method |
US7922688B2 (en) | 2007-01-08 | 2011-04-12 | Restoration Robotics, Inc. | Automated delivery of a therapeutic or cosmetic substance to cutaneous, subcutaneous and intramuscular tissue regions |
WO2008104426A3 (en) * | 2007-02-28 | 2008-11-13 | Kuka Roboter Gmbh | Industrial robot, and methods for determining the position of an industrial robot relative to an object |
WO2008104426A2 (en) | 2007-02-28 | 2008-09-04 | Kuka Roboter Gmbh | Industrial robot, and methods for determining the position of an industrial robot relative to an object |
DE102007009851B3 (en) * | 2007-02-28 | 2008-05-29 | Kuka Roboter Gmbh | Industrial robot's position determining method, involves determining position of robot relative to object based on assigned model points, and determining position of camera assigned to image and position of camera relative to robot |
US8036448B2 (en) | 2007-04-05 | 2011-10-11 | Restoration Robotics, Inc. | Methods and devices for tattoo application and removal |
US8392022B2 (en) | 2007-11-19 | 2013-03-05 | Kuka Laboratories Gmbh | Device comprising a robot, medical work station, and method for registering an object |
WO2009065827A1 (en) * | 2007-11-19 | 2009-05-28 | Kuka Roboter Gmbh | Device comprising a robot, medical work station, and method for registering an object |
US9405971B2 (en) | 2008-09-29 | 2016-08-02 | Restoration Robotics, Inc. | Object-Tracking systems and methods |
US8811660B2 (en) | 2008-09-29 | 2014-08-19 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US8848974B2 (en) | 2008-09-29 | 2014-09-30 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US9589368B2 (en) | 2008-09-29 | 2017-03-07 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US9227321B2 (en) | 2009-12-17 | 2016-01-05 | Kuka Roboter Gmbh | Method and device for controlling a manipulator |
DE102009058607A1 (en) * | 2009-12-17 | 2011-06-22 | KUKA Laboratories GmbH, 86165 | Method and device for controlling a manipulator |
US10076841B2 (en) | 2009-12-17 | 2018-09-18 | Kuka Deutschland Gmbh | Method and device for controlling a manipulator |
US11896326B2 (en) | 2014-10-27 | 2024-02-13 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table |
US11737842B2 (en) | 2014-10-27 | 2023-08-29 | Intuitive Surgical Operations, Inc. | System and method for monitoring control points during reactive motion |
US11576737B2 (en) | 2014-10-27 | 2023-02-14 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table |
US11419687B2 (en) | 2014-10-27 | 2022-08-23 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table motion |
US11672618B2 (en) | 2014-10-27 | 2023-06-13 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table motion |
US11806875B2 (en) | 2014-10-27 | 2023-11-07 | Intuitive Surgical Operations, Inc. | Disturbance compensation in computer-assisted devices |
US11759265B2 (en) | 2014-10-27 | 2023-09-19 | Intuitive Surgical Operations, Inc. | System and method for registering to a table |
US11413103B2 (en) | 2014-10-27 | 2022-08-16 | Intuitive Surgical Operations, Inc. | System and method for monitoring control points during reactive motion |
US11684448B2 (en) | 2014-10-27 | 2023-06-27 | Intuitive Surgical Operations, Inc. | Device with active brake release control |
EP3912610A1 (en) * | 2014-10-27 | 2021-11-24 | Intuitive Surgical Operations, Inc. | System for registering to a surgical table |
WO2018019550A1 (en) * | 2016-07-26 | 2018-02-01 | Siemens Aktiengesellschaft | Method for controlling an end element of a machine tool, and machine tool |
US11498219B2 (en) | 2016-07-26 | 2022-11-15 | Siemens Aktiengesellschaft | Method for controlling an end element of a machine tool, and a machine tool |
AT519176B1 (en) * | 2016-10-14 | 2019-02-15 | Engel Austria Gmbh | robot system |
DE102017123877B4 (en) | 2016-10-14 | 2019-09-19 | Engel Austria Gmbh | robot system |
AT519176A1 (en) * | 2016-10-14 | 2018-04-15 | Engel Austria Gmbh | robot system |
US10933527B2 (en) | 2019-01-25 | 2021-03-02 | Mujin, Inc. | Robotic system with enhanced scanning mechanism |
US11638993B2 (en) | 2019-01-25 | 2023-05-02 | Mujin, Inc. | Robotic system with enhanced scanning mechanism |
US11413753B2 (en) | 2019-01-25 | 2022-08-16 | Mujin, Inc. | Robotic system control method and controller |
DE102019130046B4 (en) * | 2019-01-25 | 2021-01-14 | Mujin, Inc. | Robot system with improved scanning mechanism |
US10870204B2 (en) | 2019-01-25 | 2020-12-22 | Mujin, Inc. | Robotic system control method and controller |
US11772267B2 (en) | 2019-01-25 | 2023-10-03 | Mujin, Inc. | Robotic system control method and controller |
DE102019130046A1 (en) * | 2019-01-25 | 2020-07-30 | Mujin, Inc. | A ROBOT SYSTEM WITH IMPROVED SCAN MECHANISM |
DE102019126903B3 (en) * | 2019-10-07 | 2020-09-24 | Fachhochschule Bielefeld | Method and robot system for entering a work area |
DE102022200461A1 (en) | 2022-01-17 | 2023-07-20 | Volkswagen Aktiengesellschaft | Method and robot system for machining a workpiece and coordinate system markers for a robot system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE10249786A1 (en) | Referencing method for relating robot to workpiece, in medical applications, by relating reference point using x and y position data obtained from at least two images picked up using camera mounted on robot arm | |
EP2449997B1 (en) | Medical workstation | |
EP2575662B1 (en) | Method for moving an instrument arm of a laparoscopy robot into a predeterminable relative position with respect to a trocar | |
EP2082687B1 (en) | Overlaid presentation of exposures | |
EP0799434B1 (en) | Microscope, in particular a stereomicroscope, and a method of superimposing two images | |
DE69535523T2 (en) | METHOD FOR TELE-MANIPULATION AND TELE-PRESENCE | |
EP2211751B1 (en) | Device comprising a robot, medical work station, and method for registering an object | |
DE69913106T2 (en) | METHOD AND DEVICE FOR ROBOT ALIGNMENT | |
DE3717871C3 (en) | Method and device for reproducible visual representation of a surgical intervention | |
DE4417944A1 (en) | Process for correlating different coordinate systems in computer-assisted, stereotactic surgery | |
DE19961971A1 (en) | Process for the safe automatic tracking of an endoscope and tracking of a surgical instrument with an endoscope guidance system (EFS) for minimally invasive surgery | |
EP1521211A2 (en) | Method and apparatus for determining the position and orientation of an image receiving device | |
DE102004004451A1 (en) | Method and device for medical imaging, wherein an object of an x-ray is reoriented | |
DE102007055204A1 (en) | Robot, medical workstation, and method of projecting an image onto the surface of an object | |
EP2012208A2 (en) | Programmable hand tool | |
DE102018125592A1 (en) | Control arrangement, method for controlling a movement of a robot arm and treatment device with control arrangement | |
DE10320862B4 (en) | Method for automatically adjusting a diaphragm, as well as X-ray system | |
DE102015104587B4 (en) | Method for calibrating a robot on a workspace and system for performing the method | |
DE102014110570A1 (en) | An imaging apparatus and method combining functional imaging and ultrasound imaging | |
DE102019214302B4 (en) | Method for registering an X-ray image data set with a navigation system, computer program product and system | |
DE102006055133B4 (en) | Method for positioning an X-ray recording device relative to an examination center | |
AT521076B1 (en) | Stereo microscope for use in microsurgical interventions on patients and methods for controlling the stereo microscope | |
EP1537830B1 (en) | Method and device for viewing an object with a microscope | |
DE102022200821B9 (en) | Method for calibrating a stereoscopic medical microscope and medical microscope assembly | |
DE102021134553A1 (en) | Robotic registration procedure and surgical navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
OM8 | Search report available as to paragraph 43 lit. 1 sentence 1 patent law | ||
8130 | Withdrawal |