Recherche Images Maps Play YouTube Actualités Gmail Drive Plus »
Connexion
Les utilisateurs de lecteurs d'écran peuvent cliquer sur ce lien pour activer le mode d'accessibilité. Celui-ci propose les mêmes fonctionnalités principales, mais il est optimisé pour votre lecteur d'écran.

Brevets

  1. Recherche avancée dans les brevets
Numéro de publicationUS20050204438 A1
Type de publicationDemande
Numéro de demandeUS 10/962,829
Date de publication15 sept. 2005
Date de dépôt11 oct. 2004
Date de priorité26 févr. 2004
Autre référence de publicationUS20100115418, US20110301759
Numéro de publication10962829, 962829, US 2005/0204438 A1, US 2005/204438 A1, US 20050204438 A1, US 20050204438A1, US 2005204438 A1, US 2005204438A1, US-A1-20050204438, US-A1-2005204438, US2005/0204438A1, US2005/204438A1, US20050204438 A1, US20050204438A1, US2005204438 A1, US2005204438A1
InventeursYulun Wang, Charles Jordan, Jonathan Southard, Marco Pinter
Cessionnaire d'origineYulun Wang, Jordan Charles S., Jonathan Southard, Marco Pinter
Exporter la citationBiBTeX, EndNote, RefMan
Liens externes: USPTO, Cession USPTO, Espacenet
Graphical interface for a remote presence system
US 20050204438 A1
Résumé
A robot system that includes a robot and a remote station. The remote station may be a personal computer coupled to the robot through a broadband network. A user at the remote station may receive both video and audio from a camera and microphone of the robot, respectively. The remote station may include a display user interface that has a variety of viewable fields and selectable buttons.
Images(7)
Previous page
Next page
Revendications(100)
1. A robot system, comprising:
a mobile robot that has a camera that captures an image; and, a remote station that is coupled to said robot, said remote station includes a display user interface that displays the image and provides a first graphical input that can be selected to vary the image.
2. The system of claim 1, wherein said mobile robot can be controlled from said remote station.
3. The system of claim 1, wherein said first graphical input can be selected to view a still picture image.
4. The system of claim 1, wherein said first graphical input is a slide bar that can be selected to view a plurality of still picture images.
5. The system of claim 1, wherein a still picture image provided by said mobile robot camera can be stored at said remote station by selecting said first graphical input.
6. The system of claim 1, wherein selecting said first graphical input can initiate a storage of a video segment of said image.
7. The system of claim 1, wherein said mobile robot includes a battery and said display user interface depicts how much energy is left in said battery.
8. The system of claim 1, further comprising a base station that is coupled to said remote station and wirelessly coupled to said mobile robot, said display user interface depicts a signal strength of a signal transmitted between said base station and said mobile robot.
9. The system of claim 1, further comprising a second graphical input that can be selected to vary a characteristic of the image.
10. The system of claim 1, wherein mobile robot includes a microphone and said remote station includes a speaker, said display user interface including a second graphical input that can be selected to vary a characteristic of sound provided by said microphone.
11. A method for operating robot system, comprising:
moving a mobile robot that has a camera that captures an image;
transmitting the image to a remote station; and,
displaying a display user interface at the remote station, the display user interface displays an image and provides a first graphical input that can be selected to vary the image.
12. The method of claim 11, wherein movement of the mobile robot is controlled by the remote station.
13. The method of claim 11, further comprising selecting the first graphical input to display a still picture image.
14. The method of claim 11, wherein the first graphical input is a slide bar, and further comprising selecting said slide bar to view a plurality of still picture images.
15. The method of claim 11, further comprising selecting the first graphical input to store a still picture image provided by the mobile robot camera.
16. The method of claim 11, further comprising selecting said first graphical input to initiate a storage of a video segment of said image.
17. The method of claim 11, wherein the mobile robot includes a battery and the display user interface depicts how much energy is left in the battery.
18. The method of claim 11, further comprising transmitting a signal between the mobile robot and a base station that is coupled to the remote station and wirelessly coupled to said mobile robot, and displaying a signal strength of a signal on the display user interface.
19. The method of claim 11, further comprising selecting a second graphical input displayed by the display user interface to vary a characteristic of the image.
20. The method of claim 11, wherein the mobile robot includes a microphone and the remote station includes a speaker, and further comprising selecting a second graphical input to vary a characteristic of sound provided by said microphone.
21. A robot system, comprising:
a mobile robot;
a base station that transmits a wireless signal to said mobile robot; and,
a remote station that is coupled to said mobile robot through said base station, said remote station includes a display user interface that displays a signal strength of the wireless signal.
22. The system of claim 21, wherein said display user interface displays an image and a first graphical input that can be selected to vary the image.
23. The system of claim 22, wherein the image is provided by a camera of said mobile robot.
24. The system of claim 22, wherein said first graphical input can be selected to view a still picture image.
25. The system of claim 22, wherein said first graphical input is a slide bar that can be selected to view a plurality of still picture images.
26. The system of claim 22, wherein a still picture image provided by said mobile robot camera can be stored at said remote station by selecting said first graphical input.
27. The system of claim 21, wherein selecting said first graphical input can initiate a storage of a video segment of said image.
28. The system of claim 21, wherein said mobile robot includes a battery and said display user interface depicts how much energy is left in said battery.
29. The system of claim 22, wherein said display user interface includes a second graphical input that can be selected to vary a characteristic of the image.
30. The system of claim 21, wherein mobile robot includes a microphone and said remote station includes a speaker, said display user interface includes a graphical input that can be selected to vary a characteristic of sound provided by said microphone.
31. A method for operating a robot system, comprising:
transmitting a wireless signal from a base station to a mobile robot;
moving the mobile robot; and,
displaying a display user interface at a remote station that is coupled to the base station, the display user interface displays a signal strength of the wireless signal.
32. The method of claim 31, wherein the display user interface displays an image and a first graphical input that can be selected to vary the image.
33. The method of claim 32, further comprising selecting the first graphical input to display a still picture image.
34. The method of claim 32, wherein the first graphical input is a slide bar and further comprising selecting the slide bar to view a plurality of still picture images.
35. The method of claim 32, further comprising selecting the first graphical input to store a still picture image provided by the mobile robot camera.
36. The method of claim 31, further comprising selecting said first graphical input to initiate a storage of a video segment of the image.
37. The method of claim 31, wherein movement of the mobile robot is controlled by the remote station.
38. The method of claim 31, wherein the mobile robot includes a battery and the display user interface depicts an amount of energy within the battery.
39. The method of claim 32, further comprising selecting a second graphical input displayed by the graphical user interface to vary a characteristic of the image.
40. The method of claim 31, wherein the mobile robot includes a microphone and the remote station includes a speaker, and further comprising selecting a graphical input to vary a characteristic of sound provided by said microphone.
41. A robot system, comprising:
a mobile robot that contains a battery; and,
a remote station that is coupled to said mobile robot, said remote station includes a display user interface that displays an amount of energy within said battery.
42. The system of claim 41, wherein said display user interface displays an image and a first graphical input that can be selected to vary the image.
43. The system of claim 42, wherein said image is provided by a camera of said mobile robot.
44. The system of claim 42, wherein said first graphical input can be selected to view a still picture image.
45. The system of claim 42, wherein said first graphical input is a slide bar that can be selected to view a plurality of still picture images.
46. The system of claim 42, wherein a still picture image provided by said mobile robot camera can be stored at said remote station by selecting said first graphical input.
47. The system of claim 41, wherein selecting said first graphical input can initiate a storage of a video segment of said image.
48. The system of claim 42, wherein said display user interface includes a second graphical input that can be selected to vary a characteristic of the image.
49. The system of claim 41, wherein mobile robot includes a microphone and said remote station includes a speaker, said display user interface including a graphical input that can be selected to vary a characteristic of sound provided by said microphone.
50. A method for operating a robot system, comprising:
moving a mobile robot that has a battery; and,
presenting a display user interface at a remote station, the display user interface displays an amount of energy within the battery.
51. The method of claim 50, wherein the display user interface displays an image and a first graphical input selected to vary the image.
52. The method of claim 51, further comprising selecting the first graphical input to display a still picture image.
53. The method of claim 51, wherein the first graphical input is a slide bar and further comprising selecting the slide bar to view a plurality of still picture images.
54. The method of claim 50, further comprising selecting the first graphical input to store a still picture image provided by the mobile robot camera.
55. The method of claim 51, further comprising selecting said first graphical input to initiate a storage of a video segment of said image.
56. The method of claim 51, wherein movement of the mobile robot is controlled by the remote station.
57. The method of claim 51, further comprising selecting a graphical input displayed by the display user interface to vary a characteristic of the image.
58. The method of claim 51, wherein the mobile robot includes a microphone and the remote station includes a speaker, and further comprising selecting a second graphical input to vary a characteristic of sound provided by said microphone.
59. A robot system, comprising:
a mobile robot with a sensor that provides an indication of when said robot is in proximity to an object; and,
a remote station that is coupled to said mobile robot, said remote station includes a display user interface that displays a graphical representation of when said sensor provides said indication.
60. The system of claim 59, wherein said display user interface displays an image provided by said mobile robot.
61. The system of claim 59, wherein said display user interface displays a graphical feature that indicates a direction of movement of said mobile robot.
62. The system of claim 59, wherein said display user interface displays a graphical input that can be selected to transmit an image to said mobile robot.
63. The system of claim 60, wherein a user can annotate the image.
64. The system of claim 59, wherein said mobile robot includes a screen that displays an image and a user can annotate the image.
65. The system of claim 59, wherein said display user interface includes a graphical input that can be selected to select a mobile robot from a plurality of mobile robots.
66. A method for operating a robot system, comprising:
moving a mobile robot with a sensor that provides an indication of when the mobile robot is in proximity to an object; and,
presenting a display user interface at a remote station, the display user interface displays a graphical representation of when the sensor provides the indication.
67. The method of claim 66, wherein the display user interface displays an image.
68. The method of claim 66, further comprising displaying a graphical feature that indicates a direction of movement of the mobile robot.
69. The method of claim 66, further comprising selecting a graphical input to transmit an image to the mobile robot.
70. The method of claim 67, further comprising annotating the image.
71. The method of claim 66, further comprising selecting a graphical input to select a mobile robot from a plurality of mobile robots.
72. A robot system, comprising:
a mobile robot; and,
a remote station that is coupled to said mobile robot, said remote station includes a display user interface that displays a graphical feature that indicates a direction of movement of said mobile robot.
73. The system of claim 72, wherein said display user interface displays an image provided by said mobile robot.
74. The system of claim 72, wherein said display user interface displays a graphical input that can be selected to transmit an image to said mobile robot.
75. The system of claim 73, wherein a user can annotate the image.
76. The system of claim 72, wherein said mobile robot includes a screen that displays an image and a user can annotate the image.
77. The system of claim 72, wherein said display user interface includes a graphical input that can be selected to select a mobile robot from a plurality of mobile robots.
78. A method for operating a robot system, comprising:
moving a mobile robot; and,
presenting a display user interface at a remote station, the display user interface displays a graphical feature that indicates a direction of movement of the mobile robot.
79. The method of claim 78, wherein the display user interface displays an image.
80. The method of claim 78, further comprising selecting a graphical input to transmit an image to the mobile robot.
81. The method of claim 79, further comprising annotating the image.
82. The method of claim 78, further comprising selecting a graphical input to select a mobile robot from a plurality of mobile robots.
83. A robot system, comprising:
a mobile robot that has a screen; and,
a remote station that is coupled to said mobile robot, said remote station includes a display user interface that displays a graphical input that can be selected to transmit an image to said mobile robot.
84. The system of claim 83, wherein said display user interface displays an image provided by said mobile robot.
85. The system of claim 84, wherein a user can annotate the image.
86. The system of claim 83, wherein said robot screen displays the image and a user can annotate the image.
87. The system of claim 82, wherein said display user interface includes a graphical input that can be selected to select a mobile robot from a plurality of mobile robots.
88. A method for operating a robot system, comprising:
moving a mobile robot that has a screen; and,
presenting a display user interface at a remote station, the display user interface displays a graphical input that can be selected to transmit an image to the mobile robot.
89. The method of claim 88, wherein the display user interface displays an image.
90. The method of claim 88, further comprising annotating the image.
91. The method of claim 88, wherein the image is displayed by a robot screen and a user can annotate the image.
92. The method of claim 88, further comprising selecting a graphical input to select a robot from a plurality of robots.
93. A robot system, comprising:
a plurality of mobile robots; and,
a remote station that is coupled to at least one of said mobile robots, said remote station includes a display user interface that displays a graphical input that can be selected to select one of said mobile robots.
94. The system of claim 93, wherein said display user interface displays an image provided by said mobile robot.
95. The system of claim 94, wherein a user can annotate the image.
96. The system of claim 93, wherein said mobile robot includes a screen that displays an image and a user can annotate the image.
97. A method for operating a robot system, comprising:
providing a plurality of mobile robots; and,
presenting a display user interface at a remote station, the display user interface displays a graphical input that can be selected to select one of the mobile robots.
98. The method of claim 97, wherein the display user interface displays an image.
99. The method of claim 97, further comprising annotating the image.
100. The method of claim 97, wherein the image is displayed by a robot screen and a user can annotate the image.
Description
REFERENCE TO CROSS-RELATED APPLICATIONS

This application claims priority to Application No. 60/548,561 filed on Feb. 26, 2004.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The subject matter disclosed generally relates to the field of mobile two-way teleconferencing.

2. Background Information

There is a growing need to provide remote health care to patients that have a variety of ailments ranging from Alzheimers to stress disorders. To minimize costs it is desirable to-provide home care for such patients. Home care typically requires a periodic visit by a health care provider such as a nurse or some type of assistant. Due to financial and/or staffing issues the health care provider may not be there when the patient needs some type of assistance. Additionally, existing staff must be continuously trained, which can create a burden on training personnel. It would be desirable to provide a system that would allow a health care provider to remotely care for a patient without being physically present.

Robots have been used in a variety of applications ranging from remote control of hazardous material to assisting in the performance of surgery. For example, U.S. Pat. No. 5,762,458 issued to Wang et al. discloses a system that allows a surgeon to perform minimally invasive medical procedures through the use of robotically controlled instruments. One of the robotic arms in the Wang system moves an endoscope that has a camera. The camera allows a surgeon to view a surgical area of a patient.

Tele-robots such as hazardous waste handlers and bomb detectors may contain a camera that allows the operator to view the remote site. Canadian Pat. No. 2289697 issued to Treviranus, et al. discloses a teleconferencing platform that has both a camera and a monitor. The platform includes mechanisms to both pivot and raise the camera and monitor. The Treviranus patent also discloses embodiments with a mobile platform, and different mechanisms to move the camera and the monitor.

There has been marketed a mobile robot introduced by InTouch-Health, Inc., the assignee of this application, under the trademarks COMPANION and RP-6. The InTouch robot is controlled by a user at a remote station. The remote station may be a personal computer with a joystick that allows the user to remotely control the movement of the robot. Both the robot and remote station have cameras, monitors, speakers and microphones to allow for two-way video/audio communication.

U.S. patent application Pub. No. US 2001/0054071 filed in the name of Loeb, discloses a video-conferencing system that includes a number of graphical user interfaces (“GUIs”) that can be used to establish a video-conference. One of the GUIs has an icon that can be selected to make a call. The Loeb application discloses stationary video-conferencing equipment such as a television. There is no discussion in Loeb about the use of robotics.

BRIEF SUMMARY OF THE INVENTION

A robot system that includes a remote station and a robot. The remote station includes a display user interface that can be used to operate the system.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is an illustration of a robotic system;

FIG. 2 is a schematic of an electrical system of a robot;

FIG. 3 is a further schematic of the electrical system of the robot;

FIG. 4 is a display user interface of a remote station;

FIG. 5 is a display user interface showing an electronic medical record;

FIG. 6 is a display user interface showing an image and an electronic medical record being simultaneously displayed.

the robot 12 may have a direct connection to the network thru for example a satellite.

The remote control station 16 may include a computer 22 that has a monitor 24, a camera 26, a microphone 28 and a speaker 30. The computer 22 may also contain an input device 32 such as a joystick or a mouse. The control station 16 is typically located in a place that is remote from the robot 12. Although only one remote control station 16 is shown, the system 10 may include a plurality of remote stations. In general any number of robots 12 may be controlled by any number of remote stations 16 or other robots 12. For example, one remote station 16 may be coupled to a plurality of robots 12, or one robot 12 may be coupled to a plurality of remote stations 16, or a plurality of robots 12.

Each robot 12 includes a movement platform 34 that is attached to a robot housing 36. Also attached to the robot housing 36 are a camera 38, a monitor 40, a microphone(s) 42 and a speaker(s) 44. The microphone 42 and speaker 30 may create a stereophonic sound. The robot 12 may also have an antenna 46 that is wirelessly coupled to an antenna 48 of the base station 14. The system 10 allows a user at the remote control station 16 to move the robot 12 through operation of the input device 32. The robot camera 38 is coupled to the remote monitor 24 so that a user at the remote station 16 can view a patient. Likewise, the robot monitor 40 is coupled to the remote camera 26 so that the patient may view the user. The microphones 28 and 42, and speakers 30 and 44, allow for audible communication between the patient and the user.

The remote station computer 22 may operate Microsoft OS software and WINDOWS XP or other operating systems such as LINUX. The remote computer 22 may also operate a video driver, a camera driver, an audio driver and a joystick driver. The video images may be transmitted and received with compression software such as MPEG CODEC.

The robot 12 may be coupled to one or more medical monitoring devices 50. The medical monitoring device 50 can take medical data from a patient. By of example, the medical monitoring device 50 may be a stethoscope, a pulse oximeter and/or an EKG monitor. The medical monitoring device 50 may contain a wireless transmitter 52 that transmits the patient data to the robot 12. The wirelessly transmitted data may be received by antennae 46, or a separate antennae (not shown). The robot 12 can then transmit the data to the remote station 16.

The wireless transmission from the medical monitoring device 50 may be in accord with various wireless standards such as IEEE. The standard used to transmit data from the medical monitoring device 50 should not interfere with the wireless communication between the robot 12 and the base station 14. Although wireless transmission is shown and described, it is to be understood that the medical monitoring device 50 can be coupled to the robot 12 by wires (not shown).

The remote station 16 may be coupled to a server 54 through the network 18. The server 54 may contain electronic medical records of a patient. By way of example, the electronic medical records may include written records of treatment, patient history, medication information, x-rays, EKGs, laboratory results, physician notes, etc. The medical records can be retrieved from the server 54 and displayed by the monitor 24 of the remote station. In lieu of, or in addition to, the medical records can be stored in the mobile robot 12. The remote station 16 may allow the physician to modify the records and then store the modified records back in the server 54 and/or robot 12.

FIGS. 2 and 3 show an embodiment of a robot 12. Each robot 12 may include a high level control system 60 and a low level control system 62. The high level control system 60 may include a processor 64 that is connected to a bus 66. The bus is coupled to the camera 38 by an input/output (I/O) port 68, and to the monitor 40 by a serial output port 70 and a VGA driver 72. The monitor 40 may include a touchscreen function that allows the patient to enter input by touching the monitor screen.

The speaker 44 is coupled to the bus 66 by a digital to analog converter 74. The microphone 42 is coupled to the bus 66 by an analog to digital converter 76. The high level controller 60 may also contain random access memory (RAM) device 78, a non-volatile RAM device 80 and a mass storage device 82 that are all coupled to the bus 72. The mass storage device 82 may contain medical files of the patient that can be accessed by the user at the remote control station 16. For example, the mass storage device 82 may contain a picture of the patient. The user, particularly a health care provider, can recall the old picture and make a side by side comparison on the monitor 24 with a present video image of the patient provided by the camera 38. The robot antennae 46 may be coupled to a wireless transceiver 84. By way of example, the transceiver 84 may transmit and receive information in accordance with IEEE 802.11b. The transceiver 84 may also process signals from the medical monitoring device in accordance with IEEE also known as Bluetooth. The robot may have a separate antennae to receive the wireless signals from the medical monitoring device.

The controller 64 may operate with a LINUX OS operating system. The controller 64 may also operate MS WINDOWS along with video, camera and audio drivers for communication with the remote control station 16. Video information may be transceived using MPEG CODEC compression techniques. The software may allow the user to send e-mail to the patient and vice versa, or allow the patient to access the Internet. In general the high level controller 60 operates to control communication between the robot 12 and the remote control station 16.

The high level controller 60 may be linked to the low level controller 62 by serial ports 86 and 88. The low level controller 62 includes a processor 90 that is coupled to a RAM device 92 and non-volatile RAM device 94 by a bus 96. Each robot 12 contains a plurality of motors 98 and motor encoders 100. The motors 98 can activate the movement platform and move other parts of the robot such as the monitor and camera. The encoders 100 provide feedback information regarding the output of the motors 98. The motors 98 can be coupled to the bus 96 by a digital to analog converter 102 and a driver amplifier 104. The encoders 100 can be coupled to the bus 96 by a decoder 106. Each robot 12 also has a number of proximity sensors 108 (see also FIG. 1). The position sensors 108 can be coupled to the bus 96 by a signal conditioning circuit 110 and an analog to digital converter 112.

The low level controller 62 runs software routines that mechanically actuate the robot 12. For example, the low level controller 62 provides instructions to actuate the movement platform to move the robot 12. The low level controller 62 may receive movement instructions from the high level controller 60. The movement instructions may be received as movement commands from the remote control station or another robot. Although two controllers are shown, it is to be understood that each robot 12 may have one controller, or more than two controllers, controlling the high and low level functions.

The various electrical devices of each robot 12 may be powered by a battery(ies) 114. The battery 114 may be recharged by a battery recharger station 116. The low level controller 62 may include a battery control circuit 118 that senses the power level of the battery 114. The low level controller 62 can sense when the power falls below a threshold and then send a message to the high level controller 60.

The system may be the same or similar to a robotic system provided by the assignee InTouch-Health, Inc. of Santa Barbara, Calif. under the name RP-6, which is hereby incorporated by reference. The system may also be the same or similar to the system disclosed in application No. 10/206,457 published on Jan. 29, 2004, which is hereby incorporated by reference.

FIG. 4 shows a display user interface (“DUI”) 120 that can be displayed at the remote station 16 and/or the robot 12. The DUI 120 may include a robot view field 122 that displays a video image captured by the camera of the robot. The DUI 120 may also include a station view field 124 that displays a video image provided by the camera of the remote station 16. The DUI 120 may be part of an application program stored and operated by the computer 22 of the remote station 16.

The DUI 120 may include a graphic button 126 that can be selected to display an electronic medical record as shown in FIG. 5. The button 126 can be toggled to sequentially view the video image and the electronic medical record. Alternatively, the view field 122 may be split to simultaneously display both the video image and the electronic medical record as shown in FIG. 6. The viewing field may allow the physician to modify the medical record by adding, changing or deleting all or part of the record. The remote clinician can also add to the medical record still images or video captured by the camera of the robot.

The DUI 120 may have a monitor data field 128 that can display the data generated by the medical monitoring device(s) and transmitted to the remote station. The data can be added to the electronic medical record, either automatically or through user input. For example, the data can be added to a record by “dragging” a monitor data field 128 into the viewing field 122.

The DUI 120 may include alert input icons 130 and 132. Alert icon 130 can be selected by the user at the remote station to generate an alert indicator such as a sound from the speaker of the robot. Selection of the icon generates an alert input to the robot. The robot generates a sound through its speaker in response to the alert input. By way of example, the sound may simulate the noise of a horn. Consequently, the icon may have the appearance of a horn. The remote station user may select the horn shaped icon 130 while remotely moving the robot to alert persons to the presence of the moving robot.

Alert icon 132 can be selected to request access to the video images from the robot. The default state of the robot may be to not send video information to the remote station. Selecting the alert icon 132 sends an alert input such as an access request to the robot. The robot then generates an alert indicator. The alert indicator can be a sound generated by the robot speaker, and/or a visual prompt on the robot monitor. By way of example, the visual prompt may be a “flashing” graphical icon. The sound may simulate the knocking of a door. Consequently, the alert icon 128 may have the appearance of a door knocker.

In response to the alert indicator the user may provide a user input such as the depression of a button on the robot, or the selection of a graphical image on the robot monitor, to allow access to the robot camera. The robot may also have a voice recognition system that allows the user to grant access with a voice command. The user input causes the robot to begin transmitting video images from the robot camera to the remote station that requested access to the robot. A voice communication may be established before the cycle of the alert input and response, to allow the user at the remote station to talk to the caller recipient at the robot.

The DUI 120 may include a graphical “battery meter” 134 that indicates the amount of energy left in the robot battery. A graphical “signal strength meter” 136 may indicate the strength of the wireless signal transmitted between the robot and the base station (see FIG. 1).

The DUI 120 may include a location display 138 that provides the location of the robot. The CHANGE button 140 can be selected to change the default robot in a new session. The CHANGE button 140 can be used to select and control a different robot in a system that has multiple robots. The user can initiate and terminate a session by selecting box 142. The box 142 changes from CONNECT to DISCONNECT when the user selects the box to initiate a session. System settings and support can be selected through buttons 144 and 146.

Both the robot view field 122 and the station view field 124 may have associated graphics to vary the video and audio displays. Each field may have an associated graphical audio slide bar 148 to vary the audio level of the microphone and another slide bar 152 to vary the volume of the speakers.

The DUI 120 may have slide bars 150, 154 and 156 to vary the zoom, focus and brightness of the cameras, respectively. A still picture may be taken at either the robot or remote station by selecting one of the graphical camera icons 158. The still picture may be the image presented at the corresponding field 122 or 124 at the time the camera icon 158 is selected. Capturing and playing back video can be taken through graphical icons 160. A return to real time video can be resumed, after the taking of a still picture, captured video, or reviewing a slide show, by selecting a graphical LIVE button 162.

A still picture can be loaded from disk for viewing through selection of icon 164. Stored still images can be reviewed by selecting buttons 166. The number of the image displayed relative to the total number of images is shown by graphical boxes 168. The user can rapidly move through the still images in a slide show fashion or move through a captured video clip by moving the slide bar 170. A captured video image can be paused through the selection of circle 174. Play can be resumed through the same button 174. Video or still images may be dismissed from the active list through button 172. Video or still images may be transferred to the robot by selecting icon 176. For example, a doctor at the remote station may transfer an x-ray to the screen of the robot.

A graphical depiction of the base of the robot can be shown in sensor field 178. The sensor may have various sensors that sense contact with another object. The sensor field 178 can provide a visual display of the sensors that detect the object. By way of example, the field may have one or more graphical dots 180 that display where on the robot the sensors detected an object. This provides the user with a sense of the robot environment that is outside the view of the robot camera.

The graphical depiction of the robot base may contain a graphical vector overlay 182 that indicates the direction of robot movement. The direction of movement may be different than the direction the camera is facing. The vector can provide a visual aid when driving the robot.

The system may provide the ability to annotate 184 the image displayed in field 122 and/or 124. For example, a doctor at the remote station may annotate some portion of the image captured by the robot camera. The annotated image may be stored by the system. The system may also allow for annotation of images sent to the robot through icon 176. For example, a doctor may send an x-ray to the robot which is displayed by the robot screen. The doctor can annotate the x-ray to point out a portion of the x-ray to personnel located at the robot site. This can assist in allowing the doctor to instruct personnel at the robot site.

The display user interface may include graphical inputs 186 that allow the operator to turn the views of the remote station and remote cameras on and off.

While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.

Citations de brevets
Brevet cité Date de dépôt Date de publication Déposant Titre
US5657246 *7 mars 199512 août 1997Vtel CorporationMethod and apparatus for a video conference user interface
US5758079 *7 juin 199626 mai 1998Vicor, Inc.Call control in video conferencing allowing acceptance and identification of participants in a new incoming call during an active teleconference
US5983263 *2 janv. 19989 nov. 1999Intel CorporationMethod and apparatus for transmitting images during a multimedia teleconference
US6346962 *27 févr. 199812 févr. 2002International Business Machines CorporationControl of video conferencing system with pointing device
US7391432 *5 févr. 200224 juin 2008Fujifilm CorporationVideoconference system
US7421470 *26 nov. 20032 sept. 2008Avistar Communications CorporationMethod for real-time communication between plural users
US7433921 *26 nov. 20037 oct. 2008Avistar Communications CorporationSystem for real-time communication between plural users
US7956894 *19 févr. 20037 juin 2011William Rex AkersApparatus and method for computerized multi-media medical and pharmaceutical data organization and transmission
US20010051881 *22 déc. 200013 déc. 2001Aaron G. FillerSystem, method and article of manufacture for managing a medical services network
US20020044201 *18 oct. 200118 avr. 2002Intel CorporationMethod and apparatus for controlling a remote video camera in a video conferencing system
US20020109770 *5 févr. 200215 août 2002Masahiro TeradaVideoconference system
US20040017475 *19 févr. 200329 janv. 2004Akers William RexApparatus and method for computerized multi-media data organization and transmission
US20040107254 *26 nov. 20033 juin 2004Collaboration Properties, Inc.Method for real-time communication between plural users
US20040107255 *26 nov. 20033 juin 2004Collaboration Properties, Inc.System for real-time communication between plural users
US20040150725 *27 oct. 20035 août 2004Canon Kabushiki KaishaVideo system for use with video telephone and video conferencing
AU1216200A * Titre non disponible
JP2002321180A * Titre non disponible
WO2000025516A1 *22 oct. 19994 mai 2000Vtel CorpGraphical menu items for a user interface menu of a video teleconferencing system
Référencé par
Brevet citant Date de dépôt Date de publication Déposant Titre
US72119805 juil. 20061 mai 2007Battelle Energy Alliance, LlcRobotic follow system and method
US75840205 juil. 20061 sept. 2009Battelle Energy Alliance, LlcOccupancy change detection system and method
US75872605 juil. 20068 sept. 2009Battelle Energy Alliance, LlcAutonomous navigation system and method
US76204775 juil. 200617 nov. 2009Battelle Energy Alliance, LlcRobotic intelligence kernel
US76430519 sept. 20055 janv. 2010Roy Benjamin SandbergMobile video teleconferencing system and control method
US76686215 juil. 200623 févr. 2010The United States Of America As Represented By The United States Department Of EnergyRobotic guarded motion system and method
US7761185 *3 oct. 200620 juil. 2010Intouch Technologies, Inc.Remote presence display through remotely controlled robot
US8073564 *5 juil. 20066 déc. 2011Battelle Energy Alliance, LlcMulti-robot control interface
US8306638 *30 nov. 20056 nov. 2012The Invention Science Fund I, LlcMote presentation affecting
US8352072 *7 août 20088 janv. 2013Wave Group Ltd.System for extending the observation, surveillance, and navigational capabilities of a robot
US852585324 juil. 20123 sept. 2013Google Inc.Methods and systems for generating a layered display of a device
US20070122783 *10 oct. 200631 mai 2007Habashi Nader MOn-line healthcare consultation services system and method of using same
US20100010673 *11 juil. 200814 janv. 2010Yulun WangTele-presence robot system with multi-cast features
US20110035054 *7 août 200810 févr. 2011Wave Group Ltd.System for Extending The Observation, Surveillance, and Navigational Capabilities of a Robot
US20110246551 *20 juin 20086 oct. 2011Space Software Italia S.P.A.Adaptive multifunction mission system
WO2006127297A2 *11 mai 200630 nov. 2006Intouch Technologies IncTelerobotic system with a dual application screen presentation
WO2009128997A1 *9 mars 200922 oct. 2009Intouch Technologies, Inc.A robotic based health care system
Classifications
Classification aux États-Unis455/67.7, 901/1
Classification internationaleH04B17/00, B25J5/00
Classification coopérativeB25J9/1689, B25J5/00, B25J11/009
Classification européenneB25J11/00S2, B25J9/16T4, B25J5/00
Événements juridiques
DateCodeÉvénementDescription
15 juin 2005ASAssignment
Owner name: INTOUCH TECHNOLOGIES, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTOUCH HEALTH, INC.;REEL/FRAME:016686/0356
Effective date: 20050531
5 mai 2005ASAssignment
Owner name: INTOUCH-HEALTH, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, YULAN;JORDAN, CHARLES S.;SOUTHARD, JONATHAN;AND OTHERS;REEL/FRAME:016529/0987;SIGNING DATES FROM 20050405 TO 20050414