US5276618A - Doorway transit navigational referencing system - Google Patents

Doorway transit navigational referencing system Download PDF

Info

Publication number
US5276618A
US5276618A US07/846,486 US84648692A US5276618A US 5276618 A US5276618 A US 5276618A US 84648692 A US84648692 A US 84648692A US 5276618 A US5276618 A US 5276618A
Authority
US
United States
Prior art keywords
robot
imaginary vertical
doorway
vertical plane
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US07/846,486
Inventor
Hobart R. Everett, Jr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
US Department of Navy
Original Assignee
US Department of Navy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by US Department of Navy filed Critical US Department of Navy
Priority to US07/846,486 priority Critical patent/US5276618A/en
Assigned to UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY OF THE NAVY reassignment UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY OF THE NAVY ASSIGNMENT OF ASSIGNORS INTEREST. Assignors: EVERETT, HOBART R. JR.
Application granted granted Critical
Publication of US5276618A publication Critical patent/US5276618A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels

Definitions

  • the present invention relates to the field of navigational referencing systems, and more particularly to a navigational referencing system for a mobile robot that derives both x-y positional information and angular orientation of the robot as a natural consequence of transit through a standard, unmodified doorway.
  • the present invention is related issued U.S. Pat. Nos. 4,851,661 (Jul. 25, 1989), 4,857,912 (Aug. 15, 1989), 4,902,887 (Feb. 20, 1990), 5,034,817 (Jul. 23, 1991), 5,045,769 (Sept. 3, 1991) and 5,058,385 (Oct. 22, 1991); and pending U.S. patent application Ser. Nos. 07/531,483 (filed May 29, 1990), 07/593,418 (filed Sept. 28, 1990), 07/697,128 (filed Apr. 18, 1991), 07/719,436 (filed Jun. 24, 1991), and 07/800,341 (filed Nov. 26, 1991), all of which are hereby incorporated by reference.
  • the simplest form of autonomous control is sometimes termed guidepath control and involves a navigational control loop which reacts (in a reflexive manner) to the sensed position of some external guiding reference.
  • the intent is to free a human operator from the requirement of steering the moving platform.
  • encoded reflective stripes might be applied to the floor of the robot's environment.
  • the robot would then be equipped with stripe detecting/decoding for determining the robot's position in its environment as provided on the encoded stripes.
  • AGVs automated guided vehicles
  • guidepath control is seen primarily in the improved efficiency and reduction of manpower since an operator is no longer required to guide the vehicle.
  • Large numbers of AGVs can operate simultaneously in a plant or warehouse without getting lost or disoriented.
  • the AGVs are typically scheduled and controlled by a central computer which monitors overall system operation and vehicle flow. Communication with individual vehicles can be via RF links or directional near-infrared modulated light beams, or other means.
  • the fundamental disadvantage of guidepath control is the lack of flexibility in the system. A vehicle cannot be commanded to go to a new location unless the guidepath is first modified. This is a significant drawback in the event of changes to product flow lines in assembly plants, or in the case of a security robot which must investigate a potential break-in at a designated remote location.
  • truly autonomous control implies the ability of a mobile platform to travel anywhere so desired, subject to nominal considerations of terrain.
  • Many potential applications await an indoor robot that could move in a purposeful fashion from room to room without following a set guidepath, with the intelligence to avoid objects and, if necessary, choose alternative routes of its own planning.
  • specialized sensors must be coupled with some type of "world modeling" capability that represents the relative/absolute locations of objects detected by these sensors.
  • a mobile platform can be provided with sufficient awareness of its surroundings to allow it to move about in a realistic fashion, i.e., a path not forever dictated by a guidepath stripe.
  • an object of the present invention is to provide a navigational referencing method and system for a mobile robot that derives x-y position and angular orientation of the robot within a world model of the robot's environment as the robot traverses doorway openings.
  • Another object of the present invention is to provide a navigational referencing method and system for a mobile robot that derives a relative x-y position and angular orientation of the robot without any modifications to the robot's environment.
  • Yet another object of the present invention is to provide a method and system of navigational referencing for a mobile robot that provides for sufficient updates to the x-y position and angular orientation of the robot within a world model of the robot's environment thereby avoiding the accumulated effect of navigational dead-reckoning errors.
  • a database contains positional information on the defined space in terms of the location of known objects to include a plurality of doorway openings.
  • Each doorway opening has left, right and top door frames that define first and second imaginary vertical planes at the beginning and ending, respectively, of each doorway opening.
  • the first and second imaginary vertical planes further are orthogonal to the floor of the defined space.
  • the first and second imaginary vertical planes for each of the plurality of doorway openings are also defined in terms of an x-y coordinate system.
  • a mobile robot has the capability to access the database, define its location in terms of an approximate position within the x-y coordinate system, and sense the presence of one of the plurality of doorway openings.
  • the robot is positioned and moved along a path that traverses the one doorway opening at a known speed.
  • the top door frame is detected from a first and second sensor position on the robot when the first and second sensor positions break the first imaginary vertical plane.
  • the first and second sensor positions are separated by a known distance along a line orthogonal to the path that traverses the one doorway opening.
  • An angular orientation ⁇ of the robot within the x-y coordinate system is based on a time difference between when the first and second sensor positions break the first imaginary vertical plane, the known speed of the robot and the known separation distance between the first and second sensor positions.
  • a y-coordinate position of the robot within the x-y coordinate system is based on the positional information available from the database on the first imaginary vertical plane when the first or second sensing position breaks therethrough.
  • a lateral position of the robot with respect to the one doorway opening is determined during a portion of the time between the first sensor position breaking the first imaginary vertical plane and the second sensor position breaking the second imaginary vertical plane. The lateral position is compared with the positional information from the database on the one doorway opening in order to determine an x-coordinate of the robot within the x-y coordinate system.
  • FIG. 1 is a perspective view of a mobile robot used in the navigational referencing system and method of the present invention
  • FIG. 2 is a block diagram of the various central processing units that may be used by the present invention.
  • FIG. 3 is a two-dimensional top view of a simple floor plan having the mobile robot positioned in the vicinity of a doorway opening;
  • FIG. 4 is a top view of the mobile robot outputting three acoustic beams as it approaches an expanded view of a doorway opening;
  • FIG. 5 is a top view of the mobile robot of FIG. 4 as it gets closer to the doorway opening;
  • FIG. 6 is a top view of the mobile robot of FIG. 5 just before it enters the doorway opening;
  • FIG. 7 is a top view of the mobile robot as it breaks the first imaginary vertical plane formed by the doorway opening
  • FIG. 8 is a side view of a sensor pair focused to detect the top door frame of the doorway opening.
  • a navigational referencing system for a mobile robot that utilized doorway openings as positional and angular orientation reference points would typically carry out the following tasks:
  • tasks 4, 5 and 6 contain the novel aspects of the present invention.
  • tasks 1, 2, 3 and 7 will also be described hereinbelow.
  • the present invention is not limited to the preferred embodiment system in order to carry out the above tasks.
  • Robot 10 may house a variety of sensors, cameras, receivers, transmitters, etc. However, for sake of simplicity, only those elements pertaining to navigational referencing will be shown and described herein. Similarly, propulsion systems, drive trains and various other mechanical aspects of robot 10 may be designed to suit a particular need based on well known state-of-the-art robotics systems. Since these are not critical considerations with respect to tasks 4, 5 and 6, details of these aspects of robot 10 have been omitted.
  • robot 10 has an onboard scheduling CPU (central processing unit) 100 shown in the functional block diagram of FIG. 2.
  • Scheduling CPU or scheduler 100 is provided with the capability of communicating (for example, via a radio link) with a host CPU or planner 200.
  • planner 200 maintains a world model or floor plan (e.g. an x-y coordinate system) of the robot's operating environment.
  • planner 200 knows the absolute position of each doorway opening in the robot's operating environment with respect to the absolute x-y coordinate system.
  • the floor plan maintained by planner 200 is typically initialized by means of a user interface 201 such as a personal computer.
  • the floor plan may be generated by any conventional computer aided design package.
  • Scheduler 100 also communicates with a plurality of onboard sensor processing (data acquisition) CPUs to assimilate data received and processed by the onboard sensors and associated CPU's.
  • CPUs include, but are not limited to, a video CPU 101 for processing video signals from an onboard video camera, a speech recognition/synthesis CPU 102 for voice command processing, a propulsion CPU 103 for controlling the robot's propulsion drive train system, an optical CPU 104 for receiving/processing optical sensor information, a sonar CPU 105 for receiving/processing ultrasonic sonar sensor information, and a pan axis controller CPU 106 for controlling rotational movement of the robot's head 16.
  • a video CPU 101 for processing video signals from an onboard video camera
  • a speech recognition/synthesis CPU 102 for voice command processing
  • a propulsion CPU 103 for controlling the robot's propulsion drive train system
  • an optical CPU 104 for receiving/processing optical sensor information
  • a sonar CPU 105 for receiving/processing ultrasonic sonar sensor information
  • the first of the aforementioned tasks, or finding the doorway may be addressed in a variety of ways.
  • One approach is to use a plurality of ultrasonic ranging sensors 18 mounted around the periphery of robot 10 in conjunction with a plurality of optical proximity sensors 14.
  • Ultrasonic ranging sensors 18 are typically capable of accurately measuring distance but have a poor angular resolution. This shortfall is compensated for by the fact that optical proximity sensors 14 generally have superior angular resolution at the expense of little or no ranging capability.
  • planner 200 and scheduler 100 work together with inputs from sensors 12 and 14 to direct the robot to a specified absolute position from which a doorway opening could be found. It will also be assumed that the robot knows its approximate position within the x-y coordinate system.
  • robot 10 may have accumulated minor enroute dead-reckoning errors and may have experienced real-world terrain traversability problems (i.e. wheel slippage).
  • planner 200 informs scheduler 100 of a path segment 26a extending from (the assumed position) robot 10a that is thought to penetrate doorway opening 50 at its center 51.
  • Planner 200 provides an estimated bearing ⁇ EST and distance to doorway opening 50.
  • Scheduler 100 then rotates the head 16 of (the actual position) robot 10 to the estimated bearing ⁇ EST . This points an optical proximity sensor 15 located on head 16 at door opening 50 in order to begin the task of verifying the doorway opening.
  • an array of fixed proximity sensors, such as optical proximity sensors 14, could be operated sequentially to simulate rotation of head 16.
  • proximity sensor 15 will return a "no target" condition, as it looks through doorway opening 50. If this is not the case, head 16 begins scanning 15° to either side of the estimated bearing ⁇ EST in an attempt to find doorway opening 50. If this search fails to locate doorway opening 50, an error condition is returned informing planner 200 that robot 10 is either significantly lost or the door associated with the opening is closed.
  • scheduler 100 next attempts to locate the left and right edges of doorway opening 50.
  • One way of accomplishing this is to pan head 16 while processing output from proximity sensor 15 for a "target” condition.
  • a "target” condition is indicative of energy being reflected from the doorway opening frames and adjacent wall areas on either side thereof. Since robot 10 is typically not aligned with the center 51 of doorway opening 50, position angles of head 16 corresponding to the left and right boundaries of the doorway opening are averaged to yield a relative bearing ⁇ REL to the center 51 of doorway opening 50. (Note that the relationship between the estimated bearing ⁇ EST and the relative bearing ⁇ REL has been exaggerated for purposes of clarity. In reality, this difference is on the order of one or two degrees.)
  • Scheduler 100 then alters the heading of robot 10 to be coincident with the relative bearing ⁇ REL to the actual center 51 of doorway opening 50.
  • the task of entering the doorway begins.
  • One acceptable approach begins with processing (at sonar CPU 105) sonar data received from a plurality of sonar transducers 18 as robot 10 moves along the relative bearing ⁇ REL to the center 51 of doorway opening 50. Measured distance to doorway opening 50 should be within a specified tolerance of the estimated range provided by planner 200, less any distance traveled in the interim. Three (or more) of the transducers 18 balanced about the relative bearing ⁇ REL could be focused so that those transducers indicate ranges with the specified tolerance at a specified range. This is easily explained by way of example with reference to FIGS. 4-6 where robot 10 is shown on an orthogonal approach to a doorway opening.
  • FIG. 4 is an expanded top view of a doorway opening 50, bounded by left and right walls 52L and 52R which are shown in cross-section. Walls 52L and 52R in turn support left and right door frames 53L and 53R.
  • the top portion of doorway opening 50 is omitted from FIG. 4 for purposes of clarity.
  • three sonar ranging beams a left beam 54, a center beam 56 and a right beam 58, propagate towards doorway opening 50.
  • beams 54 and 56 and 58 overlap as shown.
  • the transducers "ranging" on beams 54, 56 and 58 should all indicate a range commensurate with the estimated range.
  • sensing positions 20 and 22 are located on top of robot 10 as shown. Sensing positions 20 and 22 are fixed in relationship to one another on a mounting bar 24 and are separated by a known distance d. As shown in the top view of FIG. 7, mounting bar 24 (and hence sensing positions 20 and 22) is maintained in an orthogonal relationship with respect the robot's path 26. Dotted lines 30 and 32 represent two imaginary vertical planes formed at doorway opening 50 between the left and right doorway frames 53L and 53R and the top doorway frame (not shown in this view).
  • Each vertical plane 30 and 32 is perpendicular to the floor and is defined two dimensionally within the x-y coordinate system of planner 200.
  • accurate detection of vertical plane 30 allows the robot to start the process that updates its actual position within the x-y coordinate system maintained by planner 200.
  • vertical plane 30 is defined by the leading edge of the door frame (with respect to the robot's position and movement) while vertical plane 32 is defined by the trailing edge of the door frame.
  • Accurate detection of plane 30 is achieved in the present invention by orienting sensors placed at sensing positions 20 and 22 so that their beams are focused vertically and orthogonal to the path 26 of robot 10, so as to detect the top of the door frame of doorway opening 50.
  • sensing position 20 breaks vertical plane 30 prior to sensing position 22.
  • sensing position 22 By focusing sensor beams from sensing position 20 vertically, numerous problems generally associated with prior art approaches that detect the left and right door frames 53L and 53R, respectively, (e.g. varying widths of doorway openings and/or presence of objects near doorway opening 50 such as the door itself) are eliminated.
  • sensors placed at positions 20 and 22 can be limited to focus region that is essentially sensitive only to the height of the top door frame of a traversed doorway opening.
  • a sensor pair 28 includes an emitter 28E and detector 28D mounted on mounting bar 24.
  • Emitter 28E and detector 28D could be near-infrared optical proximity sensors configured in the convergent mode. Such sensors produce beams that can be tightly focused.
  • Emitter 28E and detector 28D are centered about sensing position 20 (or sensing position 22).
  • Sensor pair 28 is focused so that a zone of possible detection, referenced by the dotted line area 29, would be centered at a height approximately equal to the top door frame 53T (shown only in section). In this way, as sensing position 20 breaks the imaginary vertical plane 30, the leading edge of top door frame 53T will be detected. Since there are typically no obstructions at top door frame 53T, the accuracy of detecting vertical plane 30 is maximized.
  • scheduler 100 obtains a longitudinal position (or y-coordinate) fix based on positional information on vertical plane 30 stored in planner 200.
  • T is the time difference between when sensing positions 20 and 22 break through vertical plane 30 as measured, for example, by a clock maintained onboard robot 10 or scheduler 100, and
  • d is the known separation distance between sensing positions 20 and 22.
  • the final task in the navigational referencing procedure is to determine lateral position (or the x-coordinate) of robot 10 with respect to doorway opening 50.
  • a method of achieving this is by applying one of many well known acoustic time-of-flight ranging techniques. For example, as sensing position 20 breaks imaginary vertical plane 30, two of the ultrasonic ranging sensors 12 (FIG. 1) that are approximately orthogonal to doorway opening 50 could be configured to begin sonar ranging (i.e. "pinging"). The two sensors chosen should be capable of detecting the left and right door frames, respectively. Note that “approximately orthogonal” is sufficient since such sensors typically have a beam width of 30° while the robot is typically misaligned by less than 10°.
  • Pinging could take the form of single pings or be continued until sensing position 20 breaks imaginary vertical plane 32. Alternatively, pinging could continue until sensing position 22 breaks vertical plane 32. In either case, the resulting acoustic range profile could be processed so that simultaneous left (door frame) and right (door frame) created range minimums would add together to yield the width of doorway opening 50. Ranges are reported relative to the vertical axis of the robot. Scheduler 100 can then use these left and right range values as the basis for determining the lateral position of robot 10 relative to the x-coordinates of doorway opening 50 maintained by planner 200. For example, if the left and right range values were both 18 inches, planner 200 would know that the robot was centered in the doorway opening.
  • planner 200 would know that the robot was 2 inches too far to the left.
  • planner 200 since planner 200 knows the doorway measurements and orientation in terms of the absolute x-y coordinate system, it is able to resolve ambiguities between the robot's assumed position and the robot's actual position with respect to the doorway opening.
  • planner 200 converts the longitudinal position, lateral position and angular orientation of the robot relative to the doorway opening into the absolute coordinates of the x-y coordinate system.
  • the navigational referencing system and method of the present invention allows a mobile robot to accurately update its position within its environment every time the robot traverses a doorway opening. Therefore, any accumulation of dead-reckoning errors is minimized.
  • no doorway opening need be modified to accommodate this robust navigational referencing system.
  • the present invention is easily adaptable to any existing structure as long as a database of the floor plan is made available to the referencing system in terms of an x-y coordinate system (or other suitable coordinate system).
  • detection of the doorway opening via the top door frame
  • the present invention will be able to provide accurate navigational referencing in many existing structures.
  • the present invention can also be readily adapted to non-standard door heights as long as they are consistent.
  • T T is the total time between leading edge detection of vertical plane 30 and trailing edge detection of vertical plane 32.

Abstract

A system of navigational referencing for a robot moving within a defined space is provided. A database contains x-y coordinate positional information on a plurality of doorway openings in terms of an x-y coordinate system. A mobile robot is provided that has the capability to access the database, define its location in terms of an approximate position within the defined space, and sense the presence of one of the plurality of doorway openings. The robot is positioned and moved along a path that traverses the one doorway opening at a known speed. The doorway opening's top door frame is detected from first and second sensor positions on the robot. The first and second sensor positions are separated by a known distance along a line orthogonal to the path that traverses the one doorway opening. An angular orientation of the robot is based on a time difference between when the first and second sensor positions detect the top door frame, the known speed of the robot and the known separation distance between the first and second sensor positions. A y-coordinate position of the robot is based on the positional information from the database when the first or second sensing position detects the top door frame. A lateral position of the robot is determined during a portion of the time between when the first and second sensor positions detect the top door frame. The lateral position is compared with the positional information from the database in order to determine an x-coordinate of the robot.

Description

STATEMENT OF GOVERNMENT INTEREST
The invention described herein may be manufactured and used by or for the Government of the United States for governmental purposes without the payment of any royalties thereon or therefor.
FIELD OF THE INVENTION
The present invention relates to the field of navigational referencing systems, and more particularly to a navigational referencing system for a mobile robot that derives both x-y positional information and angular orientation of the robot as a natural consequence of transit through a standard, unmodified doorway.
CROSS-REFERENCES TO RELATED APPLICATIONS
The present invention is related issued U.S. Pat. Nos. 4,851,661 (Jul. 25, 1989), 4,857,912 (Aug. 15, 1989), 4,902,887 (Feb. 20, 1990), 5,034,817 (Jul. 23, 1991), 5,045,769 (Sept. 3, 1991) and 5,058,385 (Oct. 22, 1991); and pending U.S. patent application Ser. Nos. 07/531,483 (filed May 29, 1990), 07/593,418 (filed Sept. 28, 1990), 07/697,128 (filed Apr. 18, 1991), 07/719,436 (filed Jun. 24, 1991), and 07/800,341 (filed Nov. 26, 1991), all of which are hereby incorporated by reference.
BACKGROUND OF THE INVENTION
The ultimate goal of a robotics system is of course to perform some useful function in place of its human counterpart. Benefits typically associated with the installation of fixed-location industrial robots are improved effectiveness, higher quality, reductions in manpower, greater efficiency, reliability and cost savings. Additional benefits include the ability to perform tasks for which humans are incapable and the removal of humans from dangerous or life-threatening scenarios. The concept of mobility has always suggested an additional range of applications beyond that of the typical factory floor, wherein free-roaming robots moved about with an added versatility that brought even greater returns. In practice, however, the realization of this dream has been slow in coming.
One of the most significant technological hurdles impeding the widespread introduction of mobile robotics systems arises from the need for a mobile platform to successfully interact with the physical objects and entities in its environment. The robot must be able to navigate from a known position to a desired new location and orientation, at the same time avoiding any contact with fixed or moving objects while enroute. A robust navigational scheme that preserves the validity of a world model for free-roaming platforms has remained an elusive research goal, and for this reason many proposed applications of autonomous mobile robots are yet to be implemented.
The simplest form of autonomous control is sometimes termed guidepath control and involves a navigational control loop which reacts (in a reflexive manner) to the sensed position of some external guiding reference. The intent is to free a human operator from the requirement of steering the moving platform. For example, encoded reflective stripes might be applied to the floor of the robot's environment. The robot would then be equipped with stripe detecting/decoding for determining the robot's position in its environment as provided on the encoded stripes. Such automated guided vehicles (AGVs) have found extensive use in factories and warehouses for material transfer, in modern office scenarios for material and mail pickup and delivery, and in hospitals for delivery of meals and supplies to nursing stations, to name but a few.
Advantages of guidepath control are seen primarily in the improved efficiency and reduction of manpower since an operator is no longer required to guide the vehicle. Large numbers of AGVs can operate simultaneously in a plant or warehouse without getting lost or disoriented. The AGVs are typically scheduled and controlled by a central computer which monitors overall system operation and vehicle flow. Communication with individual vehicles can be via RF links or directional near-infrared modulated light beams, or other means. However, the fundamental disadvantage of guidepath control is the lack of flexibility in the system. A vehicle cannot be commanded to go to a new location unless the guidepath is first modified. This is a significant drawback in the event of changes to product flow lines in assembly plants, or in the case of a security robot which must investigate a potential break-in at a designated remote location.
Thus, truly autonomous control implies the ability of a mobile platform to travel anywhere so desired, subject to nominal considerations of terrain. Many potential applications await an indoor robot that could move in a purposeful fashion from room to room without following a set guidepath, with the intelligence to avoid objects and, if necessary, choose alternative routes of its own planning. To do this, specialized sensors must be coupled with some type of "world modeling" capability that represents the relative/absolute locations of objects detected by these sensors. In this way, a mobile platform can be provided with sufficient awareness of its surroundings to allow it to move about in a realistic fashion, i.e., a path not forever dictated by a guidepath stripe.
The accuracy of this model, which is constructed and refined in a continuous fashion as the robot moves about its workspace, is directly dependent throughout this process upon the validity of the robot's perceived location and orientation. Accumulated dead-reckoning errors can quickly render the information entered into the model invalid in that the associated geographical reference point for data acquired relative to the robot's position is incorrect. As the accuracy of the model degrades, the ability of the robot to successfully navigate and avoid collisions diminishes rapidly, until it fails altogether. One solution to this problem is to provide navigation landmarks within the robot's environment for use by the robot in periodic (absolute) positional updates. The concept of using existing interior doorways as navigation landmarks for a mobile robotics system has always been appealing, in that no modifications to the surrounding environment are required. The robot by necessity must travel through a doorway to enter an adjoining space. If in so doing the system could obtain an accurate positional update, then such would indeed represent an elegant solution to the problem of cumulative dead-reckoning errors.
Thus, the need exists for a navigational referencing system for a mobile robot that can derive its own updated positional information as it moves through its environment. Accordingly, an object of the present invention is to provide a navigational referencing method and system for a mobile robot that derives x-y position and angular orientation of the robot within a world model of the robot's environment as the robot traverses doorway openings. Another object of the present invention is to provide a navigational referencing method and system for a mobile robot that derives a relative x-y position and angular orientation of the robot without any modifications to the robot's environment. Yet another object of the present invention is to provide a method and system of navigational referencing for a mobile robot that provides for sufficient updates to the x-y position and angular orientation of the robot within a world model of the robot's environment thereby avoiding the accumulated effect of navigational dead-reckoning errors.
SUMMARY OF THE INVENTION
In accordance with the present invention, a system of navigational referencing for a robot moving within a defined space is provided. A database contains positional information on the defined space in terms of the location of known objects to include a plurality of doorway openings. Each doorway opening has left, right and top door frames that define first and second imaginary vertical planes at the beginning and ending, respectively, of each doorway opening. The first and second imaginary vertical planes further are orthogonal to the floor of the defined space. The first and second imaginary vertical planes for each of the plurality of doorway openings are also defined in terms of an x-y coordinate system. A mobile robot is provided that has the capability to access the database, define its location in terms of an approximate position within the x-y coordinate system, and sense the presence of one of the plurality of doorway openings. The robot is positioned and moved along a path that traverses the one doorway opening at a known speed. The top door frame is detected from a first and second sensor position on the robot when the first and second sensor positions break the first imaginary vertical plane. The first and second sensor positions are separated by a known distance along a line orthogonal to the path that traverses the one doorway opening. An angular orientation α of the robot within the x-y coordinate system is based on a time difference between when the first and second sensor positions break the first imaginary vertical plane, the known speed of the robot and the known separation distance between the first and second sensor positions. A y-coordinate position of the robot within the x-y coordinate system is based on the positional information available from the database on the first imaginary vertical plane when the first or second sensing position breaks therethrough. A lateral position of the robot with respect to the one doorway opening is determined during a portion of the time between the first sensor position breaking the first imaginary vertical plane and the second sensor position breaking the second imaginary vertical plane. The lateral position is compared with the positional information from the database on the one doorway opening in order to determine an x-coordinate of the robot within the x-y coordinate system.
DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view of a mobile robot used in the navigational referencing system and method of the present invention;
FIG. 2 is a block diagram of the various central processing units that may be used by the present invention;
FIG. 3 is a two-dimensional top view of a simple floor plan having the mobile robot positioned in the vicinity of a doorway opening;
FIG. 4 is a top view of the mobile robot outputting three acoustic beams as it approaches an expanded view of a doorway opening;
FIG. 5 is a top view of the mobile robot of FIG. 4 as it gets closer to the doorway opening;
FIG. 6 is a top view of the mobile robot of FIG. 5 just before it enters the doorway opening;
FIG. 7 is a top view of the mobile robot as it breaks the first imaginary vertical plane formed by the doorway opening;
FIG. 8 is a side view of a sensor pair focused to detect the top door frame of the doorway opening; and
DESCRIPTION OF THE PREFERRED EMBODIMENT
A navigational referencing system for a mobile robot that utilized doorway openings as positional and angular orientation reference points would typically carry out the following tasks:
1) Find the doorway.
2) Verify the doorway.
3) Enter the doorway.
4) Determine longitudinal position or y-coordinate of the robot relative to the doorway.
5) Determine angular orientation α of the robot relative to the doorway.
6) Determine lateral position or x-coordinate of the robot relative to the doorway.
7) Convert relative x, y, α to absolute.
It is to be appreciated that tasks 4, 5 and 6 contain the novel aspects of the present invention. However, for a more complete understanding of the present invention, tasks 1, 2, 3 and 7 will also be described hereinbelow. Furthermore, it is to be understood at the outset that the present invention is not limited to the preferred embodiment system in order to carry out the above tasks.
Referring now to the drawings, and in particular to FIG. 1, a perspective view of a mobile robot 10 is shown. Robot 10 may house a variety of sensors, cameras, receivers, transmitters, etc. However, for sake of simplicity, only those elements pertaining to navigational referencing will be shown and described herein. Similarly, propulsion systems, drive trains and various other mechanical aspects of robot 10 may be designed to suit a particular need based on well known state-of-the-art robotics systems. Since these are not critical considerations with respect to tasks 4, 5 and 6, details of these aspects of robot 10 have been omitted.
Typically, robot 10 has an onboard scheduling CPU (central processing unit) 100 shown in the functional block diagram of FIG. 2. Scheduling CPU or scheduler 100 is provided with the capability of communicating (for example, via a radio link) with a host CPU or planner 200. For purposes of the present invention, planner 200 maintains a world model or floor plan (e.g. an x-y coordinate system) of the robot's operating environment. In particular, planner 200 knows the absolute position of each doorway opening in the robot's operating environment with respect to the absolute x-y coordinate system. The floor plan maintained by planner 200 is typically initialized by means of a user interface 201 such as a personal computer. The floor plan may be generated by any conventional computer aided design package.
Scheduler 100 also communicates with a plurality of onboard sensor processing (data acquisition) CPUs to assimilate data received and processed by the onboard sensors and associated CPU's. Examples of such CPU's include, but are not limited to, a video CPU 101 for processing video signals from an onboard video camera, a speech recognition/synthesis CPU 102 for voice command processing, a propulsion CPU 103 for controlling the robot's propulsion drive train system, an optical CPU 104 for receiving/processing optical sensor information, a sonar CPU 105 for receiving/processing ultrasonic sonar sensor information, and a pan axis controller CPU 106 for controlling rotational movement of the robot's head 16.
The first of the aforementioned tasks, or finding the doorway, may be addressed in a variety of ways. One approach is to use a plurality of ultrasonic ranging sensors 18 mounted around the periphery of robot 10 in conjunction with a plurality of optical proximity sensors 14. Ultrasonic ranging sensors 18 are typically capable of accurately measuring distance but have a poor angular resolution. This shortfall is compensated for by the fact that optical proximity sensors 14 generally have superior angular resolution at the expense of little or no ranging capability. For purposes of the present invention, it will be assumed that planner 200 and scheduler 100 work together with inputs from sensors 12 and 14 to direct the robot to a specified absolute position from which a doorway opening could be found. It will also be assumed that the robot knows its approximate position within the x-y coordinate system. The robot's actual position within the x-y coordinate system is said to be known approximately because during the time since its last positional update, robot 10 may have accumulated minor enroute dead-reckoning errors and may have experienced real-world terrain traversability problems (i.e. wheel slippage).
Reference will also be made now to a simple floor plan for purposes of describing the robot's task of finding a doorway opening. In the two-dimensional top view of FIG. 3, the robot will be assumed to have traveled to a position that is aligned with the center 51 of doorway opening 50 as shown by the dotted line representation of the robot referenced by numeral 10a. However, due to the aforementioned dead-reckoning errors and traversability problems, the robot is actually positioned as shown by the solid line representation of the robot referenced by numeral 10.
To find doorway opening 50 once robot 10 is in the vicinity of same, planner 200 informs scheduler 100 of a path segment 26a extending from (the assumed position) robot 10a that is thought to penetrate doorway opening 50 at its center 51. Planner 200 provides an estimated bearing θEST and distance to doorway opening 50. For sake of simplicity, it will further be assumed that planner 200 has oriented (the assumed position) robot 10a onto the path segment 26a that actually penetrates doorway opening 50 orthogonal to the doorway opening's associated wall, i.e. θEST =90°. Scheduler 100 then rotates the head 16 of (the actual position) robot 10 to the estimated bearing θEST. This points an optical proximity sensor 15 located on head 16 at door opening 50 in order to begin the task of verifying the doorway opening. Alternatively, an array of fixed proximity sensors, such as optical proximity sensors 14, could be operated sequentially to simulate rotation of head 16.
Unless robot 10 is significantly misaligned due to previously accumulated dead-reckoning errors, proximity sensor 15 will return a "no target" condition, as it looks through doorway opening 50. If this is not the case, head 16 begins scanning 15° to either side of the estimated bearing θEST in an attempt to find doorway opening 50. If this search fails to locate doorway opening 50, an error condition is returned informing planner 200 that robot 10 is either significantly lost or the door associated with the opening is closed.
Assuming doorway opening 50 is detected, scheduler 100 next attempts to locate the left and right edges of doorway opening 50. One way of accomplishing this is to pan head 16 while processing output from proximity sensor 15 for a "target" condition. A "target" condition is indicative of energy being reflected from the doorway opening frames and adjacent wall areas on either side thereof. Since robot 10 is typically not aligned with the center 51 of doorway opening 50, position angles of head 16 corresponding to the left and right boundaries of the doorway opening are averaged to yield a relative bearing θREL to the center 51 of doorway opening 50. (Note that the relationship between the estimated bearing θEST and the relative bearing θREL has been exaggerated for purposes of clarity. In reality, this difference is on the order of one or two degrees.)
Scheduler 100 then alters the heading of robot 10 to be coincident with the relative bearing θREL to the actual center 51 of doorway opening 50. At this point, the task of entering the doorway begins. One acceptable approach begins with processing (at sonar CPU 105) sonar data received from a plurality of sonar transducers 18 as robot 10 moves along the relative bearing θREL to the center 51 of doorway opening 50. Measured distance to doorway opening 50 should be within a specified tolerance of the estimated range provided by planner 200, less any distance traveled in the interim. Three (or more) of the transducers 18 balanced about the relative bearing θREL could be focused so that those transducers indicate ranges with the specified tolerance at a specified range. This is easily explained by way of example with reference to FIGS. 4-6 where robot 10 is shown on an orthogonal approach to a doorway opening.
FIG. 4 is an expanded top view of a doorway opening 50, bounded by left and right walls 52L and 52R which are shown in cross-section. Walls 52L and 52R in turn support left and right door frames 53L and 53R. The top portion of doorway opening 50 is omitted from FIG. 4 for purposes of clarity. As robot 10 approaches doorway opening 50, three sonar ranging beams: a left beam 54, a center beam 56 and a right beam 58, propagate towards doorway opening 50. For complete ranging coverage, beams 54 and 56 and 58 overlap as shown. Thus, the transducers "ranging" on beams 54, 56 and 58 should all indicate a range commensurate with the estimated range.
As robot 10 closes on doorway opening 50, the center beam 56 will break through opening 50 as shown in FIG. 5. A corresponding increase in range measurement will result from the transducer producing center beam 56. Assuming robot 10 is perfectly aligned with opening 50, beam "break through" of center beam 56 occurs at a range where its effective beam width becomes less than the width of opening 50. However, since perfect alignment is atypical, a slight delay in beam break through results as center beam 56 narrows (with respect to opening 50) on approach. Beams 54 and 58 continue to strike frames 53L and 53R as robot 10 approaches opening 50. Thus, the range to opening 50 is continuously measured by beams 54 and 58 as robot 10 approaches the opening. Similar to center beam 56, left beam 54 and right beam 58 will eventually break through opening 50 as shown in FIG. 6. This process allows for verification of doorway detection as the beams are observed to break through at the predicted ranges.
As robot 10 enters doorway opening 50 and passes there through, the present invention executes its novel approach to accomplishing tasks 4, 5 and 6. Referring again to FIG. 1, two sensing positions 20 and 22 are located on top of robot 10 as shown. Sensing positions 20 and 22 are fixed in relationship to one another on a mounting bar 24 and are separated by a known distance d. As shown in the top view of FIG. 7, mounting bar 24 (and hence sensing positions 20 and 22) is maintained in an orthogonal relationship with respect the robot's path 26. Dotted lines 30 and 32 represent two imaginary vertical planes formed at doorway opening 50 between the left and right doorway frames 53L and 53R and the top doorway frame (not shown in this view). Each vertical plane 30 and 32 is perpendicular to the floor and is defined two dimensionally within the x-y coordinate system of planner 200. Thus, accurate detection of vertical plane 30 (or vertical plane 32 if robot 10 is approaching opening 50 from the opposite direction) allows the robot to start the process that updates its actual position within the x-y coordinate system maintained by planner 200. As shown, vertical plane 30 is defined by the leading edge of the door frame (with respect to the robot's position and movement) while vertical plane 32 is defined by the trailing edge of the door frame.
Accurate detection of plane 30 is achieved in the present invention by orienting sensors placed at sensing positions 20 and 22 so that their beams are focused vertically and orthogonal to the path 26 of robot 10, so as to detect the top of the door frame of doorway opening 50. In the example shown, sensing position 20 breaks vertical plane 30 prior to sensing position 22. (However, their situation could just as easily be reversed if robot 10 approached opening 50 from a different heading.) By focusing sensor beams from sensing position 20 vertically, numerous problems generally associated with prior art approaches that detect the left and right door frames 53L and 53R, respectively, (e.g. varying widths of doorway openings and/or presence of objects near doorway opening 50 such as the door itself) are eliminated. This is because the top frame of most doorway openings, unlike its sides, is typically unobstructed and is located at a standard height of approximately 80 inches above the floor. Accordingly, sensors placed at positions 20 and 22 can be limited to focus region that is essentially sensitive only to the height of the top door frame of a traversed doorway opening.
One implementation of such a sensing arrangement is shown in FIG. 8 and is applicable for use at both sensing positions 20 and 22. A sensor pair 28 includes an emitter 28E and detector 28D mounted on mounting bar 24. Emitter 28E and detector 28D could be near-infrared optical proximity sensors configured in the convergent mode. Such sensors produce beams that can be tightly focused.
Emitter 28E and detector 28D are centered about sensing position 20 (or sensing position 22). Sensor pair 28 is focused so that a zone of possible detection, referenced by the dotted line area 29, would be centered at a height approximately equal to the top door frame 53T (shown only in section). In this way, as sensing position 20 breaks the imaginary vertical plane 30, the leading edge of top door frame 53T will be detected. Since there are typically no obstructions at top door frame 53T, the accuracy of detecting vertical plane 30 is maximized. At time of detection, scheduler 100 obtains a longitudinal position (or y-coordinate) fix based on positional information on vertical plane 30 stored in planner 200.
As robot 10 traverses doorway opening 50, another sensor pair (not shown but identical to sensor pair 28) positioned at sensing position 22 will break through vertical plane 30. It is as this point in time that angular orientation α of the robot 10 with respect to the doorway opening can be determined in accordance with the formula ##EQU1## where V is the speed of robot 10,
T is the time difference between when sensing positions 20 and 22 break through vertical plane 30 as measured, for example, by a clock maintained onboard robot 10 or scheduler 100, and
d is the known separation distance between sensing positions 20 and 22.
The final task in the navigational referencing procedure is to determine lateral position (or the x-coordinate) of robot 10 with respect to doorway opening 50. A method of achieving this is by applying one of many well known acoustic time-of-flight ranging techniques. For example, as sensing position 20 breaks imaginary vertical plane 30, two of the ultrasonic ranging sensors 12 (FIG. 1) that are approximately orthogonal to doorway opening 50 could be configured to begin sonar ranging (i.e. "pinging"). The two sensors chosen should be capable of detecting the left and right door frames, respectively. Note that "approximately orthogonal" is sufficient since such sensors typically have a beam width of 30° while the robot is typically misaligned by less than 10°. Pinging could take the form of single pings or be continued until sensing position 20 breaks imaginary vertical plane 32. Alternatively, pinging could continue until sensing position 22 breaks vertical plane 32. In either case, the resulting acoustic range profile could be processed so that simultaneous left (door frame) and right (door frame) created range minimums would add together to yield the width of doorway opening 50. Ranges are reported relative to the vertical axis of the robot. Scheduler 100 can then use these left and right range values as the basis for determining the lateral position of robot 10 relative to the x-coordinates of doorway opening 50 maintained by planner 200. For example, if the left and right range values were both 18 inches, planner 200 would know that the robot was centered in the doorway opening. If, however, the left range value was 16 inches and the right range value was 20 inches, planner 200 would know that the robot was 2 inches too far to the left. Finally, since planner 200 knows the doorway measurements and orientation in terms of the absolute x-y coordinate system, it is able to resolve ambiguities between the robot's assumed position and the robot's actual position with respect to the doorway opening. Thus, planner 200 converts the longitudinal position, lateral position and angular orientation of the robot relative to the doorway opening into the absolute coordinates of the x-y coordinate system.
The advantages of the present invention are numerous. First, the navigational referencing system and method of the present invention allows a mobile robot to accurately update its position within its environment every time the robot traverses a doorway opening. Therefore, any accumulation of dead-reckoning errors is minimized. In addition, no doorway opening need be modified to accommodate this robust navigational referencing system. Thus, the present invention is easily adaptable to any existing structure as long as a database of the floor plan is made available to the referencing system in terms of an x-y coordinate system (or other suitable coordinate system). Further, by focusing proximity sensors vertically, detection of the doorway opening (via the top door frame) is not affected by the width of the doorway opening or other adjacent objects in the robot's surrounding environment. Finally, since heights of doorway openings are for the most part standard, the present invention will be able to provide accurate navigational referencing in many existing structures. The present invention can also be readily adapted to non-standard door heights as long as they are consistent.
Although the invention has been described relative to a specific embodiment thereof, there are numerous variations and modifications that will be readily apparent to those skilled in the art in the light of the above teachings. For example, verification of the angular orientation o could easily be achieved by taking similar sensor measurements as sensing positions 20 and 22 break through imaginary vertical plane 32 defined by the trailing edge of the door frame. In addition, the measured distance between vertical planes 30 and 32 could easily be calculated from the equation
V*T.sub.T cosα                                       (2)
where TT is the total time between leading edge detection of vertical plane 30 and trailing edge detection of vertical plane 32.
This would serve as a means of verifying that the robot was indeed passing through a doorway opening of known dimensions and not under some other overhead target (e.g. EXIT sign, duct work, etc.). Further, the use of doorway openings could be replaced by some other fixed and known location overhead structural detail such as beams. Finally, the proximity sensors could be replaced with video cameras as a means of detecting the top door frame. It is therefore to be understood that, within the scope of the appended claims, the invention may be practiced other than as specifically described.

Claims (11)

What is claimed as new and desired to be secured by letters patent of the United States is:
1. A method of navigational referencing for a robot which moves within a defined space comprising the steps of:
providing the robot with positional information on the defined space in terms of plurality of doorway openings, each doorway opening having left, right and top door frames defining first and second imaginary vertical planes at the beginning and ending, respectively, of each doorway opening, the first and second imaginary vertical planes further being orthogonal to the floor of the defined space, wherein the first and second imaginary vertical planes for each of the plurality of doorway openings are defined in terms of an x-y coordinate system;
finding one of the plurality of doorway openings;
positioning the robot on a heading to traverse said one doorway opening;
moving the robot along a path defined by the heading in order to traverse said one doorway opening at a known speed;
detecting the top door frame from a first sensor position on the robot when the first sensor position enters the first imaginary vertical plane;
detecting the top door frame from a second sensor position on the robot when the second sensor position enters the first imaginary vertical plane, wherein the first and second sensor positions are separated by a known distance along a line orthogonal to the path defined by the robot's heading;
measuring a time difference between when the first and second sensor positions break the first imaginary vertical plane, wherein an angular orientation α of the robot within the x-y coordinate system is based on the measured time difference, the known speed of the robot and the known separation distance between the first and second sensor positions, and a y-coordinate position of the robot within the x-y coordinate system is based on the provided positional information on the first imaginary vertical plane when the first or second sensing position enters said first imaginary vertical plane; and
determining a lateral position of the robot with respect to said one doorway opening during a portion of the time between the first sensor position entering the first imaginary vertical plane and the second sensor position entering the second imaginary vertical plane, wherein the lateral position is compared with the provided positional information of said one doorway opening in order to determine an x-coordinate of the robot within the x-y coordinate system.
2. A method according to claim 1 wherein the angular orientation α of the robot within the x-y coordinate system is equal to ##EQU2## where V is the known speed of the robot,
T is the measured time difference between when the first and second sensor positions break the first imaginary vertical plane, and
d is the known separation distance between the first and second sensor positions.
3. A method according to claim 1 wherein said step of determining the lateral position of the robot includes the step of measuring a distance to the left and right door frames during the portion of the time between the first sensor position entering the first imaginary vertical plane and the second sensor position entering second imaginary vertical plane.
4. A method according to claim 3 wherein the distance to the left and right door frames is determined by acoustic time-of-flight techniques.
5. A system of navigational referencing within a defined space comprising:
a database containing positional information on the defined space in terms of a plurality of doorway openings, each doorway opening having left, right and top door frames defining first and second imaginary vertical planes at the beginning and ending, respectively, of each doorway opening, the first and second imaginary vertical planes further being orthogonal to the floor of the defined space, wherein the first and second imaginary vertical planes for each of the plurality of doorway openings are defined in terms of an x-y coordinate system;
a mobile robot having the capability to access said database, define its location in terms of an approximate position within said x-y coordinate system, and sense the presence of one of said plurality of doorway openings;
means for positioning and moving the robot along a path that traverses said one doorway opening at a known speed;
means for detecting the top door frame from a first and second sensor position on the robot when the first and second sensor positions break the first imaginary vertical plane, wherein the first and second sensor positions are separated by a known distance along a line orthogonal to the path that traverses the one doorway opening, wherein an angular orientation α of the robot within the x-y coordinate system is based on a time difference between when the first and second sensor positions break the first imaginary vertical plane, the known speed of the robot and the known separation distance between the first and second sensor positions, and a y-coordinate position of the robot within the x-y coordinate system is based on the positional information available from said database on the first imaginary vertical plane when the first or second sensing position enters said first imaginary vertical plane; and
means for determining a lateral position of the robot with respect to the one doorway opening during a portion of the time between the first sensor position entering the first imaginary vertical plane and the second sensor position entering the second imaginary vertical plane, wherein the lateral position is compared with the positional information from said database on said one doorway opening in order to determine an x-coordinate of the robot within the x-y coordinate system.
6. A system as in claim 5 wherein the angular orientation α of the robot within the x-y coordinate system is equal to ##EQU3## where V is the known speed of the robot,
T is the time difference between when the first and second sensor positions break the first imaginary vertical plane, and
d is the known separation distance between the first and second sensor positions.
7. A system as in claim 5 wherein said means for detecting the top door frame comprises first and second proximity sensors at said first and second sensor positions, respectively.
8. A system as in claim 7 wherein each of said proximity sensors comprises a proximity sensor pair, each sensor pair including an emitter having an emitter beam and a detector having a detecting beam, wherein said emitter beam and detection beam are focused to intersect in a zone, whereby the top door frame of the one door opening passes through the zone as the robot traverses the one doorway opening.
9. A system as in claim 8 wherein said emitter and said detector operate in a convergent-mode in a near-infrared region.
10. A system as in claim 5 wherein said means for determining the lateral position of the robot comprises means for measuring a distance from the robot to the left and right door frames, respectively.
11. A system as in claim 10 wherein said means for measuring a distance comprises acoustic range finders activated when the first sensor position enters the first imaginary vertical plane and deactivated when the second sensor position enters the second imaginary vertical plane.
US07/846,486 1992-02-26 1992-02-26 Doorway transit navigational referencing system Expired - Fee Related US5276618A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US07/846,486 US5276618A (en) 1992-02-26 1992-02-26 Doorway transit navigational referencing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US07/846,486 US5276618A (en) 1992-02-26 1992-02-26 Doorway transit navigational referencing system

Publications (1)

Publication Number Publication Date
US5276618A true US5276618A (en) 1994-01-04

Family

ID=25298087

Family Applications (1)

Application Number Title Priority Date Filing Date
US07/846,486 Expired - Fee Related US5276618A (en) 1992-02-26 1992-02-26 Doorway transit navigational referencing system

Country Status (1)

Country Link
US (1) US5276618A (en)

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5510984A (en) * 1994-11-28 1996-04-23 Board Of Regents-Univ. Of Nebraska Automated guided vehicle enunciator system
US5525031A (en) * 1994-02-18 1996-06-11 Xerox Corporation Automated print jobs distribution system for shared user centralized printer
US5537017A (en) * 1992-05-22 1996-07-16 Siemens Aktiengesellschaft Self-propelled device and process for exploring an area with the device
WO1997041451A1 (en) * 1996-04-30 1997-11-06 Aktiebolaget Electrolux System and device for a self orienting device
US5935179A (en) * 1996-04-30 1999-08-10 Aktiebolaget Electrolux System and device for a self orienting device
US5995884A (en) * 1997-03-07 1999-11-30 Allen; Timothy P. Computer peripheral floor cleaning system and navigation method
US6046565A (en) * 1998-06-19 2000-04-04 Thorne; Henry F. Robotic vehicle with deduced reckoning positioning system
FR2817053A1 (en) * 2000-11-22 2002-05-24 Samsung Kwangju Electronics Co Mobile surveillance robot has drive and obstacle detection camera for sensing presence of obstacles
US6459955B1 (en) * 1999-11-18 2002-10-01 The Procter & Gamble Company Home cleaning robot
US20030236590A1 (en) * 2002-06-12 2003-12-25 Samsung Electronics Co., Ltd. Apparatus and method of recognizing position and direction of mobile robot
WO2004020267A2 (en) 2002-08-30 2004-03-11 Aethon, Inc. Robotic cart pulling vehicle
EP1441632A2 (en) * 2001-09-26 2004-08-04 Friendly Robotics Ltd. Robotic vacuum cleaner
US20040200505A1 (en) * 2003-03-14 2004-10-14 Taylor Charles E. Robot vac with retractable power cord
US20040211444A1 (en) * 2003-03-14 2004-10-28 Taylor Charles E. Robot vacuum with particulate detector
US20040220698A1 (en) * 2003-03-14 2004-11-04 Taylor Charles E Robotic vacuum cleaner with edge and object detection system
US6834220B1 (en) * 1999-11-17 2004-12-21 Bail Gmbh Self-propelling vehicle
US20050000543A1 (en) * 2003-03-14 2005-01-06 Taylor Charles E. Robot vacuum with internal mapping system
US20050010331A1 (en) * 2003-03-14 2005-01-13 Taylor Charles E. Robot vacuum with floor type modes
FR2867269A1 (en) * 2004-03-04 2005-09-09 Elie Assaad Rolling apparatus for rectangular structural opening of building, has two thumb wheels, where apparatus calculates co-ordinates of each point in three dimensions while rolling on opening, and provides dimensions for wood work
US20050246078A1 (en) * 2004-04-30 2005-11-03 Jan Vercammen Automatically guided vehicle with improved navigation
US20050244259A1 (en) * 2004-05-03 2005-11-03 Chilson Gerald E Automatic transport loading system and method
US20050267629A1 (en) * 2002-06-07 2005-12-01 Ulf Petersson Electronic directing system
US20060020369A1 (en) * 2004-03-11 2006-01-26 Taylor Charles E Robot vacuum cleaner
US20060241812A1 (en) * 2005-04-25 2006-10-26 Lg Electronics Inc. Robot system capable of specifying moving area
US20060276958A1 (en) * 2005-06-02 2006-12-07 Jervis B. Webb Company Inertial navigational guidance system for a driverless vehicle utilizing laser obstacle sensors
US20070016328A1 (en) * 2005-02-18 2007-01-18 Andrew Ziegler Autonomous surface cleaning robot for wet and dry cleaning
US20070112461A1 (en) * 2005-10-14 2007-05-17 Aldo Zini Robotic ordering and delivery system software and methods
US20070114975A1 (en) * 2004-01-21 2007-05-24 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US20070179670A1 (en) * 2002-01-24 2007-08-02 Irobot Corporation Navigational control system for a robotic device
US20070192910A1 (en) * 2005-09-30 2007-08-16 Clara Vu Companion robot for personal interaction
US20070213892A1 (en) * 2001-06-12 2007-09-13 Irobot Corporation Method and System for Multi-Mode Coverage For An Autonomous Robot
US20070250212A1 (en) * 2005-12-02 2007-10-25 Halloran Michael J Robot system
US20080047092A1 (en) * 2006-05-19 2008-02-28 Irobot Corporation Coverage robots and associated cleaning bins
US20080058987A1 (en) * 2005-12-02 2008-03-06 Irobot Corporation Navigating autonomous coverage robots
US20080065265A1 (en) * 2006-05-31 2008-03-13 Irobot Corporation Detecting robot stasis
US20080091305A1 (en) * 2005-12-02 2008-04-17 Irobot Corporation Coverage robot mobility
US20080199298A1 (en) * 2004-05-03 2008-08-21 Jervis B. Webb Company Automatic transport loading system and method
US20080276408A1 (en) * 2007-05-09 2008-11-13 Irobot Corporation Autonomous coverage robot
US20080282494A1 (en) * 2005-12-02 2008-11-20 Irobot Corporation Modular robot
US20090045766A1 (en) * 2000-01-24 2009-02-19 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US20090319083A1 (en) * 2001-01-24 2009-12-24 Irobot Corporation Robot Confinement
US20100049365A1 (en) * 2001-06-12 2010-02-25 Irobot Corporation Method and System for Multi-Mode Coverage For An Autonomous Robot
US20100257690A1 (en) * 2002-01-03 2010-10-14 Irobot Corporation Autonomous floor-cleaning robot
US20100266381A1 (en) * 2004-05-03 2010-10-21 Jervis B. Webb Company Automatic transport loading system and method
US20100312390A1 (en) * 2007-05-14 2010-12-09 Robosoft Domestic robot assistant having a rolling chassis
US20100313364A1 (en) * 2009-06-12 2010-12-16 Samsung Electronics Co., Ltd. Robot cleaner and control method thereof
US20110125323A1 (en) * 2009-11-06 2011-05-26 Evolution Robotics, Inc. Localization by learning of wave-signal distributions
US8075243B2 (en) 2004-05-03 2011-12-13 Jervis B. Webb Company Automatic transport loading system and method
US8192137B2 (en) 2004-05-03 2012-06-05 Jervis B. Webb Company Automatic transport loading system and method
CN102525416A (en) * 2010-10-15 2012-07-04 真茂科技股份有限公司 Automatic remove health care device
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
WO2013007741A1 (en) 2011-07-11 2013-01-17 Alfred Kärcher Gmbh & Co. Kg Self-propelling floor cleaning device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US8594840B1 (en) 2004-07-07 2013-11-26 Irobot Corporation Celestial navigation system for an autonomous robot
WO2014047557A1 (en) * 2012-09-21 2014-03-27 Irobot Corporation Proximity sensing on mobile robots
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
CN104331078A (en) * 2014-10-31 2015-02-04 东北大学 Multi-robot cooperative localization method based on position mapping algorithm
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US9436185B2 (en) 2010-12-30 2016-09-06 Irobot Corporation Coverage robot navigating
US9580285B2 (en) 2011-08-26 2017-02-28 Crown Equipment Corporation Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US9958873B2 (en) 2011-04-11 2018-05-01 Crown Equipment Corporation System for efficient scheduling for multiple automated non-holonomic vehicles using a coordinated path planner
US10353400B2 (en) * 2016-05-23 2019-07-16 Asustek Computer Inc. Navigation system and navigation method
CN112442958A (en) * 2020-12-08 2021-03-05 苏州优智达机器人有限公司 Method for enabling unmanned equipment to pass through channel blocking device, unmanned equipment and system
US11525921B2 (en) 2018-04-03 2022-12-13 Sharkninja Operating Llc Time of flight sensor arrangement for robot navigation and methods of localization using same

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4500970A (en) * 1982-01-15 1985-02-19 Richard A. Boulais Robot vehicle guidance system including checkpoint realignment system
US4638445A (en) * 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot
US4751658A (en) * 1986-05-16 1988-06-14 Denning Mobile Robotics, Inc. Obstacle avoidance system
US4779203A (en) * 1985-08-30 1988-10-18 Texas Instruments Incorporated Visual navigation system for mobile robots
US4846297A (en) * 1987-09-28 1989-07-11 Tennant Company Automated guided vehicle
US4905151A (en) * 1988-03-07 1990-02-27 Transitions Research Corporation One dimensional image visual system for a moving vehicle
US5040116A (en) * 1988-09-06 1991-08-13 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5073749A (en) * 1989-06-22 1991-12-17 Shinko Electric Co., Ltd. Mobile robot navigating method
US5109340A (en) * 1989-06-22 1992-04-28 Shinko Electric Co., Ltd. Path planning method for mobile robots
US5111401A (en) * 1990-05-19 1992-05-05 The United States Of America As Represented By The Secretary Of The Navy Navigational control system for an autonomous vehicle
US5155684A (en) * 1988-10-25 1992-10-13 Tennant Company Guiding an unmanned vehicle by reference to overhead features

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4500970A (en) * 1982-01-15 1985-02-19 Richard A. Boulais Robot vehicle guidance system including checkpoint realignment system
US4638445A (en) * 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot
US4779203A (en) * 1985-08-30 1988-10-18 Texas Instruments Incorporated Visual navigation system for mobile robots
US4751658A (en) * 1986-05-16 1988-06-14 Denning Mobile Robotics, Inc. Obstacle avoidance system
US4846297A (en) * 1987-09-28 1989-07-11 Tennant Company Automated guided vehicle
US4905151A (en) * 1988-03-07 1990-02-27 Transitions Research Corporation One dimensional image visual system for a moving vehicle
US5040116A (en) * 1988-09-06 1991-08-13 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5155684A (en) * 1988-10-25 1992-10-13 Tennant Company Guiding an unmanned vehicle by reference to overhead features
US5073749A (en) * 1989-06-22 1991-12-17 Shinko Electric Co., Ltd. Mobile robot navigating method
US5109340A (en) * 1989-06-22 1992-04-28 Shinko Electric Co., Ltd. Path planning method for mobile robots
US5111401A (en) * 1990-05-19 1992-05-05 The United States Of America As Represented By The Secretary Of The Navy Navigational control system for an autonomous vehicle

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Isik et al., "Pilot Level of a Hierarchical Controller for an Unmanned Mobile Robot", 1988 IEEE, pp. 241-255.
Isik et al., Pilot Level of a Hierarchical Controller for an Unmanned Mobile Robot , 1988 IEEE, pp. 241 255. *
R ning et al., A 3 D Scene Interpreter for Indoor Navigation , IEEE International Workshop on Intelligent Robots and System, IROS 90, pp. 695 701. *
Roning et al., "A 3-D Scene Interpreter for Indoor Navigation", IEEE Intetional Workshop on Intelligent Robots and System, IROS '90, pp. 695-701.

Cited By (222)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5537017A (en) * 1992-05-22 1996-07-16 Siemens Aktiengesellschaft Self-propelled device and process for exploring an area with the device
US5525031A (en) * 1994-02-18 1996-06-11 Xerox Corporation Automated print jobs distribution system for shared user centralized printer
US5510984A (en) * 1994-11-28 1996-04-23 Board Of Regents-Univ. Of Nebraska Automated guided vehicle enunciator system
WO1997041451A1 (en) * 1996-04-30 1997-11-06 Aktiebolaget Electrolux System and device for a self orienting device
US5935179A (en) * 1996-04-30 1999-08-10 Aktiebolaget Electrolux System and device for a self orienting device
AU713488B2 (en) * 1996-04-30 1999-12-02 Aktiebolaget Electrolux System and device for a self orienting device
US5995884A (en) * 1997-03-07 1999-11-30 Allen; Timothy P. Computer peripheral floor cleaning system and navigation method
US6046565A (en) * 1998-06-19 2000-04-04 Thorne; Henry F. Robotic vehicle with deduced reckoning positioning system
US6834220B1 (en) * 1999-11-17 2004-12-21 Bail Gmbh Self-propelling vehicle
US6459955B1 (en) * 1999-11-18 2002-10-01 The Procter & Gamble Company Home cleaning robot
US20090045766A1 (en) * 2000-01-24 2009-02-19 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8565920B2 (en) 2000-01-24 2013-10-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8761935B2 (en) * 2000-01-24 2014-06-24 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9446521B2 (en) 2000-01-24 2016-09-20 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8478442B2 (en) 2000-01-24 2013-07-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9144361B2 (en) 2000-04-04 2015-09-29 Irobot Corporation Debris sensor for cleaning apparatus
FR2817053A1 (en) * 2000-11-22 2002-05-24 Samsung Kwangju Electronics Co Mobile surveillance robot has drive and obstacle detection camera for sensing presence of obstacles
US9582005B2 (en) 2001-01-24 2017-02-28 Irobot Corporation Robot confinement
US9167946B2 (en) 2001-01-24 2015-10-27 Irobot Corporation Autonomous floor cleaning robot
US9038233B2 (en) 2001-01-24 2015-05-26 Irobot Corporation Autonomous floor-cleaning robot
US20090319083A1 (en) * 2001-01-24 2009-12-24 Irobot Corporation Robot Confinement
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US8686679B2 (en) 2001-01-24 2014-04-01 Irobot Corporation Robot confinement
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US20070213892A1 (en) * 2001-06-12 2007-09-13 Irobot Corporation Method and System for Multi-Mode Coverage For An Autonomous Robot
US20160354931A1 (en) * 2001-06-12 2016-12-08 Irobot Corporation Method and System for Multi-Mode Coverage For An Autonomous Robot
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20100049365A1 (en) * 2001-06-12 2010-02-25 Irobot Corporation Method and System for Multi-Mode Coverage For An Autonomous Robot
US8838274B2 (en) 2001-06-12 2014-09-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20100263142A1 (en) * 2001-06-12 2010-10-21 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20070100500A1 (en) * 2001-09-26 2007-05-03 F Robotics Acquisitions, Ltd. Robotic vacuum cleaner
US20100332067A1 (en) * 2001-09-26 2010-12-30 Shai Abramson Robotic Vacuum Cleaner
US7444206B2 (en) 2001-09-26 2008-10-28 F Robotics Acquisitions Ltd. Robotic vacuum cleaner
EP1441632A2 (en) * 2001-09-26 2004-08-04 Friendly Robotics Ltd. Robotic vacuum cleaner
EP1441632A4 (en) * 2001-09-26 2007-08-15 F Robotics Acquisitions Ltd Robotic vacuum cleaner
US7769490B2 (en) 2001-09-26 2010-08-03 F Robotics Acquisitions Ltd. Robotic vacuum cleaner
US8311674B2 (en) 2001-09-26 2012-11-13 F Robotics Acquisitions Ltd. Robotic vacuum cleaner
US20080281481A1 (en) * 2001-09-26 2008-11-13 Shai Abramson Robotic Vacuum Cleaner
US20110131741A1 (en) * 2002-01-03 2011-06-09 Jones Joseph L Autonomous Floor-Cleaning Robot
US20100257690A1 (en) * 2002-01-03 2010-10-14 Irobot Corporation Autonomous floor-cleaning robot
US8516651B2 (en) 2002-01-03 2013-08-27 Irobot Corporation Autonomous floor-cleaning robot
US8474090B2 (en) 2002-01-03 2013-07-02 Irobot Corporation Autonomous floor-cleaning robot
US8671507B2 (en) 2002-01-03 2014-03-18 Irobot Corporation Autonomous floor-cleaning robot
US8656550B2 (en) 2002-01-03 2014-02-25 Irobot Corporation Autonomous floor-cleaning robot
US20070179670A1 (en) * 2002-01-24 2007-08-02 Irobot Corporation Navigational control system for a robotic device
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
US20050267629A1 (en) * 2002-06-07 2005-12-01 Ulf Petersson Electronic directing system
US7574282B2 (en) * 2002-06-07 2009-08-11 Husqvarna Ab Electronic directing system
US20030236590A1 (en) * 2002-06-12 2003-12-25 Samsung Electronics Co., Ltd. Apparatus and method of recognizing position and direction of mobile robot
US8041455B2 (en) 2002-08-30 2011-10-18 Aethon, Inc. Robotic cart pulling vehicle
US20050029029A1 (en) * 2002-08-30 2005-02-10 Aethon, Inc. Robotic cart pulling vehicle
US7431115B2 (en) 2002-08-30 2008-10-07 Aethon, Inc. Robotic cart pulling vehicle
US20070051546A1 (en) * 2002-08-30 2007-03-08 Thorne Henry F Robotic cart pulling vehicle
WO2004020267A2 (en) 2002-08-30 2004-03-11 Aethon, Inc. Robotic cart pulling vehicle
US20090030569A1 (en) * 2002-08-30 2009-01-29 Thorne Henry F Robotic cart pulling vehicle
US7100725B2 (en) 2002-08-30 2006-09-05 Aethon Robotic cart pulling vehicle
US8781626B2 (en) 2002-09-13 2014-07-15 Irobot Corporation Navigational control system for a robotic device
US8428778B2 (en) 2002-09-13 2013-04-23 Irobot Corporation Navigational control system for a robotic device
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US8793020B2 (en) 2002-09-13 2014-07-29 Irobot Corporation Navigational control system for a robotic device
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US20040244138A1 (en) * 2003-03-14 2004-12-09 Taylor Charles E. Robot vacuum
US20040220698A1 (en) * 2003-03-14 2004-11-04 Taylor Charles E Robotic vacuum cleaner with edge and object detection system
US20040200505A1 (en) * 2003-03-14 2004-10-14 Taylor Charles E. Robot vac with retractable power cord
US20040236468A1 (en) * 2003-03-14 2004-11-25 Taylor Charles E. Robot vacuum with remote control mode
US20050000543A1 (en) * 2003-03-14 2005-01-06 Taylor Charles E. Robot vacuum with internal mapping system
US7805220B2 (en) 2003-03-14 2010-09-28 Sharper Image Acquisition Llc Robot vacuum with internal mapping system
US20040211444A1 (en) * 2003-03-14 2004-10-28 Taylor Charles E. Robot vacuum with particulate detector
US20050010331A1 (en) * 2003-03-14 2005-01-13 Taylor Charles E. Robot vacuum with floor type modes
US7801645B2 (en) 2003-03-14 2010-09-21 Sharper Image Acquisition Llc Robotic vacuum cleaner with edge and object detection system
US8749196B2 (en) 2004-01-21 2014-06-10 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US9215957B2 (en) 2004-01-21 2015-12-22 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8854001B2 (en) 2004-01-21 2014-10-07 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8461803B2 (en) 2004-01-21 2013-06-11 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US20070114975A1 (en) * 2004-01-21 2007-05-24 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US20070267998A1 (en) * 2004-01-21 2007-11-22 Irobot Corporation Autonomous Robot Auto-Docking and Energy Management Systems and Methods
US20080007203A1 (en) * 2004-01-21 2008-01-10 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8456125B2 (en) 2004-01-28 2013-06-04 Irobot Corporation Debris sensor for cleaning apparatus
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8598829B2 (en) 2004-01-28 2013-12-03 Irobot Corporation Debris sensor for cleaning apparatus
US8378613B2 (en) 2004-01-28 2013-02-19 Irobot Corporation Debris sensor for cleaning apparatus
FR2867269A1 (en) * 2004-03-04 2005-09-09 Elie Assaad Rolling apparatus for rectangular structural opening of building, has two thumb wheels, where apparatus calculates co-ordinates of each point in three dimensions while rolling on opening, and provides dimensions for wood work
WO2005085755A1 (en) * 2004-03-04 2005-09-15 Elie Assaad Rolling device for measuring the dimensions of an opening in a building for the fitting of a piece of joinery comprising two coding wheels and an inclinometer
US20060020369A1 (en) * 2004-03-11 2006-01-26 Taylor Charles E Robot vacuum cleaner
US9360300B2 (en) 2004-03-29 2016-06-07 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US20050246078A1 (en) * 2004-04-30 2005-11-03 Jan Vercammen Automatically guided vehicle with improved navigation
US20080199298A1 (en) * 2004-05-03 2008-08-21 Jervis B. Webb Company Automatic transport loading system and method
US8210791B2 (en) 2004-05-03 2012-07-03 Jervis B. Webb Company Automatic transport loading system and method
US20050244259A1 (en) * 2004-05-03 2005-11-03 Chilson Gerald E Automatic transport loading system and method
US7648329B2 (en) 2004-05-03 2010-01-19 Jervis B. Webb Company Automatic transport loading system and method
US7980808B2 (en) 2004-05-03 2011-07-19 Jervis B. Webb Company Automatic transport loading system and method
US8075243B2 (en) 2004-05-03 2011-12-13 Jervis B. Webb Company Automatic transport loading system and method
US8192137B2 (en) 2004-05-03 2012-06-05 Jervis B. Webb Company Automatic transport loading system and method
US20100266381A1 (en) * 2004-05-03 2010-10-21 Jervis B. Webb Company Automatic transport loading system and method
US9486924B2 (en) 2004-06-24 2016-11-08 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9229454B1 (en) 2004-07-07 2016-01-05 Irobot Corporation Autonomous mobile robot system
US8874264B1 (en) 2004-07-07 2014-10-28 Irobot Corporation Celestial navigation system for an autonomous robot
US8594840B1 (en) 2004-07-07 2013-11-26 Irobot Corporation Celestial navigation system for an autonomous robot
US8634956B1 (en) 2004-07-07 2014-01-21 Irobot Corporation Celestial navigation system for an autonomous robot
US8634958B1 (en) 2004-07-07 2014-01-21 Irobot Corporation Celestial navigation system for an autonomous robot
US9223749B2 (en) 2004-07-07 2015-12-29 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8670866B2 (en) 2005-02-18 2014-03-11 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US9445702B2 (en) 2005-02-18 2016-09-20 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8985127B2 (en) 2005-02-18 2015-03-24 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8774966B2 (en) 2005-02-18 2014-07-08 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8966707B2 (en) 2005-02-18 2015-03-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US20070016328A1 (en) * 2005-02-18 2007-01-18 Andrew Ziegler Autonomous surface cleaning robot for wet and dry cleaning
US20080140255A1 (en) * 2005-02-18 2008-06-12 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8782848B2 (en) 2005-02-18 2014-07-22 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US20060241812A1 (en) * 2005-04-25 2006-10-26 Lg Electronics Inc. Robot system capable of specifying moving area
US20060276958A1 (en) * 2005-06-02 2006-12-07 Jervis B. Webb Company Inertial navigational guidance system for a driverless vehicle utilizing laser obstacle sensors
US9796078B2 (en) 2005-09-30 2017-10-24 Irobot Corporation Companion robot for personal interaction
US8935006B2 (en) 2005-09-30 2015-01-13 Irobot Corporation Companion robot for personal interaction
US9878445B2 (en) 2005-09-30 2018-01-30 Irobot Corporation Displaying images from a robot
EP2281667A1 (en) * 2005-09-30 2011-02-09 iRobot Corporation Companion robot for personal interaction
US20070199108A1 (en) * 2005-09-30 2007-08-23 Colin Angle Companion robot for personal interaction
US8583282B2 (en) 2005-09-30 2013-11-12 Irobot Corporation Companion robot for personal interaction
US8195333B2 (en) 2005-09-30 2012-06-05 Irobot Corporation Companion robot for personal interaction
US20110172822A1 (en) * 2005-09-30 2011-07-14 Andrew Ziegler Companion Robot for Personal Interaction
US20070192910A1 (en) * 2005-09-30 2007-08-16 Clara Vu Companion robot for personal interaction
US10661433B2 (en) 2005-09-30 2020-05-26 Irobot Corporation Companion robot for personal interaction
US7957837B2 (en) 2005-09-30 2011-06-07 Irobot Corporation Companion robot for personal interaction
US9452525B2 (en) 2005-09-30 2016-09-27 Irobot Corporation Companion robot for personal interaction
US9446510B2 (en) 2005-09-30 2016-09-20 Irobot Corporation Companion robot for personal interaction
US20090177323A1 (en) * 2005-09-30 2009-07-09 Andrew Ziegler Companion robot for personal interaction
US20070129849A1 (en) * 2005-10-14 2007-06-07 Aldo Zini Robotic ordering and delivery apparatuses, systems and methods
US20100234990A1 (en) * 2005-10-14 2010-09-16 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US9020679B2 (en) 2005-10-14 2015-04-28 Aethon, Inc. Robotic ordering and delivery system software and methods
US8204624B2 (en) 2005-10-14 2012-06-19 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US9679270B2 (en) 2005-10-14 2017-06-13 Aethon, Inc. Robotic ordering and delivery system software and methods
US9026301B2 (en) 2005-10-14 2015-05-05 Aethon, Inc. Robotic ordering and delivery system software and methods
US9563206B2 (en) 2005-10-14 2017-02-07 Aethon, Inc. Robotic ordering and delivery system software and methods
US7894939B2 (en) 2005-10-14 2011-02-22 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US20070112461A1 (en) * 2005-10-14 2007-05-17 Aldo Zini Robotic ordering and delivery system software and methods
US8010230B2 (en) 2005-10-14 2011-08-30 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US7996109B2 (en) 2005-10-14 2011-08-09 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US20110137457A1 (en) * 2005-10-14 2011-06-09 Aethon, Inc. Robotic ordering and delivery apparatuses, systems and methods
US20080282494A1 (en) * 2005-12-02 2008-11-20 Irobot Corporation Modular robot
US20080091305A1 (en) * 2005-12-02 2008-04-17 Irobot Corporation Coverage robot mobility
US20070250212A1 (en) * 2005-12-02 2007-10-25 Halloran Michael J Robot system
US10524629B2 (en) 2005-12-02 2020-01-07 Irobot Corporation Modular Robot
US20110077802A1 (en) * 2005-12-02 2011-03-31 Halloran Michael J Robot System
US8954192B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Navigating autonomous coverage robots
US8950038B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Modular robot
US9392920B2 (en) 2005-12-02 2016-07-19 Irobot Corporation Robot system
US8761931B2 (en) 2005-12-02 2014-06-24 Irobot Corporation Robot system
US8978196B2 (en) 2005-12-02 2015-03-17 Irobot Corporation Coverage robot mobility
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US20080058987A1 (en) * 2005-12-02 2008-03-06 Irobot Corporation Navigating autonomous coverage robots
US8661605B2 (en) 2005-12-02 2014-03-04 Irobot Corporation Coverage robot mobility
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8600553B2 (en) 2005-12-02 2013-12-03 Irobot Corporation Coverage robot mobility
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US9599990B2 (en) 2005-12-02 2017-03-21 Irobot Corporation Robot system
US20110004339A1 (en) * 2005-12-02 2011-01-06 Irobot Corporation Autonomous coverage robot navigation system
US9144360B2 (en) 2005-12-02 2015-09-29 Irobot Corporation Autonomous coverage robot navigation system
US9149170B2 (en) 2005-12-02 2015-10-06 Irobot Corporation Navigating autonomous coverage robots
US8528157B2 (en) 2006-05-19 2013-09-10 Irobot Corporation Coverage robots and associated cleaning bins
US8572799B2 (en) 2006-05-19 2013-11-05 Irobot Corporation Removing debris from cleaning robots
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US9492048B2 (en) 2006-05-19 2016-11-15 Irobot Corporation Removing debris from cleaning robots
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US20080047092A1 (en) * 2006-05-19 2008-02-28 Irobot Corporation Coverage robots and associated cleaning bins
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US9317038B2 (en) 2006-05-31 2016-04-19 Irobot Corporation Detecting robot stasis
US20080065265A1 (en) * 2006-05-31 2008-03-13 Irobot Corporation Detecting robot stasis
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US8839477B2 (en) 2007-05-09 2014-09-23 Irobot Corporation Compact autonomous coverage robot
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US8726454B2 (en) 2007-05-09 2014-05-20 Irobot Corporation Autonomous coverage robot
US9480381B2 (en) 2007-05-09 2016-11-01 Irobot Corporation Compact autonomous coverage robot
US11072250B2 (en) 2007-05-09 2021-07-27 Irobot Corporation Autonomous coverage robot sensing
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US20080276408A1 (en) * 2007-05-09 2008-11-13 Irobot Corporation Autonomous coverage robot
US8438695B2 (en) 2007-05-09 2013-05-14 Irobot Corporation Autonomous coverage robot sensing
US20100312390A1 (en) * 2007-05-14 2010-12-09 Robosoft Domestic robot assistant having a rolling chassis
US20150224645A1 (en) * 2009-06-12 2015-08-13 Samsung Electronics Co., Ltd. Robot cleaner and control method thereof
US20100313364A1 (en) * 2009-06-12 2010-12-16 Samsung Electronics Co., Ltd. Robot cleaner and control method thereof
US9037294B2 (en) * 2009-06-12 2015-05-19 Samsung Electronics Co., Ltd. Robot cleaner and control method thereof
US9844876B2 (en) * 2009-06-12 2017-12-19 Samsung Electronics Co., Ltd. Robot cleaner and control method thereof
US20110125323A1 (en) * 2009-11-06 2011-05-26 Evolution Robotics, Inc. Localization by learning of wave-signal distributions
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
CN102525416A (en) * 2010-10-15 2012-07-04 真茂科技股份有限公司 Automatic remove health care device
US11157015B2 (en) 2010-12-30 2021-10-26 Irobot Corporation Coverage robot navigating
US10152062B2 (en) 2010-12-30 2018-12-11 Irobot Corporation Coverage robot navigating
US9436185B2 (en) 2010-12-30 2016-09-06 Irobot Corporation Coverage robot navigating
US9958873B2 (en) 2011-04-11 2018-05-01 Crown Equipment Corporation System for efficient scheduling for multiple automated non-holonomic vehicles using a coordinated path planner
WO2013007741A1 (en) 2011-07-11 2013-01-17 Alfred Kärcher Gmbh & Co. Kg Self-propelling floor cleaning device
DE102011051729A1 (en) * 2011-07-11 2013-01-17 Alfred Kärcher Gmbh & Co. Kg Self-propelled floor cleaning device
US9580285B2 (en) 2011-08-26 2017-02-28 Crown Equipment Corporation Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US10611613B2 (en) 2011-08-26 2020-04-07 Crown Equipment Corporation Systems and methods for pose development using retrieved position of a pallet or product load to be picked up
US10429851B2 (en) 2012-09-21 2019-10-01 Irobot Corporation Proximity sensing on mobile robots
US8862271B2 (en) 2012-09-21 2014-10-14 Irobot Corporation Proximity sensing on mobile robots
WO2014047557A1 (en) * 2012-09-21 2014-03-27 Irobot Corporation Proximity sensing on mobile robots
AU2013317738B2 (en) * 2012-09-21 2015-05-07 Irobot Corporation Proximity sensing on mobile robots
US9442488B2 (en) 2012-09-21 2016-09-13 Irobot Corporation Proximity sensing on mobile robots
CN104331078A (en) * 2014-10-31 2015-02-04 东北大学 Multi-robot cooperative localization method based on position mapping algorithm
US10353400B2 (en) * 2016-05-23 2019-07-16 Asustek Computer Inc. Navigation system and navigation method
US11525921B2 (en) 2018-04-03 2022-12-13 Sharkninja Operating Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
CN112442958A (en) * 2020-12-08 2021-03-05 苏州优智达机器人有限公司 Method for enabling unmanned equipment to pass through channel blocking device, unmanned equipment and system

Similar Documents

Publication Publication Date Title
US5276618A (en) Doorway transit navigational referencing system
US5111401A (en) Navigational control system for an autonomous vehicle
US5758298A (en) Autonomous navigation system for a mobile robot or manipulator
US5525882A (en) Method and system for maneuvering a mobile robot
US4821192A (en) Node map system and method for vehicle
US5812267A (en) Optically based position location system for an autonomous guided vehicle
US4829442A (en) Beacon navigation system and method for guiding a vehicle
US20180129217A1 (en) Navigation Of Mobile Robots Based On Passenger Following
US7873438B2 (en) Mobile apparatus and control program therefor
JP4368317B2 (en) Mobile robot
IL100633A (en) Large area movement robot
JP2001515237A (en) Docking method of autonomous motion unit using guidance beam
JP6828579B2 (en) Environmental maintenance robot and its control program
JP4530996B2 (en) Mobile robot
JP2009237851A (en) Mobile object control system
JP4377347B2 (en) Mobile robot
Hamner et al. An efficient system for combined route traversal and collision avoidance
Shoval et al. Implementation of a Kalman filter in positioning for autonomous vehicles, and its sensitivity to the process parameters
Aman et al. A sensor fusion methodology for obstacle avoidance robot
JP4377346B2 (en) Mobile robot
Roth et al. Navigation and docking manoeuvres of mobile robots in industrial environments
Castro et al. Obstacle avoidance in local navigation
WO1995029380A1 (en) Navigation system for fast automated vehicles and mobile robots
JP4368318B2 (en) Mobile robot
JP6863049B2 (en) Autonomous mobile robot

Legal Events

Date Code Title Description
AS Assignment

Owner name: UNITED STATES OF AMERICA, THE, AS REPRESENTED BY T

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNOR:EVERETT, HOBART R. JR.;REEL/FRAME:006297/0319

Effective date: 19920226

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
FP Lapsed due to failure to pay maintenance fee

Effective date: 19980107

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362