WO2001038049A1 - Providing information of moving objects - Google Patents

Providing information of moving objects Download PDF

Info

Publication number
WO2001038049A1
WO2001038049A1 PCT/GB2000/004445 GB0004445W WO0138049A1 WO 2001038049 A1 WO2001038049 A1 WO 2001038049A1 GB 0004445 W GB0004445 W GB 0004445W WO 0138049 A1 WO0138049 A1 WO 0138049A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
image
imaging apparatus
speed
images
Prior art date
Application number
PCT/GB2000/004445
Other languages
French (fr)
Inventor
Mika Eino Antero Laitinen
Jarno Santeri Haapasaari
Seppo Olli Antero LEPPÄJÄRVI
Original Assignee
Robotic Technology Systems Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robotic Technology Systems Plc filed Critical Robotic Technology Systems Plc
Priority to AU15325/01A priority Critical patent/AU1532501A/en
Publication of WO2001038049A1 publication Critical patent/WO2001038049A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41815Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
    • G05B19/4182Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell manipulators and conveyor only
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present invention relates to provision of information of moving objects, and in particular, but not exclusively, to provision of information by a machine vision system of objects moved by a conveyor.
  • the objects may be, for example, any workpieces, tools, goods, pallets or similar articles that are to be handled in industrial or commercial processes.
  • Objects may also need to be moved, e.g. during manufacturing or transporting operations, by an appropriate conveyor or similar apparatus arranged to move the objects.
  • an appropriate conveyor or similar apparatus arranged to move the objects.
  • Machine vision systems may be used for providing the required information on the objects to be handled and/or processed. More detailed examples of the (machine) vision systems are disclosed in international publications Nos . O95/00299 and 097/17173.
  • an object carried by an conveyor may be detected and/or recognised by an imaging apparatus of the vision system.
  • the object may then be subjected to further processing based on information on the object provided by the vision system.
  • the vision system arrangement may even be such that an object and/or predefined characteristic information of an object is detected while the object is moving.
  • the object may, for example, be picked by appropriate means such as a gripping mechanism of a robot or a manipulator or similar actuator from the conveyor and moved to a desired next location or stage of processing.
  • the machine vision based systems typically operate such that an object is detected and imaged by means of an imaging apparatus, whereafter the object is recognised and predefined characteristics thereof are determined.
  • the object may be processed, such as gripped by appropriate gripping means.
  • the further processing may include operations such as machining or packaging that utilises the information received from the imaging of the object.
  • the imaging apparatus is typically arranged to take a plurality of subsequent picture frames from its imaging area as the objects move i.e. flow pass the imaging area.
  • the object flow is divided into several slices or frames each presenting a still image of the object or objects within the imaging area in a given moment.
  • the division of the object flow into the subsequent still image "slices" may be referred to as slicing.
  • Two subsequent slices usually overlap so that none of the objects on the conveyor will be missed.
  • the interval between the subsequent slices is typically optimised such that all objects will become imaged with a certain reliability while only a minimum number of images is needed.
  • a conveyor apparatus may operate with a relatively high speed and thus the interval between the subsequent images should be relatively short. With short imaging intervals it is, however, possible that an object becomes imaged twice, i.e. an object may be found from two subsequent images.
  • the vision system may detect twice the same object because of the shape and/or size of the object. This may occur e.g. when some of the objects are relatively big and require relatively wide imaging area and short interval while some of the other objects to be processed by the same system are of a relatively small size. If a double detection occurs, the vision system may then output information that instead of one actual object, the conveyor carries two (or even more) objects.
  • the subsequent processing apparatus will then operate as the conveyor would carry two (or even more) objects. For example, a picking device may then accomplish a so called "double-pick" operation. The second pick would, however, be a "ghost" pick as the real object has already been picked.
  • a controller of the object handling and/or processing system may find the same object from two (or more) different images which were taken shortly one after another. For example, as the conveyor has moved the object between the images, the object may lie in the first image on a top or front part of the image and in the second image the same object may lie on a bottom or rear part of the image.
  • the picking device will then get two search results based on the first and the second images respectively, and will thus make two pickings. This consumes unnecessarily the resources and working time of the picking device.
  • the device makes a "ghost" pick, a following real object may simultaneously pass the picking area so that the following object will not become picked at all. Therefore it would be advantageous to reduce the number of any useless "ghost" operations of the handling and/or processing apparatus.
  • a method of providing information on moving objects by an imaging apparatus comprising: determining the speed of the objects based on information provided by the imaging apparatus; generating a plurality of images of the objects; and determining whether an image is the first image of an object based on the determined speed of the objects.
  • an apparatus for providing information of moving objects comprising: an imaging apparatus arranged to generate a plurality of images of the objects moving past an imaging area of the imaging apparatus; and a controller arranged to determine the speed of the objects by processing information that is based on images generated by the imaging apparatus and to determine if an object exists in more than one image of the plurality of images based on the determined speed of the objects.
  • the speed of the objects may be determined based on information provided by two images taken by the imaging apparatus.
  • the direction of movement of the objects may also be determined based on information provided by the imaging apparatus.
  • the information on which the speed and/or direction determination is based and the plurality of images of the objects may be provided with a common imaging device.
  • the imaging apparatus may comprise at least one camera. If it is determined that the image is not the first image of the object, the object in the image may be ignored and no further control instructions concerning the object in two different images may not be generated.
  • the determination of whether the image is the first image including the object may comprise computing a theoretic location of the object in an image and verifying whether the theoretic location of the object and the real location of the object in the image match with a predefined accuracy. The accuracy may be adjustable .
  • an object in an image may be recognized based on determining predefined information of the boundary of the object from an image of the object. At least one characteristic of an object may also be determined based on determining predefined information of the boundary of the object.
  • the embodiments of the present invention provide several advantages. By providing a controller of a system processing an object with information concerning the manner the object moves it is possible to reduce the risk that a handling/processing apparatus subjects an operation twice to the same object, e.g. to reduce the possibility of an operation where a robot tries to pick an object twice.
  • the same imaging apparatus that is employed for recognizing and/or for determining the characteristics of the object may be used for the determination of the parameters relating to the movement of the object.
  • Figure 1 shows one embodiment of the present invention
  • Figures 2a and 2b disclose two subsequent images of an imaging area of an imaging apparatus
  • Figure 3 is a flowchart illustrating the operation of one embodiment of the present invention.
  • Figure 1 shows a schematic presentation of an embodiment of the present invention.
  • the system includes a conveyor 1 for supporting and moving objects 2.
  • the objects are moved at speed v in a direction from left to right as is indicated by the arrow 10.
  • Figure 1 shows a belt conveyor
  • the skilled person is familiar with many other possible types of conveyors that could be used for conveying objects. These include chain conveyors and conveyors in which the objects are moved below the conveyor structure, e.g. are supported by appropriate hangers.
  • the embodiments of the invention can be applied to any type of conveying arrangements adapted to move objects.
  • the system includes further an imaging apparatus comprising a camera 3.
  • imaging apparatus comprising a camera 3.
  • Various possibilities for imaging apparatus are known, these possibilities including, without restricting to these, cameras such as CCD (Charge Coupled Device) matrix cameras, progressive scan cameras, CCIR cameras (CCIR is a European standard for machine vision cameras employing a resolution of 768x576 pixels) and RS170 cameras (a North American standard with resolution of 640x480 pixels) and laser and infrared imaging applications.
  • the camera 3 is arranged to image objects 2 on the belt 1 that are within an imaging area 40 between dashed lines 4a and 4b (see also Figs . 2a and 2b) .
  • a single camera 3 is shown to be disposed above the conveyor 1.
  • the position and general arrangement of the imaging apparatus may differ from this, and the embodiments of the invention are applicable in various possible imaging apparatus variations in which the number and positioning of the imaging apparatus and components can be freely chosen.
  • the position of the imaging device 3 is typically chosen such that it is possible to detect desired points of the objects for the provision of a reliable detection of the objects forwarding on the belt.
  • the camera may also be positioned in the side of the conveyor or even below the conveyor.
  • the vision system may also be provided with more than one camera, e.g. in applications where three dimensional images of the objects are produced or a great accuracy is required.
  • the exemplifying system of Figure 1 includes further a robot 5 for picking the objects from the conveyor 1. More particularly, the objects are picked by gripping means 6 of the robot 5. It is to be understood that the robot 5 is only a preferred example of a possible subsequent handling and/or processing device. Any suitable actuator device may be used for the further processing of the objects 2 after they have been imaged by the imaging apparatus 3.
  • a control unit 7 is also shown.
  • the control unit is arranged to process information received from the imaging apparatus 3 via connection 8 and to control the operation of the robot 5 via connection 9.
  • the controller unit includes required data processing capability, and may be based on microprocessor technology.
  • the controller may be based on PentiumTM processor, ⁇ even thoug a less or more powerfull processor may also be employed depending on the requirements of the system and the objects to be handled.
  • the controller 7 may be provided with appropriate memory devices, drives, display means, a keyboard, a mouse or other pointing device and any adapters and interfaces that may be required.
  • an appropriate imaging software is typically required.
  • the controller may also be provided with a network card to enable installations requiring communication over a data network.
  • the controller may be adapted for communication e.g. based on TCP/IP networking (Transport Control Protocol/Internet Protocol) or over a local area network (LAN) .
  • TCP/IP networking Transport Control Protocol/Internet Protocol
  • LAN local area network
  • Figures 2a and 2b show two subsequent images from an imaging area 40 and a flow of objects 21, 2, and 22 through the imaging area 40. As can be seen, the object 2 can be found from both of these images. However, the second image of Figure 2b should produce information concerning the next object 22 only.
  • the procedure employed herein is based on the idea that by providing the controller 7 of the system with information concerning the manner how the conveyor 1 moves, and thus how the object may appear m the different images taken by the camera 3, it is possible to prevent the robot 5 from trying to pick the same object 2 twice. After the determination procedure by the controller the robot will be provided with only one position determination results per object and will thus not try to attempt to make more than one picking. More particularly, if the system is provided with predefined information concerning the movement of the conveyor and the objects on the conveyor (direction and speed v) , it is possible to calculate if the object 2 m the second image
  • the speed of the objects may be calculated and given as pixels/second or as "real" units, e.g. mm/second. A possibility for the determination of the direction and speed of the objects will be discussed below.
  • a direction vector may be used to define the direction in which the objects on the conveyor belt move.
  • component I of the vector may define the movement in the horizontal direction, and a positive value of said vector component may indicate movement from left to right, or vice versa.
  • Component J of the vector may define the movement in vertical direction. A positive value of the component J may indicate a movement from up to down, or vice versa.
  • the determination of the speed and direction parameters may comprise the following steps. Stop the conveyor 1, put a test object in the imaging area 40 and grab an image of the test object. Define the characteristics of the object from the image so that the system will know what kind of objects to look for during an automatic conveyor direction and speed determination operation. The auto-determination procedure will then wait until a "matching" object comes to the imaging area along the conveyor, records the object coordinates and waits for a specified time. After the time has lapsed a second image is grabbed and the new position of the object is recorded. The imaging interval between the subsequent two images is determined.
  • the imaging apparatus is a camera that is positioned above a conveyor such that the top and bottom edges of the imaging area (4a and 4b, respectively) are perpendicular to the direction of movement of the objects carried by the conveyor
  • the first image may be grabbed when the object moves into the top of the imaging area.
  • the second image will then be grabbed later after a x grab interval' (for example, some milliseconds) has lapsed.
  • the controller of the system will be provided with the required information so that it is possible to define the length of travel the object has moved during the grab interval' .
  • the speed may be determined by dividing the length of travel by the grab interval' in appropriate units.
  • the test or "calibrating" object may then be taken away from the conveyor belt, and the actual processing of the objects may start.
  • the calibration of the speed information is accomplished during a normal operation of the conveyor, and the object is not taken away but will be processed in a predefined manner after recognition.
  • the controller of the system may verify whether an object in an image is in reality an object that was already processed based on information from the previous image. This may be done by computing what will be the "theoretic" location of the object 2 in the second image (Fig. 2b) based on the determined speed and direction information of the objects. After the "theoretic" location of the object 2 in the second image is computed, an object appearing in that location in the second image may be ignored. In other words, based on the conveyor direction vector and the detected speed it is possible to calculate if the object has moved to the current position from a position which has already been reported to a robot.
  • the new object position will not be sent to the robot.
  • the system finds that the object in the second image is in fact the same object as in the first image, it will not include the object in the search results of the second image, and will not process the information of the "ghost" object any further. For example, no instructions are given to the robot 5 of Figure 1 based on the second results . By means of this the robot 5 will get only one characteristics (e.g. position, shape and so on) for each object provided by the first image and any "ghost" pickings may be avoided.
  • characteristics e.g. position, shape and so on
  • the accuracy level of the double or "ghost" action preventing system may be defined e.g. by means of a tolerance value that defines how much the difference between the calculated theoretical location of the object and the actual position of the object in the second image may differ from each other until the location of the object in the second image is considered to be different.
  • the location may be different e.g. since the object is not the same object as in the first image or since the object has, for some reason, moved relative to the conveyor. If the location of the object differs more than what the tolerance value allows, the object may be considered as a "real" object, or some other procedure may follow.
  • the tolerance may be given in pixels or in real units, e.g. in millimetres. It may also be preferred in some embodiments to be able to adjust the accuracy level which can be accomplished by changing the tolerance value.
  • the imaging is based on a method where only the boundary of the object is imaged and analysed, and wherein the object is recognised based on the boundary characteristics.
  • This enables a faster processing as the computing capacity and time required for the boundary detection and computations is less than if the entire object area is analysed on pixel to pixel basis.
  • the actual object related data is retrieved from a object database based on the recognition of predefined points on the boundary and then this retrieved data is used in the actual handling and/or processing of the object.
  • the conveyor with a special marking, such as a cross, spot or line extending perpendicular to the conveyor belt, or to arrange a special sign or marking to some of the objects to be conveyed through the imaging area.
  • the system automatically detects the speed of the conveyor, e.g. from a marking provided in the conveyor, within predefined intervals, and adaptively adjusts the speed information whenever the speed has changed.
  • the arrangement may also be such that whenever the special marking or pattern is detected (either in the belt, chain or similar element of the conveyor or on an object), a speed calibration procedure will occur.
  • the embodiments are applicable also if several cameras or other imaging devices are used for the vision system.
  • the two or more imaging devices may also be monitoring different conveyors.
  • Each of the imaging devices may have its own double pick checking system or then the arrangement may be such that a common controller controls the operation of each of the separate imaging devices .
  • a message may be shown to the user to confirm a successful auto-determination procedure. If the process fails, an error message may be shown.

Abstract

The present invention relates to a method and apparatus for providing information on moving objects (2) by an imaging apparatus (3). According to the embodiments, the speed of the objects is also determined based on information provided by the imaging apparatus. The imaging apparatus (3) is adapted to generate a plurality of images of the objects. When processing an image of said plurality of images, it is determined whether the image is the first image in which an object is present based on the determined speed of the objects.

Description

PROVIDING INFORMATION OF MOVING OBJECTS
Field of the Invention
The present invention relates to provision of information of moving objects, and in particular, but not exclusively, to provision of information by a machine vision system of objects moved by a conveyor.
Background of the Invention
Handling and/or processing of various objects is a commonplace in various fields of industry. The objects may be, for example, any workpieces, tools, goods, pallets or similar articles that are to be handled in industrial or commercial processes. Objects may also need to be moved, e.g. during manufacturing or transporting operations, by an appropriate conveyor or similar apparatus arranged to move the objects. In several applications there also exists a need to pick the moving objects from a conveyor or similar and/or to move the objects to another location and/or to process the objects further e.g. by subjecting the objects to predefined packaging, machining or finishing operations.
During the further processing of the object it may be necessary to know one or more of the characteristics of the object, such as the position, orientation, shape or size of the object, so that it is possible for example to grip and move the object from one location to another. Machine vision systems may be used for providing the required information on the objects to be handled and/or processed. More detailed examples of the (machine) vision systems are disclosed in international publications Nos . O95/00299 and 097/17173.
When employing a vision system an object carried by an conveyor may be detected and/or recognised by an imaging apparatus of the vision system. The object may then be subjected to further processing based on information on the object provided by the vision system. The vision system arrangement may even be such that an object and/or predefined characteristic information of an object is detected while the object is moving. After the detection the object may, for example, be picked by appropriate means such as a gripping mechanism of a robot or a manipulator or similar actuator from the conveyor and moved to a desired next location or stage of processing. In other words, the machine vision based systems typically operate such that an object is detected and imaged by means of an imaging apparatus, whereafter the object is recognised and predefined characteristics thereof are determined. After the determination of information that is required for processing the object further, the object may be processed, such as gripped by appropriate gripping means. The further processing may include operations such as machining or packaging that utilises the information received from the imaging of the object.
The imaging apparatus is typically arranged to take a plurality of subsequent picture frames from its imaging area as the objects move i.e. flow pass the imaging area. Thus the object flow is divided into several slices or frames each presenting a still image of the object or objects within the imaging area in a given moment. The division of the object flow into the subsequent still image "slices" may be referred to as slicing. Two subsequent slices usually overlap so that none of the objects on the conveyor will be missed. The interval between the subsequent slices is typically optimised such that all objects will become imaged with a certain reliability while only a minimum number of images is needed.
A conveyor apparatus may operate with a relatively high speed and thus the interval between the subsequent images should be relatively short. With short imaging intervals it is, however, possible that an object becomes imaged twice, i.e. an object may be found from two subsequent images. In addition, the vision system may detect twice the same object because of the shape and/or size of the object. This may occur e.g. when some of the objects are relatively big and require relatively wide imaging area and short interval while some of the other objects to be processed by the same system are of a relatively small size. If a double detection occurs, the vision system may then output information that instead of one actual object, the conveyor carries two (or even more) objects. The subsequent processing apparatus will then operate as the conveyor would carry two (or even more) objects. For example, a picking device may then accomplish a so called "double-pick" operation. The second pick would, however, be a "ghost" pick as the real object has already been picked.
The reason why a double-picking may occur is that a controller of the object handling and/or processing system may find the same object from two (or more) different images which were taken shortly one after another. For example, as the conveyor has moved the object between the images, the object may lie in the first image on a top or front part of the image and in the second image the same object may lie on a bottom or rear part of the image. The picking device will then get two search results based on the first and the second images respectively, and will thus make two pickings. This consumes unnecessarily the resources and working time of the picking device. In addition, while the device makes a "ghost" pick, a following real object may simultaneously pass the picking area so that the following object will not become picked at all. Therefore it would be advantageous to reduce the number of any useless "ghost" operations of the handling and/or processing apparatus.
Summary of the Invention
It is an aim of the embodiments of the present invention to address one or several of the above problems.
According to one aspect of the present invention, there is provided a method of providing information on moving objects by an imaging apparatus, comprising: determining the speed of the objects based on information provided by the imaging apparatus; generating a plurality of images of the objects; and determining whether an image is the first image of an object based on the determined speed of the objects.
According to another aspect of the present invention there is provided an apparatus for providing information of moving objects, comprising: an imaging apparatus arranged to generate a plurality of images of the objects moving past an imaging area of the imaging apparatus; and a controller arranged to determine the speed of the objects by processing information that is based on images generated by the imaging apparatus and to determine if an object exists in more than one image of the plurality of images based on the determined speed of the objects.
The speed of the objects may be determined based on information provided by two images taken by the imaging apparatus. The direction of movement of the objects may also be determined based on information provided by the imaging apparatus. The information on which the speed and/or direction determination is based and the plurality of images of the objects may be provided with a common imaging device. The imaging apparatus may comprise at least one camera. If it is determined that the image is not the first image of the object, the object in the image may be ignored and no further control instructions concerning the object in two different images may not be generated. The determination of whether the image is the first image including the object may comprise computing a theoretic location of the object in an image and verifying whether the theoretic location of the object and the real location of the object in the image match with a predefined accuracy. The accuracy may be adjustable .
In order to make the recognition process faster an object in an image may be recognized based on determining predefined information of the boundary of the object from an image of the object. At least one characteristic of an object may also be determined based on determining predefined information of the boundary of the object.
The embodiments of the present invention provide several advantages. By providing a controller of a system processing an object with information concerning the manner the object moves it is possible to reduce the risk that a handling/processing apparatus subjects an operation twice to the same object, e.g. to reduce the possibility of an operation where a robot tries to pick an object twice. The same imaging apparatus that is employed for recognizing and/or for determining the characteristics of the object may be used for the determination of the parameters relating to the movement of the object.
Srief Description of Drawings
For better understanding of the present invention, reference will now be made by way of example to the accompanying drawings in which:
Figure 1 shows one embodiment of the present invention; Figures 2a and 2b disclose two subsequent images of an imaging area of an imaging apparatus; and
Figure 3 is a flowchart illustrating the operation of one embodiment of the present invention.
Description of Preferred Embodiments of the Invention
Reference is made to Figure 1 which shows a schematic presentation of an embodiment of the present invention. The system includes a conveyor 1 for supporting and moving objects 2. The objects are moved at speed v in a direction from left to right as is indicated by the arrow 10. Even though Figure 1 shows a belt conveyor, the skilled person is familiar with many other possible types of conveyors that could be used for conveying objects. These include chain conveyors and conveyors in which the objects are moved below the conveyor structure, e.g. are supported by appropriate hangers. Thus it is to be appreciated that the embodiments of the invention can be applied to any type of conveying arrangements adapted to move objects.
The system includes further an imaging apparatus comprising a camera 3. Various possibilities for imaging apparatus are known, these possibilities including, without restricting to these, cameras such as CCD (Charge Coupled Device) matrix cameras, progressive scan cameras, CCIR cameras (CCIR is a European standard for machine vision cameras employing a resolution of 768x576 pixels) and RS170 cameras (a North American standard with resolution of 640x480 pixels) and laser and infrared imaging applications. The camera 3 is arranged to image objects 2 on the belt 1 that are within an imaging area 40 between dashed lines 4a and 4b (see also Figs . 2a and 2b) .
In Figure 1 a single camera 3 is shown to be disposed above the conveyor 1. However, the position and general arrangement of the imaging apparatus may differ from this, and the embodiments of the invention are applicable in various possible imaging apparatus variations in which the number and positioning of the imaging apparatus and components can be freely chosen. The position of the imaging device 3 is typically chosen such that it is possible to detect desired points of the objects for the provision of a reliable detection of the objects forwarding on the belt. Thus the camera may also be positioned in the side of the conveyor or even below the conveyor. The vision system may also be provided with more than one camera, e.g. in applications where three dimensional images of the objects are produced or a great accuracy is required.
The exemplifying system of Figure 1 includes further a robot 5 for picking the objects from the conveyor 1. More particularly, the objects are picked by gripping means 6 of the robot 5. It is to be understood that the robot 5 is only a preferred example of a possible subsequent handling and/or processing device. Any suitable actuator device may be used for the further processing of the objects 2 after they have been imaged by the imaging apparatus 3.
A control unit 7 is also shown. The control unit is arranged to process information received from the imaging apparatus 3 via connection 8 and to control the operation of the robot 5 via connection 9. The controller unit includes required data processing capability, and may be based on microprocessor technology. For example, the controller may be based on Pentium™ processor, < even thoug a less or more powerfull processor may also be employed depending on the requirements of the system and the objects to be handled. Depending on the application, the controller 7 may be provided with appropriate memory devices, drives, display means, a keyboard, a mouse or other pointing device and any adapters and interfaces that may be required. In addition, an appropriate imaging software is typically required. The controller may also be provided with a network card to enable installations requiring communication over a data network. The controller may be adapted for communication e.g. based on TCP/IP networking (Transport Control Protocol/Internet Protocol) or over a local area network (LAN) .
A double pick checking procedure in accordance with an embodiment of the present invention will be discussed now with reference to Figures 2a and 2b. Figures 2a and 2b show two subsequent images from an imaging area 40 and a flow of objects 21, 2, and 22 through the imaging area 40. As can be seen, the object 2 can be found from both of these images. However, the second image of Figure 2b should produce information concerning the next object 22 only.
The procedure employed herein is based on the idea that by providing the controller 7 of the system with information concerning the manner how the conveyor 1 moves, and thus how the object may appear m the different images taken by the camera 3, it is possible to prevent the robot 5 from trying to pick the same object 2 twice. After the determination procedure by the controller the robot will be provided with only one position determination results per object and will thus not try to attempt to make more than one picking. More particularly, if the system is provided with predefined information concerning the movement of the conveyor and the objects on the conveyor (direction and speed v) , it is possible to calculate if the object 2 m the second image
(as shown in Figure 2b) is m reality the same object 2 that already appeared in the first image (as shown in Figure 2a) . The speed of the objects may be calculated and given as pixels/second or as "real" units, e.g. mm/second. A possibility for the determination of the direction and speed of the objects will be discussed below.
A direction vector may be used to define the direction in which the objects on the conveyor belt move. For example, component I of the vector may define the movement in the horizontal direction, and a positive value of said vector component may indicate movement from left to right, or vice versa. Component J of the vector may define the movement in vertical direction. A positive value of the component J may indicate a movement from up to down, or vice versa.
The following will give, with reference to the flow chart of Figure 3, a more precise example of the procedure for an automated conveyor direction vector and speed determination. During automatic conveyor direction and speed determination (i.e. calibration) procedure, two images will be taken i.e. grabbed consecutively. To keep the procedure simple, both images should preferably contain only one object, even though this is not a requirement in all embodiments. Based on the imaging interval and object positions in the images, conveyor direction and speed can be determined by appropriate computations based on the lengths of the interval and the distance the object has travelled.
The determination of the speed and direction parameters may comprise the following steps. Stop the conveyor 1, put a test object in the imaging area 40 and grab an image of the test object. Define the characteristics of the object from the image so that the system will know what kind of objects to look for during an automatic conveyor direction and speed determination operation. The auto-determination procedure will then wait until a "matching" object comes to the imaging area along the conveyor, records the object coordinates and waits for a specified time. After the time has lapsed a second image is grabbed and the new position of the object is recorded. The imaging interval between the subsequent two images is determined. For example, if the imaging apparatus is a camera that is positioned above a conveyor such that the top and bottom edges of the imaging area (4a and 4b, respectively) are perpendicular to the direction of movement of the objects carried by the conveyor, the first image may be grabbed when the object moves into the top of the imaging area. The second image will then be grabbed later after a xgrab interval' (for example, some milliseconds) has lapsed. After this procedure the controller of the system will be provided with the required information so that it is possible to define the length of travel the object has moved during the grab interval' . The speed may be determined by dividing the length of travel by the grab interval' in appropriate units. The test or "calibrating" object may then be taken away from the conveyor belt, and the actual processing of the objects may start. According to one possibility the calibration of the speed information is accomplished during a normal operation of the conveyor, and the object is not taken away but will be processed in a predefined manner after recognition.
After the controller of the system is provided with the speed and direction information, it may verify whether an object in an image is in reality an object that was already processed based on information from the previous image. This may be done by computing what will be the "theoretic" location of the object 2 in the second image (Fig. 2b) based on the determined speed and direction information of the objects. After the "theoretic" location of the object 2 in the second image is computed, an object appearing in that location in the second image may be ignored. In other words, based on the conveyor direction vector and the detected speed it is possible to calculate if the object has moved to the current position from a position which has already been reported to a robot. If the double pick checking procedure finds that the calculated old position matches in a predefined accuracy with the actual old position, the new object position will not be sent to the robot. Thus, when the system finds that the object in the second image is in fact the same object as in the first image, it will not include the object in the search results of the second image, and will not process the information of the "ghost" object any further. For example, no instructions are given to the robot 5 of Figure 1 based on the second results . By means of this the robot 5 will get only one characteristics (e.g. position, shape and so on) for each object provided by the first image and any "ghost" pickings may be avoided.
It is possible to accomplish some further checks, such as whether the shape and/or size of the object in the computed theoretic location in the second image corresponds the particular object in the first image, before ignoring the object in the second image.
It is also possible to define the accuracy level of the double or "ghost" action preventing system. This may be done e.g. by means of a tolerance value that defines how much the difference between the calculated theoretical location of the object and the actual position of the object in the second image may differ from each other until the location of the object in the second image is considered to be different. The location may be different e.g. since the object is not the same object as in the first image or since the object has, for some reason, moved relative to the conveyor. If the location of the object differs more than what the tolerance value allows, the object may be considered as a "real" object, or some other procedure may follow. The tolerance may be given in pixels or in real units, e.g. in millimetres. It may also be preferred in some embodiments to be able to adjust the accuracy level which can be accomplished by changing the tolerance value.
According to a preferred embodiment the imaging is based on a method where only the boundary of the object is imaged and analysed, and wherein the object is recognised based on the boundary characteristics. This enables a faster processing as the computing capacity and time required for the boundary detection and computations is less than if the entire object area is analysed on pixel to pixel basis. According to a possibility the actual object related data is retrieved from a object database based on the recognition of predefined points on the boundary and then this retrieved data is used in the actual handling and/or processing of the object.
The above example described use of a test or "calibrating" object for the calibration procedure. However, it is possible to provide the conveyor with a special marking, such as a cross, spot or line extending perpendicular to the conveyor belt, or to arrange a special sign or marking to some of the objects to be conveyed through the imaging area. According to one possibility the system automatically detects the speed of the conveyor, e.g. from a marking provided in the conveyor, within predefined intervals, and adaptively adjusts the speed information whenever the speed has changed. The arrangement may also be such that whenever the special marking or pattern is detected (either in the belt, chain or similar element of the conveyor or on an object), a speed calibration procedure will occur.
It was already noted above that the embodiments are applicable also if several cameras or other imaging devices are used for the vision system. The two or more imaging devices may also be monitoring different conveyors. Each of the imaging devices may have its own double pick checking system or then the arrangement may be such that a common controller controls the operation of each of the separate imaging devices .
A message may be shown to the user to confirm a successful auto-determination procedure. If the process fails, an error message may be shown.
It should be appreciated that whilst embodiments of the present invention have been described in relation to picking of object, embodiments of the present invention are applicable to any other type of operations where there may be a need for providing information of moving objects so that it is possible to reduce the amount of any unnecessary double or "ghost" operations following a "double" detection of the object. It is also noted herein that while the above describes exemplifying embodiments of the invention, there are several variations and modifications which may be made to the disclosed solution without departing from the scope of the present invention as defined in the appended claims.

Claims

Claims
1. A method of providing information on moving objects by an imaging apparatus, comprising: determining the speed of the objects based on information provided by the imaging apparatus; generating a plurality of images of the objects; and determining whether an image is the first image of an object based on the determined speed of the objects.
2. A method according to claim 1, wherein the speed of the objects is determined based on information provided by two images taken by the imaging apparatus.
3. A method according to claim 2, wherein a predefined pattern in detected in said two images, and the speed of the objects is calculated based on the time difference between the images and the difference in location of the pattern in the images .
4. A method according to claim 3, wherein the pattern comprises an object or a part of an object moved past the imaging area of the imaging apparatus.
5. A method according to claim 3, wherein the objects are moved past the imaging area of the imaging apparatus by a conveyor apparatus and the pattern comprises a marking or sign provided in the conveyor apparatus.
6. A method according to any of the preceding claims, comprising determination of the direction of movement of the objects based on information provided by the imaging apparatus .
7. A method according to claim 6, wherein the direction of movement is defined by a vector defined by two vector components normal to each other.
8. A method according to any of the preceding claims, wherein the information for the speed determination and the plurality of images of the objects are provided with a common imaging device.
9. A method according to any of the preceding claims, wherein the imaging apparatus comprises a camera.
10. A method according to any of the preceding claims, wherein, if it is determined that the image is not the first image of the object, the object in the image is ignored and no further control instructions concerning the object is generated.
11. A method according to any of the preceding claims, wherein an object is recognized based on determining predefined information of the boundary of the object from an image taken from the object by the imaging apparatus.
12. A method according to any of the preceding claims, wherein at least one characteristic of an object is determined based on determining predefined information of the boundary of the object from the image taken from the object by the imaging apparatus.
13. A method according to claim 12, wherein the characteristics comprise at least one of: the position of the object; the orientation of the object; the size of the object; and the shape of the object.
14. A method according to any of the preceding claims, wherein the determination of whether the image is the first image of the object comprises computing a theoretic location of the object in an image and verifying whether the theoretic location of the object and the real location of the object in the image match with a predefined accuracy.
15. A method according to claim 14, comprising adjusting the predefined accuracy.
16. A method according to any of the preceding claims, wherein the speed of the objects is determined while the objects are moved and the speed information forming one of the basis for determining whether an image is the first image of an object is adjusted adaptively based on the determined speed.
17. An apparatus for providing information of moving objects, comprising: an imaging apparatus arranged to generate a plurality of images of the objects moving past an imaging area of the imaging apparatus; and a controller arranged to determine the speed of the objects by processing information that is based on images generated by the imaging apparatus and to determine if an object exists in more than one image of the plurality of images based on the determined speed of the objects.
18. An apparatus according to claim 17, wherein the objects are moved by a conveyor apparatus and processed by an actuator apparatus, and the controller is arranged to control the operation of said at least one actuator apparatus based on information from the imaging apparatus.
19. An apparatus according to claim 17 or 18, wherein the controller is arranged to determine a theoretic location of the object in an image based on the speed of the objects and to verify whether the theoretic location of the object and the real location of the object in the image match in a predefined accuracy.
PCT/GB2000/004445 1999-11-23 2000-11-22 Providing information of moving objects WO2001038049A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU15325/01A AU1532501A (en) 1999-11-23 2000-11-22 Providing information of moving objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9927679.2 1999-11-23
GB9927679A GB2356699A (en) 1999-11-23 1999-11-23 Providing information of moving objects

Publications (1)

Publication Number Publication Date
WO2001038049A1 true WO2001038049A1 (en) 2001-05-31

Family

ID=10864995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/004445 WO2001038049A1 (en) 1999-11-23 2000-11-22 Providing information of moving objects

Country Status (3)

Country Link
AU (1) AU1532501A (en)
GB (1) GB2356699A (en)
WO (1) WO2001038049A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITRM20090483A1 (en) * 2009-09-22 2011-03-23 Gentili Enrico METHOD AND CONTROL SYSTEM.
US10471478B2 (en) 2017-04-28 2019-11-12 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007049702A1 (en) * 2007-10-17 2009-04-23 Robert Bosch Gmbh picking line
DE102010014105A1 (en) 2010-04-07 2011-10-13 Siemens Aktiengesellschaft Method and device for measuring objects during transport
JP2012187651A (en) * 2011-03-09 2012-10-04 Omron Corp Image processing apparatus, image processing system, and guidance apparatus therefor
FI20115326A0 (en) * 2011-04-05 2011-04-05 Zenrobotics Oy Procedure for canceling sensor measurements after a picking function in a robotic system
EP3349951B1 (en) * 2015-09-17 2023-07-12 ABB Schweiz AG A component feeder and a system for picking components comprising the component feeder

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4669168A (en) * 1984-11-05 1987-06-02 Nissan Motor Company, Limited Method and system for automatically attaching works onto vehicle bodies carried on a conveyor
US4876728A (en) * 1985-06-04 1989-10-24 Adept Technology, Inc. Vision system for distinguishing touching parts
US5063524A (en) * 1988-11-10 1991-11-05 Thomson-Csf Method for estimating the motion of at least one target in a sequence of images and device to implement this method
US5467402A (en) * 1988-09-20 1995-11-14 Hitachi, Ltd. Distributed image recognizing system and traffic flow instrumentation system and crime/disaster preventing system using such image recognizing system
US5687249A (en) * 1993-09-06 1997-11-11 Nippon Telephone And Telegraph Method and apparatus for extracting features of moving objects
US5757287A (en) * 1992-04-24 1998-05-26 Hitachi, Ltd. Object recognition system and abnormality detection system using image processing

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5040056A (en) * 1990-01-29 1991-08-13 Technistar Corporation Automated system for locating and transferring objects on a conveyor belt
US5065237A (en) * 1990-08-01 1991-11-12 General Electric Company Edge detection using patterned background
DE19601005A1 (en) * 1996-01-15 1997-07-17 Bosch Gmbh Robert Process for the detection of moving objects in successive images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4669168A (en) * 1984-11-05 1987-06-02 Nissan Motor Company, Limited Method and system for automatically attaching works onto vehicle bodies carried on a conveyor
US4876728A (en) * 1985-06-04 1989-10-24 Adept Technology, Inc. Vision system for distinguishing touching parts
US5467402A (en) * 1988-09-20 1995-11-14 Hitachi, Ltd. Distributed image recognizing system and traffic flow instrumentation system and crime/disaster preventing system using such image recognizing system
US5063524A (en) * 1988-11-10 1991-11-05 Thomson-Csf Method for estimating the motion of at least one target in a sequence of images and device to implement this method
US5757287A (en) * 1992-04-24 1998-05-26 Hitachi, Ltd. Object recognition system and abnormality detection system using image processing
US5687249A (en) * 1993-09-06 1997-11-11 Nippon Telephone And Telegraph Method and apparatus for extracting features of moving objects

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OKADA S ET AL: "AUTOMATIC IDENTIFICAITON OF CONVEYER-TRANSFERRED PARTS THROUGH IMAGE DATA PROCESSING", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INDUSTRIAL ELECTRONICS,CONTROL AND INSTRUMENTATION. (IECON),US,NEW YORK, IEEE, vol. CONF. 20, 5 September 1994 (1994-09-05), pages 719 - 722, XP000525407, ISBN: 0-7803-1329-1 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITRM20090483A1 (en) * 2009-09-22 2011-03-23 Gentili Enrico METHOD AND CONTROL SYSTEM.
US10471478B2 (en) 2017-04-28 2019-11-12 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same
US11090689B2 (en) 2017-04-28 2021-08-17 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same
US11858010B2 (en) 2017-04-28 2024-01-02 United Parcel Service Of America, Inc. Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same

Also Published As

Publication number Publication date
AU1532501A (en) 2001-06-04
GB2356699A (en) 2001-05-30
GB9927679D0 (en) 2000-01-19

Similar Documents

Publication Publication Date Title
US9604365B2 (en) Device and method of transferring articles by using robot
CN109955222B (en) Article transfer device, robot system, and article transfer method
CN107597600B (en) Sorting system and method for sorting
US8442668B2 (en) Handling system, work system, and program
CN110431093B (en) Article picking and placing system of split robot
EP1748339A2 (en) Workpiece tracking and handling device comprising a conveying means and a plurality of robots
US20080082213A1 (en) Workpiece picking apparatus
WO2009148089A1 (en) Handling apparatus, control device, control method, and program
US10521871B2 (en) Robot system
JP3296643B2 (en) Component supply method and device
JP2002299889A (en) Device and method for mounting electronic component
JP2009291895A (en) Handling apparatus, control device, and control method
WO2001038049A1 (en) Providing information of moving objects
WO2020047575A1 (en) Vision system for a robotic machine
CN109454004B (en) Robot scanning and sorting system and method
EP4015097A1 (en) Picking device
KR100651361B1 (en) Apparatus for mounting components and method thereof
JP2019018339A (en) Robot system
JPH0735527A (en) Device for recognition of position and shape of object to be conveyed
JP7436170B2 (en) robot system
CN115003613A (en) Device and method for separating piece goods
EP1569776A1 (en) Method and arrangement to avoid collision between a robot and its surroundings while picking details including a sensorsystem
JP7332201B2 (en) Conveyor control system and vibrating conveyer
US11697210B2 (en) Robot system
JP7448328B2 (en) Mechanical system that detects missed workpieces

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase