CA2416073A1 - System and method for inspecting containers with openings - Google Patents

System and method for inspecting containers with openings Download PDF

Info

Publication number
CA2416073A1
CA2416073A1 CA002416073A CA2416073A CA2416073A1 CA 2416073 A1 CA2416073 A1 CA 2416073A1 CA 002416073 A CA002416073 A CA 002416073A CA 2416073 A CA2416073 A CA 2416073A CA 2416073 A1 CA2416073 A1 CA 2416073A1
Authority
CA
Canada
Prior art keywords
contrast
grayscale
image
processing
defects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002416073A
Other languages
French (fr)
Inventor
Jeff Hooker
Timothy Hebert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Intelligent Machine Concepts LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelligent Machine Concepts LLC filed Critical Intelligent Machine Concepts LLC
Publication of CA2416073A1 publication Critical patent/CA2416073A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/90Investigating the presence of flaws or contamination in a container or its contents
    • G01N21/909Investigating the presence of flaws or contamination in a container or its contents in opaque containers or opaque container parts, e.g. cans, tins, caps, labels

Abstract

A system and method of the present invention allows the inspection of an object having an annular opening, such as a container or can, and uses a plurality of cameras that acquire grayscale images of arcuate sectors that together comprise a circumferential area within the opening. A processor is connected to the plurality of cameras for receiving the grayscale images and determining defects within the container based on contrast differences and grayscale intensity. The invention allows smart analysis of defects using qualitative inspections and quantitative measurements at higher speeds.

Description

~~

I
SYST~3 AND E~OD F4R INSpECTINO
co~r~RS ~~x ap~rrcs Field ,'of the Inveatioa This invention relates to the field of inspection, and more particularly, this invention relates to the field o~ inspecting containers such as cans with open tops.
8ackasouiid o~ the Iuyeation Container inspection is becoming increasingly important as customerstdemand increased quality and lower cost products. The use of computers in the inspection process hae~inereased these requirements.
Some systems use compu~eriaed imaging systems. For example, one system uses a conversion of image data into binary values to search fox defects. For example, .U.S. Patent No. 4,924,107 to Tucker discloses a system for insgecting cans where a plurality of horizontal.
' regions on the inside surface of an object, such as~an aluminum beverage can,~are inspected. This system .converts images to binary values and processes the values. Other_ systems~have similar binary value . algorithms. .
Greater control; hooaever, over the can inspection process in fihe internal areas of the can, such as near the top, are desired using similar metrology and.grayscale~analysis without complicated ~~
binary conversion algorithms.
AMENDED SHEET
Emofangs«" ",.,~or . ~, . "

S~ -~:. _ _,r y ~8~~6 ~2t~568y, w0 96/4199 has the drawbacks that it uses a template as a known image and compares an acquired 5~ image line-by-line and pixel-by~pixel in a comparison with the known image as a template. Thus, some poor lighting conditions would create a defect and errors would be amplified with this straight pattern matching system.
As to document D1 (US-A-54122,0~~, it uses a comparison and~subtraction and no advanced processing or artificial intelligence. In high speed,t~sting or is defect testing og many variables, it is inadequate.
EmofanasiAMENDED SHEET

~~ '' _~ 6 U'~tT7'~D56B, ~f' ' a..~a t ~r~~;i,~v~
8tutmtary of the Invet~tion The pxesent invention is advantageous and .
provides son-contact, on-conveyor, real-time, 100%
gauging and defeat detection system for a container, such as beverage containers and cans. The system ~of 'the present invention identifies random production of anomalies that make cans reieetable and detects internal mechanical defects, surface'defects, azid contamination with foreign objects on the full 360 interior of the can at line speeds to 3,000 CPM. Neck pucker, hole-in-sidewall, thumb dents, sidewall .
creases, ink stains and missing internal coatings can be determined and cans rejected if necessaxy.
In accordance with the present invention, th~
system inspects an object, such as a container or can.
having an annular opening, and uses a.plurality of cameras that acquire grayscale images of arcuate sectors that together compr~.se a ciraumferential area within the opening. A processor is connected to a ~plural3.ty of cameras for receiving the grayscale images and determining defects within the container based on contrast differences in the grayscale intensity.
' Another camera can be positioned substantially above the opening for acquiring a top grayscale image of the opening.
A system and method.of the present invention ' allows the inspection of an object having an annular opening, such as a container or can, and uses.a plurality of cameras that acquire grayscale images o~
34 arcuate sectors that together comprise a circumferential area within the opening. A processor is connected to the plurality of cameras and receives the grayscale image data. It determines defects within the container using a pipeline process of independent 3S -processing stages on image data with a rules based AMENDED SHEET
. EmPfanBS~.., .~ ~.. .. ..

U'J 94$96 USt)"1 X0568;.;

knowledge base. It determines true and false hits in a first processing stage based on a Sow sensitivity that is set with contrast grayscale images. Contrast regions are determined in a subsequent processing stage .
in computer memory space from the hits. Contrast pairs are grouped by proximity. Groups of contrast chaziges axe recorded fn a knowledge base, while processing , , image data to determine how much 7,ight is in a specific region and determine if adequate light is present, In a judgment stage, the image data is processed in a rules based knowledge base to determine defects as coherent structural defects.
In still another aspect of the present invention, a memory is associated With the processor for storing within a ktxowledge bass of a computer.
memory the various rules for determining when a defect exists based on the contrast differences and grayscale intensity. The cameras can comprise charge coupled device ~CCD) cameras. A strobe light is positioned at the inspection station for il~.uminating the container AMENDED SHEET
Empfangs~... .. _.. _. ..
opening as containers are fed by a conveyor that advances a plurality of cylindrical containers along a predetermined path of travel into the inspection station.
An eject mechanism ejects a container from the conveyor after determining that any defects are serious enough to make the container defective. This eject mechanism could comprise an air blow off mechanism.
In still another aspect of the present invention, the conveyor includes a vacuum assist mechanism to aid in retaining containers onto the conveyor by vacuum. A method is also disclosed for acquiring a plurality of grayscale images of arcuate surface sectors that together comprise a circumferential area inside the opening of the object and then processing the grayscale images to determine defects in the objects. The object can comprise the container, can or other object.
Brief Description of the Drawings Other objects, features and advantages of the present invention will become apparent from the detailed description of the invention which follows, when considered in light of the accompanying drawings in which:
FIG. 1 is an isometric view of the system of the present invention showing an overall conveyor and inspection station, where an operator monitors and controls the inspection process.
FIGS. 2A and 2B show the conveyor with a vacuum draw for aiding in holding containers onto the conveyor, and a sensor mechanism.
FIG. 3 is an isometric view of a portion of the inspection station showing wide angle lenses and cameras that acquire images from the side and top.
FIG. 4 is another isometric view similar to FIG. 3 and taken from a view looking toward the bottom of the can.
FIG. 5 is an elevation view of the cameras and lenses shown in FIG. 3.
FIG. 6 is a top plan view of the inside of the inspection station showing the side cameras.
FIG. 7 is a type of user screen having images that could be displayed and showing the four side images and a top image.
FIGS. 8A and 8B through 14A and 14B are visual images and process images showing examples of various defects.
FIG. 15 is an overall block diagram showing the software algorithm as it works in a "pipeline" and parallel processing format.
FIG. 16 is a flow chart illustrating the acquisition stage of the software that acquires an image.
FIG. 17 is a flow chart illustrating the processing stage.
FIG. 18 is a flow chart illustrating the processing of the side image.
FIG. 19 is a flow chart illustrating the processing of a top image.
FIG. 20 is a flow chart illustrating the decision making process for accepting or rejecting the container after the previous acquisition and processing stages.
Detailed Description of the Preferred Embodiments The present invention is advantageous and provides an inspection system for containers, such as the normal 12-ounce aluminum beverage cans that are formed as two pieces, with the formed body and a top later added. The system provides an inspection module that effectively and reliably identifies random production anomalies and rejects those cans that are considered to be defective. The system effectively detects internal mechanical damage and defects, surface-finish defects and contamination and foreign objects on the full, 360° interior of the can at line speeds up to 3,000 CPM.
Using a processor, such as a PC and memory for a historical database and knowledge base, current and historical data and other defect data can be displayed on a graphical user interface (GUI), or sent to a networked SPC system for off-line analysis. As described below and shown in the figures, the system can reliably differentiate between visual anomolies and defects that concern neck pucker, hole-in-sidewall, thumb dents, sidewall creases, ink stains and missing internal coatings, such as a lack of enamel coating.
The system of the present invention uses four side cameras 22, each having an attached wide angle lens 24 (FIG. 6), and a top camera 26 positioned for obtaining a top image 28 (FIG. 5). Each side camera 22 acquires grayscale images of arcuate sectors and each comprise a circumferential area within the opening.
Thus, the cameras (FIG. 6) acquire quadrant sectors, each a little over 90° with about 200 overlap with the adjacent grayscale image acquired from adjacent cameras. If only three cameras are used, there would be less overlap and the angular image spread more.
Thus, four cameras have been found acceptable to obtain the necessary grayscale images. However, more than five cameras can also be used.
The top grayscale image is acquired from a top camera (FIGS. 5 and 6) and the wide angle lens. A
processor is connected to the cameras for receiving the grayscale images, and determining defects within the container based on contrast differences and grayscale intensity.
Referring now to FIG. 1, the overall large components used in the system and method of the present invention are illustrated. A conveyor 10 holds the cans in vertical orientation and adjacent to each other, and advances the plurality of cans C along a predetermined path of travel defined by the conveyor 10 into an inspection station generally indicated at 12.
Although the term "cans" is used throughout the description, other containers could be cylindrical, and other configurations with openings can be inspected by the system. The inspection station 12 can be a separate unit that mounts over the conveyor 10 and is bolted to a floor 14. An operator console, such as a keypad 16 and/or touch screen 20, can be mounted on the inspection station, although it is not necessary. The conveyor could be mounted on an appropriate frame 22 and suspension as known to those skilled in the art.
The processor 23, such as a personal computer, could be mounted exterior to the unit or within the unit, as illustrated.
In one aspect of the present invention, the conveyor 10 includes a number of segments having vacuum holes (FIGS. 2A and 2B) that connect to a vacuum system to allow vacuum to be drawn from the top surface of the conveyor to retain a can C against the top surface. The cans also typically include a bottom bevel; although it is not necessary for the present invention, such that when two cans are positioned in close proximity and touch each other, an open triangular area is formed and can be used with a sensor 25 to indicate the presence of cans (FIGS. 2A and 2B).
The conveyor 10 could be belt-driven to move the cans. Additionally, the vacuum could apply only minimal drawing force for can stability only, and cans could be advanced by pressure exerted from adjacent cans on a more stationary conveyor. It is also possible to use a conveyor that forces air upward against the cans, such that each can "floats" on a conveyor. Thus, if a straight line stationary conveyor is used, or a curved stationary conveyor with side rails, the cans "float" on the conveyor and are pushed along the air cushioned conveyor and into the inspection station 12. Other conveyors could be used as suggested by those skilled in the art for moving cans into the inspection station.
A number of different sensors 25 could be used, such as a through beam sensor (not shown) (FIGS. 2A and 2B), such that when the "open" or triangular area defined by adjacent bottom bevels of the two cans passes through the beam sensor, the light passes from the light source through an area defined by the triangular area and is received by a light receptor. This type of sensor could be connected to the processor 23 to perform the functions of the system and method of the present invention and indicate automatically the presence of a can.
At the inspection station, at least one light source 40 illuminates the interior of the can, and in one example, the source is a xenon strobe instead of a more conventional prior art LED strobe. As shown in _g_ FIG. 3, a mounting plate 42 is positioned at the inspection station and the xenon strobe is mounted in the center portion of the mounting plate 42. The xenon strobe is mounted on a ring support member 44 and the top camera 26 and its wide angle lens extend downward through the ring support member, as shown in FIG. 4.
The ring support member 44 is mounted by a support bracket 46 to the mounting plate.
Each of the side and top cameras are charge coupled device (CCD) cameras and work with pinhole lenses. These cameras operate in NTSC standards and typically have two frames that are interlaced at 1/60 second per frame, and having two frames per image.
However, it is possible to operate the CCD cameras in the present invention in frame mode and acquire one of the frames such that the images. are in half resolution down to 1/60 second, thus acquiring an image every 16 milliseconds. Naturally, other cameras could be used and the acquisition time can vary by standards known to those skilled in the art. Line scan cameras, frame cameras and other cameras can also be used. V~lith a vertical sync pulse of 1/60 second and 256 lines, the present invention could allow up to 3,000 cans per minute. It is possible, for example, to move down to 64 lines and obtain 360 images a second, but there may be data problems in a regular type of processor that could be used in the present invention with a normal personal computer operating at a 133 megahertz bus.
Although any number of processors could be used in the present invention, as known to those skilled in the art, a regular personal computer that is mounted for aesthetic and functional reasons within the base of the inspection station frame (FIG. 1) is acceptable as long as it has adequate processing power and data transfer capability as described above.
As shown in FIGS. 3-6, the side cameras 22 are mounted on side mounting brackets 48 and-include the wide angle lens that points at an acute angle into the interior of the can at the top flange area (f) defined by the can. The cameras are connected to the processor 23.
The side mounting brackets 48 also include a swivel mounting bracket 48a to allow the cameras to be oriented at different angles, and also swivelled at different angles, as shown by the arrows in FIG. 3.
Side mounting brackets 48 are slidable upon lateral supports 48b mounted on the underside of the mounting plate.
FIG. 7 illustrates the type of computer images that can be displayed on a computer screen using the four side cameras and top camera. Naturally the images can be displayed as part of a graphical user interface of a computer screen, and can include other information and data entry boxes (not shown) to allow further data processing on the captured images. It is also possible to display other image data as known by one skilled in the art on the graphical user interface.
FIGS. 8A-14B show dual side-by-side images where visual images are shown on the left side, and the processed images, as seen by the computer, are shown on the right side. FIGS. 8A and 8B show neck pucker.
FIGS. 9A and 9B show a hole-in-sidewall. FIGS. 10A and 10B show a sidewall crease. FIGS. 11A and 11B show a neck dent and ink stain. FIGS. 12A and 12B show an ink stain. FIGS. 13A and 13B show a thumb dent and ink stain. FIGS. 14A and 14B show a missing coating/partial spray, such as when an enamel spray has been missed.
The circles indicated at 50 are considered hits, but not considered defects, until they are processed in groups of contrast pairs, shown by the circles indicated at 52, as explained below. The sensitivity could be set very low and many false hits could be registered. The sensitivity is set with the contrast of the grayscale images. For a hit to be considered a defect, it must meet the criteria and have neighbors, a certain contrast and contrast-pairs.
Referring now to FIGS. 15-20, there are illustrated flow charts for the software algorithm that is used in the system of the present invention for inspecting containers, such as a can. Although the description of the software algorithm for the system proceeds with reference to the inspection of containers and cans, such as the common 12 ounce aluminum formed cans, the software algorithm can be used in the inspection of different articles and containers of different configurations and different types, as known to those skilled in the art.
The software algorithm can be written in different software languages, as known to those skilled in the art. In the present example, however, the program C++ has been found to be an advantageous language for the software. The software algorithm establishes a "pipeline" process of three processing stages as shown in FIG. 15, and uses a two pass filer as explained below. Each of the three stages function independently, but simultaneously. While the third stage is working on a first group of images, the second stage is working on a second group of images and the third stage is working on a third group of images. A

single group of images must be processed literally through the pipeline.
In the illustrated figures, each of the three stages process the five images per stage, i.e., the four side images and the one top image, as shown in FIG. 7. It should be understood that the software algorithm operates in a pipeline process with the acquired groups of images. The number of images per group depends on the type of hardware used for acquiring the images, such as what type of CCD cameras or line scan cameras are used. As long as the processor and bus connections have the appropriate processing power and data capability, then a larger number of images can be acquired, if necessary.
15~ For purposes of clarity, reference numerals used in describing the system software begin in the one hundred (100) series.
As shown in FIG. 15, and as noted before, the external hardware 100 includes the CCD cameras having associated vision processors 102 that process each acquired image. Each time there is a hardware trigger, the five cameras "grab" or acquire a respective image, totaling five images, which are passed into the software processing pipeline and processed in parallel.
In the first acquire stage 104, the image queue 106 is a memory buffer as part of a PC memory space 107. Enough memory space has been allocated to hold a set number of images, and in one example, 30 sets of images. The number of images that can be held in the image queue 106 can vary, depending on the processing speed and type of images, as known to those skilled in the art. In the second processing stage 108, the knowledge base 110 contains the rules for determining a flawed can.

The third judgment stage 112 interacts with the PC memory space 107, the knowledge base 110 and the eject mechanism hardware 114 to eject a can if the system has determined that a can or other container is flawed. Throughout these three stages, the first acquire stage 104 interacts with the vision processors 102 and PC memory space 107 containing the image queue 106. The process stage 108 interacts only with the PC
memory spaces of the image queue and the knowledge base. The judgment stage 112 interacts with the PC
memory space 107, the knowledge base 110, and the eject mechanism hardware 114.
The algorithm steps that are operative for the first acquire stage 104 are shown at FIG. 16. The system determines if a hardware trigger has been received (block 120) corresponding to a decision that a can is available for image acquisition. If there is no external trigger corresponding to a decision that a can is available, this step repeats until there is a trigger. If there is a trigger, the five cameras acquire all images (block 122), and if all images are acquired, the vision processors are reset (block 124).
The software again waits for the external trigger.
If all images are not obtained, then any single images that have been obtained are acquired from the vision processors (block 126). This could be any number (In) of images, I1, I2, I3 . . .In. In the present hardware configuration illustrated in FIGS.
1-4, there are five cameras, and thus, 1 _< n -< 5. The software determines if In is a valid image (block 128).
If In is a valid image, then those images I" are placed into the image queue (block 130). If the images I" are not valid images, then a blank image is placed into the image queue (block 132). A blank image is used if a number I" of cameras capture good images and they are analyzed and processed, but one or more other images are a blank image, for example, such as when camera no. 5 (for example) is inoperable. A fifth image is retained as a blank image and the processor and software will not concern itself with processing that blank image at a later stage.
' FIG. 17 illustrates the second process stage 108 and in the first step, a determination is made if the image queue 106 is empty. If the image queue is empty (block 134), then the software loops back and continues to look at the image queue to determine if it is empty. If the image queue is not empty, then the image is removed from the image queue (block 136) and a determination is made whether the processor is a side image 138 based upon information stored in the knowledge base concerning the side image processing.
If the image is a side image, then the image is processed as a side image (block 140), as shown in the algorithm.
The side image processing first determines if the image is blank (block 142). If it is a blank image, then the software records "no groups" to the knowledge base (block 144). For example, for every can entering into the system for inspection, there is a set of information such that the judgment thread in the software determines whether to keep the can or reject the can. This is done to determine if there is can data to record in the knowledge base.
If there is no blank image, then the contrast filter is run (block 146) checking contrast based on gray scale. This is a software tool used to analyze the image space-by-space in segments, i.e., blocks of image data. Thus, the software analyzes blocks of the image data, and one at a time, determines first whether there are high contrast regions (block 148). If there are high contrast regions in the gray scale analysis, the software counts that region as a hit. Thus, the software algorithm determines if there are any high contrast regions, and if yes, it determines the contrast pairs from the different hits (block 150).
After the filter is run and the system marks the high contrast regions in the memory space of the memory array, the system determines whether there are any high contrast regions. The system software groups the contrast pairs by proximity (block 152). These high contrast regions are analyzed using gray scale imaging. For example, if the image intensity of the gray scale of blocks of image data go from white to black, that is a positive contrast difference. If the gray scale image goes from black to white, there is a negative contrast difference.
The system analyzes group pairs of approximately equal contrast that are close to each other, and if there is a defect, for example, in the neck of the can, the contrast will change from a regular dark contrast to light wherever the defect is, and then be dark again at the regular or normal can surface. Thus, the algorithm determines the changes in contrast intensity. If there are groups of contrast changes (block 154), then the groups are recorded in the knowledge base (block 156). In the previous steps, if there have been no high contrast regions, then the processor records "no groups" to the knowledge base.
If after grouping the contrast pairs by proximity and there are no groups, then the processor records "no groups" to the knowledge base. In any event, at this stage, side image processing stops (block 158).
If the software determines that there are no side images (block 138, FIG. 17), then the image is processed as a top image (block 141), as shown in the top image processing algorithm flow chart of FIG. 19.
After processing the side image and/or top image, the judgment stage is signaled (block 141A).
A light meter is run (block 160), which is a software application to determine how much light is in a specific region. The light meter is operated on the image data itself. The system software can incorporate light meter software as known to those skilled in the art. If there is too much light (block 162), then the image is recorded to reflect the container or can as a "no spray" defect (block 164). In this instance, a portion or all of the inside of the can has no enamel or other protective coating, and at least one camera has been receiving an image of raw aluminum, which floods the image intensity out through massive reflection. It is termed a "no spray" defect corresponding to the lack of enamel spraying.
If there is not excessive light, then the system determines if there is inadequate light (block 166). If there is inadequate light, then the label image is tagged a "no image" (block 168). "No image"
can occur, for example, when a strobe light has failed.
The third judgment stage 112 analyzes this information and raises a flag indicating that there is a problem, especially if subsequent images are processed linearly, each having too little light. The same could occur if too much light, and the images are referenced as corresponding to surfaces that are "no spray" defect.
The system determines the hits (block 170), and if there are any hits (block 172), these hits are recorded as defects (block 174). If not, then the processor records "no defects" (block 176). At this point, the processing stage stops (block 178) and the judgment stage is then signaled and processing begins in the third judgment stage as shown in FIG. 20.
The system software determines if a judgment stage signal is caught (block 180). If no signal is caught, then the system software continues in a loop to determine when a judgment stage signal is received. If the signal is received and caught, the recorded data is read from the knowledge base (block 182). If there is a "no image" reference (block 184), then the tally for "no images" are incremented (block 186), in order to keep track of the cans. If not, then the system software determines if there is a "no spray" data reference (block 188), and if there is, then the "no spray" tally is incremented (block 190). A signal blow off occurs (block 192) to eject the can at this point.
If the "no spray" determination (block 188) is negative, then the system determines if there are any top image defects (block 194), and if there are, the top defect tally is incremented (block 196) and the signal generated to blow off and eject the can (block 192). If there are no top image defects, then the system determines if there have been any side image defects (block 198), and if there have been, then the side defect tally is incremented (block 200) and the signal generated to blow off and eject the can (block 192). If there are no side image defects, then the signal blow off is not signaled and the can remains on line for further processing (block 202).
It is evident that the algorithm is advantageous with its use of filters and decision processes to determine the positive and negative contrasts, and is more advantageous than prior art units.
Many modifications and other embodiments of the invention will come to the mind of one skilled in the art having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed, and that the modifications and embodiments are intended to be included within the scope of the dependent claims.

Claims (11)

THAT WHICH IS CLAIMED IS:
1. A system for inspecting an object having an annular opening and being characterized by:
a plurality of cameras (24) that acquire grayscale image data of arcuate sectors that together comprise a circumferential area within the opening; and a processor (27) connected to the plurality of cameras for receiving the grayscale image data and determining defects within the container using a pipeline process of independent processing stages on image data with a rules based knowledge base by determining true and false hits in a first processing stage based on a low sensitivity that is set with contrast of grayscale images, and in a subsequent processing stage, determining contrast regions in a computer memory space from the hits and grouping contrast pairs by proximity, analyzing contrast pairs of about equal contrast that are close to each other to determine changes in contrast intensity, recording groups of contrast changes in a knowledge base while also processing image data to determine how much light is in a specific region to determine if adequate light was present, and in a judgment stage, processing the image, data in a rules based knowledge base to determine defects as coherent structural defects.
2. A system according to Claim 1, and further comprising a camera (26) positioned substantially above the opening for acquiring a top grayscale image of the opening.
3. A system according to Claim 2, in the container, wherein said processor determines defects from the top grayscale image and the grayscale images of arcuate sectors simultaneously.
4. A system according to Claim 1, and further comprising a memory associated with said processor for storing within a knowledge base rules for determining when a defect exists based on the contrast differences in grayscale intensity.
5. A system according to Claim 1, wherein said camera comprises charge coupled device cameras.
6. A method of inspecting an object having an opening and being characterized by the steps of:
acquiring grayscale image data of arcuate surface sectors that together comprise a circumferential area inside the opening of the object;
and processing the grayscale image data to determine defects of the object using a pipeline process of independent processing stages on image data with a rules based knowledge base by determining true and false hits in a first processing stage based on a low sensitivity that is set with contrast of grayscale images, and in a subsequent processing stage, determining contrast regions in a computer memory space from the hits and grouping contrast pairs by proximity, analyzing contrast pairs of about equal contrast that are close to each other to determine changes in contrast intensity, recording groups of contrast changes in a knowledge base while also processing image data to determine how much light is in a specific region to determine if adequate light was present, and in a judgment stage, processing the image data in a rules based knowledge base to determine defects as coherent structural defects.
7. A method according to Claim 6, and further comprising the step of acquiring the grayscale images simultaneously.
8. A method according to Claim 6, and further comprising the step of processing the grayscale images simultaneously in parallel.
9. A method according to Claim 6, and further comprising the step of determining contrast differences in the grayscale intensity of each grayscale image for determining defects.
10. A method according to Claim 6, and further comprising the step of acquiring each grayscale image with a camera.
11. A method according to Claim 6, and further comprising the step of acquiring a top grayscale image of the opening.
CA002416073A 2000-07-18 2001-06-28 System and method for inspecting containers with openings Abandoned CA2416073A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/618,344 2000-07-18
US09/618,344 US6525333B1 (en) 2000-07-18 2000-07-18 System and method for inspecting containers with openings with pipeline image processing
PCT/US2001/020568 WO2002006799A1 (en) 2000-07-18 2001-06-28 System and method for inspecting containers with openings

Publications (1)

Publication Number Publication Date
CA2416073A1 true CA2416073A1 (en) 2002-01-24

Family

ID=24477310

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002416073A Abandoned CA2416073A1 (en) 2000-07-18 2001-06-28 System and method for inspecting containers with openings

Country Status (6)

Country Link
US (1) US6525333B1 (en)
EP (1) EP1301775A1 (en)
JP (1) JP2004525340A (en)
AU (1) AU2001270227A1 (en)
CA (1) CA2416073A1 (en)
WO (1) WO2002006799A1 (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6864498B2 (en) * 2001-05-11 2005-03-08 Orbotech Ltd. Optical inspection system employing a staring array scanner
ATE441850T1 (en) * 2001-11-16 2009-09-15 Heineken Supply Chain Bv METHOD AND APPARATUS FOR GENERATING A ROBUST REFERENCE IMAGE OF A CONTAINER AND SELECTING A CONTAINER
US20030179920A1 (en) * 2002-03-13 2003-09-25 Intelligent Machine Concepts, L.L.C. Inspection system for determining object orientation and defects
JP2004151025A (en) * 2002-10-31 2004-05-27 Teruaki Ito Test tube type discriminating device
US8014586B2 (en) * 2007-05-24 2011-09-06 Applied Vision Corporation Apparatus and methods for container inspection
US7971497B2 (en) * 2007-11-26 2011-07-05 Air Products And Chemicals, Inc. Devices and methods for performing inspections, repairs, and/or other operations within vessels
US8601843B2 (en) 2008-04-24 2013-12-10 Crown Packaging Technology, Inc. High speed necking configuration
GB2482473A (en) * 2010-06-29 2012-02-08 Constar Internat Uk Ltd Inspection of articles
CN103492834B (en) * 2010-09-15 2016-03-30 吴乃恩 For detecting automatic utensil and the method for inspection of pivoting part quality
EP2629969B1 (en) * 2010-10-19 2024-03-20 Pressco Technology, Inc. Systems and methods for printing component identification and selected adjustment thereof
US10094785B2 (en) 2011-05-17 2018-10-09 Gii Acquisition, Llc Method and system for optically inspecting headed manufactured parts
US8570504B2 (en) 2011-05-17 2013-10-29 Gii Acquisition, Llc Method and system for optically inspecting parts
US10088431B2 (en) 2011-05-17 2018-10-02 Gii Acquisition, Llc Method and system for optically inspecting headed manufactured parts
US9697596B2 (en) 2011-05-17 2017-07-04 Gii Acquisition, Llc Method and system for optically inspecting parts
US9047657B2 (en) 2011-05-17 2015-06-02 Gii Acquisition, Lcc Method and system for optically inspecting outer peripheral surfaces of parts
US10373470B2 (en) 2013-04-29 2019-08-06 Intelliview Technologies, Inc. Object detection
CN103234981A (en) * 2013-05-14 2013-08-07 天津名唐环保科技有限公司 Detection method and detection system for inner wall of dual-visual metal can
US9555616B2 (en) 2013-06-11 2017-01-31 Ball Corporation Variable printing process using soft secondary plates and specialty inks
CN103512893B (en) * 2013-09-10 2016-08-10 上海东富龙科技股份有限公司 A kind of ampoule bottle head detection device for lamp inspection machine
CN103499586B (en) * 2013-09-10 2017-01-04 上海东富龙科技股份有限公司 A kind of cillin bottle head detection device for lamp inspection machine
JP6200741B2 (en) * 2013-09-24 2017-09-20 アイマー・プランニング株式会社 Can inspection equipment
BR122020021072B1 (en) * 2013-09-24 2024-02-15 Nippon National Seikan Company, Ltd. CAN INSPECTION DEVICE AND CAN INSPECTION SYSTEM CONTAINING THE SAME
CA2847707C (en) 2014-03-28 2021-03-30 Intelliview Technologies Inc. Leak detection
US10943357B2 (en) 2014-08-19 2021-03-09 Intelliview Technologies Inc. Video based indoor leak detection
ES2734983T3 (en) 2014-12-04 2019-12-13 Ball Beverage Packaging Europe Ltd Printing apparatus
CN105760424A (en) * 2016-01-16 2016-07-13 唐山师范学院 Database establishment method for storing key data of enterprise products
US10549921B2 (en) 2016-05-19 2020-02-04 Rexam Beverage Can Company Beverage container body decorator inspection apparatus
WO2018017712A1 (en) 2016-07-20 2018-01-25 Ball Corporation System and method for aligning an inker of a decorator
US11034145B2 (en) 2016-07-20 2021-06-15 Ball Corporation System and method for monitoring and adjusting a decorator for containers
US20180232875A1 (en) * 2017-02-13 2018-08-16 Pervacio Inc Cosmetic defect evaluation

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CH548599A (en) * 1972-01-19 1974-04-30 Emhart Zuerich Sa Crack testing station for the sorting line of a plant for the production of glass containers.
US3923158A (en) 1973-06-20 1975-12-02 Platmanufaktur Ab On-line multistation inspection device for machine moulded products
US4697245A (en) 1984-11-29 1987-09-29 Cbit Corporation Inspection and measuring apparatus and method
JPS61193009A (en) 1985-02-22 1986-08-27 Toyo Glass Kk Inspecting device for top surface of opening of container
AU597485B2 (en) * 1987-04-22 1990-05-31 John Lysaght (Australia) Limited Non-contact determination of the position of a rectilinear feature of an article
US4786801A (en) 1987-07-21 1988-11-22 Emhart Industries Inc. Finish Leak Detector having vertically movable light source
US4882498A (en) 1987-10-09 1989-11-21 Pressco, Inc. Pulsed-array video inspection lighting system
US5051825A (en) * 1989-04-07 1991-09-24 Pressco, Inc. Dual image video inspection apparatus
US4906099A (en) 1987-10-30 1990-03-06 Philip Morris Incorporated Methods and apparatus for optical product inspection
US4924107A (en) * 1988-10-07 1990-05-08 Ball Corporation System for inspecting the inside surfaces of a container for defects and method therefor
JPH041505A (en) 1990-04-18 1992-01-07 Matsushita Electric Ind Co Ltd Three-dimensional position measuring method and acquiring method for work
US5412203A (en) 1991-07-15 1995-05-02 Fuji Electric Co., Ltd. Cylindrical container inner surface tester
US5371690A (en) 1992-01-17 1994-12-06 Cognex Corporation Method and apparatus for inspection of surface mounted devices
US5369713A (en) * 1992-07-09 1994-11-29 Schwartz; Nira Inspection method using area of interest (AOI) analysis
US5440385A (en) 1993-02-05 1995-08-08 Pressco Technology, Inc. Integrated isotropic illumination source for translucent item inspection
US5451773A (en) * 1993-06-04 1995-09-19 Pressco Technology, Inc. Non-contact perforation/pinhole detection system for opaque vessels
US5581632A (en) 1994-05-02 1996-12-03 Cognex Corporation Method and apparatus for ball bond inspection system
US5880772A (en) 1994-10-11 1999-03-09 Daimlerchrysler Corporation Machine vision image data acquisition system
US5764874A (en) 1994-10-31 1998-06-09 Northeast Robotics, Inc. Imaging system utilizing both diffuse and specular reflection characteristics
US5591462A (en) 1994-11-21 1997-01-07 Pressco Technology, Inc. Bottle inspection along molder transport path
US5699152A (en) 1995-04-03 1997-12-16 Alltrista Corporation Electro-optical inspection system and method
WO1996041299A1 (en) 1995-06-07 1996-12-19 Pressco Technology, Inc. Inspection system for exterior article surfaces
US6018562A (en) * 1995-11-13 2000-01-25 The United States Of America As Represented By The Secretary Of The Army Apparatus and method for automatic recognition of concealed objects using multiple energy computed tomography
US5742037A (en) 1996-03-07 1998-04-21 Cognex Corp. Method and apparatus for high speed identification of objects having an identifying feature
US5936353A (en) 1996-04-03 1999-08-10 Pressco Technology Inc. High-density solid-state lighting array for machine vision applications
US5818443A (en) 1996-05-06 1998-10-06 Cognex Corporation Single step coarse registration and inspection of circular objects
US5987159A (en) * 1996-09-24 1999-11-16 Cognex Corporation System or method for detecting defect within a semi-opaque enclosure
JP3614597B2 (en) 1996-10-24 2005-01-26 三菱原子燃料株式会社 Internal imaging device
US5807449A (en) 1997-01-08 1998-09-15 Hooker; Jeffrey A. Workpiece treating apparatus and method of treating same
JP2000055827A (en) * 1998-08-10 2000-02-25 Hitachi Eng Co Ltd Method and device for inspecting mouth part of glass container, etc.

Also Published As

Publication number Publication date
EP1301775A1 (en) 2003-04-16
US6525333B1 (en) 2003-02-25
WO2002006799A1 (en) 2002-01-24
JP2004525340A (en) 2004-08-19
AU2001270227A1 (en) 2002-01-30

Similar Documents

Publication Publication Date Title
US6525333B1 (en) System and method for inspecting containers with openings with pipeline image processing
US5592286A (en) Container flange inspection system using an annular lens
US5699152A (en) Electro-optical inspection system and method
CA2272494C (en) Inspection of containers employing a single area array sensor and alternately strobed light sources
US7329855B2 (en) Optical inspection of glass bottles using multiple cameras
US4924107A (en) System for inspecting the inside surfaces of a container for defects and method therefor
EP1151283B1 (en) Cylindrical object surface inspection system
US7414716B2 (en) Machine for inspecting glass containers
US7330251B2 (en) Method and apparatus for producing reference image in glass bottle inspecting apparatus
US20080093538A1 (en) Machine for inspecting glass containers
US8058607B2 (en) Machine for inspecting glass containers at an inspection station using an addition of a plurality of illuminations of reflected light
US7541572B2 (en) Machine for inspecting rotating glass containers with light source triggered multiple times during camera exposure time
US6519356B1 (en) System and method for inspecting cans
CN111921912A (en) High-precision visual detection system and method for penicillin bottle
US7317524B2 (en) Method and device for detecting surface defects on the neck ring of a transparent or translucent container of revolution
JP4361156B2 (en) Appearance inspection equipment for articles
US7876951B2 (en) Machine for inspecting glass containers
JPH04147045A (en) Surface inspection device
EP1916515B1 (en) Machine for inspecting glass containers
JPH06118026A (en) Method for inspecting vessel inner surface
JP3055322B2 (en) Circular container inner surface inspection device
JPH10160676A (en) Rice grain inspection device
JPH08285791A (en) Visual inspection apparatus for inner surface of deep object
JPH0581697U (en) Appearance inspection device
JPS58184537A (en) Apparatus for detecting defect of glass bottle

Legal Events

Date Code Title Description
FZDE Dead