US20040047140A1 - Programmable illuminator for vision system - Google Patents

Programmable illuminator for vision system Download PDF

Info

Publication number
US20040047140A1
US20040047140A1 US10/657,286 US65728603A US2004047140A1 US 20040047140 A1 US20040047140 A1 US 20040047140A1 US 65728603 A US65728603 A US 65728603A US 2004047140 A1 US2004047140 A1 US 2004047140A1
Authority
US
United States
Prior art keywords
workpiece
view
image
field
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/657,286
Inventor
Kurt Pelsue
Jonathan Ehrmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electro Scientific Industries Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/657,286 priority Critical patent/US20040047140A1/en
Publication of US20040047140A1 publication Critical patent/US20040047140A1/en
Assigned to ELECTRO SCIENTIFIC INDUSTRIES, INC. reassignment ELECTRO SCIENTIFIC INDUSTRIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GSI GROUP CORPORATION, GSI GROUP INC
Assigned to ELECTRO SCIENTIFIC INDUSTRIES, INC. reassignment ELECTRO SCIENTIFIC INDUSTRIES, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION SERIAL NUMBER 11776904 PREVIOUSLY RECORDED ON REEL 030582 FRAME 0160. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: GSI GROUP CORPORATION, GSI GROUP INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features

Definitions

  • the present invention is directed to machine vision and in particular to providing illumination for such systems.
  • Machine vision has been applied to a number of production and testing tasks.
  • workpieces such as printed-circuit boards, integrated-circuit chips, and other articles of manufacture are brought into the field of view of a camera.
  • the camera typically generates an image in digital form, which digital circuitry normally in the form of a microprocessor and related circuitry processes in accordance with the task to be performed.
  • a superior solution is to move neither the camera nor the workpiece, but rather to move the camera's field of view by employing deflector mechanisms.
  • Galvanometer-mounted pivoting mirrors, pivoting prisms, and rotating reflector polygons are among the mechanisms commonly employed in optical systems to perform image deflection. Although these still are moving parts, they are ordinarily relatively small and take advantage of optical leverage to change the field of view faster than systems that move the entire workpiece or camera.
  • the lighting system would have to be optimized for a wide range of resultant relationships between the lighting system's position and that of the camera's field of view.
  • the difficulty of solving this problem has confounded attempts to employ field-of-view deflection.
  • “Dark-field illumination” is an illumination approach that takes advantage of the fact that a specularly reflecting feature in the midst of a diffuse-reflecting background will appear dark if that feature's specular reflection images the main light source outside the camera's field of view. That is, since the angle of reflection of all light striking a specular reflector equals that light's angle of incidence, the reflected light will not pass through the camera's entrance pupil unless that sole angle of reflection yields that result. But light striking the diffusely reflecting background is reflected in a range of angles, so a substantial amount may enter the camera even if the specular-reflection angle would not result in a ray that does. The specularly reflecting feature is therefore readily identified because it appears dark against a lighter background.
  • FIG. 1 is a diagram that illustrates specular reflection
  • FIG. 2 is a similar diagram that illustrates diffuse reflection
  • FIG. 3 is a diagram that illustrates specularly reflecting indicia in a diffusely reflecting background
  • FIG. 4 is a diagram of a vision system that employs the present invention's teachings
  • FIG. 5 is a block diagram of the control system that an application using the FIG. 4 system may employ;
  • FIG. 6 is a diagram of the processes that the system of FIG. 5 performs.
  • FIG. 7 is a diagram of the system that employs the present invention's teachings in a laser-scribing system.
  • an incident ray 12 strikes a specularly reflecting surface 14 at an angle ⁇ i with respect to the normal 16 to that surface. If the surface 14 is a mirror or other specularly reflecting surface, essentially all light that strikes the surface at the angle of incidence ⁇ i reflects at an angle of reflection ⁇ o equal to the angle of incidence ⁇ i .
  • the point 20 at which the light ray strikes the surface 14 may be located within the field of view of a camera, but that light will not contribute to the camera image unless ray 18 extends through that camera's entrance pupil. For the purposes of this discussion, we will assume that it does not. So point 20 is not illuminated so far as the camera is concerned.
  • the reflection of ray 12 includes not only ray 18 , whose angle of reflection equals the angle of incidence, but also a plume of other rays, such as ray 22 , whose angles with the normal differ from that of the incident ray.
  • the camera pupil may be so positioned that it receives some of these rays even if it does not receive ray 18 . So spot 20 is illuminated from the camera's point of view, even though it would not be if it reflected only specularly.
  • the surface 14 includes both FIG. 3's specularly reflecting indicia 24 and its diffuse background 26 , an image is formed in which the indicia are readily distinguished if the camera is positioned with an entrance pupil that does not receive the specularly reflected rays but does receive some of the rays that result from diffuse reflection.
  • the indicia may not have perfectly mirror-like surfaces, as the drawing suggests, but they will be distinguishable so long as they yield light plumes that are significantly more compact than the plumes produces by the indicia's surroundings.
  • FIG. 4 depicts an apparatus for applying the technique of dark-field illumination to a field-of-view-deflection system.
  • a camera that includes a detector 30 , lenses 32 , 34 , and 36 , and a field-of-view deflector depicted for purposes of illustration as including a galvanometer-mounted mirror 38 .
  • Detector 30 will typically take the form of an array of charge-coupled devices, whose outputs are converted to digital form for processing in accordance with the particular application to which the system is applied.
  • That lens system includes an image-forming lens 32 spaced by its focal length from the detector 30 . It also includes collimating lenses 34 and 36 , which collimate the light from a target region in which a workpiece 37 is disposed; i.e., lenses 34 and 36 image workpiece 37 at infinity. But other embodiments may, say, include only a single lens, corresponding to lens 32 but positioned to image the workpiece 37 on the detector 30 .
  • a camera field of view consisting of a volume of points in space from which the lenses and field-of-view deflector provide optical paths to the detector 30 .
  • the field of view intersects the target region so that the camera can “see” a portion 40 of the workpiece.
  • the field-of-view deflector's position determines where on the workpiece portion 40 falls; i.e., the field viewed by the detector can change without relative movement between the camera and the workpiece, although such movement may occur, too.
  • the workpiece may be on a conveyor, which typically would not be capable of moving as quickly as the field of view.
  • the drawing depicts the field-of-view deflector as comprising only a single mirror, many embodiments of the present invention will employ two, which deflect the camera's field of view along mutually orthogonal axes. Also, although the drawing shows lenses 34 and 36 between the mirror 38 and the workpiece 37 , some embodiments that employ collimating lenses may instead place them between the field-of-view deflector and the detector.
  • the system further includes an array of lamps suspended above the workpiece.
  • the number of such lamps will be relatively large, but FIG. 4 depicts only two such lamps, 42 and 44 , which are controlled by illumination-control circuitry not shown in FIG. 4. Operation of these lamps is coordinated with the position of the mirror 38 and thus of the camera's field of view. As that mirror pivots, the workpiece portion 40 of which the camera can form an image on the detector 30 moves about the workpiece's surface.
  • the lamps in the array shine on the workpiece at any given time, but selected ones are prevented from doing so as the portion 40 within the field of view moves about the surface of the workpiece 37 .
  • any lamp is prevented from shining on the workpiece if under the assumption of specular reflection it would be imaged into the field of view. That is, any source that the camera could “see” if the workpiece portion 40 were a mirror is prevented from shining on the workpiece so that the camera receives no specular reflection from it.
  • lamps prevented from shining on the workpiece to prevent specular reflection into the camera typically shine on it again after the field of view moves beyond their images. So a dark region of unlit sources moves about against a background of sources that are lit as the scanning process proceeds.
  • the “sources” can be reflectors, for instance, and “lit” array elements could be the reflectors on which a remote source or sources selectively shine, while the unlit elements would be the reflectors on which the source or sources do not shine.
  • Another approach is to provide continuously operating lamps and selectively operable baffles that can selectively hide lamps from the workpiece; the “lit” elements would be the lamps not hidden.
  • the array elements consist of respective light-emitting diodes (“LEDs”) that are simply turned on to cause them to shine on the workpiece and turned off to prevent them from doing so, and the following description is based on this assumption.
  • Source 42 contributes to the image because any diffuse reflection resulting from ray 48 will result in a plume of rays 52 , of which some, such as rays 54 and 56 , pass through the camera's entrance pupil.
  • the portion 40 viewable by the camera 28 may reach a point from which specular reflection in response to light from source 42 would produce rays that contribute to the image on detector 30 whereas reflection of light from source 44 would no longer do so.
  • source 42 will be turned off and source 44 will typically be turned back on. Note that this operation of turning off selected elements occurs in addition to other illumination-control operations that may be occurring. It may be desirable, for instance, to operate different ones of the sources with differing intensities so as to achieve illumination that is optimally uniform for the currently prevailing camera angle. Also, the sources that “remain” lit may actually be strobed so as to “freeze” the workpiece image despite continuous relative motion between the camera's field of view and the workpiece.
  • Controlling the sources can take any of a number of forms. Since it is well within the skill of those familiar with such optical systems to predict which sources can be “seen” at various angles, an algorithm for converting scan-angle position to lamp selection can readily be written, and the determination can accordingly be made algorithmically in real time. Or that calculation can be made ahead of time to populate a look-up table used to make the real-time conversion from field-of-view position to lamp selection. Alternatively, the look-up table could be populated in accordance with experimental results. Regardless of how the look-up table is populated, it can be used in a straight table look-up or as part of a combination of table look-up, and, say real-time interpolation.
  • FIG. 6 depicts the systems as including a computer 60 that runs a software application 62 for which image data are required.
  • the application may operate various other software modules such as modules 64 , 66 , and 68 , which respectively control the field-of-view deflector, lights, and detector.
  • the computer 60 communicates with appropriate interface circuitry represented by blocks 70 , 72 , and 74 to coordinate these operations.
  • FIG. 6 indicates, an application requiring image data would perform a routine whose object is to acquire image data at a particular location.
  • FIG. 6's block 76 represents entering such a routine.
  • This routine may concurrently call FIG. 5's several processes 64 , 66 , and 68 .
  • the scan-control whose entry block 77 represents would typically perform operations such as computing the deflector positions required to place the field of view in the desired location.
  • Block 78 represents this step. Once the appropriate locations are determined, the scan-control process would cause FIG. 5's scanner interface to move, say, a galvanometer to produce the desired field-of-view deflection.
  • Block 80 represents this operation, after which the process terminates in a step 82 . That step may include setting a flag to indicate that the movement operation has been completed.
  • FIG. 6's block 84 represents entering the light-control process, which includes determining from the commanded field-of-view location which lamps need to be switched on or off.
  • Block 86 represents performing this conversion, which, as was explained above, may involve an algorithmic determination, a table look-up, or a combination of both.
  • the desired light actuations are determined, they are performed by communications with FIG. 5's lighting interface 72 in a step that FIG. 6's block 88 represents.
  • That drawing's block 90 represents ending the process in a step that may involve setting a flag to indicate that the lights have been properly set.
  • the image-processing operation whose entry FIG. 6's block 92 represents, depends on proper illumination and proper positioning of the field-of-view deflector.
  • Block 94 accordingly represents testing the scan-control operation's flag to determine whether the field of view is positioned as required.
  • the process proceeds to step 96 to determine whether the illumination has been set properly. If it has, the process operates FIG. 5's detector interface 74 to obtain the camera's output data, as block 98 indicates, and the resultant data are supplied to the requesting application, as block 100 indicates.
  • FIG. 7 illustrates one example of an apparatus in which the present invention is advantageous.
  • the apparatus in FIG. 7 differs from that of FIG. 4 in that it additionally includes a laser source 102 and a beam splitter 104 .
  • a system of the FIG. 7 type typically will include a module for controlling the laser as well as a laser interface by which that module exercises such control.
  • the beam splitter 104 cooperates with the field-of-view deflector 38 to direct the laser light to a workpiece location in the camera's field of view.
  • the application program may employ the vision system to identify fiducial marks or other identifying features previously made on the workpiece and thereby properly locate the position at which the new mark is to be made.
  • the vision system also may be used to perform quality control on the marking process, possibly in a closed-loop fashion so as to adjust laser-beam positioning in accordance with the results of previous observations.
  • FIG. 7 The system of FIG. 7 is more convenient than some prior-art marking systems.
  • the marking apparatus was located separate from the imaging systems, in which there could be relative movement between the workpiece and the camera and lighting systems.
  • the illustrated system employs the same lens and deflector apparatus as the imaging system, so positions being inspected and marked are readily correlated.

Abstract

The field portion (40) of a workpiece (14) that an electro-optical system (28) can image is deflected by a field-of-view deflector (38) and an array of light sources (42, 44) illuminates the workpiece. As the field of view (40) moves about the workpiece surface, individual sources (42, 44) in the light-source array are so turned on and off that all sources that could be imaged into the field of view by specular reflection are turned off. In this way, proper dark-field illumination is maintained.

Description

    BACKGROUND OF THE INVENTION
  • The present invention is directed to machine vision and in particular to providing illumination for such systems. [0001]
  • Machine vision has been applied to a number of production and testing tasks. In general, workpieces, such as printed-circuit boards, integrated-circuit chips, and other articles of manufacture are brought into the field of view of a camera. The camera typically generates an image in digital form, which digital circuitry normally in the form of a microprocessor and related circuitry processes in accordance with the task to be performed. [0002]
  • In many cases, the workpiece is too large for a practical-sized camera to image with adequate resolution, but this problem is readily solved by taking an image of only a small part of the workpiece at any single time. This yields the requisite resolution, and images of respective segments of the workpiece can be taken as the workpiece is stepped through the camera's field of view. [0003]
  • Although this approach is acceptable in a number of applications, it can be throughput- and accuracy-limiting in some others. There are often practical limits to the speed at which the workpiece can be advanced through the camera's field of view. Additionally, the need for accurate correlation between successive images can impose severe accuracy requirements on the workpiece-advancing system. To a greater or lesser degree, the same limitations apply regardless of whether it is the camera or the workpiece that is moved. [0004]
  • For some applications, a superior solution is to move neither the camera nor the workpiece, but rather to move the camera's field of view by employing deflector mechanisms. Galvanometer-mounted pivoting mirrors, pivoting prisms, and rotating reflector polygons are among the mechanisms commonly employed in optical systems to perform image deflection. Although these still are moving parts, they are ordinarily relatively small and take advantage of optical leverage to change the field of view faster than systems that move the entire workpiece or camera. [0005]
  • Despite this advantage, there is a class of applications to which workers in this field have been slow to apply the field-of-view-deflection approach. One example of this class is the type of application that involves reading laser-scribed marks on workpieces such as semiconductor wafers or electronic-component packages. Marks of that type are hard to detect reliably because they are quite subtle. So considerable effort has been applied to illuminating the workpiece in such a manner as to minimize noise contributed by surface irregularities in non-marked regions. But achieving this result is greatly complicated in systems that use field-of-view deflectors. In systems that move the workpiece or the camera, the illumination apparatus always has the same position with respect to the field of view, so illumination characteristics need to be optimized for that relationship only. In field-of-view-deflector systems, on the other hand, the lighting system would have to be optimized for a wide range of resultant relationships between the lighting system's position and that of the camera's field of view. For some applications, the difficulty of solving this problem has confounded attempts to employ field-of-view deflection. [0006]
  • SUMMARY OF THE INVENTION
  • But we have recognized that imaging results for such systems can be greatly improved by emphasizing the dark-field-illumination aspects of the problem and adapting to it a method previously used to vary dark-field illumination in response to camera-objective changes. [0007]
  • “Dark-field illumination” is an illumination approach that takes advantage of the fact that a specularly reflecting feature in the midst of a diffuse-reflecting background will appear dark if that feature's specular reflection images the main light source outside the camera's field of view. That is, since the angle of reflection of all light striking a specular reflector equals that light's angle of incidence, the reflected light will not pass through the camera's entrance pupil unless that sole angle of reflection yields that result. But light striking the diffusely reflecting background is reflected in a range of angles, so a substantial amount may enter the camera even if the specular-reflection angle would not result in a ray that does. The specularly reflecting feature is therefore readily identified because it appears dark against a lighter background. [0008]
  • Because of this effect, there is a rich store of work directed to dark-field illumination, and we have recognized that properly adapting it can yield significantly improved results for field-of-view-deflection systems. It had long ago been recognized in systems such as that described in U.S. Pat. No. 4,604,648 to Kley that individual elements of a light-source array should be selectively operated in accordance with the particular objective or zoom position of the imaging camera. We have adapted this concept by so operating elements of a light-source array that the position within the array at which one or more light sources is not lit moves around the array as the deflector changes the field of view's position. [0009]
  • Specifically, as the deflector so moves as to change the field of view on the workpiece, we selectively turn off any elements of the source array that will be imaged into the camera's field if specular reflection occurs in the portion of the workpiece within that field of view. As the field of view moves so that an element previously thus imaged no longer is, the element is typically turned on again.[0010]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention description below refers to the accompanying drawings, of which: [0011]
  • FIG. 1 is a diagram that illustrates specular reflection; [0012]
  • FIG. 2 is a similar diagram that illustrates diffuse reflection; [0013]
  • FIG. 3 is a diagram that illustrates specularly reflecting indicia in a diffusely reflecting background; [0014]
  • FIG. 4 is a diagram of a vision system that employs the present invention's teachings; [0015]
  • FIG. 5 is a block diagram of the control system that an application using the FIG. 4 system may employ; [0016]
  • FIG. 6 is a diagram of the processes that the system of FIG. 5 performs; and [0017]
  • FIG. 7 is a diagram of the system that employs the present invention's teachings in a laser-scribing system.[0018]
  • DETAILED DESCRIPTION OF AN ILLUSTRATIVE EMBODIMENT
  • Before we describe the system of the present invention, we briefly review the concept of dark-field illumination. In FIG. 1, an [0019] incident ray 12 strikes a specularly reflecting surface 14 at an angle θi with respect to the normal 16 to that surface. If the surface 14 is a mirror or other specularly reflecting surface, essentially all light that strikes the surface at the angle of incidence θi reflects at an angle of reflection θo equal to the angle of incidence θi. The point 20 at which the light ray strikes the surface 14 may be located within the field of view of a camera, but that light will not contribute to the camera image unless ray 18 extends through that camera's entrance pupil. For the purposes of this discussion, we will assume that it does not. So point 20 is not illuminated so far as the camera is concerned.
  • Now consider FIG. 2, where we change the specular-reflection assumption and instead assume that the [0020] target surface 14 is a diffuse reflector. In that case, the reflection of ray 12 includes not only ray 18, whose angle of reflection equals the angle of incidence, but also a plume of other rays, such as ray 22, whose angles with the normal differ from that of the incident ray. The camera pupil may be so positioned that it receives some of these rays even if it does not receive ray 18. So spot 20 is illuminated from the camera's point of view, even though it would not be if it reflected only specularly.
  • If the [0021] surface 14 includes both FIG. 3's specularly reflecting indicia 24 and its diffuse background 26, an image is formed in which the indicia are readily distinguished if the camera is positioned with an entrance pupil that does not receive the specularly reflected rays but does receive some of the rays that result from diffuse reflection. Of course, the indicia may not have perfectly mirror-like surfaces, as the drawing suggests, but they will be distinguishable so long as they yield light plumes that are significantly more compact than the plumes produces by the indicia's surroundings.
  • FIG. 4 depicts an apparatus for applying the technique of dark-field illumination to a field-of-view-deflection system. In an electro-[0022] optical process head 28 is mounted a camera that includes a detector 30, lenses 32, 34, and 36, and a field-of-view deflector depicted for purposes of illustration as including a galvanometer-mounted mirror 38. Detector 30 will typically take the form of an array of charge-coupled devices, whose outputs are converted to digital form for processing in accordance with the particular application to which the system is applied.
  • Other detection devices can be employed, of course, as can lens systems different from that of FIG. 4. That lens system includes an image-forming [0023] lens 32 spaced by its focal length from the detector 30. It also includes collimating lenses 34 and 36, which collimate the light from a target region in which a workpiece 37 is disposed; i.e., lenses 34 and 36 image workpiece 37 at infinity. But other embodiments may, say, include only a single lens, corresponding to lens 32 but positioned to image the workpiece 37 on the detector 30.
  • Extending downward to infinity from the electro-[0024] optical head 28 is a camera field of view consisting of a volume of points in space from which the lenses and field-of-view deflector provide optical paths to the detector 30. The field of view intersects the target region so that the camera can “see” a portion 40 of the workpiece. The field-of-view deflector's position determines where on the workpiece portion 40 falls; i.e., the field viewed by the detector can change without relative movement between the camera and the workpiece, although such movement may occur, too. For instance, the workpiece may be on a conveyor, which typically would not be capable of moving as quickly as the field of view.
  • Although the drawing depicts the field-of-view deflector as comprising only a single mirror, many embodiments of the present invention will employ two, which deflect the camera's field of view along mutually orthogonal axes. Also, although the drawing shows [0025] lenses 34 and 36 between the mirror 38 and the workpiece 37, some embodiments that employ collimating lenses may instead place them between the field-of-view deflector and the detector.
  • To illuminate the [0026] workpiece 37 so that the camera can form an adequate image, the system further includes an array of lamps suspended above the workpiece. For most of the present invention's embodiments, the number of such lamps will be relatively large, but FIG. 4 depicts only two such lamps, 42 and 44, which are controlled by illumination-control circuitry not shown in FIG. 4. Operation of these lamps is coordinated with the position of the mirror 38 and thus of the camera's field of view. As that mirror pivots, the workpiece portion 40 of which the camera can form an image on the detector 30 moves about the workpiece's surface.
  • In accordance with the present invention, most or a significant portion of the lamps in the array shine on the workpiece at any given time, but selected ones are prevented from doing so as the [0027] portion 40 within the field of view moves about the surface of the workpiece 37. As that portion 40 moves, any lamp is prevented from shining on the workpiece if under the assumption of specular reflection it would be imaged into the field of view. That is, any source that the camera could “see” if the workpiece portion 40 were a mirror is prevented from shining on the workpiece so that the camera receives no specular reflection from it. As the mirror 38 continues moving and continues to deflect the camera's field of view, lamps prevented from shining on the workpiece to prevent specular reflection into the camera typically shine on it again after the field of view moves beyond their images. So a dark region of unlit sources moves about against a background of sources that are lit as the scanning process proceeds.
  • Any way of achieving such a dark region moving about a background of sources can be used. The “sources” can be reflectors, for instance, and “lit” array elements could be the reflectors on which a remote source or sources selectively shine, while the unlit elements would be the reflectors on which the source or sources do not shine. Another approach is to provide continuously operating lamps and selectively operable baffles that can selectively hide lamps from the workpiece; the “lit” elements would be the lamps not hidden. Preferably, though, the array elements consist of respective light-emitting diodes (“LEDs”) that are simply turned on to cause them to shine on the workpiece and turned off to prevent them from doing so, and the following description is based on this assumption. [0028]
  • To determine which lamps are in the “viewed” subset and should thus be turned off, we assume that the workpiece is a mirror. Under this assumption, an [0029] example ray 45 emitted by source 44 will be reflected along path 46 to contribute to formation of the image on detector 30. That is, if region 40 were a mirror, at least part of source 44 would be imaged into the camera's field of view. Any source for which this is true is part of the viewed subset and is therefore turned off. Source 42, on the other hand, remains lit because specular reflection of any rays, such as ray 48, that strike region 40 will result in rays, such as ray 50, that do not enter the camera: that source belongs in the “unviewed” set. (We note in passing that the “mirror” 40 need not have the horizontal orientation that the drawing depicts; any expected workpiece-surface angle can be used for determining the viewed and unviewed sets' respective memberships.) Source 42 contributes to the image because any diffuse reflection resulting from ray 48 will result in a plume of rays 52, of which some, such as rays 54 and 56, pass through the camera's entrance pupil.
  • As deflection continues, the [0030] portion 40 viewable by the camera 28 may reach a point from which specular reflection in response to light from source 42 would produce rays that contribute to the image on detector 30 whereas reflection of light from source 44 would no longer do so. When the deflector reaches such a point, source 42 will be turned off and source 44 will typically be turned back on. Note that this operation of turning off selected elements occurs in addition to other illumination-control operations that may be occurring. It may be desirable, for instance, to operate different ones of the sources with differing intensities so as to achieve illumination that is optimally uniform for the currently prevailing camera angle. Also, the sources that “remain” lit may actually be strobed so as to “freeze” the workpiece image despite continuous relative motion between the camera's field of view and the workpiece.
  • Controlling the sources can take any of a number of forms. Since it is well within the skill of those familiar with such optical systems to predict which sources can be “seen” at various angles, an algorithm for converting scan-angle position to lamp selection can readily be written, and the determination can accordingly be made algorithmically in real time. Or that calculation can be made ahead of time to populate a look-up table used to make the real-time conversion from field-of-view position to lamp selection. Alternatively, the look-up table could be populated in accordance with experimental results. Regardless of how the look-up table is populated, it can be used in a straight table look-up or as part of a combination of table look-up, and, say real-time interpolation. [0031]
  • FIGS. 5 and 6 show how machine-vision applications will typically implement the present invention's teachings. Although it is apparent that dedicated “random logic” could be separately designed for each of the functions to be described below, most embodiments will employ programming to configure common microprocessor circuitry to act as the various circuits for performing these operations. So FIG. 6 depicts the systems as including a [0032] computer 60 that runs a software application 62 for which image data are required. To this end, the application may operate various other software modules such as modules 64, 66, and 68, which respectively control the field-of-view deflector, lights, and detector. Under direction from these modules, the computer 60 communicates with appropriate interface circuitry represented by blocks 70, 72, and 74 to coordinate these operations.
  • As FIG. 6 indicates, an application requiring image data would perform a routine whose object is to acquire image data at a particular location. FIG. 6's [0033] block 76 represents entering such a routine. This routine may concurrently call FIG. 5's several processes 64, 66, and 68. As FIG. 6 indicates, the scan-control whose entry block 77 represents would typically perform operations such as computing the deflector positions required to place the field of view in the desired location. Block 78 represents this step. Once the appropriate locations are determined, the scan-control process would cause FIG. 5's scanner interface to move, say, a galvanometer to produce the desired field-of-view deflection. Block 80 represents this operation, after which the process terminates in a step 82. That step may include setting a flag to indicate that the movement operation has been completed.
  • FIG. 6's [0034] block 84 represents entering the light-control process, which includes determining from the commanded field-of-view location which lamps need to be switched on or off. Block 86 represents performing this conversion, which, as was explained above, may involve an algorithmic determination, a table look-up, or a combination of both. Once the desired light actuations are determined, they are performed by communications with FIG. 5's lighting interface 72 in a step that FIG. 6's block 88 represents. That drawing's block 90 represents ending the process in a step that may involve setting a flag to indicate that the lights have been properly set.
  • The image-processing operation, whose entry FIG. 6's [0035] block 92 represents, depends on proper illumination and proper positioning of the field-of-view deflector. Block 94 accordingly represents testing the scan-control operation's flag to determine whether the field of view is positioned as required. Once it has determined that the desired position has been reached, the process proceeds to step 96 to determine whether the illumination has been set properly. If it has, the process operates FIG. 5's detector interface 74 to obtain the camera's output data, as block 98 indicates, and the resultant data are supplied to the requesting application, as block 100 indicates.
  • The particular nature of whatever [0036] application 62 the invention supports is not a feature of the invention, but FIG. 7 illustrates one example of an apparatus in which the present invention is advantageous. The apparatus in FIG. 7 differs from that of FIG. 4 in that it additionally includes a laser source 102 and a beam splitter 104. And, in addition to the program modules depicted in FIG. 5, a system of the FIG. 7 type typically will include a module for controlling the laser as well as a laser interface by which that module exercises such control. To mark the workpiece, the beam splitter 104 cooperates with the field-of-view deflector 38 to direct the laser light to a workpiece location in the camera's field of view.
  • The application program may employ the vision system to identify fiducial marks or other identifying features previously made on the workpiece and thereby properly locate the position at which the new mark is to be made. The vision system also may be used to perform quality control on the marking process, possibly in a closed-loop fashion so as to adjust laser-beam positioning in accordance with the results of previous observations. [0037]
  • The system of FIG. 7 is more convenient than some prior-art marking systems. In such prior-art systems the marking apparatus was located separate from the imaging systems, in which there could be relative movement between the workpiece and the camera and lighting systems. In contrast, the illustrated system employs the same lens and deflector apparatus as the imaging system, so positions being inspected and marked are readily correlated. [0038]
  • It is another aspect of the present invention to reverse the arrangement described above in order to provide “bright field” illumination. This type of illumination is valuable when the background for the relatively specular indicia is relatively unreflective. In such a situation, it is preferable to emphasize reflection from the relatively specular indicia without unnecessarily illuminating the background. For this purpose, the sources that specular reflection would make visible at the source are the lit ones, and the others are the ones that are not lit. [0039]
  • From the foregoing description, it is apparent that the present invention can be employed in a wide range of embodiments and constitutes a significant advance in the art.[0040]

Claims (14)

What is claimed is:
1. A method of imaging portions of a workpiece located within a field of view of an imaging system, the workpiece having features which are to be detected with the imaging system, the method comprising:
illuminating a first portion of the workpiece from a first combination of illumination positions and reduced illumination positions so as to limit a first distribution of energy reflected specularly from a workpiece location corresponding to the first portion;
generating output signals to produce image data representative of an image of the first portion;
illuminating a second portion of the workpiece from a second combination of illumination positions and reduced illumination positions so as to limit a second distribution of energy reflected specularly from a workpiece location corresponding to the second portion, the second combination being non-identical to the first combination as a result of a position of the workpiece portion within the field of view of the imaging system;
generating output signals to produce image data representative of an image of the second portion; and
detecting the features in images of the first and second image portions based on similarities and differences in the images.
2. The method of claim 1 wherein illuminating the first portion and illuminating the second portion are carried out concurrently.
3. The method of claim 1 further wherein the surface features are machine readable marks.
4. The method of claim 1 further comprising controllably positioning the field of view of the imaging system after illuminating the first portion so as to view the second portion with the imaging system.
5. The method of claim 4 wherein controllably positioning is carried out with a computer-controlled galvanometer-mounted pivotal mirror having a maximum deflection angle, wherein a maximum field of view of the imaging system is limited by the mirror deflection angle.
6. The method of claim 3 further comprising moving the workpiece relative to the imaging system after illuminating the first portion so as to view the second portion with the imaging system.
7. The method of claim 6 wherein moving is carried out with an X-Y stage.
8. The method of claim 1 wherein the features are marks on a semiconductor wafer.
9. The method of claim 1 wherein the features are laser scribed marks on the workpiece, detecting is carried out with by means of a machine vision processor, and wherein illuminating the first and second combinations of illumination positions and reduced illumination positions introduces sufficient contrast between the features and a background to detect the features at any angular location within a field of view of the imaging system.
10. The method of claim 1 further including irradiating the workpiece with a laser beam to modify a workpiece surface property wherein a feature is produced by interaction of the laser beam and the workpiece.
11. A method of imaging portions of a workpiece comprising:
illuminating the workpiece from an illumination position so as to produce reflected energy from at least first and second portions of the workpiece;
attenuating, at a first location between an illumination position and an image location, a first portion of the reflected energy so as to limit the distribution of reflected energy incident on an image location corresponding to a first portion of the workpiece;
generating output signals to produce image data representative of an image of the first portion;
attenuating, at a second location between an illumination position and an image location, a second portion of the reflected energy so as to limit the distribution of reflected energy incident on an image location corresponding to a second portion of the workpiece;
generating output signals to produce image data representative of an image of the second portion; and
detecting the features in images of the first and second image portions based on similarities and differences in the images.
12. The method of claim 11 wherein attenuating the first and second portions is carried out concurrently.
13. The method of claim 11 further comprising irradiating the workpiece with a laser beam to modify a workpiece surface property wherein a surface feature is produced by interaction of the laser beam with the workpiece.
14. The method of claim 11 wherein attenuating comprises controllably positioning at least one baffle in a path between an illumination position and an image location.
US10/657,286 1999-04-27 2003-09-08 Programmable illuminator for vision system Abandoned US20040047140A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/657,286 US20040047140A1 (en) 1999-04-27 2003-09-08 Programmable illuminator for vision system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/301,002 US6633338B1 (en) 1999-04-27 1999-04-27 Programmable illuminator for vision system
US10/657,286 US20040047140A1 (en) 1999-04-27 2003-09-08 Programmable illuminator for vision system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/301,002 Continuation US6633338B1 (en) 1999-04-27 1999-04-27 Programmable illuminator for vision system

Publications (1)

Publication Number Publication Date
US20040047140A1 true US20040047140A1 (en) 2004-03-11

Family

ID=28791804

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/301,002 Expired - Lifetime US6633338B1 (en) 1999-04-27 1999-04-27 Programmable illuminator for vision system
US10/657,286 Abandoned US20040047140A1 (en) 1999-04-27 2003-09-08 Programmable illuminator for vision system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/301,002 Expired - Lifetime US6633338B1 (en) 1999-04-27 1999-04-27 Programmable illuminator for vision system

Country Status (1)

Country Link
US (2) US6633338B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050197672A1 (en) * 2000-02-04 2005-09-08 Freeman Gary A. Integrated resuscitation
US20100155379A1 (en) * 2008-12-19 2010-06-24 Applied Materials, Inc. Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication
US20100212182A1 (en) * 2005-06-06 2010-08-26 Krishna Singh Method and apparatus for dehydrating high level waste based on dew point temperature measurements
CN104148810A (en) * 2013-01-28 2014-11-19 先进科技新加坡有限公司 Method of radiatively grooving a semiconductor substrate
AU2012244240B2 (en) * 2007-08-29 2015-04-30 Scientific Games, Llc Enhanced scanner design
CN112858169A (en) * 2021-02-01 2021-05-28 苏州维嘉科技股份有限公司 Light source detection device, light source lighting method thereof and light source control device
WO2023166643A1 (en) * 2022-03-03 2023-09-07 三菱電機株式会社 Appearance inspection device and appearance inspection method

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7028899B2 (en) * 1999-06-07 2006-04-18 Metrologic Instruments, Inc. Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target
JP2000090233A (en) * 1998-09-08 2000-03-31 Olympus Optical Co Ltd Image processor
US6788411B1 (en) * 1999-07-08 2004-09-07 Ppt Vision, Inc. Method and apparatus for adjusting illumination angle
GB2356996A (en) * 1999-12-03 2001-06-06 Hewlett Packard Co Improvements to digital cameras
IL133696A (en) * 1999-12-23 2006-04-10 Orbotech Ltd Cam reference inspection of multi-color and contour images
CN100544877C (en) * 2003-10-17 2009-09-30 通明国际科技公司 Flexible scan field
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7344273B2 (en) 2005-03-22 2008-03-18 Binary Works, Inc. Ring light with user manipulable control
US7315361B2 (en) * 2005-04-29 2008-01-01 Gsi Group Corporation System and method for inspecting wafers in a laser marking system
CN101600978B (en) * 2005-08-26 2012-09-05 卡姆特有限公司 Device and method for controlling an angular coverage of a light beam
US7911444B2 (en) 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display
CN101305275B (en) 2005-11-30 2010-08-18 株式会社尼康 Observing device
US7630002B2 (en) * 2007-01-05 2009-12-08 Microsoft Corporation Specular reflection reduction using multiple cameras
US8212857B2 (en) 2007-01-26 2012-07-03 Microsoft Corporation Alternating light sources to reduce specular reflection
DE102007063453B3 (en) * 2007-12-28 2009-10-08 Göpel electronic GmbH Assembled printed circuit board optical inspection arrangement for quality control, has control system connected with color filters to ensure that inspection fields are optically represented in successive manner on image sensor surface
WO2011092493A2 (en) * 2010-01-26 2011-08-04 De Beers Centenary AG Gemstone sparkle analysis
US20120274838A1 (en) * 2010-10-15 2012-11-01 Triune Ip Llc Illumination and image capture
EP2866432B1 (en) * 2012-06-20 2017-01-04 Hitachi, Ltd. Automatic image compositing device
DE102013017795C5 (en) * 2013-10-25 2018-01-04 Lessmüller Lasertechnik GmbH Process monitoring method and apparatus
CA2928878C (en) * 2013-11-04 2020-06-23 Tomra Sorting Nv Inspection apparatus
CN104023179B (en) * 2014-06-27 2017-08-15 北京智谷睿拓技术服务有限公司 Image formation control method and equipment
US10926416B2 (en) * 2018-11-21 2021-02-23 Ford Global Technologies, Llc Robotic manipulation using an independently actuated vision system, an adversarial control scheme, and a multi-tasking deep learning architecture

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4475796A (en) * 1981-03-13 1984-10-09 Olympus Optical Co., Ltd. Epidark illumination system
US4484069A (en) * 1981-10-15 1984-11-20 St. Regis Paper Company Apparatus and method for sensing distance
US4604648A (en) * 1984-10-12 1986-08-05 Kley Victor B Electronic viewing system for integrated circuit packages
US4706168A (en) * 1985-11-15 1987-11-10 View Engineering, Inc. Systems and methods for illuminating objects for vision systems
US4893223A (en) * 1989-01-10 1990-01-09 Northern Telecom Limited Illumination devices for inspection systems
US4918284A (en) * 1988-10-14 1990-04-17 Teradyne Laser Systems, Inc. Calibrating laser trimming apparatus
US4972093A (en) * 1987-10-09 1990-11-20 Pressco Inc. Inspection lighting system
US5038258A (en) * 1989-03-02 1991-08-06 Carl-Zeiss-Stiftung Illuminating arrangement for illuminating an object with incident light
US5129009A (en) * 1990-06-04 1992-07-07 Motorola, Inc. Method for automatic semiconductor wafer inspection
US5185638A (en) * 1991-04-26 1993-02-09 International Business Machines Corporation Computer controlled, multiple angle illumination system
US5515452A (en) * 1992-12-31 1996-05-07 Electroglas, Inc. Optical character recognition illumination method and system
US5519496A (en) * 1994-01-07 1996-05-21 Applied Intelligent Systems, Inc. Illumination system and method for generating an image of an object
US5585616A (en) * 1995-05-05 1996-12-17 Rockwell International Corporation Camera for capturing and decoding machine-readable matrix symbol images applied to reflective surfaces
US5615013A (en) * 1995-06-27 1997-03-25 Virtek Vision Corp. Galvanometer and camera system
US5684530A (en) * 1993-02-16 1997-11-04 Northeast Robotics, Inc. Continuous diffuse illumination method and apparatus
US5720424A (en) * 1995-05-30 1998-02-24 Kabushiki Kaisha Shinkawa Wire bonding apparatus
US5724139A (en) * 1996-06-28 1998-03-03 Polaroid Corporation Dark field, photon tunneling imaging probes
US5737122A (en) * 1992-05-01 1998-04-07 Electro Scientific Industries, Inc. Illumination system for OCR of indicia on a substrate
US5799135A (en) * 1994-06-28 1998-08-25 Fanuc, Ltd. Robot controlling method and apparatus using laser sensor
US5822053A (en) * 1995-04-25 1998-10-13 Thrailkill; William Machine vision light source with improved optical efficiency
US5965042A (en) * 1996-07-24 1999-10-12 Miyachi Technos Corporation Method and apparatus for laser marking with laser cleaning

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4475796A (en) * 1981-03-13 1984-10-09 Olympus Optical Co., Ltd. Epidark illumination system
US4484069A (en) * 1981-10-15 1984-11-20 St. Regis Paper Company Apparatus and method for sensing distance
US4604648A (en) * 1984-10-12 1986-08-05 Kley Victor B Electronic viewing system for integrated circuit packages
US4706168A (en) * 1985-11-15 1987-11-10 View Engineering, Inc. Systems and methods for illuminating objects for vision systems
US4972093A (en) * 1987-10-09 1990-11-20 Pressco Inc. Inspection lighting system
US4918284A (en) * 1988-10-14 1990-04-17 Teradyne Laser Systems, Inc. Calibrating laser trimming apparatus
US4893223A (en) * 1989-01-10 1990-01-09 Northern Telecom Limited Illumination devices for inspection systems
US5038258A (en) * 1989-03-02 1991-08-06 Carl-Zeiss-Stiftung Illuminating arrangement for illuminating an object with incident light
US5129009A (en) * 1990-06-04 1992-07-07 Motorola, Inc. Method for automatic semiconductor wafer inspection
US5185638A (en) * 1991-04-26 1993-02-09 International Business Machines Corporation Computer controlled, multiple angle illumination system
US5737122A (en) * 1992-05-01 1998-04-07 Electro Scientific Industries, Inc. Illumination system for OCR of indicia on a substrate
US5515452A (en) * 1992-12-31 1996-05-07 Electroglas, Inc. Optical character recognition illumination method and system
US5684530A (en) * 1993-02-16 1997-11-04 Northeast Robotics, Inc. Continuous diffuse illumination method and apparatus
US5519496A (en) * 1994-01-07 1996-05-21 Applied Intelligent Systems, Inc. Illumination system and method for generating an image of an object
US5799135A (en) * 1994-06-28 1998-08-25 Fanuc, Ltd. Robot controlling method and apparatus using laser sensor
US5822053A (en) * 1995-04-25 1998-10-13 Thrailkill; William Machine vision light source with improved optical efficiency
US5585616A (en) * 1995-05-05 1996-12-17 Rockwell International Corporation Camera for capturing and decoding machine-readable matrix symbol images applied to reflective surfaces
US5720424A (en) * 1995-05-30 1998-02-24 Kabushiki Kaisha Shinkawa Wire bonding apparatus
US5615013A (en) * 1995-06-27 1997-03-25 Virtek Vision Corp. Galvanometer and camera system
US5724139A (en) * 1996-06-28 1998-03-03 Polaroid Corporation Dark field, photon tunneling imaging probes
US5965042A (en) * 1996-07-24 1999-10-12 Miyachi Technos Corporation Method and apparatus for laser marking with laser cleaning

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050197672A1 (en) * 2000-02-04 2005-09-08 Freeman Gary A. Integrated resuscitation
US20100212182A1 (en) * 2005-06-06 2010-08-26 Krishna Singh Method and apparatus for dehydrating high level waste based on dew point temperature measurements
AU2012244240B2 (en) * 2007-08-29 2015-04-30 Scientific Games, Llc Enhanced scanner design
US20100155379A1 (en) * 2008-12-19 2010-06-24 Applied Materials, Inc. Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication
WO2010080595A2 (en) * 2008-12-19 2010-07-15 Applied Materials, Inc. Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication
WO2010080595A3 (en) * 2008-12-19 2010-10-14 Applied Materials, Inc. Illumination methods and systems for laser scribe detection and alignment in thin film solar cell fabrication
CN104148810A (en) * 2013-01-28 2014-11-19 先进科技新加坡有限公司 Method of radiatively grooving a semiconductor substrate
CN112858169A (en) * 2021-02-01 2021-05-28 苏州维嘉科技股份有限公司 Light source detection device, light source lighting method thereof and light source control device
WO2023166643A1 (en) * 2022-03-03 2023-09-07 三菱電機株式会社 Appearance inspection device and appearance inspection method

Also Published As

Publication number Publication date
US6633338B1 (en) 2003-10-14

Similar Documents

Publication Publication Date Title
US6633338B1 (en) Programmable illuminator for vision system
EP1581781B1 (en) Method and apparatus for simultaneous 2-d and topographical inspection
US6464126B2 (en) Bonding apparatus and bonding method
US4608494A (en) Component alignment apparatus
US20060209299A1 (en) Inspection lighting head system and method of operation
JP2750953B2 (en) Barcode imaging method
KR950024099A (en) How to create images of lighting systems and objects
US20110090333A1 (en) High speed optical inspection system with adaptive focusing
CN101603926B (en) Multi-surface detection system and method
KR20030015207A (en) Imaging system
JP5807772B2 (en) Defect detection apparatus and method
CN1243970C (en) Scanning head and outer inspection method and apparatus capable of using said scanning head
JPH11183389A (en) Observing device
KR20110069058A (en) Apparatus and method for optically converting a three-dimensional object into a two-dimensional planar image
TWI697662B (en) Illumination system, inspection tool with illumination system, method for inspecting an object, and method of operating an illumination system
US20070024846A1 (en) Device for Dark Field Illumination and Method for Optically Scanning of Object
JP2005091049A (en) Light irradiator for image processing and light irradiation method for image processing
KR101581777B1 (en) Multiple surface inspection system and method
JP2000028320A (en) Image recognizing equipment and image recognizing method
JP2001522997A (en) Apparatus and method for detecting the position of a component and / or for detecting the position of a connection of a component, and a mounting head having a device for detecting the position of a component and / or detecting the position of a connection of a component
US20040114035A1 (en) Focusing panel illumination method and apparatus
JPH0545142A (en) Method for inspecting surface state
JPH11242002A (en) Observing device
JP2013007588A (en) Defect detection device and its method
JP2021148633A (en) Illumination device, illumination device control method, exterior appearance inspection device, and program

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: ELECTRO SCIENTIFIC INDUSTRIES, INC., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GSI GROUP CORPORATION;GSI GROUP INC;REEL/FRAME:030582/0160

Effective date: 20130503

AS Assignment

Owner name: ELECTRO SCIENTIFIC INDUSTRIES, INC., OREGON

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION SERIAL NUMBER 11776904 PREVIOUSLY RECORDED ON REEL 030582 FRAME 0160. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:GSI GROUP CORPORATION;GSI GROUP INC.;REEL/FRAME:056424/0287

Effective date: 20130503