WO1997021189A1 - Edge peak boundary tracker - Google Patents

Edge peak boundary tracker Download PDF

Info

Publication number
WO1997021189A1
WO1997021189A1 PCT/US1996/012954 US9612954W WO9721189A1 WO 1997021189 A1 WO1997021189 A1 WO 1997021189A1 US 9612954 W US9612954 W US 9612954W WO 9721189 A1 WO9721189 A1 WO 9721189A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
finding
line
contour
parallel
Prior art date
Application number
PCT/US1996/012954
Other languages
French (fr)
Inventor
David Michael
Original Assignee
Cognex Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cognex Corporation filed Critical Cognex Corporation
Priority to JP52123797A priority Critical patent/JP2001519934A/en
Publication of WO1997021189A1 publication Critical patent/WO1997021189A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Definitions

  • This invention relates generally to computer vision, and particularly
  • an image is formed to represent a scene of interest that
  • an image typically consists of a two-dimensional array of picture
  • Pixels can be square, rectangular, or hexagonal,
  • Each pixel is associated with a gray value, and each gray
  • the gray value of each pixel can range from 0 to 255, for example.
  • the gray value of each pixel can range from 0 to 255, for example.
  • the visual properties of an object are usually different from the visual properties of the background.
  • the gray value of the pixel is above the threshold gray value, then it can be
  • the threshold is static, non-adaptive, and is applied
  • object portions of the image can easily be misclassified as
  • object pixel if it is below the threshold, it is a background pixel.
  • transition from object pixel to background pixel is the boundary, and it can
  • an edge contour indicates the boundary between an object
  • edges are combined into “edge contours” or “boundary chains” which
  • This step is also
  • this method does not provide
  • edge contour is progressively formed by
  • each one- dimensional edge position being determined by processing a set of pixels
  • the invention is a method for finding an edge
  • contour point includes locating the position of an edge in a one-dimensional
  • edge in a one-dimensional gray value signal includes the step of finding an
  • the one-dimensional edge positions can be determined by
  • each edge contour point can be found to sub-pixel
  • the invention overcomes the drawbacks of the aforementioned
  • a threshold gray value is not needed to define the boundary
  • the invention can be used to partition an image of a scene into
  • An edge contour is a continuous curve formed by a sequence of
  • the invention is particularly useful for analysis of images of back- lit
  • the invention can be used, for example, to provide information
  • Fig. 1 shows an image of an object having two gray value regions
  • Fig. IA shows the same object and background as in Fig. 1 , also
  • Fig. 2 shows the same object and background as in Figs. 1 and IA,
  • Fig. 3 shows the eight directions in the neighborhood of a single
  • Fig. 4 is a plot of a gray value signal versus position taken across a
  • Fig. 4 A is a plot of the first derivative of the signal of Fig. 4,
  • Fig. 4B is a plot showing the three first derivative values of Fig. 4A,
  • Figs. 5 A-M' show a variety of pixel sets which are processed to
  • Fig. 6 is representation of a boundary, showing a sequence of sets of
  • Fig. 7 is a flow chart illustrating the method of the invention.
  • the invention finds edge contours, and so it can
  • the method of the invention can discriminate between edges
  • Fig. 1 illustrates another advantage of the present
  • Fig. IA shows the same object 10 and background 12, 12', 12" as in
  • Fig. 1 A pair of dashed lines 14 are included to indicate a region of
  • the invention can find the boundary of an object without processing every pixel
  • Fig. 2 shows the same object and background as in Figs. 1 and IA.
  • An image consists of a regular array of pixels, which can be square,
  • Fig. 3 shows the eight
  • axes are also axes of symmetry of the array of square pixels.
  • the invention exploits two axes of
  • a first difference signal is computed along one or
  • Each pixel can have a gray value that
  • a pixel can be black (0), white (255), or any shade of gray in
  • a "gray value signal” is here defined
  • the path can be straight or curved.
  • a straight path was traversed, such as a purely
  • the transition is a boundary
  • first difference signal i.e., plotting the change in gray value from pixel
  • each of three pairs of neighboring pixels is indicated as the height of a
  • a parabola 36 can be fit to the three
  • inte ⁇ olation such as a fourth order curve.
  • filtering methods include the first difference operated described above, and
  • Non-linear methods include the square of the first difference, and the
  • Statistical methods include finding the window position of maximum
  • Matched filtering methods include finding the position that minimizes
  • one-dimensional gray value signal will provide a position value, but the
  • present invention will provide the location of an edge contour.
  • FIG. 6 example of a generally horizontal edge contour is shown in Fig. 6.
  • region 40 of white pixels meets a region of gray pixels 42 along an edge
  • Each square 46 represents a pixel of an image that represents
  • each pixel is the average gray value over the area of each square 46.
  • This set 48 is shown in Fig. 5 A, and includes
  • the pixel set 48 is characterized by having an origin 58, here residing at the intersection of the left wall and the mid-
  • the origin 58 is used as a reference point for
  • the pixel set 48 also specifies changes in position of the pixel set 48.
  • the pixel set 48 also specifies changes in position of the pixel set 48.
  • the position of an edge contour point can be anywhere along the midline
  • FIGs. 5C, 5D, 5M, and 5M' illustrate that there are
  • the invention provides a series of edge contour points that together
  • edge contour can correspond to a boundary between regions in the
  • a dark region can represent a
  • contour can represent the edge of the wafer.
  • a curve can be fit to the
  • edge contour between contour points For example, a circle can be fit
  • each imaginary line is not parallel to the direction of the
  • FIG. 5A - 5M ⁇ a variety of pixel sets are shown.
  • Each pixel set is shown with a direction arrow 72.
  • the direction arrow 72 is shown with a direction arrow 72.
  • pixel set can primarily be used.
  • the position and direction of the next edge contour point For example, the
  • pixel set of Figs.5 A or 5 A' can be used to find the sequence of edge
  • contour points 68 that together indicate the position of the edge contour 44
  • pixel set in Figs.5 A and 5A' can also be
  • NE northeast
  • SE southeast
  • Each arrow 72 in Fig.6 indicates the direction of the current edge
  • each circle 68 indicates the position of the current edge
  • contour point In general, the direction and position of the current edge
  • contour point are used to determine the next pixel set, e. g., one selected
  • the position of a pixel set is defined as the position of its origin
  • pixels along more than one imaginary line such as the pixel sets shown in
  • Figs.5E - 5L can advantageously be used.
  • pixel sets shown in Figs.5E - 5L each include two subsets of
  • each imaginary line 74 is parallel with respect to a principle
  • each magnitude of a one-dimensional edge is compared with a noise-filter threshold, the
  • threshold being set so as to reduce the likelihood that a spurious one-
  • edge contour point the magnitude of a one-dimensional edge found along a
  • first imaginary line 74 is compared with the magnitude of a one-
  • contour point is determined by ascertaining which of the edge positions 76,
  • contour point moves North one pixel, and West one pixel. If the right-
  • edge contour point is SE, and the position of the current edge contour point
  • position of the edge contour point is selected as follows: select a position d at is intermediate the positions of each of the proposed edge contour
  • the selected position being the weighted average of the positions of
  • the edge contour including a sequence
  • contour point another plurality of pixels is then selected (80), and the
  • edges peaks e.g., imaginary lines pe ⁇ endicular to
  • one or more pixels can be added to the sub-
  • the peak can be sought again. If it a peak is not found after one or more pixels have been added, the direction of the imaginary line can be changed,
  • one-dimensional edge position includes the step of locating the peak of a
  • first difference signal includes the
  • step of using the first difference signal to find an interpolated peak
  • the first difference signal includes at least three data points.
  • circuit chip Also, the invention also includes the method steps explained
  • magnetic media such as a magnetic disk or tape

Abstract

A method and apparatus for finding an edge contour (44) in an image of an object (10) is provided that is robust against object/background misclassification due to non-uniform illumination across an image, while also being computationally efficient, and avoiding the need to select a classification threshold. The invention can be used to partition an image of a scene into object regions and background regions, or foreground regions and background regions, using edge contours found in the image. The invention is particularly useful for analysis of images of back-lit objects. The edge contour (44) is progressively formed by finding a sequence of one-dimensional edge positions (68), each one-dimensional edge position (68) being determined by processing (82) a set of pixels arranged along at least one imaginary line.

Description

EDGE PEAK BOUNDARY TRACKER
FIELD OF THE INVENTION
This invention relates generally to computer vision, and particularly
to image analysis.
BACKGROUND OF THE INVENTION
Commonly, an image is formed to represent a scene of interest that
can include one or more physical objects on a background. In computer
vision, an image typically consists of a two-dimensional array of picture
elements, called pixels. Pixels can be square, rectangular, or hexagonal,
for example. Each pixel is associated with a gray value, and each gray
value can range from 0 to 255, for example. The gray value of each pixel
is determined by the visual properties of the portion of the scene
represented by that pixel.
An important task involved in analyzing images is differentiating
between an object and background of an image, or between the foreground
generally and the background of an image. The visual properties of an object are usually different from the visual properties of the background.
Thus, the gray values of the portion of the image that represents the object
will on average be different from the gray values of the portion of the
image that represents the background.
It is known to differentiate an object from the background by
comparing each pixel in an image with a single threshold gray value. If
the gray value of the pixel is above the threshold gray value, then it can be
categorized as object or foreground; if the gray value of the pixel is equal
to or below the threshold gray value, then it can be categorized as
background. Note that the threshold is static, non-adaptive, and is applied
globally, in that each and every pixel in the image is compared with the
same threshold.
One drawback to this approach is that, due to non-uniform
illumination across an image which can make one side of the scene darker
than another, object portions of the image can easily be misclassified as
background portions, and vice versa. Another disadvantage is that each
pixel of the image must be processed. Since computation time is
proportional to the number of pixels processed, this approach can present a
computational burden when there are large images, or when many images
must be processed rapidly, for example. Another way to differentiate an object from the background is to
start with a single pixel on the object/background boundary, and to
compare each pixel in the neighborhood of the pixel to a single threshold
gray value. If the neighborhood pixel is above the threshold, it is an
object pixel; if it is below the threshold, it is a background pixel. The
transition from object pixel to background pixel is the boundary, and it can
be further localized with an inteφolation step. The neighborhood pixel
that is closest to the boundary is then selected, and each of its neighboring
pixels is compared to the single threshold gray value. The process is
repeated, resulting in a sequence of pixels that tracks the boundary of an
object. An example of this technique is the Cognex Boundary Tracker,
sold by Cognex Corporation, Natick MA.
Although this method processes less pixels than the previous method
that processes every pixel in an image by only processing pixels in a
narrow band around each object/background boundary, it also suffers from
the problem of classification error due to non-uniform illumination across
an image.
A third way to locate the boundary between object and background is
to first perform "edge detection" on the image. An edge is usefully
defined as a change in gray value from dark to light or from light to dark that can span many pixels. Since the gray values of the image that
represent the object will on average be different from the gray values that
represent the background, there is an extended band or chain of pixels that
have gray values that transition between the gray values of the object and
the background. This extended band or chain of pixels is called an "edge
contour" . Thus, an edge contour indicates the boundary between an object
and the background.
In "edge detection", the entire image is processed so as to label each
pixel of the image as being either on an edge or not on an edge. Since
every pixel in the image is processed, this first step is very computationally
expensive. Next, an "edge linking" step is performed wherein pixels on
edges are combined into "edge contours" or "boundary chains" which
defines the boundary between object and background. This step is also
computationally expensive, because every pixel that has been labeled as
being on an edge must be processed to determine whether it can be
included in a boundary chain. Moreover, this method does not provide
sub-pixel accuracy. PROBLEMS TO BE SOLVED BY THE INVENTION
It is a goal of the invention to provide a method and apparatus for
locating an edge contour that is robust against object/background
misclassification due to non-uniform illumination across an image, and that
is significantly more computationally efficient than the methods and
apparatus of the prior art.
It is a goal of the invention to provide a method and apparatus for
locating an edge contour that avoids the use of a gray value threshold, and
thereby avoid the task of selecting a threshold to optimize classification of
object and background.
It is a goal of the invention to provide a method and apparatus for
edge contour tracking that avoids the use of a threshold gray value, and
thereby avoids the associated misclassification errors due to non-uniform
lighting across a scene.
SUMMARY OF THE INVENTION
A method and apparatus for finding an edge contour in an image of
an object is provided wherein the edge contour is progressively formed by
finding a sequence of one-dimensional edge positions, each one- dimensional edge position being determined by processing a set of pixels
arranged along at least one imaginary line.
More particularly, the invention is a method for finding an edge
contour in an image, where the edge contour is found by finding a
sequence of contour points. The step of determining the position of a
contour point includes locating the position of an edge in a one-dimensional
gray value signal taken along a line across the projected edge contour, and
then advancing to a next contour point based on each successive edge so-
located.
In a preferred embodiment, the step of locating the position of an
edge in a one-dimensional gray value signal includes the step of finding an
interpolated edge peak. Other preferred embodiments find and exploit
more than one one-dimensional edge position to determine the two-
dimensional position and direction of the next edge contour point. It is
preferable that when two one-dimensional edge positions are used to
determine the two-dimensional position and direction of the next edge
contour point, a pair of perpendicular imaginary lines and the associated
pixels are used. The one-dimensional edge positions can be determined by
analyzing an edge enhancement signal derived from the gray value signal
taken along each imaginary line. According to the invention, the two dimensional position of each edge contour point can be found to sub-pixel
accuracy.
The invention overcomes the drawbacks of the aforementioned
approaches in that it is robust against object/background misclassification
due to non-uniform illumination across an image, and is significantly more
computationally efficient. Moreover, since no gray value threshold is
used, the task of selecting a threshold to optimize classification of object
and background is avoided.
Further, a threshold gray value is not needed to define the boundary,
so the task of selecting a threshold to minimize classification errors is
avoided. Moreover, since there is no threshold gray value, the associated
misclassification errors due to non-uniform lighting across a scene are
thereby avoided.
The invention can be used to partition an image of a scene into
object regions and background regions, or foreground regions and
background regions, using edge contours found in the image according to
the method of the invention.
An edge contour is a continuous curve formed by a sequence of
connected points at the interpolated maximum of the first derivative of the
gray value signal taken across the boundary at each point. Since the edge contour is formed by a sequential local computation, i.e., sequentially
finding the interpolated maximum so as to track a boundary, only pixels at
or near the boundary are processed. Thus, significant computational
efficiency is gained, because many pixels that are not near the boundary
need not be processed.
The invention is particularly useful for analysis of images of back- lit
objects. The invention can be used, for example, to provide information
regarding the physical position and/or orientation of a physical object
represented by an image, and such information can be used to control
hardware resources, such as an aligner.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be more fully understood from the following
detailed description, in conjunction with the accompanying figures,
wherein:
Fig. 1 shows an image of an object having two gray value regions
against a background of three gray value regions, the varied gray values
illustrating non-uniform illumination over the surface of the object and
background; Fig. IA shows the same object and background as in Fig. 1 , also
including a pair of dashed lines that enclose a region of pixels to be
processed;
Fig. 2 shows the same object and background as in Figs. 1 and IA,
also showing the boundary between the object and the background, as
indicated by a closed thin black line;
Fig. 3 shows the eight directions in the neighborhood of a single
pixel;
Fig. 4 is a plot of a gray value signal versus position taken across a
boundary, indicating the gray values of four pixels at four locations across
the boundary;
Fig. 4 A is a plot of the first derivative of the signal of Fig. 4,
indicating three first derivative values at three locations across the
boundary;
Fig. 4B is a plot showing the three first derivative values of Fig. 4A,
and the parabolic inteφolation curve that fits the three points;
Figs. 5 A-M' show a variety of pixel sets which are processed to
determine the position and direction of each new edge contour point; Fig. 6 is representation of a boundary, showing a sequence of sets of
four pixels that are processed to obtain the edge contour point along an
arbitrary segment of the boundary; and
Fig. 7 is a flow chart illustrating the method of the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring to Fig. 1 , the invention finds edge contours, and so it can
be used to find the boundary of an image of an object against a background
12, 12', 12" . The method of the invention can discriminate between edges
of various strengths, and so it can find the boundary of the image of the
triangle shown in Fig. 1 , without being confused by the two gray value
regions of the triangle, or by the background, here shown as having three
gray value regions 12, 12', 12".
Moreover, Fig. 1 illustrates another advantage of the present
invention; it can find the boundary of an object, even when varied gray
values are present due to non-uniform illumination over the surface of the
object and background.
Fig. IA shows the same object 10 and background 12, 12', 12" as in
Fig. 1. A pair of dashed lines 14 are included to indicate a region of
pixels to be sequentially processed according to the invention. Thus, the invention can find the boundary of an object without processing every pixel
in an image; only the pixels in a local neighborhood of each point along
the boundary are processed.
Fig. 2 shows the same object and background as in Figs. 1 and IA.
New to this figure is the boundary 16 between the object 10 and the
background 12, 12', 12", as indicated by a thin black line 16 that extends
around the entire triangular object 10.
An image consists of a regular array of pixels, which can be square,
rectangular, triangular, or hexagonal, for example. Fig. 3 shows the eight
directions in the neighborhood of a single square pixel 18, i.e. , E, NE, N,
NW, W, SW, S, and SE.
In the case of an image consisting of square pixels, there are two
coordinate axes, e.g., a coordinate axis in the X-direction (left-to-right),
and a coordinate axis in the Y-direction (down-to-up). These coordinate
axes are also axes of symmetry of the array of square pixels. In addition,
there are two other axes of symmetry of a square-pixel-based image, i.e. ,
the diagonal spanning the upper-left corner and the lower-right corner of
each pixel, and the diagonal spanning the lower-left corner and the upper-
right corner of each pixel. In a preferred embodiment, the invention exploits two axes of
symmetry, where the two axes are preferably peφendicular with respect to
each other. Specifically, a first difference signal is computed along one or
both of two peφendicular axes, and the results are used to determine the
direction of each boundary segment, as will be discussed further below.
So, referring again to Fig. 3, the direction of a boundary segment is
preferably expressed, in the case of an image consisting of an array of
square pixels, as one of the eight directions shown, i.e., E, NE, N, NW,
W, SW, S, and SE.
By way of further background, referring to Figs. 4, 4 A, and 4B, an
explanation of the terms "gray value signal", "first difference signal", and
"inteφolated first difference signal" will now be provided. An image
consists of a regular array of pixels. Each pixel can have a gray value that
falls within a range of allowable gray values, such as between 0 and 255.
Thus, a pixel can be black (0), white (255), or any shade of gray in
between from dark to light (1-254). A "gray value signal" is here defined
as a sequence of gray values of the pixels traversed along a particular path
in the image. The path can be straight or curved. In the case of Fig. 4, a straight path was traversed, such as a purely
horizontal path (not shown) . The gray value of each of four pixels is
indicated as the height of a black dot 20 above a horizontal axis 22. The
horizontal position of each of the four pixels is indicated by a vertical
heavy dotted line 24. A curve 26 is drawn through the four black dots 20
to suggest how the brightness of the underlying object may change along
the horizontal path. Note that to the left the gray values are less (darker)
than the gray values to the right (lighter), and that there is a transition
between dark and light in the middle. The transition is a boundary
between the dark region on the left and the light region on the right.
The character of the transition is further appreciated by plotting the
"first difference signal", i.e., plotting the change in gray value from pixel
to pixel along a given path. Here, in Fig. 4A, the given path is the same
as the horizontal path traversed in Fig. 4. The first difference between
each of three pairs of neighboring pixels is indicated as the height of a
black dot 28 above a horizontal axis 30. The horizontal position of each of
the three difference values is indicated by a vertical light-dotted line 32. A
curve 34 is drawn through the three black dots 28 to suggest how the rate
of change in brightness of the underlying object may vary along the
horizontal path. Note that both to the left and to the right, the gray values do not
change very much, but note well that in the middle, there is a region of
significant change. This region of change is called an "edge", or "edge
contour".
Referring to Fig. 4B, the position of the edge contour is
approximated by the largest of the values 28 of the first difference. A
more accurate position of the edge can be obtained by fitting a curve to the
values 28, and then finding the position of the maximum of this curve, i.e. ,
inteφolating or extrapolating to find the position of the edge contour to
sub-pixel accuracy. For example, a parabola 36 can be fit to the three
points 28, the maximum of the curve 36 being indicated by the thin vertical
line 38 to the right of the greatest of the three points 28. Note again that
is a sub-pixel position value. Even greater sub-pixel accuracy can be
achieved if more first difference values are available, using a higher-order
inteφolation, such as a fourth order curve.
There are many other methods for determining a position of an edge
in a one-dimensional gray value signal that provide the position value to
sub-pixel accuracy. Thus, according to the invention, the following
methods can be used as an alternative to the above-described method of finding the peak in a first difference signal to find the position of the edge
contour to sub-pixel accuracy.
For example, there are linear filtering methods, non-linear filtering
methods, statistical methods, and matched filtering methods. Linear
filtering methods include the first difference operated described above, and
also include a laplacian operator, and a difference-of-Gausians operator.
Non-linear methods include the square of the first difference, and the
log of the first difference. An advantage of using the log of the first
difference is that this operator is relatively insensitive to multiplicative
changes in luminance level over an image.
Statistical methods include finding the window position of maximum
variance for a window sliding along the one-dimensional gray value signal.
Matched filtering methods include finding the position that minimizes
the mean squared error between an ideal edge profile and the one-
dimensional gray value signal.
Each of the above methods for finding the position of an edge in a
one-dimensional gray value signal will provide a position value, but the
position value will differ from one method to another. Nevertheless, consistent use of any one of the above methods in the practice of the
present invention will provide the location of an edge contour. Thus, the
description of the preferred embodiment of the present invention that
includes a discussion of the use of determining the position of a contour
point by locating the peak of a first difference signal of a gray value signal
taken across the edge contour, is for purposes of clarity of explanation, and
in no way is meant to limit the scope of the present invention. The best
method for finding the position of an edge in a one- dimensional gray value
signal will depend to a large extent on the character of the image under
analysis, and the particular requirements of the application domain.
Referring to Figs. 5A and 6, the pixel set shown in Fig. 5A will be
used to find an edge contour according to the method of the invention. An
example of a generally horizontal edge contour is shown in Fig. 6. A
region 40 of white pixels meets a region of gray pixels 42 along an edge
contour 44. Each square 46 represents a pixel of an image that represents
the underlying scene presented by regions 42 and 44. The gray value of
each pixel is the average gray value over the area of each square 46.
The example shown in Fig. 6 repeatedly uses the same pixel set to
track the edge contour 44. This set 48 is shown in Fig. 5 A, and includes
four pixels 50, 52, 54, and 56. The pixel set 48 is characterized by having an origin 58, here residing at the intersection of the left wall and the mid-
line of the pixel set 48. The origin 58 is used as a reference point for
specifying changes in position of the pixel set 48. The pixel set 48 also
has three first difference positions, indicated by the three Xs 60, 62, and
64. Note that the three Xs are located along an imaginary vertical position
axis of the pixel set 48, such as the vertical midline 66. Of course,
consistent use of the left or right wall of the pixel set as the vertical
position axis would work also. In the example of Fig. 6, the position of
the edge contour point for each position of the pixel set 48 will be
somewhere along the midline 66, which midline is not shown explicitly in
Fig. 6.
In Fig. 6, each position of the edge contour point along the midline
66, for each position of the pixel set 48 along the boundary 44, is indicated
generally by a circle 68, and particularly by the center of the circle 68.
The position of an edge contour point can be anywhere along the midline
66, as is the case when inteφolation or extrapolation techniques are
employed. Alternatively, the position of an edge contour point can be
restricted to one of the intersection points 60, 62, and 64. This can be
accomplished, for example, by reporting the maximum of the three first
difference values, and performing no extrapolation or inteφolation, or by finding and reporting the nearest intersection point 60, 62, 64 to the found
inteφolated or extrapolated point.
In the case where no edge contour point is found along the midline
66, i.e. , the magnitude of the one-dimensional edge detection signal does
not exceed a noise-filtering threshold, then one or more of the other pixels
sets, such as 5B - 5M, is used until an edge contour point is found, or until
no such signal is found. In the event that no edge contour point is found,
the edge contour is terminated. The use of other pixels sets, such as 5B -
5M, will be explained further below.
In this example, the position of each edge contour point is
determined by a calculation of the first difference between each
neighboring pair of pixels in the pixel set 48. As stated previously, other
methods can be used that take the pixels of a pixel set as input, and
provide the position of an edge contour point along the vertical position
axis of the pixel set. Figs. 5C, 5D, 5M, and 5M' illustrate that there are
useful pixel sets that are arranged in configurations other than a vertical
column. Thus, the position of an edge contour point is most generally
expressed as the longitudinal position of the edge contour point along the
longitudinal position axis of a pixel set. The invention provides a series of edge contour points that together
indicate the position of an edge contour in the image under analysis. The
edge contour can correspond to a boundary between regions in the
underlying scene. For example, a dark region can represent a
semiconductor wafer, a bright region, a wafer handling stage, and the edge
contour can represent the edge of the wafer. A curve can be fit to the
series of boundary points that can be used to find inteφolated positions of
the edge contour between contour points. For example, a circle can be fit
to the edge contour points found in the analysis of the image of a back- lit
semiconductor wafer. Once the closest-fitting circle is found, the position
of the center of that circle can be used as the position of the center of the
wafer.
Referring to Fig. 7, to provide a series of edge contour points, the
position and direction of an initial edge contour point must first be found
(70). Here, reference numbers within parentheses indicate method steps.
This can be done by a variety of methods. For example, simple visual
inspection by an operator, or use of an edge-finding tool, such as the
CALIPER TOOL, sold by Cognex Coφoration, Natick MA. Based on the
found position and direction of the initial edge contour point, the invention
selects a plurality of pixels along at least one imaginary line in the image, e.g. , 66, where each imaginary line is not parallel to the direction of the
initial edge countour point. In a preferred embodiment, each imaginary
line is preferably non-parallel with respect to each other imaginary line
selected to determine the position and direction of the next contour edge
point.
Referring to Figs. 5A - 5M\ a variety of pixel sets are shown.
Each pixel set is shown with a direction arrow 72. The direction arrow 72
indicates the direction of the current edge contour point after which the
pixel set can primarily be used. The pixel set chosen based on the
direction arrow 72 of the current edge contour point is then used to find
the position and direction of the next edge contour point. For example, the
pixel set of Figs.5 A or 5 A' can be used to find the sequence of edge
contour points 68 that together indicate the position of the edge contour 44
shown in Fig.6. Note that the pixel set in Figs.5 A and 5A' can also be
used for both a northeast (NE) and a southeast (SE) directed current edge
contour point, where the use of the pixel set in Fig.5 A is specifically
illustrated in Fig.6.
Each arrow 72 in Fig.6 indicates the direction of the current edge
contour point, and each circle 68 indicates the position of the current edge
contour point, both of which are used to determine the vertical position of the next pixel set of Fig.5A. Once selected, the pixel set in its new
vertical position is used to find the position and direction of the next edge
contour point. In general, the direction and position of the current edge
contour point are used to determine the next pixel set, e. g., one selected
from among those shown in Figs.5 A - 5M', and the position of the selected
pixel set. The position of a pixel set is defined as the position of its origin
58.
For edge contours that potentially include large changes in edge
contour angle over a small number of pixels, use of the pixel sets having
pixels along more than one imaginary line, such as the pixel sets shown in
Figs.5E - 5L, can advantageously be used.
The pixel sets shown in Figs.5E - 5L each include two subsets of
pixels, each subset disposed along an imaginary line 74 that is non- parallel
with respect to the direction 72 of the present edge contour point.
Preferably, each imaginary line 74 is parallel with respect to a principle
axis or an axis of symmetry of the pixel grid of the image.
To find the position and direction of the next edge contour point
using all of the pixels of a pixel set as shown in Figs.5E - 5L, the position
and magnitude of a one-dimensional edge is found along each of the
imaginary lines 74, 74'. In a preferred embodiment, each magnitude of a one-dimensional edge is compared with a noise-filter threshold, the
threshold being set so as to reduce the likelihood that a spurious one-
dimensional edge (one that is due to noise and not related to the underlying
scene that the image represents) is considered.
In a preferred embodiment, to determine the direction of the current
edge contour point, the magnitude of a one-dimensional edge found along a
first imaginary line 74 is compared with the magnitude of a one-
dimensional edge found along a second imaginary line 74' . The position
of the one-dimensional edge of the greatest magnitude is then used to
determine the direction of the current edge contour point. For example, if
the magnitude of the one-dimensional edge found along the horizontal line
74' is found to be the largest, then the direction of the current edge
contour point is determined by ascertaining which of the edge positions 76,
76', 76" (indicated by an 'x') along the horizontal imaginary line 74' is
nearest. If the left-most V is the nearest, then the direction of the current
edge contour point is NW (See Fig.3), and the position of the current edge
contour point moves North one pixel, and West one pixel. If the right-
most 'x' is the nearest, then the direction of the current edge contour point
is NE, and the position of the current edge contour point moves North one
pixel, and East one pixel. Lastly, if the center 'x' is the nearest, then the direction of the current edge contour point is N, and the position of the
current edge contour point moves North one pixel.
Similarly, if the magnitude of the one-dimensional edge found along
the vertical line 74 is found to be the largest, then the direction of the
current edge contour point is determined by ascertaining which of the edge
positions 78, 78', 78"(indicated by an 'x') along the vertical imaginary line
74 is nearest. If the top-most 'x' 78 is the nearest, then the direction of
the current edge contour point is NE (See Fig.3), and the position of the
current edge contour point moves North one pixel, and East one pixel. If
the bottom-most 'x' 78" is the nearest, then the direction of the current
edge contour point is SE, and the position of the current edge contour point
moves South one pixel, and East one pixel. Lastly, if the center 'x' 78' is
the nearest, then the direction of the current edge contour point is E, and
the position of the current edge contour point moves East one pixel.
In an alternate embodiment, to find the position and direction of the
next edge contour point using all of the pixels of a pixel set as shown in
Figs.5E - 5L, the one-dimensional position and magnitude of a one-
dimensional edge is found along each of the imaginary lines 74, 74' . A
position and direction of a proposed next edge contour point is computed according to the method of the previous paragraph for each of d e
imaginary lines, given the one-dimensional position and magnitude of the
one-dimensional edge found along each of the imaginary lines 74, 74'. In
the case where the two proposed edge contour points differ in both
direction and in the magnitude of its one-dimensional edge, and one of the
directions of the two edge contour points is the same as the direction of the
previous edge contour point, the following edge contour point selection
rule is operative: The proposed edge contour point, having a direction that
is different from the direction of the previous edge contour point, is
selected to be the current edge contour point, only if the magnitude of its
one-dimensional edge, exceeds by a minimum amount, the magnitude of
the one dimensional edge of the proposed edge contour point, having a
direction that is the same as the direction of the previous edge contour
point. Note that if the minimum amount is zero, then the method
described in this paragraph is the same as the method of the previous
paragraph.
In a further alternate embodiment, for use with each of the
embodiments of the previous two paragraphs, and only in the case where
the two proposed edge contour points are of the same direction, the (2-D)
position of the edge contour point is selected as follows: select a position d at is intermediate the positions of each of the proposed edge contour
points, the selected position being the weighted average of the positions of
each of the proposed edge contour points, the positions being weighted in
accordance with the magnitude of each respective one- dimensional edge.
In each of the previous descriptions of how to find the position and
direction of the next edge contour point using all of the pixels of a pixel set
as shown in Figs.5E - 5L, the position and magnitude of a one-
dimensional edge can be found along each of the imaginary lines 74, 74'
using one of the following four methods: inteφolating/extrapolating,
inteφolating/not-extrapolating, not-interpolating/extrapolating, and not-
inteφolating/not-extrapolating. In applications such as finding the back- lit
outline of a semiconductor wafer, inteφolating/not-extrapolating is the
preferred method.
With reference to Fig.7, in general, the method of the invention for
finding an edge contour in an image, the edge contour including a sequence
of edge contour points, proceeds as follows. First, the two-dimensional (2-
D) position and direction of an initial edge contour point is determined (70)
using an edge detector, such as the Caliper Tool, sold by Cognex
Corporation, Natick MA. Next, using the 2-D position and direction of the initial edge contour point, a plurality of pixels are selected (80) along
at least one imaginary line that is in non-parallel relationship with the
direction of the initial edge contour point, or in subsequent steps, the
current edge contour point. In a preferred embodiment, the imaginary line
is parallel to a coordinate axis of the image.
Then, for each imaginary line, a one-dimensional edge position is
found (82), using a technique such as finding the one-dimensional position
of the peak of the first difference, or finding the zero-crossing of the
second difference, of the gray value signal taken along each imaginary
line. Next, using at least one one-dimensional edge position, the 2-D
position and direction of the new edge contour boundary point are
determined (84). With the 2-D position and direction of the new edge
contour point, another plurality of pixels is then selected (80), and the
process continues iteratively until the end of the edge contour is reached
(86).
To determine when the end of the edge contour is reached, for each
imaginary line, a one-dimensional edge position is sought (82). If no one-
dimensional edge is found, then the direction of the current edge contour
point is changed (88), and another plurality of pixels is selected along one
or more different imaginary lines (80). For example, if the direction of the current edge contour point is East (E), and if no one-dimensional edge is
found along the corresponding vertical imaginary line, then the direction of
the current edge contour point is changed to North (N). If no one-
dimensional edge is found along the corresponding horizontal imaginary
line, then the direction of the current edge contour point can be changed to
either South (S) or West (W). Alternatively, the direction can be changed
by smaller increments, such as in the sequence NE, N, NW, W, etc. , or
NE, SE, N, S, etc., until either a one-dimensional edge is found, or until
all allowed directions are explored, demonstrating that no one- dimensional
edge was found in any direction.
In yet another alternate embodiment, pixels along three imaginary
lines can be searched for edge peaks, e.g., imaginary lines peφendicular to
the NW, N, and NE directions. If no edge peak is found, another three
imaginary lines can be searched, such as lines peφendicular to the SW, W,
and NW directions.
In a further alternate embodiment, if no one-dimensional peak is
found along an imaginary line that traverses N pixels, where N is
preferably equal to four, then one or more pixels can be added to the sub-
set of pixels disposed along that imaginary line, and the one-dimensional
peak can be sought again. If it a peak is not found after one or more pixels have been added, the direction of the imaginary line can be changed,
as described above.
In a preferred embodiment of the invention, the step of finding a
one-dimensional edge position includes the step of locating the peak of a
first difference signal, and in a further preferred embodiment, includes the
step of using the first difference signal to find an interpolated peak.
Preferably, the first difference signal includes at least three data points.
It should be appreciated that all of the above-described method steps
can advantageously be executed on a general puφose computer, and can
also be advantageously instantiated as specialized hardware, such as a gate
array, an application specific integrated circuit, or as a custom integrated
circuit chip. Also, the invention also includes the method steps explained
herein expressed as computer-readable instructions, and stored on a
magnetic media, such as a magnetic disk or tape, to provide an article of
manufacture that embodies the invention.
Other modifications and implementations will occur to those skilled
in the art without departing from the spirit and the scope of the invention
as claimed. Accordingly, the above description is not intended to limit the
invention except as indicated in the following claims.

Claims

CLAIMSWhat is claimed is:
1. A method for finding an edge contour in an image, the edge contour
including a sequence of contour points, the method comprising the steps of:
determining the position of a contour point by locating the peak of a
first difference signal of a gray value signal taken across the edge contour;
advancing to a next contour point based on each successive peak so-
located; and
returning to the step of determining the position of a countour point.
2. The method of claim 1 , wherein the step of locating the peak of a first
difference signal includes the step of:
using said first difference signal to find an inteφolated peak.
3. The method of claim 1 , wherein the step of locating the peak of a first
difference signal of a gray value signal taken across the edge contour is
taken along a line parallel to a coordinate axis of the image.
4. The method of claim 1, wherein said first difference signal includes at
least three data points.
5. The method of claim 1 , wherein said first difference signal of a gray
value signal is taken across the edge contour in a first direction, the
method further including the step of:
locating the peak of a second first difference signal of a second gray
value signal taken across the edge contour in a second direction.
6. A method for finding an edge contour, the method comprising the steps
of:
determining the position and direction of an initial current boundary
point;
selecting a plurality of pixels along a first line that is non-parallel to
the direction of said current boundary point;
computing a first difference signal along said plurality of pixels; finding the position of a peak of said first difference signal;
using said position of said peak to determine a position and direction
of a new current boundary point; and
returning to the step of selecting a plurality of pixels.
7. The method of claim 6, further including the step of:
selecting a plurality of pixels along a second line that is non- parallel
to the direction of said current boundary point, and that is non- parallel to
the direction of said first line.
8. The method of claim 6, wherein said step of finding the position of a
peak of said first difference signal includes the steps of:
fitting a curve to a plurality of data points of said first difference
signal; and
finding the position of a maximum of said curve.
9. The method of claim 6, further including the step of:
selecting a plurality of pixels along a third line that is non-parallel to
the direction of said current boundary point, that is non-parallel to the direction of said first line, and that is that is non-parallel to the direction of
said second line.
10. A method for finding a boundary in an image of an object and a
background, the method comprising the steps of:
progressively forming an edge contour by finding a sequence of
inteφolated maxima of a first difference taken across the boundary, each
first difference being taken by processing a set of pixels only one pixel
wide.
11. A method for finding an edge contour in an image, the edge contour
including a sequence of contour points, the method comprising the steps of:
determining the position of a contour point including the step of
locating the position of an edge in a one-dimensional gray value signal
taken along a first line across the edge contour;
advancing to a next contour point based on each successive edge so-
located; and
returning to the step of determining the position of a countour point.
12. The method of claim 11 , wherein the step of locating the position of
an edge in a one-dimensional gray value signal includes the step of:
finding an inteφolated edge peak.
13. The method of claim 11 , wherein said line across the edge contour is
a line parallel to a coordinate axis of the image.
14. The method of claim 11 , wherein said step of determining the position
of a contour point further includes the step of:
locating the position of an edge in a one-dimensional gray value
signal taken along a second line across the edge contour.
15. The method of claim 14, wherein said second line is peφendicular to
said first line.
16. A method for finding an edge contour in an image, the edge contour
including a plurality of contour points, the method comprising the steps of:
determining the position and direction of an initial current contour
point; selecting a plurality of pixels along a first line through said current
contour point that is non-parallel to the direction of said current contour
point;
computing an edge enhancement signal along said plurality of pixels;
finding the position of a peak of said edge enhancement signal;
using said position of said peak to determine a position and direction
of a new current contour point; and
returning to the step of selecting a plurality of pixels.
17. The method of claim 16, further including the step of:
selecting a plurality of pixels along a second line through said
current contour point that is non-parallel to the direction of said current
contour point, and that is non-parallel to the direction of said first line.
18. The method of claim 16, wherein said step of finding the position of a
peak of said edge enhancement signal includes the steps of:
fitting a curve to a plurality of data points of said edge enhancement
signal; and
finding the position of a maximum of said curve.
19. The method of claim 16, further including the step of:
selecting a plurality of pixels along a third line through said current
contour point that is non-parallel to the direction of said current contour
point, that is non-parallel to the direction of said first line, and that is that
is non-parallel to the direction of said second line.
20. A method for finding a boundary in an image of an object and a
background, the method comprising the steps of:
progressively forming an edge contour by finding a sequence of
inteφolated maxima of an edge signal taken across the boundary, said edge
signal being taken by processing a set of pixels arranged along a line
across the boundary.
21. An apparatus for finding an edge contour in an image, the edge
contour including a plurality of contour points, the apparatus comprising:
initial determining means for determining the position and direction
of an initial current contour point;
first selecting means, connected to said determining means, for
selecting a plurality of pixels along a first line through said current contour
point that is non-parallel to the direction of said current contour point; computing means, connected to said first selecting means, for
computing an edge enhancement signal along said plurality of pixels;
finding means, connected to said computing means, for finding the
position of a peak of said edge enhancement signal; and
new determining means, connected to said finding means and to said
first selecting means, for using said position of said peak to determine a
position and direction of a new current contour point.
22. The apparatus of claim 21 , further comprising:
second selecting means, connected to said first selecting means, for
selecting a plurality of pixels along a second line through said current
contour point that is non-parallel to the direction of said current contour
point, and that is non-parallel to the direction of said first line.
23. The apparatus of claim 21 , wherein said finding means for finding the
position of a peak of said edge enhancement signal includes:
fitting means for fitting a curve to a plurality of data points of said
edge enhancement signal; and finding means, connected to said fitting means, for finding the
position of a maximum of said curve.
24. The apparatus of claim 21 , further comprising:
third selecting means, connected to said first selecting means, for
selecting a plurality of pixels along a third line through said current
contour point that is non-parallel to the direction of said current contour
point, that is non-parallel to the direction of said first line, and that is that
is non-parallel to the direction of said second line.
SUBSTITUTE SHE1 • (RULE 26)
PCT/US1996/012954 1995-12-06 1996-08-09 Edge peak boundary tracker WO1997021189A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP52123797A JP2001519934A (en) 1995-12-06 1996-08-09 Edge / peak boundary tracker

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/567,946 US5987172A (en) 1995-12-06 1995-12-06 Edge peak contour tracker
US08/567,946 1995-12-06

Publications (1)

Publication Number Publication Date
WO1997021189A1 true WO1997021189A1 (en) 1997-06-12

Family

ID=24269283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1996/012954 WO1997021189A1 (en) 1995-12-06 1996-08-09 Edge peak boundary tracker

Country Status (3)

Country Link
US (2) US5987172A (en)
JP (1) JP2001519934A (en)
WO (1) WO1997021189A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000022655A1 (en) * 1998-10-15 2000-04-20 Applied Materials, Inc. Detection of wafer fragments in a wafer processing apparatus
WO2000034918A1 (en) * 1998-12-11 2000-06-15 Synapix, Inc. Interactive edge detection markup process
GB2358702A (en) * 2000-01-26 2001-08-01 Robotic Technology Systems Plc Providing information of objects by imaging
US8867847B2 (en) 1998-07-13 2014-10-21 Cognex Technology And Investment Corporation Method for fast, robust, multi-dimensional pattern recognition
US9147252B2 (en) 2003-07-22 2015-09-29 Cognex Technology And Investment Llc Method for partitioning a pattern into optimized sub-patterns

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE38559E1 (en) 1984-12-20 2004-07-27 Orbotech Ltd Automatic visual inspection system
US6259827B1 (en) 1996-03-21 2001-07-10 Cognex Corporation Machine vision methods for enhancing the contrast between an object and its background using multiple on-axis images
US6075881A (en) 1997-03-18 2000-06-13 Cognex Corporation Machine vision methods for identifying collinear sets of points from an image
US6608647B1 (en) 1997-06-24 2003-08-19 Cognex Corporation Methods and apparatus for charge coupled device image acquisition with independent integration and readout
US6381375B1 (en) 1998-02-20 2002-04-30 Cognex Corporation Methods and apparatus for generating a projection of an image
US6639593B1 (en) * 1998-07-31 2003-10-28 Adobe Systems, Incorporated Converting bitmap objects to polygons
US6381366B1 (en) 1998-12-18 2002-04-30 Cognex Corporation Machine vision methods and system for boundary point-based comparison of patterns and images
US6687402B1 (en) 1998-12-18 2004-02-03 Cognex Corporation Machine vision methods and systems for boundary feature comparison of patterns and images
US6636633B2 (en) * 1999-05-03 2003-10-21 Intel Corporation Rendering of photorealistic computer graphics images
US6912304B1 (en) * 1999-08-02 2005-06-28 Applied Materials, Inc. Two-dimensional scatter plot technique for defect inspection
US6684402B1 (en) 1999-12-01 2004-01-27 Cognex Technology And Investment Corporation Control methods and apparatus for coupling multiple image acquisition devices to a digital data processor
US6748104B1 (en) 2000-03-24 2004-06-08 Cognex Corporation Methods and apparatus for machine vision inspection using single and multiple templates or patterns
US6459807B1 (en) * 2000-06-13 2002-10-01 Semiconductor Technologies & Instruments, Inc. System and method for locating irregular edges in image data
DE10047211B4 (en) * 2000-09-23 2007-03-22 Leica Microsystems Semiconductor Gmbh Method and device for determining the position of an edge of a structural element on a substrate
US7006694B1 (en) * 2000-10-05 2006-02-28 Coreco Imaging, Inc. System and method for pattern identification
US8682077B1 (en) 2000-11-28 2014-03-25 Hand Held Products, Inc. Method for omnidirectional processing of 2D images including recognizable characters
US6882958B2 (en) * 2001-06-28 2005-04-19 National Instruments Corporation System and method for curve fitting using randomized techniques
US20060225484A1 (en) * 2001-10-09 2006-10-12 Gleman Stuart M Bolt tension gauging system
US7120286B2 (en) * 2001-11-21 2006-10-10 Mitutoyo Corporation Method and apparatus for three dimensional edge tracing with Z height adjustment
FR2849244B1 (en) * 2002-12-20 2006-03-10 Sagem METHOD FOR DETERMINING THE LIVING CHARACTER OF A CARRIER COMPONENT OF A DIGITAL IMPRINT
US7190834B2 (en) 2003-07-22 2007-03-13 Cognex Technology And Investment Corporation Methods for finding and characterizing a deformed pattern in an image
CN1266950C (en) * 2003-11-10 2006-07-26 华亚微电子(上海)有限公司 System and method for reinforcing video image quality
CN1263313C (en) * 2003-11-10 2006-07-05 华亚微电子(上海)有限公司 System and method for promoting marginal definition of video image
TW200528770A (en) * 2004-02-23 2005-09-01 Altek Corp Method and apparatus for determining edge inclination of an interested pixel in a color filter image array interpolation (CFAI)
EP1782622A1 (en) * 2004-06-23 2007-05-09 Koninklijke Philips Electronics N.V. Pixel interpolation
US7738705B2 (en) * 2004-06-30 2010-06-15 Stefano Casadei Hierarchical method and system for pattern recognition and edge detection
US8437502B1 (en) 2004-09-25 2013-05-07 Cognex Technology And Investment Corporation General pose refinement and tracking tool
WO2006085861A1 (en) * 2005-02-07 2006-08-17 Tankesmedjan Inc Method for recognizing and indexing digital media
US9769354B2 (en) 2005-03-24 2017-09-19 Kofax, Inc. Systems and methods of processing scanned data
US9137417B2 (en) 2005-03-24 2015-09-15 Kofax, Inc. Systems and methods for processing video data
US7416125B2 (en) * 2005-03-24 2008-08-26 Hand Held Products, Inc. Synthesis decoding and methods of use thereof
CN100469103C (en) * 2005-07-01 2009-03-11 鸿富锦精密工业(深圳)有限公司 Image noise filtering system and method
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
US8162584B2 (en) 2006-08-23 2012-04-24 Cognex Corporation Method and apparatus for semiconductor wafer alignment
US8073253B2 (en) * 2006-09-29 2011-12-06 General Electric Company Machine learning based triple region segmentation framework using level set on PACS
JP4818053B2 (en) * 2006-10-10 2011-11-16 株式会社東芝 High resolution device and method
US20080144935A1 (en) * 2006-11-30 2008-06-19 Chav Ramnada Method for segmentation of digital images
US8306261B2 (en) * 2008-06-13 2012-11-06 International Business Machines Corporation Detection of an object in an image
US9767354B2 (en) 2009-02-10 2017-09-19 Kofax, Inc. Global geographic information retrieval, validation, and normalization
US9349046B2 (en) 2009-02-10 2016-05-24 Kofax, Inc. Smart optical input/output (I/O) extension for context-dependent workflows
US8958605B2 (en) 2009-02-10 2015-02-17 Kofax, Inc. Systems, methods and computer program products for determining document validity
US9576272B2 (en) 2009-02-10 2017-02-21 Kofax, Inc. Systems, methods and computer program products for determining document validity
US8774516B2 (en) 2009-02-10 2014-07-08 Kofax, Inc. Systems, methods and computer program products for determining document validity
US8675957B2 (en) 2010-11-18 2014-03-18 Ebay, Inc. Image quality assessment to merchandise an item
US8526709B2 (en) * 2011-01-13 2013-09-03 Lam Research Corporation Methods and apparatus for detecting multiple objects
US8538077B2 (en) 2011-05-03 2013-09-17 Microsoft Corporation Detecting an interest point in an image using edges
JP5959168B2 (en) * 2011-08-31 2016-08-02 オリンパス株式会社 Image processing apparatus, operation method of image processing apparatus, and image processing program
TWI466063B (en) * 2011-09-29 2014-12-21 Altek Corp Processing method for image interpolation
TWI462576B (en) * 2011-11-25 2014-11-21 Novatek Microelectronics Corp Method and circuit for detecting edge of logo
US9058515B1 (en) 2012-01-12 2015-06-16 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US9483794B2 (en) 2012-01-12 2016-11-01 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US10146795B2 (en) 2012-01-12 2018-12-04 Kofax, Inc. Systems and methods for mobile image capture and processing
US9058580B1 (en) 2012-01-12 2015-06-16 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US8855375B2 (en) 2012-01-12 2014-10-07 Kofax, Inc. Systems and methods for mobile image capture and processing
CN103292725A (en) * 2012-02-29 2013-09-11 鸿富锦精密工业(深圳)有限公司 Special boundary measuring system and method
US9208536B2 (en) 2013-09-27 2015-12-08 Kofax, Inc. Systems and methods for three dimensional geometric reconstruction of captured image data
WO2014160426A1 (en) 2013-03-13 2014-10-02 Kofax, Inc. Classifying objects in digital images captured using mobile devices
US9355312B2 (en) 2013-03-13 2016-05-31 Kofax, Inc. Systems and methods for classifying objects in digital images captured using mobile devices
US9118863B2 (en) * 2013-04-18 2015-08-25 Xerox Corporation Method and apparatus for finer control in thin line growth
US20140316841A1 (en) 2013-04-23 2014-10-23 Kofax, Inc. Location-based workflows and services
DE202014011407U1 (en) 2013-05-03 2020-04-20 Kofax, Inc. Systems for recognizing and classifying objects in videos captured by mobile devices
US9679224B2 (en) 2013-06-28 2017-06-13 Cognex Corporation Semi-supervised method for training multiple pattern recognition and registration tool models
JP2016538783A (en) 2013-11-15 2016-12-08 コファックス, インコーポレイテッド System and method for generating a composite image of a long document using mobile video data
US9760788B2 (en) 2014-10-30 2017-09-12 Kofax, Inc. Mobile document detection and orientation based on reference object characteristics
US9501830B2 (en) * 2015-03-18 2016-11-22 Intel Corporation Blob detection in noisy images
US10242285B2 (en) 2015-07-20 2019-03-26 Kofax, Inc. Iterative recognition-guided thresholding and data extraction
US9779296B1 (en) 2016-04-01 2017-10-03 Kofax, Inc. Content-based detection and three dimensional geometric reconstruction of objects in image and video data
US10803350B2 (en) 2017-11-30 2020-10-13 Kofax, Inc. Object detection and image cropping using a multi-detector approach

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4183013A (en) * 1976-11-29 1980-01-08 Coulter Electronics, Inc. System for extracting shape features from an image
US4570180A (en) * 1982-05-28 1986-02-11 International Business Machines Corporation Method for automatic optical inspection
US5081689A (en) * 1989-03-27 1992-01-14 Hughes Aircraft Company Apparatus and method for extracting edges and lines
US5398292A (en) * 1992-04-22 1995-03-14 Honda Giken Kogyo Kabushiki Kaisha Edge detecting apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4972495A (en) * 1988-12-21 1990-11-20 General Electric Company Feature extraction processor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4183013A (en) * 1976-11-29 1980-01-08 Coulter Electronics, Inc. System for extracting shape features from an image
US4570180A (en) * 1982-05-28 1986-02-11 International Business Machines Corporation Method for automatic optical inspection
US5081689A (en) * 1989-03-27 1992-01-14 Hughes Aircraft Company Apparatus and method for extracting edges and lines
US5398292A (en) * 1992-04-22 1995-03-14 Honda Giken Kogyo Kabushiki Kaisha Edge detecting apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8867847B2 (en) 1998-07-13 2014-10-21 Cognex Technology And Investment Corporation Method for fast, robust, multi-dimensional pattern recognition
WO2000022655A1 (en) * 1998-10-15 2000-04-20 Applied Materials, Inc. Detection of wafer fragments in a wafer processing apparatus
WO2000034918A1 (en) * 1998-12-11 2000-06-15 Synapix, Inc. Interactive edge detection markup process
GB2358702A (en) * 2000-01-26 2001-08-01 Robotic Technology Systems Plc Providing information of objects by imaging
US9147252B2 (en) 2003-07-22 2015-09-29 Cognex Technology And Investment Llc Method for partitioning a pattern into optimized sub-patterns

Also Published As

Publication number Publication date
JP2001519934A (en) 2001-10-23
US5987172A (en) 1999-11-16
US5943441A (en) 1999-08-24

Similar Documents

Publication Publication Date Title
US5943441A (en) Edge contour tracking from a first edge point
US5640200A (en) Golden template comparison using efficient image registration
Cooper et al. Early jump-out corner detectors
CN111968144B (en) Image edge point acquisition method and device
KR100367303B1 (en) Vector correlation system for automatically locating patterns in an image
US8180154B2 (en) Method and apparatus for region-based segmentation image processing using region mapping
US6463175B1 (en) Structure-guided image processing and image feature enhancement
Wei et al. Contrast-guided image interpolation
US7162089B2 (en) Method for segmenting and recognizing an image in industry radiation imaging
US6917721B2 (en) Method and apparatus for sub-pixel edge detection
EP1092206A1 (en) Method of accurately locating the fractional position of a template match point
JP4724638B2 (en) Object detection method
Holzer et al. Multilayer adaptive linear predictors for real-time tracking
US20030026473A1 (en) Structure-guided automatic alignment for image processing
KR20020064897A (en) Segmentation of digital images
US6879719B1 (en) Method for measurement of full-two dimensional submicron shapes
Chen et al. Subpixel estimation of circle parameters using orthogonal circular detector
KR20050047745A (en) Edge detecting apparatus and method thereof
CN110298799B (en) PCB image positioning correction method
KR20050041848A (en) Method and apparatus for detecting the location and luminance transition range of slant image edges
CN107247953A (en) A kind of characteristic point type selection method based on edge rate
Sahu et al. Improved simplified novel method for edge detection in grayscale images using adaptive thresholding
Zhou et al. Edge detection with bilateral filtering in spiral space
Song et al. A new edge detection algorithm using data fusion approaches
JPH09259279A (en) Line segment extraction method and line segment extraction device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP KR SG

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref country code: JP

Ref document number: 1997 521237

Kind code of ref document: A

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase