USRE41364E1 - Method for determining scan line misalignments - Google Patents

Method for determining scan line misalignments Download PDF

Info

Publication number
USRE41364E1
USRE41364E1 US11/433,909 US43390906A USRE41364E US RE41364 E1 USRE41364 E1 US RE41364E1 US 43390906 A US43390906 A US 43390906A US RE41364 E USRE41364 E US RE41364E
Authority
US
United States
Prior art keywords
boundary
scan line
scanner
pixels
scan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US11/433,909
Inventor
Yu-Fen Tsai
Te-Chih Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Transpacific Optics LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/433,909 priority Critical patent/USRE41364E1/en
Assigned to TRANSPACIFIC OPTICS LLC reassignment TRANSPACIFIC OPTICS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MUSTEK SYSTEMS, INC.
Assigned to MUSTEK SYSTEMS, INC. reassignment MUSTEK SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHANG, TE-CHIH, TSAI, YU-FEN
Application granted granted Critical
Publication of USRE41364E1 publication Critical patent/USRE41364E1/en
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00007Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for relating to particular apparatus or devices
    • H04N1/00018Scanning arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00031Testing, i.e. determining the result of a trial
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00045Methods therefor using a reference pattern designed for the purpose, e.g. a test chart
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00053Methods therefor out of service, i.e. outside of normal operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00063Methods therefor using at least a part of the apparatus itself, e.g. self-testing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/10Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces
    • H04N1/1013Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces with sub-scanning by translatory movement of at least a part of the main-scanning components
    • H04N1/1017Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces with sub-scanning by translatory movement of at least a part of the main-scanning components the main-scanning components remaining positionally invariant with respect to one another in the sub-scanning direction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • H04N1/191Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a one-dimensional array, or a combination of one-dimensional arrays, or a substantially one-dimensional array, e.g. an array of staggered elements
    • H04N1/192Simultaneously or substantially simultaneously scanning picture elements on one main scanning line
    • H04N1/193Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/04Scanning arrangements
    • H04N2201/047Detection, control or error compensation of scanning velocity or position
    • H04N2201/04701Detection of scanning velocity or position
    • H04N2201/04703Detection of scanning velocity or position using the scanning elements as detectors, e.g. by performing a prescan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/04Scanning arrangements
    • H04N2201/047Detection, control or error compensation of scanning velocity or position
    • H04N2201/04701Detection of scanning velocity or position
    • H04N2201/04715Detection of scanning velocity or position by detecting marks or the like, e.g. slits
    • H04N2201/04717Detection of scanning velocity or position by detecting marks or the like, e.g. slits on the scanned sheet, e.g. a reference sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/04Scanning arrangements
    • H04N2201/047Detection, control or error compensation of scanning velocity or position
    • H04N2201/04701Detection of scanning velocity or position
    • H04N2201/04749Detecting position relative to a gradient, e.g. using triangular-shaped masks, marks or gratings

Definitions

  • the present invention provides a method for determining if an image from a scanner has occurrences of scan line misalignments. More particularly, a software method enabling a program to determine if an image from a scanner has occurrences of scan line misalignments is disclosed.
  • Scanners are popular computer peripherals that are used to digitize documents or pictures so that they may be stored on a computer. To ensure a high quality of these scanned images, manufacturers strive to increase the resolution of the images, and to make their colors more brilliant. But a key factor affecting the quality of scanned images is the stability of the scanning module. If the stability of the scanning module is insufficient, the images from a scanner may have misalignments or entire deletions of scan lines in the image.
  • FIG. 1 is a schematic diagram of a prior art method to determine if there are any occurrences of scan line deletions in an image 10 .
  • the test is performed after the scanner has been completely manufactured, and involves the scanning of the test image 10 . Testing personnel then analyze the results. As shown in FIG. 1 , if a missed scan line 12 is observed by the testing personnel, the scanner is returned to the factory for adjustments. This visual method to determine if a scanner misses scan lines is both time consuming and relatively inaccurate.
  • FIG. 2 is a schematic diagram of another prior art method that uses a scanned test image 20 to determine if a scanner misses scan lines.
  • the test image 20 is produced by the scanner to be tested by scanning a test picture.
  • Each grid element, such as 24 or 26 in FIG. 2 represents a gray-scale value of a scanned test image pixel from the scanner after scanning the test picture, which has an axis of symmetry at 45 degrees.
  • the range of the gray-scale values are from 0 to 255.
  • the larger the gray-scale values are the brighter the corresponding image pixels are.
  • the region 28 represents an image region after scanning the test picture, and its pixel values are almost all less than 30.
  • I(X, Y) represents the pixel value of the test image 20 at the Xth column and the Yth line.
  • a simple program is used to compare the pixel value I(i) of a point (X, Y) and the pixel value I(i+1) of another point (X ⁇ 1, Y+1). If the difference between I(i) and I(i+1) is too large, then it is assumed that a scan line 22 is missing between the lines (Y) and (Y+1).
  • the prior art compares two adjacent lines and determines if the scanned test image 20 conforms to the expected 45 degree symmetry of the test picture.
  • the minimum unit required to determine if a scan line has been skipped is one pixel. This is not accurate enough to satisfy the requirements of a high-end scanner.
  • the present invention scans a test image that has a black bias on a white background.
  • the black bias is a line set at about 45 degrees to the scan lines of the scanner.
  • the method involves finding boundary points of the scanned bias, calculating a regression line from the positions of the boundary points, using differences in the position of adjacent boundary points together with the slope reciprocal of the regression line to determine error values, and comparing the error values with a gate value to determine if there are any occurrences of scan line misalignment.
  • the present invention can detect scan line misalignments with sub-pixel accuracy, thus fulfilling the more rigid requirements for high-level scanners.
  • FIG. 1 is a schematic diagram of a first prior art method for determining if an image from a scanner has any occurrences of scan line misalignments.
  • FIG. 2 is a schematic diagram of a second prior art method for determining if an image from a scanner has any occurrences of scan line misalignments.
  • FIG. 3 is a schematic diagram of a scanner with the present invention method.
  • FIG. 4 is a schematic diagram of a document with a test pattern of the present invention method.
  • FIG. 5 is a relationship diagram between a plurality of pixels of a scanning line and their corresponding gray-scale image values used to illustrate the present invention method of searching for the boundary points from the image information in a scan line.
  • FIG. 6 is a schematic diagram for a method to calculate the boundary points depicted in FIG. 5 .
  • FIG. 7 is a flow chart of the present invention method for determining if an image from a scanner has any occurrences of scan line misalignments.
  • FIG. 8 is a flow chart of the present invention method to search for boundary points from the image information in a scan line.
  • FIG. 3 is a schematic diagram of a scanner 30 with the present invention method
  • FIG. 4 is a schematic diagram of a document 36 for performing the present invention method.
  • the scanner 30 includes a housing 32 , a scanning platform 34 onto which is placed the document 36 , a scanning module 38 to scan the document 36 , and a driving module 40 to drive the scanning module 38 .
  • the document 36 has a white background with a test pattern that is a black bias 37 .
  • the width of the bias 37 is 2 mm, and it has an angle of 45 degrees with respect to the orientation of the scan lines.
  • the black bias 37 runs from the upper-right portion of the document 46 to the lower-left portion of the document 36 .
  • the scanning module 38 is used to scan the document 36 .
  • Scan line image information for a plurality of scan lines is collected, each scan line containing a portion of scanned image of the document 36 , the scan line information being collected in order.
  • the image information in each scan line includes a plurality of gray-scale pixels, a portion of which correspond to the bias 37 .
  • a searching method is used to find a boundary point of the bias 37 from the gray-scale image information in each scan line. Because the bias 37 on the document 36 includes two boundary lines, the image information in each scan line will have two boundary points. For convenience, the positions of the boundary points of the left boundary line will be described. The positions of the boundary points on the right side of the bias 37 are found in the same manner. This method is actually quite well known in the field of image processing.
  • FIG. 5 is a relationship diagram between the plurality of pixels of a scan line and their corresponding gray-scale image values, and is used to illustrate the present invention method for searching the boundary points in the image information of a scan line.
  • the gray-scale image values of a plurality of pixels in a chosen white region 42 of the scan line is averaged.
  • Half of the value of this result is used to define a boundary reference level (V P1 ).
  • the pixel whose gray-scale image value most closely matches this boundary reference level is selected as a first boundary reference point P 1 .
  • a second predetermined number of pixels are selected. Average of the gray-scale values of these chosen pixels are used to define a white reference level (V W ).
  • V W white reference level
  • another group of the second predetermined number of pixels are selected, and their average values are used as a black reference level (V B ).
  • first boundary reference point P 1 is located in an interim region 46 between the white region 42 and the black region 44 . Because the first boundary reference point P 1 is located in an interim region 46 between the white region 42 and the black region 44 , moving forward or backward by a predetermined number of pixels from the first boundary point P 1 is used to ensure that the chosen pixels are located in the interim region 46 . This makes the calculation of the white and the black reference levels more accurate.
  • the average of the white reference level (V W ) and the black reference level (V B ) is used to define a boundary level (V 0 ).
  • Two adjacent pixels P 2 and P 3 are then chosen between where the boundary level V 0 falls. That is, the gray-scale image value of the boundary level V 0 lies between the gray-scale image values of the pixels P 2 and P 3 , satisfying the inequality V P3 ⁇ V 0 ⁇ V P2 .
  • These two pixels are used as a second and a third boundary reference points P 2 and P 3 .
  • FIG. 6 is a schematic diagram for a method to calculate the boundary points of FIG. 5 .
  • the gray-scale image values V P2 and V P3 of the second and the third boundary reference points are used along with the boundary level V 0 to calculate the position X of the boundary point of the scanning line.
  • P 3 is equal to P 2 +1.
  • P 3 - X P 3 - P 2 V 0 - V P ⁇ ⁇ 2 V P ⁇ ⁇ 2 - V P ⁇ ⁇ 3
  • the position of the boundary point X is equal to ( P 2 + V p ⁇ ⁇ 2 - V 0 V p ⁇ ⁇ 2 - V p ⁇ ⁇ 3 ) .
  • a series of different positions of boundary points X i can be calculated, where i is an integer ranging from 1 to n, n being the number of scan lines.
  • a regression line and its slope can be calculated.
  • the calculation of the regression line can be done using all, or only some, of the previously found boundary points. Because the calculation and mathematical significance of the regression line is a well known prior art mathematical tool, the equations are only noted here, without undue explanation.
  • each error value ERR i is compared against a predetermined gate value (TD) to determine if the image from the scanner has any occurrences of scan line misalignment. If a specific error value is larger than the gate value, then there must be a scan line misalignment at the corresponding scan line. If the specific error value is less than or equal to the gate value, then there is no occurrence of scan line misalignment at that scan line. Occurrences of scan line misalignment are therefore determined by the choice of the gate value, which must be set by experienced personnel. In the preferred embodiment of the present invention, the gate value is about 0.3.
  • FIG. 7 is a flow chart of the present invention method to determine if an image from a scanner has any occurrences of scan line misalignment.
  • the method of the present invent includes the following steps:
  • FIG. 8 is a flow chart of the present invention method of searching for the boundary points in the image information of a scan line. This method includes the following steps:
  • the comparison of gray-scale image values of two adjacent scan lines to determine if the slope of the scanned test image is equal to its original value lowers the factors of personal judgement, but the results are too rough to fulfill the requirements of high-level scanners.
  • the present invention method searches for the boundary points to determine a regression line, and can calculate the positions of the boundary points accurately within one pixel unit and the error values of all boundary points to the regression line. If the error value is larger than a predetermined gate value, a scan line misalignment is determined to have occurred. In light of the discussion above, this method fulfills the requirements of high-level scanners, and the gate value can be chosen by experienced personnel to account for different requirements.

Abstract

A test image has a black bias on a white background. The black bias is a line set at about 45 degrees to the scan lines of a scanner. Boundary points of the scanned bias are found. A regression line is calculated from the positions of the boundary points. Differences in the positions of adjacent boundary points, together with the slope reciprocal of the regression line, are used to determine error values. The error values are compared with a gate value to determine if there are any occurrences of scan line misalignment.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention provides a method for determining if an image from a scanner has occurrences of scan line misalignments. More particularly, a software method enabling a program to determine if an image from a scanner has occurrences of scan line misalignments is disclosed.
2. Description of the Prior Art
Scanners are popular computer peripherals that are used to digitize documents or pictures so that they may be stored on a computer. To ensure a high quality of these scanned images, manufacturers strive to increase the resolution of the images, and to make their colors more brilliant. But a key factor affecting the quality of scanned images is the stability of the scanning module. If the stability of the scanning module is insufficient, the images from a scanner may have misalignments or entire deletions of scan lines in the image.
Please refer to FIG. 1. FIG. 1 is a schematic diagram of a prior art method to determine if there are any occurrences of scan line deletions in an image 10. The test is performed after the scanner has been completely manufactured, and involves the scanning of the test image 10. Testing personnel then analyze the results. As shown in FIG. 1, if a missed scan line 12 is observed by the testing personnel, the scanner is returned to the factory for adjustments. This visual method to determine if a scanner misses scan lines is both time consuming and relatively inaccurate.
Please refer to FIG. 2. FIG. 2 is a schematic diagram of another prior art method that uses a scanned test image 20 to determine if a scanner misses scan lines. The test image 20 is produced by the scanner to be tested by scanning a test picture. Each grid element, such as 24 or 26 in FIG. 2, represents a gray-scale value of a scanned test image pixel from the scanner after scanning the test picture, which has an axis of symmetry at 45 degrees. The range of the gray-scale values are from 0 to 255. The smaller the gray-scale values are, the darker the corresponding image pixels are. The larger the gray-scale values are, the brighter the corresponding image pixels are. The region 28 represents an image region after scanning the test picture, and its pixel values are almost all less than 30.
In this prior art, a search is performed within the scanned test image 20 to find the positions of the boundary points of the test image 20, and then pixel values within the boundary points are tested against diagonally adjacent pixel values. For example, I(X, Y) represents the pixel value of the test image 20 at the Xth column and the Yth line. A simple program is used to compare the pixel value I(i) of a point (X, Y) and the pixel value I(i+1) of another point (X−1, Y+1). If the difference between I(i) and I(i+1) is too large, then it is assumed that a scan line 22 is missing between the lines (Y) and (Y+1).
Hence, the prior art compares two adjacent lines and determines if the scanned test image 20 conforms to the expected 45 degree symmetry of the test picture. The minimum unit required to determine if a scan line has been skipped is one pixel. This is not accurate enough to satisfy the requirements of a high-end scanner.
SUMMARY OF THE INVENTION
It is therefore an objective of the present invention to provide a method for determining scan line misalignments of a scanner.
Briefly, the present invention scans a test image that has a black bias on a white background. The black bias is a line set at about 45 degrees to the scan lines of the scanner. The method involves finding boundary points of the scanned bias, calculating a regression line from the positions of the boundary points, using differences in the position of adjacent boundary points together with the slope reciprocal of the regression line to determine error values, and comparing the error values with a gate value to determine if there are any occurrences of scan line misalignment.
It is an advantage that the present invention can detect scan line misalignments with sub-pixel accuracy, thus fulfilling the more rigid requirements for high-level scanners.
These and other objectives of the present invention will no doubt become obvious to these of ordinary skill in the art after reading the following detailed description of the preferred embodiment, which is illustrated in the various figures and drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of a first prior art method for determining if an image from a scanner has any occurrences of scan line misalignments.
FIG. 2 is a schematic diagram of a second prior art method for determining if an image from a scanner has any occurrences of scan line misalignments.
FIG. 3 is a schematic diagram of a scanner with the present invention method.
FIG. 4 is a schematic diagram of a document with a test pattern of the present invention method.
FIG. 5 is a relationship diagram between a plurality of pixels of a scanning line and their corresponding gray-scale image values used to illustrate the present invention method of searching for the boundary points from the image information in a scan line.
FIG. 6 is a schematic diagram for a method to calculate the boundary points depicted in FIG. 5.
FIG. 7 is a flow chart of the present invention method for determining if an image from a scanner has any occurrences of scan line misalignments.
FIG. 8 is a flow chart of the present invention method to search for boundary points from the image information in a scan line.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Please refer to FIG. 3 and FIG. 4. FIG. 3 is a schematic diagram of a scanner 30 with the present invention method, and FIG. 4 is a schematic diagram of a document 36 for performing the present invention method. The scanner 30 includes a housing 32, a scanning platform 34 onto which is placed the document 36, a scanning module 38 to scan the document 36, and a driving module 40 to drive the scanning module 38. For a preferred embodiment of the present invention, the document 36 has a white background with a test pattern that is a black bias 37. The width of the bias 37 is 2 mm, and it has an angle of 45 degrees with respect to the orientation of the scan lines. The black bias 37 runs from the upper-right portion of the document 46 to the lower-left portion of the document 36.
To determine if there are any occurrences of scan line misalignment, the scanning module 38 is used to scan the document 36. Scan line image information for a plurality of scan lines is collected, each scan line containing a portion of scanned image of the document 36, the scan line information being collected in order. The image information in each scan line includes a plurality of gray-scale pixels, a portion of which correspond to the bias 37.
A searching method is used to find a boundary point of the bias 37 from the gray-scale image information in each scan line. Because the bias 37 on the document 36 includes two boundary lines, the image information in each scan line will have two boundary points. For convenience, the positions of the boundary points of the left boundary line will be described. The positions of the boundary points on the right side of the bias 37 are found in the same manner. This method is actually quite well known in the field of image processing.
Please refer to FIG. 5. FIG. 5 is a relationship diagram between the plurality of pixels of a scan line and their corresponding gray-scale image values, and is used to illustrate the present invention method for searching the boundary points in the image information of a scan line. First, the gray-scale image values of a plurality of pixels in a chosen white region 42 of the scan line is averaged. Half of the value of this result is used to define a boundary reference level (VP1). The pixel whose gray-scale image value most closely matches this boundary reference level is selected as a first boundary reference point P1.
Moving forward from the first boundary reference point P1 by a first predetermined number (say, 15) of pixels, a second predetermined number of pixels (again, 15) are selected. Average of the gray-scale values of these chosen pixels are used to define a white reference level (VW). Similarly, by moving backwards from the first boundary reference point P1 by the first predetermined number of pixels (15), another group of the second predetermined number of pixels (15) are selected, and their average values are used as a black reference level (VB).
Because the first boundary reference point P1 is located in an interim region 46 between the white region 42 and the black region 44, moving forward or backward by a predetermined number of pixels from the first boundary point P1 is used to ensure that the chosen pixels are located in the interim region 46. This makes the calculation of the white and the black reference levels more accurate.
The average of the white reference level (VW) and the black reference level (VB) is used to define a boundary level (V0). Two adjacent pixels P2 and P3 are then chosen between where the boundary level V0 falls. That is, the gray-scale image value of the boundary level V0 lies between the gray-scale image values of the pixels P2 and P3, satisfying the inequality VP3≦V0≦VP2. These two pixels are used as a second and a third boundary reference points P2 and P3.
Please refer to FIG. 6. FIG. 6 is a schematic diagram for a method to calculate the boundary points of FIG. 5. The gray-scale image values VP2 and VP3 of the second and the third boundary reference points are used along with the boundary level V0 to calculate the position X of the boundary point of the scanning line. The following equation is used to calculate X: X = P 2 + V p 2 - V 0 V p 2 - V p 1 .
Because P2 and P3 are adjacent pixels, P3 is equal to P2+1. From the equation P 3 - X P 3 - P 2 = V 0 - V P 2 V P 2 - V P 3 ,
the position of the boundary point X is equal to ( P 2 + V p 2 - V 0 V p 2 - V p 3 ) .
In the same manner, a series of different positions of boundary points Xi can be calculated, where i is an integer ranging from 1 to n, n being the number of scan lines.
After determining the boundary position points of the left boundary line for every scan line, a regression line and its slope can be calculated. The calculation of the regression line can be done using all, or only some, of the previously found boundary points. Because the calculation and mathematical significance of the regression line is a well known prior art mathematical tool, the equations are only noted here, without undue explanation. With n parts of numbers (xi, yi), i from 1 to n and being a positive integer, chosen to calculate the regression line y=mx+b, the values of m and b can be determined from the following equations: m = n n i = 1 x i y i - ( n i = 1 x i ) ( n i = 1 y i ) n n i = 1 x i 2 - ( n i = 1 x i ) 2 b = ( n i = 1 y i ) ( n i = 1 x i 2 ) - ( n i = 1 x i ) ( n i = 1 x i y i ) n n i = 1 x i 2 - ( n i = 1 x i ) 2
After calculating the regression line, the difference in the position of every boundary point is calculated (Δxi=xi−xi+1). This, with the reciprocal of the slope of the regression line (1/m), is used to calculate corresponding error values as (ERRi=|Δxi−1/m|). And the error values can also be interpreted with the use of error ratios, which are equal to ERRi/(1/m).
Finally, each error value ERRi is compared against a predetermined gate value (TD) to determine if the image from the scanner has any occurrences of scan line misalignment. If a specific error value is larger than the gate value, then there must be a scan line misalignment at the corresponding scan line. If the specific error value is less than or equal to the gate value, then there is no occurrence of scan line misalignment at that scan line. Occurrences of scan line misalignment are therefore determined by the choice of the gate value, which must be set by experienced personnel. In the preferred embodiment of the present invention, the gate value is about 0.3.
Please refer to FIG. 7. FIG. 7 is a flow chart of the present invention method to determine if an image from a scanner has any occurrences of scan line misalignment. The method of the present invent includes the following steps:
    • Step 50: Begin.
    • Step 52: Scan the black bias 37 on the document 36, and then collect the corresponding image information from the scan lines.
    • Step 54: Search for the position of the black bias 37 from the image information in the scan lines by way of searching for the boundary points.
    • Step 56: Are the required positions of the boundary points found? If not, go back to Step 54.
    • Step 58: Calculate the corresponding regression line using the found positions of the boundary points.
    • Step 60: Calculate the reciprocal of the slope of the regression line (1/m).
    • Step 62: Select an appropriate gate value (TD).
    • Step 64: Calculate the differences in the positions of the boundary points (Δxi=xi−xi+1) of the adjacent scan lines.
    • Step 66: Use the differences of step 64 and the reciprocal of the slope of the regression line of step 60 to calculate corresponding error values (ERRi=|Δxi−1/m|), and the error values can also be interpreted with the use of error ratios, which are equal to ERRi/(1/m).
    • Step 68: Is an error value from step 66 larger than the chosen gate value? If not, go to step 72, otherwise go to Step 70.
    • Step 70: A scan line misalignment has occurred. Go to step 74.
    • Step 72: There are no occurrences of scan line misalignment.
    • Step 74: Have all of the chosen boundary points been calculated? If not, go to step 64, otherwise go to Step 76.
    • Step 76: End.
Please refer to FIG. 8. FIG. 8 is a flow chart of the present invention method of searching for the boundary points in the image information of a scan line. This method includes the following steps:
    • Step 80: Begin.
    • Step 82: Average the gray-scale image values of the pixels in the white region and defining half of this average value as a boundary reference level VP1.
    • Step 84: Find the pixel in the scan line with a gray-scale value that is closest to the boundary reference level, and define this pixel as a first boundary reference point P1.
    • Step 86: After moving forward 15 pixels from the first boundary reference point, select the next 15 pixels ahead, and average the gray-scale image values of these 15 pixels to determine a white reference level VW.
    • Step 88: After moving backward 15 pixels from the first boundary reference point, select the next 15 pixels behind and average the gray-scale image values of these 15 pixels to determine a black reference level VB.
    • Step 90: Average the white reference level VW and the black reference level VB to determine a boundary level V 0 = V B + V W 2 .
    • Step 92: Search for two adjacent pixels P2 and P3 between where V0 falls, so that VP3≦V0≦VP2, and define these two pixels as the second and the third boundary reference points P2 and P3.
    • Step 94: Calculate the position of the boundary point X, using the equation X = P 2 + V p 2 - V 0 V p 2 - V p 3 .
    • Step 96: End.
In the first prior art method, a manual, visual inspection of a scanned test image is performed, which is both judgmental (as it depends on the individual experience of the testing staff) and time-consuming. In the second prior art, the comparison of gray-scale image values of two adjacent scan lines to determine if the slope of the scanned test image is equal to its original value lowers the factors of personal judgement, but the results are too rough to fulfill the requirements of high-level scanners. The present invention method, however, searches for the boundary points to determine a regression line, and can calculate the positions of the boundary points accurately within one pixel unit and the error values of all boundary points to the regression line. If the error value is larger than a predetermined gate value, a scan line misalignment is determined to have occurred. In light of the discussion above, this method fulfills the requirements of high-level scanners, and the gate value can be chosen by experienced personnel to account for different requirements.
Those skilled in the art will readily observe that numerous modifications and alternations of the device may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.

Claims (22)

1. A method for determining if an image from a scanner has an occurrence of scan line misalignment, the scanner comprising a housing, a scanning platform upon which is placed a document to be scanned, a scanning module to scan the document, and a driving module to drive the scanning module, the method comprising:
scanning a document having a test image and collecting corresponding scan line image information from a plurality of scan lines in order, each scan line image having a portion of the scanned image of the test image;
using a method of searching for a predetermined boundary point to find the position of a boundary point of the test image from the information in every scan line image;
calculating a regression line from the position of the boundary point;
calculating discrepancies of corresponding positions of boundary points and the slope reciprocal of the regression line from the image information of adjacent scan lines and determining corresponding error values from every discrepancy and slope reciprocal; and
comparing every error value with a predetermined gate value to determine if the scan line images from the scanner have any occurrences of scan line misalignment.
2. The method of claim 1 wherein the method uses all found positions of the boundary points to calculate the regression line.
3. The method of claim 1 wherein the method uses a portion of the found positions of the boundary points to calculate the regression line.
4. The method of claim 1 wherein the regression line y=mx+b is calculated by m = n i = 1 x i y i - ( n i = 1 x i ) ( n i = 1 y i ) n n i = 1 x i 2 - ( n i = 1 x i ) 2 , b = ( n i = 1 y i ) ( n i = 1 x i 2 ) - ( n i = 1 x i ) ( n i = 1 x i y i ) n n i = 1 x i 2 - ( n i = 1 x i ) 2
andwherein (xi, yi) are the positions of the boundary points chosen to calculate the regression line.
5. The method of claim 1 wherein the document has a white background, and the test pattern is a black bias from an upper-right position to a lower-left position, each scan line comprising gray-scale image information from a plurality of pixels, and the method of searching for a predetermined boundary point proceeds from up to down and from left to right to search for the boundary points of the bias from the gray-scale image information in each scan line.
6. The method of claim 5 wherein the method of searching for a predetermined boundary point has the following steps comprises:
averaging the gray-scale values of a plurality of pixels in a chosen white region of a scan line, half of the averaging result being a boundary reference level (VP1), and defining the gray-scale value of a pixel closest to the boundary referencing level as a first boundary reference point (P1);
moving forward a first predetermined number of pixels from the first boundary reference point to select a second predetermined number of pixels, the average of the gray-scale values of the second predetermined number of pixels being a white reference level (VW);
moving backward the first predetermined number of pixels from the first boundary reference point to select the second predetermined number of pixels, the average of the gray-scale values of the second predetermined number of pixels being a black reference level (VB);
averaging the white and the black reference levels to determine a boundary level (V0);
searching for two adjacent pixels (P2, P3) from the plurality of pixels of the scan lines where the boundary level between both gray-scale values of the two adjacent pixels satisfies (VP2≦V0≦VP2), and setting these two pixels as a second and a third boundary reference point (P2, P3); and
using the gray-scale values of the second and the third reference points (VP2, VP3) and the boundary level (V0) to calculate the boundary point (X) mathematically by x = P 2 + V p 2 - V 0 V p 2 - V p 3 .
7. The method of claim 6 wherein the first and the second predetermined numbers are both 15.
8. The method of claim 1 wherein the width of the black bias is 2 mm, and the black bias has an angle of about 45 degrees with respect to the scan lines.
9. The method of claim 1 wherein the predetermined gate value is 0.3.
10. A method for identifying scan line misalignment in a scanner, comprising:
scanning a test image to obtain scanned image data corresponding to a plurality of scan lines of the scanner;
locating one or more boundary points of the scanned test image corresponding to at least a portion of the plurality of scan lines;
determining a regression line for at least a portion of the one or more boundary points;
determining a corresponding error value for at least a portion of the one or more boundary points, based at least in part on the regression line;
comparing one or more corresponding error values with a gate value; and
determining whether the scanner has a scan line misalignment based at least in part on the comparing.
11. The method of claim 10, wherein locating one or more boundary points further comprises:
for a particular scan line, identifying a first boundary reference point V P ;
determining a white reference level V W ;
determining a black reference level V B ;
averaging the white and the black reference levels to determine a boundary level V 0 ;
selecting two pixels as a second and a third boundary reference point P 2 and P 3 that satisfy the relationship V P3 ≦V 0 ≦V P2 , wherein V P2 and V P3 comprise second and third boundary reference points; and
calculating a boundary point (x) mathematically by x = P 2 + V p 2 - V 0 V p 2 - V p 3 .
12. The method of claim 10, wherein determining a corresponding error value includes:
determining a difference in position between a boundary point corresponding with a first scan line and a boundary point corresponding with a second scan line;
determining a reciprocal of the slope of the regression line at the first scan line; and
determining an error value corresponding with the first scan line based at least in part on the difference between the determined difference in position between the boundary point corresponding with the first scan line and the boundary point corresponding with the second scan line and the determined reciprocal of the slope of the regression line at the first scan line.
13. The method of claim 12, wherein the first and second scan lines comprise adjacent scan lines.
14. The method of claim 10, wherein the gate value comprises a predetermined value.
15. The method of claim 10, wherein the gate value is determined based on properties of the scanner.
16. The method of claim 10, wherein determining whether the scanner has a scan line misalignment includes identifying a misalignment if the error value is numerically greater than the gate value.
17. A scanner having a plurality of scan lines, the scanner comprising:
a housing;
a scanning platform positioned at least partially in the housing;
a scanning module positioned at least partially in the housing and configured to obtain a scanned image of the document; and
a driving module positioned at least partially in the housing and configured to drive the scanning module to scan a document positioned on the scanning platform, wherein the scanner is configured to—locate one or more boundary points of the scanned image;
determine a regression line for at least a portion of the one or more boundary points;
determine corresponding error values for the one or more boundary points, based at least in part on the regression line; and
compare one or more corresponding error values with a gate value.
18. The scanner of claim 17, wherein said driving module is further configured to identify a misalignment if the error value is numerically greater than the gate value.
19. The scanner of claim 17, wherein said document comprises an image of a test pattern.
20. The scanner of claim 19, wherein said image of a test pattern comprises a black bias that runs from the upper-right portion of the document to the lower-left portion of the document.
21. The scanner of claim 17, wherein said scanner is further configured to:
identify a first boundary reference point V P for a particular scan line;
determine a white reference level V W ;
determine a black reference level V B ;
average the white and the black reference levels to determine a boundary level V 0 ;
select two pixels as a second and a third boundary reference point P 2 , and P 3 that satisfy the relationship V P3 ≦V 0 ≦V P2 , wherein V P2 and V P3 comprise second and third boundary reference points; and
calculate a boundary point (x) mathematically by x = P 2 + V p 2 - V 0 V p 2 - V p 3 .
22. The scanner of claim 17, wherein said scanner is further configured to:
determine a difference in position between a boundary point corresponding with a first scan line and a boundary point corresponding with a second scan line;
determine a reciprocal of the slope of the regression line at the first scan line; and
determine an error value corresponding with the first scan line based at least in part on the difference between the determined difference in position between the boundary point corresponding with the first scan line and the boundary point corresponding with the second scan line and the determined reciprocal of the slope of the regression line at the first scan line.
US11/433,909 2000-02-18 2006-05-11 Method for determining scan line misalignments Expired - Lifetime USRE41364E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/433,909 USRE41364E1 (en) 2000-02-18 2006-05-11 Method for determining scan line misalignments

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
TW89102748 2000-02-18
TW089102748A TW466873B (en) 2000-02-18 2000-02-18 Method to determine the occurrence of line-dropping in the scanned image of scanner
US09/740,898 US6734998B2 (en) 2000-02-18 2000-12-21 Method for determining scan line misalignments
US11/433,909 USRE41364E1 (en) 2000-02-18 2006-05-11 Method for determining scan line misalignments

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/740,898 Reissue US6734998B2 (en) 2000-02-18 2000-12-21 Method for determining scan line misalignments

Publications (1)

Publication Number Publication Date
USRE41364E1 true USRE41364E1 (en) 2010-06-01

Family

ID=21658816

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/740,898 Ceased US6734998B2 (en) 2000-02-18 2000-12-21 Method for determining scan line misalignments
US11/433,909 Expired - Lifetime USRE41364E1 (en) 2000-02-18 2006-05-11 Method for determining scan line misalignments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/740,898 Ceased US6734998B2 (en) 2000-02-18 2000-12-21 Method for determining scan line misalignments

Country Status (2)

Country Link
US (2) US6734998B2 (en)
TW (1) TW466873B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100440951B1 (en) * 2001-07-06 2004-07-21 삼성전자주식회사 Method for correcting scanning error in the flatbed scanner and apparatus thereof
KR100573668B1 (en) * 2004-01-19 2006-04-26 삼성전자주식회사 Scan image correction device and method thereof
JP2005311766A (en) * 2004-04-22 2005-11-04 Fuji Xerox Co Ltd Image reading apparatus
US20050276454A1 (en) * 2004-06-14 2005-12-15 Rodney Beatson System and methods for transforming biometric image data to a consistent angle of inclination
US20060224953A1 (en) * 2005-04-01 2006-10-05 Xiaofan Lin Height-width estimation model for a text block
US8300276B2 (en) * 2009-12-11 2012-10-30 Raytheon Company Compensating for misalignment in an image scanner
KR20150089832A (en) 2014-01-28 2015-08-05 삼성전자주식회사 apparatus and method for processing scan data by ESD input
TWI588468B (en) * 2015-07-13 2017-06-21 All Ring Tech Co Ltd Method for searching for area of ​​interest of electronic components and method and device for detecting defect of electronic components using the same
CN106022190A (en) * 2016-05-19 2016-10-12 成都陌云科技有限公司 Image scanning method for mobile phone client

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563403A (en) * 1993-12-27 1996-10-08 Ricoh Co., Ltd. Method and apparatus for detection of a skew angle of a document image using a regression coefficient
US5892854A (en) * 1997-01-21 1999-04-06 Xerox Corporation Automatic image registration using binary moments
US5978102A (en) * 1995-11-24 1999-11-02 Minolta Co., Ltd. Image reading apparatus
US6134028A (en) * 1997-07-04 2000-10-17 Samsung Electronics Co., Ltd. Method for scanning documents
US6178015B1 (en) * 1998-06-05 2001-01-23 Mustek Systems, Inc. Apparatus and method for increasing the scan accuracy and quality of the flatbed scanner by using close loop control
US6303921B1 (en) * 1999-11-23 2001-10-16 Hewlett-Packard Company Method and system for capturing large format documents using a portable hand-held scanner
US20020070331A1 (en) * 1998-01-30 2002-06-13 Fumihiro Inui Image sensor
US6512585B1 (en) * 1999-01-20 2003-01-28 Murata Manufacturing Co. Ltd Laser scanning position detecting device
US20030020821A1 (en) * 2001-07-27 2003-01-30 Tohru Watanabe Imaging apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563403A (en) * 1993-12-27 1996-10-08 Ricoh Co., Ltd. Method and apparatus for detection of a skew angle of a document image using a regression coefficient
US5978102A (en) * 1995-11-24 1999-11-02 Minolta Co., Ltd. Image reading apparatus
US5892854A (en) * 1997-01-21 1999-04-06 Xerox Corporation Automatic image registration using binary moments
US6134028A (en) * 1997-07-04 2000-10-17 Samsung Electronics Co., Ltd. Method for scanning documents
US20020070331A1 (en) * 1998-01-30 2002-06-13 Fumihiro Inui Image sensor
US6534757B2 (en) * 1998-01-30 2003-03-18 Canon Kabushiki Kaisha Image sensor noise reduction
US6178015B1 (en) * 1998-06-05 2001-01-23 Mustek Systems, Inc. Apparatus and method for increasing the scan accuracy and quality of the flatbed scanner by using close loop control
US6512585B1 (en) * 1999-01-20 2003-01-28 Murata Manufacturing Co. Ltd Laser scanning position detecting device
US6303921B1 (en) * 1999-11-23 2001-10-16 Hewlett-Packard Company Method and system for capturing large format documents using a portable hand-held scanner
US20030020821A1 (en) * 2001-07-27 2003-01-30 Tohru Watanabe Imaging apparatus

Also Published As

Publication number Publication date
US20010019431A1 (en) 2001-09-06
US6734998B2 (en) 2004-05-11
TW466873B (en) 2001-12-01

Similar Documents

Publication Publication Date Title
USRE41364E1 (en) Method for determining scan line misalignments
US6459821B1 (en) Simultaneous registration of multiple image fragments
US6275600B1 (en) Measuring image characteristics of output from a digital printer
US7565032B2 (en) Image density-adapted automatic mode switchable pattern correction scheme for workpiece inspection
US20060291719A1 (en) Image processing apparatus
US5500737A (en) Method for measuring the contour of a surface
US7711178B2 (en) Pattern inspection method and its apparatus
US7487491B2 (en) Pattern inspection system using image correction scheme with object-sensitive automatic mode switchability
US7627164B2 (en) Pattern inspection method and apparatus with high-accuracy pattern image correction capability
US6885777B2 (en) Apparatus and method of determining image processing parameter, and recording medium recording a program for the same
US7538750B2 (en) Method of inspecting a flat panel display
US6188801B1 (en) Method and apparatus for automatic image calibration for an optical scanner
US20060110060A1 (en) Method for run-time streak removal
EP0578816B1 (en) Method of inspecting articles
US20030156748A1 (en) Adaptive threshold determination for ball grid array component modeling
EP0440396B1 (en) Method of measuring track displacement on a magnetic tape
US6373983B1 (en) Method for detecting the occurrence of missing lines on images generated by an optical scanner
US20200126203A1 (en) Display panel inspection system, inspection method of display panel and display panel using the same
JP4454075B2 (en) Pattern matching method
JP4530723B2 (en) PATTERN MATCHING METHOD, PATTERN MATCHING DEVICE, AND ELECTRONIC COMPONENT MOUNTING METHOD
EP4343710A1 (en) Mark detection method and computer program
JP3043530B2 (en) Dot pattern inspection equipment
EP0699890A1 (en) An apparatus for measuring the contour of a surface
KR0145255B1 (en) Dot pattern inspecting apparatus
US7515317B2 (en) Compensating a zipper image by a K-value

Legal Events

Date Code Title Description
AS Assignment

Owner name: MUSTEK SYSTEMS, INC.,TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TSAI, YU-FEN;CHANG, TE-CHIH;REEL/FRAME:018749/0826

Effective date: 20001218

Owner name: TRANSPACIFIC OPTICS LLC,DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MUSTEK SYSTEMS, INC.;REEL/FRAME:018749/0835

Effective date: 20051202

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12