US20080095413A1 - Fingerprint recognition system - Google Patents

Fingerprint recognition system Download PDF

Info

Publication number
US20080095413A1
US20080095413A1 US11/643,045 US64304506A US2008095413A1 US 20080095413 A1 US20080095413 A1 US 20080095413A1 US 64304506 A US64304506 A US 64304506A US 2008095413 A1 US2008095413 A1 US 2008095413A1
Authority
US
United States
Prior art keywords
fingerprint
minutiae
fpi
depicted
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/643,045
Inventor
Shing-Tung Yau
Xianfeng Gu
Zhiwu Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Geometric Informatics Inc
Original Assignee
Geometric Informatics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Geometric Informatics Inc filed Critical Geometric Informatics Inc
Priority to US11/643,045 priority Critical patent/US20080095413A1/en
Publication of US20080095413A1 publication Critical patent/US20080095413A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1353Extracting features related to minutiae or pores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Definitions

  • a biometric is defined as a biological characteristic or trait that is unique to an individual and that can be accurately measured.
  • a biometric that can be stored and accessed in an efficient manner can be used to identify an individual or to verify the identity of an individual.
  • a biometric commonly used to identify human beings is one or more fingerprints belonging to the particular human being.
  • Fingerprint identification of a human being consists of two stages: enrollment and verification/identification.
  • Enrollment of a fingerprint involves taking a fingerprint image (FPI) of an individual and storing the FPI itself or a plurality of data that is representative of the FPI in an FPI database.
  • Identification of a fingerprint involves taking an FPI of an unknown individual and comparing the unknown FPI to the FPIs or FPI data that is stored in the FPI database. An identification is made when a match between the unknown FPI and an FPI stored in the FPI database is found that has a sufficient reliability that the probability of a false positive is below a predetermined threshold.
  • Fingerprint verification or authentication matches an individual to a fingerprint that has been previously enrolled by that individual.
  • identification involves searching for a match between a single unknown FPI with many stored FPIs.
  • the verification process involves the matching an unknown or unconfirmed fingerprint minutiae template to a single previously enrolled fingerprint minutia template. Accordingly, the verification process is a one-to-one matching technique.
  • biometrics to restrict access to secure entities such as computer networks, cryptographic keys, sensitive data, and physical locations
  • smart cards cards that have a biometric, such as a fingerprint, encoded thereon can be used to provide transaction security as well.
  • a smart card allows a user to provide the biometric encoded on the card, wherein the encoded biometric data is compared to the biometric measured on the individual. In this way, a smartcard can positively authenticate the identity of the smartcard user.
  • FPI data is based on the set of singularities that can be classified according the type of singularity, e.g., deltas, arches, or whorls.
  • FPIs contain fingerprint minutiae that are the end point of a ridge curve or a bifurcation point of a ridge curve.
  • FPI images can be classified and matched according to data associated with the fingerprint minutiae. This data can include the position of the minutiae, the tangential direction of the minutiae, and the distance to other minutiae.
  • FPI data can lead to a high false acceptance or identification rate when the unknown FPI has only a few minutiae or if the unknown FPI is only a partial FPI that may or may not include the number of minutiae needed to accurately verify or identify the unknown FPI.
  • a method of analyzing and recognizing fingerprint images that utilizes vector processing of a vector field that is defined as the tangential vector of the fingerprint ridge curves is disclosed.
  • the raw fingerprint image is divided into blocks, each block is filtered to remove noise and the orientation direction of each block is found. This allows the ridge curves to be enhanced and approximated by piece-wise linear approximations.
  • the piece-wise linear approximations to the ridge curves allow the minutiae to be extracted and classified and a fingerprint minutiae template to be constructed.
  • An enrollment process gathers multiple fingerprint images, creates fingerprint minutiae templates corresponding to the fingerprint images, and stores the templates and other data associated with the respective individual or the enrolled fingerprint in a fingerprint database.
  • an unknown raw fingerprint image is obtained via a fingerprint scanner and processed similarly to the enrollment process described above.
  • the fingerprint minutiae template of the unknown fingerprint is compared to one or more previously enrolled fingerprint minutiae templates to identify or verify the identity of the individual associated with the unknown fingerprint.
  • live finger detection can be accomplished in conjunction with the identification or verification processes through analysis of the fingerprint image thus enhancing the security of the overall system.
  • FIG. 1 is a flow chart of a method for acquiring and enrolling fingerprint minutiae templates
  • FIG. 2 is a flow chart of a method for extracting minutiae from a raw fingerprint image and forming a fingerprint minutiae template
  • FIG. 3 is a schematic diagram of a direction filter suitable for use in the present fingerprint analysis method
  • FIG. 4 is a flow chart of a method for identifying/verifying the identity of an individual using the presently described fingerprint analysis method
  • FIG. 5 is a flow chart of a method for comparing an unknown fingerprint minutiae template with a previously enrolled fingerprint minutiae template
  • FIG. 6 is block diagram for a system to control physical access using the fingerprint analysis methods described herein;
  • FIG. 7 is a block diagram for a system to control computer network access using the fingerprint analysis methods described herein;
  • FIG. 8 is a block diagram for a system to control access to a web page across the internet using the fingerprint analysis methods described herein;
  • FIG. 9 is a flow chart for a method of using the presently described fingerprint analysis methods in conjunction with a smartcard
  • FIG. 10 is a flow chart for a method of detecting a live finger by analyzing the binary fingerprint image
  • FIG. 11 is a flow chart for a method of detecting and classifying singularities found in the finger print image.
  • FIG. 12 is a flow chart for a method of estimating the resolution of a raw fingerprint image.
  • a fingerprint image (FPI) acquisition, analysis, storage, and recognition system is disclosed in which FPIs are acquired and a fingerprint template based upon the acquired FPI is created.
  • the fingerprint template is stored and can be used to both identify an unknown FPI and to verify the identity of an FPI.
  • FIG. 1 is block diagram of the enrollment process used to acquire an FPI and to store the corresponding fingerprint template.
  • the raw FPI is acquired from a fingerprint sensor or scanner or a scanned FPI, as depicted in step 102 .
  • a raw FPI is defined as an original fingerprint image captured by a fingerprint sensor or scanner or a raw fingerprint can be a digitally scanned image of a paper and ink fingerprint.
  • a raw FPI includes a plurality of ridge curves and valleys interspersed between the various ridge curves corresponding to the ridges and valleys of the original fingerprint.
  • the ridge curves and valleys form various structures that include singularities such as whorls, deltas, arches, and also include fingerprint minutiae that are the ending point of ridge curves or bifurcation points of ridge curves.
  • Each of the minutiae has data associated therewith that is indicative of the position of the minutiae, the tangential direction of the minutiae, and the type of minutiae.
  • the raw FPI is processed to enhance the contrast between the ridge curves and valleys contained in the FPI, as depicted in step 104 .
  • the quality of the enhanced FPI is evaluated and if the quality of the FPI is sufficiently high, the minutiae from the FPI are extracted and control is passed to step 108 . If not, control passes to step 102 and another FPI is acquired.
  • the number of minutia are examined and if there are sufficient minutiae, control is passed to step 110 where the minutiae are extracted from the FPI and an FPI template is formed.
  • the number of minutiae that are required is dependent upon the level of security that is required. A low security application may only require six minutiae that are able to be matched, while a high security application may require 12 or more minutiae that are able to be matched.
  • a fingerprint template is an undirected graph of minutiae extracted from an FPI.
  • Each node in the fingerprint template is an individual minutia and each connecting segment in the graph connects two minutiae (i.e., graph nodes).
  • Each connecting segment also includes data associated therewith, for example, cross points of the connecting segment with ridge curves, and the angles between the direction of the connecting segment and the tangential direction of the ridge curve at the intersecting point.
  • the template can include data on the core and deltas associated with the FPI.
  • the FPI template can include data associated with a core or delta such as the position and direction of respective core and delta.
  • the fingerprint template is associated with the individual and then stored in a fingerprint template database, as depicted in step 112 . If there are a not sufficient number of minutiae, control passes to step 102 and another RAW FPI is acquired.
  • FIG. 2 is a flowchart that describes the various steps necessary to perform the image processing of the raw FPI, the minutiae extraction, and the FPI template formation. The steps depicted in FIG. 2 can be used to process raw FPIs for enrollment purposes, and raw FPIs for identification or identity verification purposes.
  • a raw FPI is acquired from a fingerprint scanner or from scanning a paper and ink fingerprint, or from a previously digitized FPI, as depicted in step 202 .
  • the raw FPI is separated into an array of non-overlapping blocks, as depicted in step 204 .
  • the block size can be selected based upon various parameters such as the size of the FPI, the amount of data contained therein, and the processor speed.
  • the block size is selected as a function of the resolution of the FPI such that within each block, the ridge curves can be approximated by straight lines.
  • the block size is given by R/25 and rounded to the closest power of 2, where R is the resolution of the FPI in dots/inch.
  • the resolution of a typical fingerprint scanner is approximately 500 dpi and is divided into 256 blocks in a 16 ⁇ 16 block pattern of equal size blocks.
  • the block size may be varied within an FPI depending upon the size of the object within the FPI that is to be processed.
  • the blocked image is processed to provide one or more regions of interest, as depicted in step 206 .
  • a region of interest in the FPI is a portion or portions of the FPI containing the ridge curves and valleys of the FPI, the remaining portion or portions of the FPI do not contain any significant fingerprint data.
  • the FPI is separated into foreground blocks and background blocks, as depicted in step 206 .
  • the mean and variance of the pixel intensities are determined for each block.
  • a predetermined mean threshold and variance threshold are selected and a k-nearest neighbor clustering algorithm is used to classify all blocks within the K-nearest neighbors as a foreground block or a background block.
  • a convex hull is formed that includes all of the blocks determined to be foreground blocks.
  • a second check of all background blocks is made to ensure that noise or other interference has not inadvertently switched a foreground block into a background block.
  • a check is made to determine if the center of a previously determined background block is contained within the convex hull formed by the foreground blocks. If so, the background block is converted into a foreground block.
  • the regions of interest in the FPI are filtered to remove random noise in order to form a clearer filtered mage, as depicted in step 208 .
  • Random noise is typically high frequency noise and accordingly a low pass filter is used to smooth out the high frequency noise from the foreground blocks of the blocked image.
  • the low pass filter is a Gaussian filter.
  • the Gaussian filter can be a 2-dimensional filter mask that when convolved with each pixel, within each of the foreground blocks, removes the high frequency noise contained within the FPI.
  • the orientation angle and magnitude of each of the foreground blocks in the filtered image are found, forming an orientation image, as depicted in step 210 .
  • the orientation angle and magnitude are found by determining the gradient in the x and y directions.
  • a Sobel differential operator is applied to each foreground block to determine the orientation angle and amplitude.
  • a Hough transformation is used to estimate the orientation angle.
  • a plurality of directional filters each corresponding to a foreground block smoothes out the differences along the ridge curves and intensifies the contrast between the ridge curves and valleys within the corresponding block.
  • the directional filter is a 2-dimensional mask having an x and y direction. The y direction of the mask is intended to amplify the fingerprint ridge curves and to negatively amplify the valleys.
  • the directional filter is a Gaussian filter along the ridge direction.
  • a directional filter mask is depicted in FIG. 3 in which the filter mask 300 is a square in which the side length is equal to the period of the signal, or the period of the signal plus 1, whichever is an odd number.
  • the middle rows 302 are selected to enhance the ridges, and the side rows 310 are used to negatively amplify the valleys.
  • the center coefficient, a 0 , 305 of the center row 304 is set to a 0 and the coefficients of the center row 304 are cosine tapered to edge values of a 0 /4 forming a symmetric row.
  • a 0 is set to 1000 and the center row 304 is cosine tapered to a value of 250 at each edge.
  • the coefficients of the middle rows 302 are cosine tapered from the value of the center row to a value of a 0,i /1.41, where a 0,i is the value of the i th coefficient of the center row 304 .
  • the ridges and valleys of the ridge-enhanced FPI are then separated into one of two binary values, a first binary value for a ridge pixel and a second binary value for a valley pixel, forming a binary fingerprint image, as depicted in step 214 .
  • the image binarization is accomplished by establishing a binary threshold and comparing the intensity value of each pixel to the binary threshold.
  • a pixel having a pixel value greater than the binary threshold is set to a first value and a pixel having a pixel value less than the binary threshold is set to a second value.
  • the binary threshold is one-half the maximum pixel intensity or 128.
  • the first value is equal to 255 and the second value is equal to zero.
  • the ridge curves and valleys of the binary FPI are thinned to a predetermined width, which in the illustrated embodiment is a single pixel forming a thinned image, as depicted in step 216 .
  • the thinning may be accomplished with thinning algorithms that are known in the art.
  • the thinned ridge curves and valleys in the thinned image are approximated by piece-wise linear segments forming a piece-wise linear FPI, as depicted in step 218 .
  • the thinned ridge curves are represented by chain code connecting the start and end points of each ridge curve within a corresponding block.
  • a line segment connecting the start and end points of the respective ridge curve is formed and the maximum distance between the line segment and the ridge curve is determined. If this distance is greater than a predetermined maximum value, two line segments approximate the ridge curve.
  • a first line segment is formed from the start point to the point on the ridge curve having the maximum distance from the original line segment.
  • a second line segment is formed from the end point of the first line segment to the end point of the ridge curve. This process is continued iteratively until the distance between the ridge curve and any point on the piece wise linear approximating segments is less than the predetermined minimum value.
  • the fingerprint minutiae are extracted from the piece-wise linear FPI, as depicted in step 220 .
  • minutiae are classified as either ending minutiae or bifurcation minutiae.
  • Ending minutiae are defined as the end point of a ridge curve in an FPI and bifurcation minutiae are defined as a crossing point of two ridge curves in an FPI.
  • a connection number is computed for each pixel in a corresponding block, wherein the connection number is indicative of whether a pixel is a fingerprint minutia and if so, what type of minutia the corresponding pixel is.
  • connection number corresponds to the properties detailed in Table 1: TABLE 1 Connection number, CN, value Property 0 Pixel is an isolated point 1 Pixel is an end point 2 Pixel is a continuing point 3 Pixel is a branching point 4 Pixel is a crossing point For a CN value of 1 or 3, the angle of the ending point or the branching point to the associated ridge curve is determined.
  • the minutiae type, the x-y position of the minutiae, and the angle of the minutiae associated with the respective ridge curve are determined and stored.
  • the extracted minutiae are further processed to remove false minutiae leaving true minutiae as depicted in step 222 .
  • a large number of false minutiae can be created and detected during the processing steps prior to this step. These minutiae may be due to small ridge segments, ridge breaks, boundary minutiae, and noise.
  • the minutiae is analyzed to see if the minutiae belongs to a broken ridge curve, a noisy link, or if the extracted minutiae is a boundary minutiae.
  • a broken ridge curve occurs when two minutiae are within a predetermined distance of one another and the directions of the respective minutiae are opposite to one another. If the number of minutiae within a specified area exceeds a predetermined threshold, the minutiae are considered to be part of a noisy link. If minutiae occur along the boundary of the FPI, it is considered to be boundary minutiae. In the event that the extracted minutiae belong to one of these three classes, the minutiae is deleted from the extracted minutiae list.
  • a fingerprint minutiae template is then formed from the true minutiae, as depicted in step 224 .
  • a fingerprint minutiae template is an undirected graph in which the true minutiae are the corresponding nodes and line segments connected between two-node points form the edges of the graph.
  • Each of the true minutiae is only connected to other true minutiae within a predetermined distance of it.
  • Data associated with the intersection between a graph edge and any of the ridge curves in the FPI is also stored. This data can include the location of the intersection, i.e., the intersection points, and the angles between the graph edge and tangential direction of the ridge curve.
  • FIG. 4 depicts a block diagram of an embodiment of the verification/identification process.
  • a raw FPI is acquired from a finger print sensor or scanner, as depicted in step 402 .
  • the acquired FPI is processed, as depicted in step 404 , and if the image is suitable for minutiae extraction as depicted in step 406 , the number of minutiae that exist in the FPI is determined, as depicted in step 408 . If sufficient minutiae exist in the FPI, the minutiae are extracted and a fingerprint minutiae template is formed as described with respect to FIG. 2 , as depicted in step 410 . If the image is not suitable to extract minutiae then control passes to step 402 and a new raw FPI is acquired.
  • the fingerprint minutiae template is formed, one or more of the previously enrolled templates are compared to the fingerprint minutiae template of the raw FPI, as depicted in step 412 .
  • a single enrolled template that is known a-priori may be compared to the template of the raw FPI in a one to one matching scheme, where the alleged identity of the individual to be verified is known.
  • many of the enrolled templates are compared to the template of the raw FPI in a one to many matching scheme.
  • the enrolled templates and the template of the raw FPI may be classified according to various characteristics such as the presence of singularities in the FPI to reduce the number of enrolled fingerprint templates to be searched.
  • the number of minutiae that are matched is compared to a predetermined threshold, as depicted in step 414 , and if the number of matched minutiae exceeds the predetermined verification threshold, the enrolled template and the unknown/unverified template of the raw FPI are considered matched, as depicted in step 416 . Accordingly, the person is identified or verified as the individual associated with the enrolled template. If the individual associated with the unknown/unverified FPI is cleared for entry into a secure entity such as a computer, a data network, or a physical space, entry is granted as depicted in step 418 . Otherwise, control is passed back to step 402 for acquisition of another FPI.
  • a secure entity such as a computer, a data network, or a physical space
  • FIG. 5 depicts an embodiment of a matching process suitable for use with the identification/verification methods described herein.
  • Having acquired an enrolled fingerprint template and a fingerprint template to be identified/verified first find all node pairs (A,B) that are locally matched, as depicted in step 502 , where A is a minutiae node from the enrolled template and B is a minutiae node from the template to be identified/verified.
  • A is a minutiae node from the enrolled template
  • B is a minutiae node from the template to be identified/verified.
  • T(B ⁇ A) is formed, as depicted in step 504 .
  • the transformation T(B ⁇ A) is defined as the translation of B to A and the rotation of B necessary to align B to A.
  • Each node pair (A,B) is further used as an anchor node and a neighborhood match is performed in the neighborhood of the anchor node using the corresponding transformation T(B ⁇ A), as depicted in step 506 .
  • the transformed minutiae nodes in the neighborhood of the node pair (A,B) in each template are compared with one another and if the differences in position and rotation between corresponding minutiae are less than a predetermined matching threshold, the minutiae are considered to be matched, as depicted in step 508 , 510 , and 512 .
  • the number of matched minutiae are counted, as depicted in step 514 .
  • the number of matched minutiae are compared to a matching threshold, as depicted in step 516 . If the number of matched minutiae exceeds the matching threshold, the fingerprint templates are considered to be matched, as depicted in step 518 , otherwise, control is returned to step 502 .
  • FIG. 6 depicts a block diagram of a physical access control system 600 .
  • a fingerprint scanner 602 is used to scan a fingerprint.
  • the scanned FPI is provided to a fingerprint server 606 that contains fingerprint templates of enrolled individuals.
  • the fingerprint server 606 creates a fingerprint minutiae template of the scanned FPI and compares the template to the previously enrolled templates corresponding to the individuals cleared for access to the secure location.
  • a positive match between the fingerprint minutiae template of the scanned FPI and one of the previously enrolled fingerprint minutiae templates will positively identify the individual if enrolled.
  • the fingerprint server 606 provides for match/no-match indicia to be provided to the physical access device 604 allowing access into the secured area.
  • FIG. 7 depicts a block diagram of a network logon control system 700 .
  • a fingerprint sensor or scanner 702 coupled to a user PC 704 is used to provide scanned fingerprint data across a data network 706 to a fingerprint server 708 .
  • the fingerprint server 708 creates a fingerprint minutiae template of the scanned FPI and compares this template to the previously enrolled fingerprint minutiae templates corresponding to the individuals cleared for access to the computer network. A match between the newly created fingerprint minutiae template and one or more of the previously enrolled fingerprint minutiae templates indicates that the individual is allowed access to the computer network.
  • the fingerprint server 708 can positively identify the particular individual seeking access and, once verified, provide the identity and the relevant data of the individual to the network server 710 .
  • FIG. 8 depicts a block diagram of an internet logon control system 800 .
  • a fingerprint sensor or scanner 802 coupled to a user PC 804 is used to provide a scanned fingerprint across the internet 806 to a fingerprint server 808 that may be associated with a particular web page or associated with a secure financial transaction that occurs over the internet.
  • the fingerprint server 808 creates a fingerprint minutiae template of the scanned FPI and compares this template to the previously enrolled fingerprint minutiae templates corresponding to the individuals cleared for access to the computer network. A match between the newly created fingerprint minutiae template and the previously enrolled fingerprint minutiae templates indicates that the individual is allowed access to the associated web page or that the financial transaction is properly authorized.
  • the fingerprint server 808 can positively identify the particular individual seeking access, and once verified, provide the identity of the individual to the application servers 810 along with personal data associated with the particular individual.
  • FIG. 9 depicts a flow chart for a method of comparing a fingerprint minutiae template with a fingerprint minutiae template previously stored on a smartcard.
  • a smartcard can be used both at a point of service transaction location or across a network such as the internet to positively identify the individual that is authorized to use the smart card.
  • An FPI is obtained from a fingerprint sensor or scanner, as depicted in step 902 .
  • the FPI is processed, as in step 904 , and if the image is of sufficient quality, as depicted in step 906 and sufficient minutiae are identified as depicted in step 908 .
  • the FPI is analyzed and processed as described above according to FIG.
  • step 910 a fingerprint minutiae template is created, as depicted in step 910 . Otherwise, a new FPI is obtained and control is passed to step 902 .
  • the extracted minutiae and the fingerprint minutiae template formed from the acquired FPI are compared to the fingerprint minutiae template stored on the smartcard, as depicted in step 912 . If a match occurs, as depicted in step 914 , the identity of the smartcard holder is verified, as depicted in step 916 , otherwise control is passed to step 902 , and a new FPI is obtained.
  • FIG. 10 depicts a flow chart for a live finger detection method that may be used in conjunction with the identification and verification methods described herein.
  • the binary fingerprint image of step 214 in FIG. 2 is further analyzed to detect the presence and size of sweat pores contained within the fingerprint image.
  • the binary image is provided, as depicted in step 1002 .
  • the boundaries of the binary image are traced and chain coded, as depicted in step 1004 .
  • All clockwise closed chains are detected, as depicted in step 1006 , and the area and arc length of the detected closed chains are measured as depicted in step 1008 .
  • clockwise closed chains are used to identify sweat pores, counter-clockwise closed chains can also be used.
  • the measured area is compared to a sweat pore threshold and if greater than the sweat pore threshold, the closed chain is a detected sweat pore. If the sweat pore exceeds a certain live finger sweat pore threshold, the finger is flagged as live, as depicted in step 1010 .
  • the sweat pore threshold is four pixels.
  • the finger is flagged as non-living and no further processing is employed and the identity of the individual is not confirmed. If the finger is living and the measured arc length is compared to a hole threshold and if less than the hole threshold, the chain is removed, as depicted in step 1012 . In this manner, arcs having a arc length less than the hole threshold are considered to be noise and are therefore removed.
  • FIG. 11 depicts a flow chart of a method of identifying the location of the cores and deltas, estimating the directions, and classifying the FPI.
  • the orientation field corresponding to an FPI from step 210 of FIG. 2 is provided, as depicted in step 1102 .
  • the orientation field is refined, as depicted in step 1104 by subdividing each block into a four sub-blocks, as depicted in step 1106 .
  • the orientation of each sub-block is predicted, using the original orientation direction as the predictor, as depicted in step 1106 .
  • An octagonal core mask is created that is a vector valued 2-dimensional matrix having as a value a unit vector radial from the center of the corresponding sub-block, as depicted in step 1108 .
  • the center of the core mask is aligned with the corresponding sub-block and is convolved with the sub-blocks in the FPI, as depicted in step 1110 .
  • the convolution result of the core mask and the sub-blocks is normalized, as depicted in step 1112 , and core and delta regions are identified as having large convolution results, i.e. the singularities of the FPI, as depicted in step 1114 .
  • the Poincare index is determined for all areas of the FPI having a convolution result greater than a predetermined curve threshold, as depicted in step 1116 .
  • the Poincare index is found by surrounding each area by a closed curve and a direction integration is performed. If the direction integration equals zero, as depicted in step 1118 , the diameter of the closed curve is reduced, as depicted in step 1120 , and the direction integration is performed again. This step is repeated until the radius is one, as depicted in step 1122 , or the integration is non-zero, as depicted in step 1118 .
  • the singularities of the FPI are classified according to the value of the corresponding Poincare index, as depicted in step 1116 .
  • the singularities are classified as whorls and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there is more than one whorl cluster, the biggest cluster is selected and the smaller clusters are deleted.
  • the singularities are cores, and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there are more than three clusters of cores, the largest two are kept and the remaining core clusters are deleted.
  • the singularities are classified as deltas and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there is one whorl cluster and 3 or more delta clusters, the largest two delta clusters are kept and the remaining delta clusters are deleted. If there is no whorl cluster and 1 or more delta clusters, the largest two delta clusters are kept and the remaining delta clusters deleted.
  • the direction of the cores are estimated, as depicted in step 1118 .
  • the core mask from step 1106 is convolved with the core singularity and the direction estimated from the results in that the displacement from the core center to the mass center of all zero sub-blocks is along the main direction of the core.
  • cores near the boundary of the FPI are estimated.
  • the cores near the boundary are estimated by treating as a core singularity sub-blocks near the boundary having a convolution value in the top 20% of values.
  • the cores are processed as described above.
  • the FPI is then classified as a whorl, right loop, left loop, arch, or double loop.
  • An FPI having a single whorl cluster is classified as a whorl.
  • An FPI having a core cluster, and one or less delta clusters is a loop. If the cross product of the vector from the core to the delta with the main direction of the core is along the normal direction of the fingerprint plane, the fingerprint is a right loop. Otherwise, is if the cross product is against the normal, the fingerprint is a left loop. If the cross product is nearly zero, the fingerprint is an arch. If there are two core clusters and two or less delta clusters, the fingerprint is a double loop. If there is no core then the fingerprint is an arch.
  • the raw fingerprint images from which one or more fingerprint minutiae templates are formed are obtained from fingerprint scanners or sensors that have different resolutions.
  • the automated fingerprint identification/verification process described herein assumes that all of the raw FPIs are of the same resolution. Although this may be true for most fingerprint scanners, if the FPI has been previously digitized from film, the resolution information may not have been included with the FPI. Without a-priori knowledge of the resolution of the FPI, extra processing is required to ensure that the images being processed are of similar resolution.
  • FIG. 12 depicts a method for use with the methods described herein to determine the resolution of an FPI having an unknown resolution.
  • the raw FPI acquired in step 1202 is divided into 16 blocks, as depicted in step 1204 .
  • the Fourier transform is computed as depicted in step 1206 .
  • the magnitude of the Fourier coefficients is determined, as depicted in step 1208 .
  • the Fourier coefficients are classified according to the corresponding spatial frequency, as depicted in step 1210 .
  • the average magnitude of the components for each spatial frequency is determined, as depicted in step 1212 .
  • the spatial frequency having the largest average magnitude is an estimation of the ridge distance of the raw FPI, as depicted in step 1214 , and may be used to adjust the processing to allow for FPIs of similar resolution to be compared.

Abstract

A method of analyzing and recognizing fingerprint images that utilizes vector processing of a vector field that is defined as the tangential vector of the fingerprint ridge curves is disclosed. The raw fingerprint image is divided into blocks, filtered to remove noise, and the orientation direction of each block is found. This allows the ridge curves to be enhanced and approximated by piece-wise linear approximations. The piece-wise linear approximations to the ridge curves allow the minutiae to be extracted and classified and a fingerprint minutiae template to be constructed. An enrollment process gathers multiple fingerprint images, creates fingerprint minutiae templates corresponding to the acquired fingerprint images, and stores the templates and other data associated with the respective individual or the enrolled fingerprint in a fingerprint database. In an identification process, an unknown raw fingerprint image is obtained via a fingerprint scanner and processed similarly to the enrollment process such that the fingerprint minutiae template of the unknown fingerprint is compared to one or more previously enrolled fingerprint minutiae templates. The identity of the individual associated with the unknown fingerprint is thereby ascertained. In addition, live finger detection can be accomplished in conjunction with the verification or identification process through analysis of the fingerprint image thus enhancing the security of the overall system.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of Ser. No. 10/156,447 filed May 28, 2002 which claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application Ser. No. 60/293,487 filed May 25, 2001 and U.S. Provisional Patent Application Ser. No. 60/338,949 filed Oct. 22, 2001.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • N/A
  • BACKGROUND OF THE INVENTION
  • A biometric is defined as a biological characteristic or trait that is unique to an individual and that can be accurately measured. A biometric that can be stored and accessed in an efficient manner can be used to identify an individual or to verify the identity of an individual. A biometric commonly used to identify human beings is one or more fingerprints belonging to the particular human being.
  • Fingerprint identification of a human being consists of two stages: enrollment and verification/identification. Enrollment of a fingerprint involves taking a fingerprint image (FPI) of an individual and storing the FPI itself or a plurality of data that is representative of the FPI in an FPI database. Identification of a fingerprint involves taking an FPI of an unknown individual and comparing the unknown FPI to the FPIs or FPI data that is stored in the FPI database. An identification is made when a match between the unknown FPI and an FPI stored in the FPI database is found that has a sufficient reliability that the probability of a false positive is below a predetermined threshold. Fingerprint verification or authentication matches an individual to a fingerprint that has been previously enrolled by that individual. Thus, identification involves searching for a match between a single unknown FPI with many stored FPIs. The verification process involves the matching an unknown or unconfirmed fingerprint minutiae template to a single previously enrolled fingerprint minutia template. Accordingly, the verification process is a one-to-one matching technique.
  • The use of biometrics to restrict access to secure entities such as computer networks, cryptographic keys, sensitive data, and physical locations is well known. In addition, smart cards, cards that have a biometric, such as a fingerprint, encoded thereon can be used to provide transaction security as well. A smart card allows a user to provide the biometric encoded on the card, wherein the encoded biometric data is compared to the biometric measured on the individual. In this way, a smartcard can positively authenticate the identity of the smartcard user.
  • However, traditional FPI data is based on the set of singularities that can be classified according the type of singularity, e.g., deltas, arches, or whorls. In addition, FPIs contain fingerprint minutiae that are the end point of a ridge curve or a bifurcation point of a ridge curve. FPI images can be classified and matched according to data associated with the fingerprint minutiae. This data can include the position of the minutiae, the tangential direction of the minutiae, and the distance to other minutiae. These types of FPI data can lead to a high false acceptance or identification rate when the unknown FPI has only a few minutiae or if the unknown FPI is only a partial FPI that may or may not include the number of minutiae needed to accurately verify or identify the unknown FPI.
  • Therefore what is needed is a method and apparatus to collect, analyze, and store FPI data such that an unknown or unverified FPI can be accurately verified or identified in the FPI or whether the FPI is only a partial print.
  • BRIEF SUMMARY OF THE INVENTION
  • A method of analyzing and recognizing fingerprint images that utilizes vector processing of a vector field that is defined as the tangential vector of the fingerprint ridge curves is disclosed. The raw fingerprint image is divided into blocks, each block is filtered to remove noise and the orientation direction of each block is found. This allows the ridge curves to be enhanced and approximated by piece-wise linear approximations. The piece-wise linear approximations to the ridge curves allow the minutiae to be extracted and classified and a fingerprint minutiae template to be constructed. An enrollment process gathers multiple fingerprint images, creates fingerprint minutiae templates corresponding to the fingerprint images, and stores the templates and other data associated with the respective individual or the enrolled fingerprint in a fingerprint database. In an identification or verification process an unknown raw fingerprint image is obtained via a fingerprint scanner and processed similarly to the enrollment process described above. The fingerprint minutiae template of the unknown fingerprint is compared to one or more previously enrolled fingerprint minutiae templates to identify or verify the identity of the individual associated with the unknown fingerprint. In addition, live finger detection can be accomplished in conjunction with the identification or verification processes through analysis of the fingerprint image thus enhancing the security of the overall system.
  • Other forms, features, and aspects of the above-described methods and system are described in the detailed description that follows.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • The invention will be more fully understood from the following detailed description taken in conjunction with the accompanying drawings in which:
  • FIG. 1 is a flow chart of a method for acquiring and enrolling fingerprint minutiae templates;
  • FIG. 2 is a flow chart of a method for extracting minutiae from a raw fingerprint image and forming a fingerprint minutiae template;
  • FIG. 3 is a schematic diagram of a direction filter suitable for use in the present fingerprint analysis method;
  • FIG. 4 is a flow chart of a method for identifying/verifying the identity of an individual using the presently described fingerprint analysis method;
  • FIG. 5 is a flow chart of a method for comparing an unknown fingerprint minutiae template with a previously enrolled fingerprint minutiae template;
  • FIG. 6 is block diagram for a system to control physical access using the fingerprint analysis methods described herein;
  • FIG. 7 is a block diagram for a system to control computer network access using the fingerprint analysis methods described herein;
  • FIG. 8 is a block diagram for a system to control access to a web page across the internet using the fingerprint analysis methods described herein;
  • FIG. 9 is a flow chart for a method of using the presently described fingerprint analysis methods in conjunction with a smartcard;
  • FIG. 10 is a flow chart for a method of detecting a live finger by analyzing the binary fingerprint image;
  • FIG. 11 is a flow chart for a method of detecting and classifying singularities found in the finger print image; and
  • FIG. 12 is a flow chart for a method of estimating the resolution of a raw fingerprint image.
  • DETAILED DESCRIPTION OF THE INVENTION
  • A fingerprint image (FPI) acquisition, analysis, storage, and recognition system is disclosed in which FPIs are acquired and a fingerprint template based upon the acquired FPI is created. The fingerprint template is stored and can be used to both identify an unknown FPI and to verify the identity of an FPI.
  • FIG. 1 is block diagram of the enrollment process used to acquire an FPI and to store the corresponding fingerprint template. In particular, the raw FPI is acquired from a fingerprint sensor or scanner or a scanned FPI, as depicted in step 102. As used herein a raw FPI is defined as an original fingerprint image captured by a fingerprint sensor or scanner or a raw fingerprint can be a digitally scanned image of a paper and ink fingerprint. A raw FPI includes a plurality of ridge curves and valleys interspersed between the various ridge curves corresponding to the ridges and valleys of the original fingerprint. The ridge curves and valleys form various structures that include singularities such as whorls, deltas, arches, and also include fingerprint minutiae that are the ending point of ridge curves or bifurcation points of ridge curves. Each of the minutiae has data associated therewith that is indicative of the position of the minutiae, the tangential direction of the minutiae, and the type of minutiae.
  • The raw FPI is processed to enhance the contrast between the ridge curves and valleys contained in the FPI, as depicted in step 104. As depicted in step 106, the quality of the enhanced FPI is evaluated and if the quality of the FPI is sufficiently high, the minutiae from the FPI are extracted and control is passed to step 108. If not, control passes to step 102 and another FPI is acquired. As depicted in step 108, the number of minutia are examined and if there are sufficient minutiae, control is passed to step 110 where the minutiae are extracted from the FPI and an FPI template is formed. In general the number of minutiae that are required is dependent upon the level of security that is required. A low security application may only require six minutiae that are able to be matched, while a high security application may require 12 or more minutiae that are able to be matched.
  • As used herein a fingerprint template is an undirected graph of minutiae extracted from an FPI. Each node in the fingerprint template is an individual minutia and each connecting segment in the graph connects two minutiae (i.e., graph nodes). Each connecting segment also includes data associated therewith, for example, cross points of the connecting segment with ridge curves, and the angles between the direction of the connecting segment and the tangential direction of the ridge curve at the intersecting point. In addition, the template can include data on the core and deltas associated with the FPI. For example, the FPI template can include data associated with a core or delta such as the position and direction of respective core and delta.
  • The fingerprint template is associated with the individual and then stored in a fingerprint template database, as depicted in step 112. If there are a not sufficient number of minutiae, control passes to step 102 and another RAW FPI is acquired.
  • FIG. 2 is a flowchart that describes the various steps necessary to perform the image processing of the raw FPI, the minutiae extraction, and the FPI template formation. The steps depicted in FIG. 2 can be used to process raw FPIs for enrollment purposes, and raw FPIs for identification or identity verification purposes.
  • As depicted in FIG. 2, a raw FPI is acquired from a fingerprint scanner or from scanning a paper and ink fingerprint, or from a previously digitized FPI, as depicted in step 202. The raw FPI is separated into an array of non-overlapping blocks, as depicted in step 204. The block size can be selected based upon various parameters such as the size of the FPI, the amount of data contained therein, and the processor speed. Preferably, the block size is selected as a function of the resolution of the FPI such that within each block, the ridge curves can be approximated by straight lines. In one preferred embodiment, the block size is given by R/25 and rounded to the closest power of 2, where R is the resolution of the FPI in dots/inch. In the illustrated embodiment, the resolution of a typical fingerprint scanner is approximately 500 dpi and is divided into 256 blocks in a 16×16 block pattern of equal size blocks. In another embodiment, the block size may be varied within an FPI depending upon the size of the object within the FPI that is to be processed.
  • The blocked image is processed to provide one or more regions of interest, as depicted in step 206. A region of interest in the FPI is a portion or portions of the FPI containing the ridge curves and valleys of the FPI, the remaining portion or portions of the FPI do not contain any significant fingerprint data. To determine the regions of interest, the FPI is separated into foreground blocks and background blocks, as depicted in step 206. In one embodiment, the mean and variance of the pixel intensities are determined for each block. A predetermined mean threshold and variance threshold are selected and a k-nearest neighbor clustering algorithm is used to classify all blocks within the K-nearest neighbors as a foreground block or a background block. In a preferred embodiment, a convex hull is formed that includes all of the blocks determined to be foreground blocks. A second check of all background blocks is made to ensure that noise or other interference has not inadvertently switched a foreground block into a background block. A check is made to determine if the center of a previously determined background block is contained within the convex hull formed by the foreground blocks. If so, the background block is converted into a foreground block.
  • The regions of interest in the FPI are filtered to remove random noise in order to form a clearer filtered mage, as depicted in step 208. Random noise is typically high frequency noise and accordingly a low pass filter is used to smooth out the high frequency noise from the foreground blocks of the blocked image. In one embodiment, the low pass filter is a Gaussian filter. The Gaussian filter can be a 2-dimensional filter mask that when convolved with each pixel, within each of the foreground blocks, removes the high frequency noise contained within the FPI.
  • The orientation angle and magnitude of each of the foreground blocks in the filtered image are found, forming an orientation image, as depicted in step 210. In general, the orientation angle and magnitude are found by determining the gradient in the x and y directions. In one embodiment, a Sobel differential operator is applied to each foreground block to determine the orientation angle and amplitude. In the event that the orientation amplitude is below a predetermined threshold, a Hough transformation is used to estimate the orientation angle.
  • The contrast between the ridge curves and the valleys in the orientation image is increased forming a ridge-enhanced FPI, as depicted in step 212. In particular, a plurality of directional filters each corresponding to a foreground block smoothes out the differences along the ridge curves and intensifies the contrast between the ridge curves and valleys within the corresponding block. In one embodiment, the directional filter is a 2-dimensional mask having an x and y direction. The y direction of the mask is intended to amplify the fingerprint ridge curves and to negatively amplify the valleys. In one embodiment, the directional filter is a Gaussian filter along the ridge direction.
  • A directional filter mask is depicted in FIG. 3 in which the filter mask 300 is a square in which the side length is equal to the period of the signal, or the period of the signal plus 1, whichever is an odd number. The middle rows 302 are selected to enhance the ridges, and the side rows 310 are used to negatively amplify the valleys. There may be transition rows 308 between the middle rows 302 and the side rows 308 that have coefficients equal to zero. The center coefficient, a0, 305 of the center row 304 is set to a0 and the coefficients of the center row 304 are cosine tapered to edge values of a0/4 forming a symmetric row. In the illustrated embodiment, a0 is set to 1000 and the center row 304 is cosine tapered to a value of 250 at each edge. The coefficients of the middle rows 302 are cosine tapered from the value of the center row to a value of a0,i/1.41, where a0,i is the value of the ith coefficient of the center row 304. The value of each coefficient of the side rows 310 is given by b i = - 1 2 ( j = 1 m a i , j )
    where i is the ith coefficient of the side row and m is the number of middle rows. Once the directional filter mask for a block has been determined, the directional filter mask is convolved with the pixels in the corresponding block.
  • The ridges and valleys of the ridge-enhanced FPI are then separated into one of two binary values, a first binary value for a ridge pixel and a second binary value for a valley pixel, forming a binary fingerprint image, as depicted in step 214. In particular, the image binarization is accomplished by establishing a binary threshold and comparing the intensity value of each pixel to the binary threshold. A pixel having a pixel value greater than the binary threshold is set to a first value and a pixel having a pixel value less than the binary threshold is set to a second value. In one embodiment in which the maximum pixel intensity is 255, the binary threshold is one-half the maximum pixel intensity or 128. The first value is equal to 255 and the second value is equal to zero.
  • The ridge curves and valleys of the binary FPI are thinned to a predetermined width, which in the illustrated embodiment is a single pixel forming a thinned image, as depicted in step 216. The thinning may be accomplished with thinning algorithms that are known in the art.
  • The thinned ridge curves and valleys in the thinned image are approximated by piece-wise linear segments forming a piece-wise linear FPI, as depicted in step 218. The thinned ridge curves are represented by chain code connecting the start and end points of each ridge curve within a corresponding block. A line segment connecting the start and end points of the respective ridge curve is formed and the maximum distance between the line segment and the ridge curve is determined. If this distance is greater than a predetermined maximum value, two line segments approximate the ridge curve. A first line segment is formed from the start point to the point on the ridge curve having the maximum distance from the original line segment. A second line segment is formed from the end point of the first line segment to the end point of the ridge curve. This process is continued iteratively until the distance between the ridge curve and any point on the piece wise linear approximating segments is less than the predetermined minimum value.
  • The fingerprint minutiae are extracted from the piece-wise linear FPI, as depicted in step 220. In general, minutiae are classified as either ending minutiae or bifurcation minutiae. Ending minutiae are defined as the end point of a ridge curve in an FPI and bifurcation minutiae are defined as a crossing point of two ridge curves in an FPI. In particular, a connection number is computed for each pixel in a corresponding block, wherein the connection number is indicative of whether a pixel is a fingerprint minutia and if so, what type of minutia the corresponding pixel is. The connection number is equal to C N = ( i = 1 7 1 2 P i - P i + 1 ) ,
  • where Pi and Pi+1 are the values of the 8 pixels surrounding the pixel of interest. The connection number corresponds to the properties detailed in Table 1:
    TABLE 1
    Connection number, CN, value Property
    0 Pixel is an isolated point
    1 Pixel is an end point
    2 Pixel is a continuing point
    3 Pixel is a branching point
    4 Pixel is a crossing point

    For a CN value of 1 or 3, the angle of the ending point or the branching point to the associated ridge curve is determined. The minutiae type, the x-y position of the minutiae, and the angle of the minutiae associated with the respective ridge curve are determined and stored.
  • The extracted minutiae are further processed to remove false minutiae leaving true minutiae as depicted in step 222. As can be appreciated, a large number of false minutiae can be created and detected during the processing steps prior to this step. These minutiae may be due to small ridge segments, ridge breaks, boundary minutiae, and noise.
  • For every minutiae extracted in step 220, the minutiae is analyzed to see if the minutiae belongs to a broken ridge curve, a noisy link, or if the extracted minutiae is a boundary minutiae. A broken ridge curve occurs when two minutiae are within a predetermined distance of one another and the directions of the respective minutiae are opposite to one another. If the number of minutiae within a specified area exceeds a predetermined threshold, the minutiae are considered to be part of a noisy link. If minutiae occur along the boundary of the FPI, it is considered to be boundary minutiae. In the event that the extracted minutiae belong to one of these three classes, the minutiae is deleted from the extracted minutiae list.
  • A fingerprint minutiae template is then formed from the true minutiae, as depicted in step 224. In particular, a fingerprint minutiae template is an undirected graph in which the true minutiae are the corresponding nodes and line segments connected between two-node points form the edges of the graph. Each of the true minutiae is only connected to other true minutiae within a predetermined distance of it. Data associated with the intersection between a graph edge and any of the ridge curves in the FPI is also stored. This data can include the location of the intersection, i.e., the intersection points, and the angles between the graph edge and tangential direction of the ridge curve.
  • FIG. 4 depicts a block diagram of an embodiment of the verification/identification process. A raw FPI is acquired from a finger print sensor or scanner, as depicted in step 402. The acquired FPI is processed, as depicted in step 404, and if the image is suitable for minutiae extraction as depicted in step 406, the number of minutiae that exist in the FPI is determined, as depicted in step 408. If sufficient minutiae exist in the FPI, the minutiae are extracted and a fingerprint minutiae template is formed as described with respect to FIG. 2, as depicted in step 410. If the image is not suitable to extract minutiae then control passes to step 402 and a new raw FPI is acquired.
  • If the fingerprint minutiae template is formed, one or more of the previously enrolled templates are compared to the fingerprint minutiae template of the raw FPI, as depicted in step 412. In the verification process, a single enrolled template that is known a-priori may be compared to the template of the raw FPI in a one to one matching scheme, where the alleged identity of the individual to be verified is known. In the identification process, many of the enrolled templates are compared to the template of the raw FPI in a one to many matching scheme. As discussed in more detail below, the enrolled templates and the template of the raw FPI may be classified according to various characteristics such as the presence of singularities in the FPI to reduce the number of enrolled fingerprint templates to be searched. The number of minutiae that are matched is compared to a predetermined threshold, as depicted in step 414, and if the number of matched minutiae exceeds the predetermined verification threshold, the enrolled template and the unknown/unverified template of the raw FPI are considered matched, as depicted in step 416. Accordingly, the person is identified or verified as the individual associated with the enrolled template. If the individual associated with the unknown/unverified FPI is cleared for entry into a secure entity such as a computer, a data network, or a physical space, entry is granted as depicted in step 418. Otherwise, control is passed back to step 402 for acquisition of another FPI.
  • FIG. 5 depicts an embodiment of a matching process suitable for use with the identification/verification methods described herein. Having acquired an enrolled fingerprint template and a fingerprint template to be identified/verified, first find all node pairs (A,B) that are locally matched, as depicted in step 502, where A is a minutiae node from the enrolled template and B is a minutiae node from the template to be identified/verified. For each identified node pair (A,B) a transformation T(B→A) is formed, as depicted in step 504. The transformation T(B→A) is defined as the translation of B to A and the rotation of B necessary to align B to A. Each node pair (A,B) is further used as an anchor node and a neighborhood match is performed in the neighborhood of the anchor node using the corresponding transformation T(B→A), as depicted in step 506. The transformed minutiae nodes in the neighborhood of the node pair (A,B) in each template are compared with one another and if the differences in position and rotation between corresponding minutiae are less than a predetermined matching threshold, the minutiae are considered to be matched, as depicted in step 508, 510, and 512. For each node pair (A,B), the number of matched minutiae are counted, as depicted in step 514. The number of matched minutiae are compared to a matching threshold, as depicted in step 516. If the number of matched minutiae exceeds the matching threshold, the fingerprint templates are considered to be matched, as depicted in step 518, otherwise, control is returned to step 502.
  • FIG. 6 depicts a block diagram of a physical access control system 600. A fingerprint scanner 602 is used to scan a fingerprint. The scanned FPI is provided to a fingerprint server 606 that contains fingerprint templates of enrolled individuals. The fingerprint server 606 creates a fingerprint minutiae template of the scanned FPI and compares the template to the previously enrolled templates corresponding to the individuals cleared for access to the secure location. A positive match between the fingerprint minutiae template of the scanned FPI and one of the previously enrolled fingerprint minutiae templates will positively identify the individual if enrolled. The fingerprint server 606 provides for match/no-match indicia to be provided to the physical access device 604 allowing access into the secured area. Note that the actual identity of the person seeking to gain entrance does not have to be ascertained, although it may be. Only the occurrence of a match between one of the group of enrolled fingerprint templates and the fingerprint minutiae template of the scanned FPI is required. However, in a further embodiment additional conventional identification establishing technologies may be implemented in conjunction with the fingerprint analysis and identification/verification described herein.
  • FIG. 7 depicts a block diagram of a network logon control system 700. A fingerprint sensor or scanner 702 coupled to a user PC 704 is used to provide scanned fingerprint data across a data network 706 to a fingerprint server 708. The fingerprint server 708 creates a fingerprint minutiae template of the scanned FPI and compares this template to the previously enrolled fingerprint minutiae templates corresponding to the individuals cleared for access to the computer network. A match between the newly created fingerprint minutiae template and one or more of the previously enrolled fingerprint minutiae templates indicates that the individual is allowed access to the computer network. In addition, the fingerprint server 708 can positively identify the particular individual seeking access and, once verified, provide the identity and the relevant data of the individual to the network server 710.
  • FIG. 8 depicts a block diagram of an internet logon control system 800. A fingerprint sensor or scanner 802 coupled to a user PC 804 is used to provide a scanned fingerprint across the internet 806 to a fingerprint server 808 that may be associated with a particular web page or associated with a secure financial transaction that occurs over the internet. The fingerprint server 808 creates a fingerprint minutiae template of the scanned FPI and compares this template to the previously enrolled fingerprint minutiae templates corresponding to the individuals cleared for access to the computer network. A match between the newly created fingerprint minutiae template and the previously enrolled fingerprint minutiae templates indicates that the individual is allowed access to the associated web page or that the financial transaction is properly authorized. In addition, the fingerprint server 808 can positively identify the particular individual seeking access, and once verified, provide the identity of the individual to the application servers 810 along with personal data associated with the particular individual.
  • FIG. 9 depicts a flow chart for a method of comparing a fingerprint minutiae template with a fingerprint minutiae template previously stored on a smartcard. A smartcard can be used both at a point of service transaction location or across a network such as the internet to positively identify the individual that is authorized to use the smart card. An FPI is obtained from a fingerprint sensor or scanner, as depicted in step 902. The FPI is processed, as in step 904, and if the image is of sufficient quality, as depicted in step 906 and sufficient minutiae are identified as depicted in step 908. The FPI is analyzed and processed as described above according to FIG. 2 and the minutiae are extracted from the FPI and a fingerprint minutiae template is created, as depicted in step 910. Otherwise, a new FPI is obtained and control is passed to step 902. The extracted minutiae and the fingerprint minutiae template formed from the acquired FPI are compared to the fingerprint minutiae template stored on the smartcard, as depicted in step 912. If a match occurs, as depicted in step 914, the identity of the smartcard holder is verified, as depicted in step 916, otherwise control is passed to step 902, and a new FPI is obtained.
  • The verification and identification functions described herein are based on the premise that a finger being presented and scanned by the fingerprint scanner is a live finger and not a prosthetic or severed finger having a false fingerprint. FIG. 10 depicts a flow chart for a live finger detection method that may be used in conjunction with the identification and verification methods described herein. The binary fingerprint image of step 214 in FIG. 2 is further analyzed to detect the presence and size of sweat pores contained within the fingerprint image. The binary image is provided, as depicted in step 1002. The boundaries of the binary image are traced and chain coded, as depicted in step 1004. All clockwise closed chains are detected, as depicted in step 1006, and the area and arc length of the detected closed chains are measured as depicted in step 1008. Although clockwise closed chains are used to identify sweat pores, counter-clockwise closed chains can also be used. The measured area is compared to a sweat pore threshold and if greater than the sweat pore threshold, the closed chain is a detected sweat pore. If the sweat pore exceeds a certain live finger sweat pore threshold, the finger is flagged as live, as depicted in step 1010. In the illustrated embodiment in which the fingerprint sensor/scanner has a 500 dpi resolution, the sweat pore threshold is four pixels. Otherwise, the finger is flagged as non-living and no further processing is employed and the identity of the individual is not confirmed. If the finger is living and the measured arc length is compared to a hole threshold and if less than the hole threshold, the chain is removed, as depicted in step 1012. In this manner, arcs having a arc length less than the hole threshold are considered to be noise and are therefore removed.
  • In some circumstances, it may be desirable to classify the FPI according to the location of the cores and deltas, the estimate of the main direction of the cores, and classifying the FPI according to various categories of FPI. FIG. 11 depicts a flow chart of a method of identifying the location of the cores and deltas, estimating the directions, and classifying the FPI. The orientation field corresponding to an FPI from step 210 of FIG. 2 is provided, as depicted in step 1102. The orientation field is refined, as depicted in step 1104 by subdividing each block into a four sub-blocks, as depicted in step 1106. The orientation of each sub-block is predicted, using the original orientation direction as the predictor, as depicted in step 1106. An octagonal core mask is created that is a vector valued 2-dimensional matrix having as a value a unit vector radial from the center of the corresponding sub-block, as depicted in step 1108. The center of the core mask is aligned with the corresponding sub-block and is convolved with the sub-blocks in the FPI, as depicted in step 1110.
  • The convolution result of the core mask and the sub-blocks is normalized, as depicted in step 1112, and core and delta regions are identified as having large convolution results, i.e. the singularities of the FPI, as depicted in step 1114. The Poincare index is determined for all areas of the FPI having a convolution result greater than a predetermined curve threshold, as depicted in step 1116. The Poincare index is found by surrounding each area by a closed curve and a direction integration is performed. If the direction integration equals zero, as depicted in step 1118, the diameter of the closed curve is reduced, as depicted in step 1120, and the direction integration is performed again. This step is repeated until the radius is one, as depicted in step 1122, or the integration is non-zero, as depicted in step 1118.
  • The singularities of the FPI are classified according to the value of the corresponding Poincare index, as depicted in step 1116. For a Poincare index of 1, the singularities are classified as whorls and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there is more than one whorl cluster, the biggest cluster is selected and the smaller clusters are deleted. For a Poincare index of 0.5, the singularities are cores, and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there are more than three clusters of cores, the largest two are kept and the remaining core clusters are deleted. For a Poincare index of −0.5, the singularities are classified as deltas and are clustered according to the corresponding Euclidean distance from the arbitrary origin. If there is one whorl cluster and 3 or more delta clusters, the largest two delta clusters are kept and the remaining delta clusters are deleted. If there is no whorl cluster and 1 or more delta clusters, the largest two delta clusters are kept and the remaining delta clusters deleted.
  • For any cores detected in step 1116, the direction of the cores are estimated, as depicted in step 1118. The core mask from step 1106 is convolved with the core singularity and the direction estimated from the results in that the displacement from the core center to the mass center of all zero sub-blocks is along the main direction of the core.
  • If no cores or whorl clusters are identified then cores near the boundary of the FPI are estimated. The cores near the boundary are estimated by treating as a core singularity sub-blocks near the boundary having a convolution value in the top 20% of values. The cores are processed as described above.
  • The FPI is then classified as a whorl, right loop, left loop, arch, or double loop. An FPI having a single whorl cluster is classified as a whorl. An FPI having a core cluster, and one or less delta clusters is a loop. If the cross product of the vector from the core to the delta with the main direction of the core is along the normal direction of the fingerprint plane, the fingerprint is a right loop. Otherwise, is if the cross product is against the normal, the fingerprint is a left loop. If the cross product is nearly zero, the fingerprint is an arch. If there are two core clusters and two or less delta clusters, the fingerprint is a double loop. If there is no core then the fingerprint is an arch.
  • In some circumstances, the raw fingerprint images from which one or more fingerprint minutiae templates are formed are obtained from fingerprint scanners or sensors that have different resolutions. Generally, the automated fingerprint identification/verification process described herein assumes that all of the raw FPIs are of the same resolution. Although this may be true for most fingerprint scanners, if the FPI has been previously digitized from film, the resolution information may not have been included with the FPI. Without a-priori knowledge of the resolution of the FPI, extra processing is required to ensure that the images being processed are of similar resolution.
  • FIG. 12 depicts a method for use with the methods described herein to determine the resolution of an FPI having an unknown resolution. The raw FPI acquired in step 1202 is divided into 16 blocks, as depicted in step 1204. For each block, the Fourier transform is computed as depicted in step 1206. The magnitude of the Fourier coefficients is determined, as depicted in step 1208. The Fourier coefficients are classified according to the corresponding spatial frequency, as depicted in step 1210. The average magnitude of the components for each spatial frequency is determined, as depicted in step 1212. The spatial frequency having the largest average magnitude is an estimation of the ridge distance of the raw FPI, as depicted in step 1214, and may be used to adjust the processing to allow for FPIs of similar resolution to be compared.
  • Those of ordinary skill in the art should further appreciate that variations to and modification of the above-described methods for identifying and verifying fingerprints can be made. Accordingly, the invention should be viewed as limited solely by the scope and spirit of the appended claims.

Claims (1)

1. A method for fingerprint recognition, the method comprising the steps of:
acquiring a fingerprint image;
dividing the fingerprint image into blocks, thereby forming a blocked fingerprint image;
separating the blocked fingerprint image into foreground blocks and background blocks;
determining an orientation angle and amplitude for each of the foreground blocks, thereby forming an orientation field of the fingerprint;
creating a ridge-enhanced image of the fingerprint;
creating a binary image of the fingerprint;
creating a piecewise linear image of the fingerprint;
extracting minutiae of the fingerprint from the piecewise linear image of the fingerprint;
creating a fingerprint template of the fingerprint from the extracted minutiae; and
storing the fingerprint template as an enrolled fingerprint template.
US11/643,045 2001-05-25 2006-12-19 Fingerprint recognition system Abandoned US20080095413A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/643,045 US20080095413A1 (en) 2001-05-25 2006-12-19 Fingerprint recognition system

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US29348701P 2001-05-25 2001-05-25
US33894901P 2001-10-22 2001-10-22
US10/156,447 US6876757B2 (en) 2001-05-25 2002-05-28 Fingerprint recognition system
US11/081,213 US20050157913A1 (en) 2001-05-25 2005-03-16 Fingerprint recognition system
US11/643,045 US20080095413A1 (en) 2001-05-25 2006-12-19 Fingerprint recognition system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/081,213 Continuation US20050157913A1 (en) 2001-05-25 2005-03-16 Fingerprint recognition system

Publications (1)

Publication Number Publication Date
US20080095413A1 true US20080095413A1 (en) 2008-04-24

Family

ID=26967980

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/156,447 Expired - Fee Related US6876757B2 (en) 2001-05-25 2002-05-28 Fingerprint recognition system
US11/081,213 Abandoned US20050157913A1 (en) 2001-05-25 2005-03-16 Fingerprint recognition system
US11/643,045 Abandoned US20080095413A1 (en) 2001-05-25 2006-12-19 Fingerprint recognition system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US10/156,447 Expired - Fee Related US6876757B2 (en) 2001-05-25 2002-05-28 Fingerprint recognition system
US11/081,213 Abandoned US20050157913A1 (en) 2001-05-25 2005-03-16 Fingerprint recognition system

Country Status (3)

Country Link
US (3) US6876757B2 (en)
AU (1) AU2002318165A1 (en)
WO (1) WO2002096181A2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080144894A1 (en) * 2006-08-11 2008-06-19 Vladimir Nickolaevich Bichigov Method for filtering a fingerprint image continuation-in-part
US20080209226A1 (en) * 2007-02-28 2008-08-28 Microsoft Corporation User Authentication Via Biometric Hashing
US20080209227A1 (en) * 2007-02-28 2008-08-28 Microsoft Corporation User Authentication Via Biometric Hashing
US20100158329A1 (en) * 2008-12-19 2010-06-24 Shajil Asokan Thaniyath Elegant Solutions for Fingerprint Image Enhancement
US20120087553A1 (en) * 2006-10-10 2012-04-12 West Virginia University Research Corporation, WVU Office of Technology Transfer & WYU Business Multi-resolutional texture analysis fingerprint liveness systems and methods
US20120101822A1 (en) * 2010-10-25 2012-04-26 Lockheed Martin Corporation Biometric speaker identification
US20130279750A1 (en) * 2012-04-20 2013-10-24 Dmetrix, Inc. Identification of foreign object debris
WO2022125058A1 (en) * 2020-12-07 2022-06-16 Google Llc Fingerprint-based authentication using touch inputs
EP3940583A4 (en) * 2019-03-15 2022-09-07 Arcsoft Corporation Limited Methods for fingerprint image enhancement, fingerprint recognition and application startup
WO2024030105A1 (en) * 2022-08-02 2024-02-08 Havelsan Hava Elektronik San. Ve Tic. A.S. Multi-stage fusion matcher for dirty fingerprint and dirty palm

Families Citing this family (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001041032A1 (en) * 1999-11-30 2001-06-07 David Russell Methods, systems, and apparatuses for secure interactions
AU2002318165A1 (en) * 2001-05-25 2002-12-09 Biometric Informatics Technology, Inc. Fingerprint recognition system
DE10126369A1 (en) * 2001-05-30 2002-12-05 Giesecke & Devrient Gmbh Procedure for checking a fingerprint
CN1393823A (en) * 2001-07-02 2003-01-29 国际商业机器公司 Apparatus and method for auxiliary recognizing of human biological character
JP2003075135A (en) * 2001-08-31 2003-03-12 Nec Corp Fingerprint image input device and organism discrimination method by fingerprint image
US7237115B1 (en) * 2001-09-26 2007-06-26 Sandia Corporation Authenticating concealed private data while maintaining concealment
US7272247B2 (en) * 2001-10-10 2007-09-18 Activcard Ireland Limited Method and system for fingerprint authentication
KR100453220B1 (en) * 2001-12-05 2004-10-15 한국전자통신연구원 Apparatus and method for authenticating user by using a fingerprint feature
US7142699B2 (en) * 2001-12-14 2006-11-28 Siemens Corporate Research, Inc. Fingerprint matching using ridge feature maps
EP1543457A4 (en) * 2002-07-12 2009-03-25 Privaris Inc Personal authentication software and systems for travel privilege assignation and verification
AU2003258067A1 (en) 2002-08-06 2004-02-23 Privaris, Inc. Methods for secure enrollment and backup of personal identity credentials into electronic devices
DE10237011A1 (en) * 2002-08-13 2004-02-26 Philips Intellectual Property & Standards Gmbh Fingerprint lines encoding method, involves encoding one line at time by vertices that include starting and end points, such that connecting line segments between adjacent vertices are no more than given distance away from line
CN100352399C (en) * 2002-09-13 2007-12-05 富士通株式会社 Biosensing instrument and method and identifying device having biosensing function
US7120280B2 (en) * 2002-09-27 2006-10-10 Symbol Technologies, Inc. Fingerprint template generation, verification and identification system
DE10254327A1 (en) * 2002-11-21 2004-06-03 Philips Intellectual Property & Standards Gmbh Method for determining the contact area in images of skin prints
US7072496B2 (en) * 2002-12-20 2006-07-04 Motorola, Inc. Slap print segmentation system and method
SE526678C2 (en) * 2003-02-24 2005-10-25 Precise Biometrics Ab Fingerprint representation creation method for checking person's identity using smart card, involves creating unique pairs of minutiae points identified in fingerprint and representing that pairs in predetermined manner
EP1624412B1 (en) * 2003-05-15 2008-12-17 Fujitsu Limited Biological information measuring device
US8171304B2 (en) * 2003-05-15 2012-05-01 Activcard Ireland Limited Method, system and computer program product for multiple biometric template screening
US20040239648A1 (en) 2003-05-30 2004-12-02 Abdallah David S. Man-machine interface for controlling access to electronic devices
US20050036664A1 (en) * 2003-07-03 2005-02-17 Cross Match Technologies, Inc. Polygonal ridge flow classification
US20050152586A1 (en) * 2004-01-13 2005-07-14 Tri-D Systems, Inc. Print analysis
US7356170B2 (en) * 2004-02-12 2008-04-08 Lenovo (Singapore) Pte. Ltd. Fingerprint matching method and system
KR100601453B1 (en) * 2004-03-10 2006-07-14 엘지전자 주식회사 Fingerprint recognition method
US20050249388A1 (en) * 2004-05-07 2005-11-10 Linares Miguel A Three-dimensional fingerprint identification system
US20060104224A1 (en) * 2004-10-13 2006-05-18 Gurminder Singh Wireless access point with fingerprint authentication
US20060104484A1 (en) * 2004-11-16 2006-05-18 Bolle Rudolf M Fingerprint biometric machine representations based on triangles
US7565548B2 (en) * 2004-11-18 2009-07-21 Biogy, Inc. Biometric print quality assurance
US20060120578A1 (en) * 2004-12-02 2006-06-08 Tri-D Systems, Inc. Minutiae matching
KR100752640B1 (en) * 2005-01-05 2007-08-29 삼성전자주식회사 Method and apparatus for segmenting fingerprint region using directional gradient filters
CA2592749C (en) 2005-03-24 2015-02-24 Privaris, Inc. Biometric identification device with smartcard capabilities
JP4911317B2 (en) * 2005-06-30 2012-04-04 日本電気株式会社 Fingerprint image background detection apparatus and detection method
KR100825773B1 (en) * 2005-08-23 2008-04-28 삼성전자주식회사 Method and apparatus for estimating orientation
WO2007043169A1 (en) * 2005-10-12 2007-04-19 Takashi Kuraishi Fingerprinting means and fingerprinting system
ES2347687T3 (en) 2006-04-26 2010-11-03 Aware, Inc. QUALITY AND SEGMENTATION OF THE PREVIEW OF A DACTILAR FOOTPRINT.
US7813531B2 (en) * 2006-05-01 2010-10-12 Unisys Corporation Methods and apparatus for clustering templates in non-metric similarity spaces
KR100792374B1 (en) 2006-08-24 2008-01-08 주식회사 이노와이어리스 Monitoring system and method for trunk gateway
CN100573553C (en) * 2007-01-18 2009-12-23 中国科学院自动化研究所 Method for detecting living body fingerprint based on thin plate spline deformation model
US20080273770A1 (en) * 2007-05-03 2008-11-06 Upek, Inc. Fast Fingerprint Identification And Verification By Minutiae Pair Indexing
JP2009077049A (en) * 2007-09-19 2009-04-09 Canon Inc Image reader
US7953256B2 (en) * 2007-09-21 2011-05-31 International Business Machines Corporation Method and system for detecting fingerprint spoofing
KR20230116073A (en) 2007-09-24 2023-08-03 애플 인크. Embedded authentication systems in an electronic device
US8125458B2 (en) * 2007-09-28 2012-02-28 Microsoft Corporation Detecting finger orientation on a touch-sensitive device
US8052060B2 (en) * 2008-09-25 2011-11-08 Utc Fire & Security Americas Corporation, Inc. Physical access control system with smartcard and methods of operating
TW201023055A (en) * 2008-12-12 2010-06-16 Moredna Technology Co Ltd Highly efficient method for processing fingerprint images
WO2010087886A1 (en) * 2009-01-27 2010-08-05 Gannon Technologies Group Llc Systems and methods for graph-based pattern recognition technology applied to the automated identification of fingerprints
WO2010125653A1 (en) * 2009-04-28 2010-11-04 富士通株式会社 Biometric authentication device, biometric authentication method and biometric authentication program
EA014284B1 (en) * 2009-05-04 2010-10-29 Павел Анатольевич Зайцев Method for improving quality of fingerprint image
EP2472469A1 (en) 2009-08-25 2012-07-04 Nec Corporation Striped pattern image examination support device, striped pattern image examination support method and program
MY168008A (en) * 2010-11-02 2018-10-11 Mimos Berhad Surveillance and monitoring method
US9042607B2 (en) 2011-05-02 2015-05-26 Omnicell, Inc. System and method for user access of dispensing unit
CN103150541A (en) * 2011-12-06 2013-06-12 富泰华工业(深圳)有限公司 Fingerprint identification device and fingerprint identification method
US9384518B2 (en) * 2012-03-26 2016-07-05 Amerasia International Technology, Inc. Biometric registration and verification system and method
AU2013262488A1 (en) 2012-05-18 2014-12-18 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US9965607B2 (en) 2012-06-29 2018-05-08 Apple Inc. Expedited biometric validation
US9396382B2 (en) * 2012-08-17 2016-07-19 Flashscan3D, Llc System and method for a biometric image sensor with spoofing detection
GB2507540A (en) 2012-11-02 2014-05-07 Zwipe As Enrolling fingerprints by combining image strips to obtain sufficient width
GB2507539A (en) 2012-11-02 2014-05-07 Zwipe As Matching sets of minutiae using local neighbourhoods
CN103077377B (en) * 2012-12-31 2015-07-29 清华大学 Based on the fingerprint correction method of field of direction distribution
KR101419784B1 (en) * 2013-06-19 2014-07-21 크루셜텍 (주) Method and apparatus for recognizing and verifying fingerprint
US20150071508A1 (en) * 2013-09-09 2015-03-12 Apple Inc. Background Enrollment and Authentication of a User
US9928355B2 (en) 2013-09-09 2018-03-27 Apple Inc. Background enrollment and authentication of a user
KR102187833B1 (en) * 2014-01-02 2020-12-07 삼성전자 주식회사 Method for executing a function and Electronic device using the same
JP6431044B2 (en) * 2014-03-25 2018-11-28 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
EP3125194B1 (en) * 2014-03-25 2021-10-27 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
JP6069582B2 (en) * 2014-03-25 2017-02-01 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
CN103996056B (en) * 2014-04-08 2017-05-24 浙江工业大学 Tattoo image classification method based on deep learning
US9558392B2 (en) * 2015-02-12 2017-01-31 Korecen Co., Ltd. Finger vein authentication system
US10528789B2 (en) 2015-02-27 2020-01-07 Idex Asa Dynamic match statistics in pattern matching
US9940502B2 (en) 2015-02-27 2018-04-10 Idex Asa Pre-match prediction for pattern testing
US10157306B2 (en) 2015-02-27 2018-12-18 Idex Asa Curve matching and prequalification
US10868672B1 (en) 2015-06-05 2020-12-15 Apple Inc. Establishing and verifying identity using biometrics while protecting user privacy
US11140171B1 (en) 2015-06-05 2021-10-05 Apple Inc. Establishing and verifying identity using action sequences while protecting user privacy
KR102204307B1 (en) 2015-06-11 2021-01-18 삼성전자주식회사 Method for pre-processing image comprising biological inforamtion
EP3109793B1 (en) 2015-06-22 2020-07-22 Nxp B.V. Fingerprint sensing system
US10146981B2 (en) 2015-09-10 2018-12-04 Qualcomm Incorporated Fingerprint enrollment and matching with orientation sensor input
US9935948B2 (en) * 2015-09-18 2018-04-03 Case Wallet, Inc. Biometric data hashing, verification and security
CN105447454B (en) * 2015-11-13 2018-05-01 广东欧珀移动通信有限公司 Fingerprint template improving method, device and terminal device
US10713697B2 (en) 2016-03-24 2020-07-14 Avante International Technology, Inc. Farm product exchange system and method suitable for multiple small producers
US10121054B2 (en) * 2016-11-10 2018-11-06 Synaptics Incorporated Systems and methods for improving spoof detection based on matcher alignment information
EP3321846A1 (en) * 2016-11-15 2018-05-16 Mastercard International Incorporated Systems and methods for secure biometric sample raw data storage
CN108537098A (en) * 2017-03-01 2018-09-14 重庆邮电大学 A kind of fingerprint identification method
TWI629666B (en) * 2017-03-27 2018-07-11 銘傳大學 Block-based error measurement method for image object segmentation
US11704418B2 (en) * 2018-11-27 2023-07-18 Shanghai Harvest Intelligence Technology Co., Ltd. Fingerprint encryption method and device, fingerprint decryption method and device, storage medium and terminal
US11651060B2 (en) * 2020-11-18 2023-05-16 International Business Machines Corporation Multi-factor fingerprint authenticator
CN115995098B (en) * 2023-03-23 2023-05-30 北京点聚信息技术有限公司 Flow data verification method for online signing of electronic contract

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4208651A (en) * 1978-05-30 1980-06-17 Sperry Corporation Fingerprint identification by ridge angle and minutiae recognition
US5040223A (en) * 1988-02-17 1991-08-13 Nippondenso Co., Ltd. Fingerprint verification method employing plural correlation judgement levels and sequential judgement stages
US5109428A (en) * 1988-12-06 1992-04-28 Fujitsu Ltd Minutia data extraction in fingerprint identification
US5631972A (en) * 1995-05-04 1997-05-20 Ferris; Stephen Hyperladder fingerprint matcher
US5631971A (en) * 1994-05-24 1997-05-20 Sparrow; Malcolm K. Vector based topological fingerprint matching
US5659626A (en) * 1994-10-20 1997-08-19 Calspan Corporation Fingerprint identification system
US5917928A (en) * 1997-07-14 1999-06-29 Bes Systems, Inc. System and method for automatically verifying identity of a subject
US5982914A (en) * 1997-07-29 1999-11-09 Smarttouch, Inc. Identification of individuals from association of finger pores and macrofeatures
US6049621A (en) * 1997-08-22 2000-04-11 International Business Machines Corporation Determining a point correspondence between two points in two respective (fingerprint) images
US6263091B1 (en) * 1997-08-22 2001-07-17 International Business Machines Corporation System and method for identifying foreground and background portions of digitized images
US6314196B1 (en) * 1995-10-05 2001-11-06 Fujitsu Denso Ltd. Fingerprint registering method and fingerprint checking device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4185270A (en) * 1976-07-19 1980-01-22 Fingermatrix, Inc. Fingerprint identification method and apparatus
US4135147A (en) * 1976-09-10 1979-01-16 Rockwell International Corporation Minutiae pattern matcher
CA1199732A (en) * 1982-06-28 1986-01-21 Koh Asai Method and device for matching fingerprints with precise minutia pairs selected from coarse pairs
EP0159037B1 (en) * 1984-04-18 1993-02-10 Nec Corporation Identification system employing verification of fingerprints
US4790564A (en) * 1987-02-20 1988-12-13 Morpho Systemes Automatic fingerprint identification system including processes and apparatus for matching fingerprints
DE68905237T2 (en) * 1988-05-24 1993-07-29 Nec Corp METHOD AND DEVICE FOR COMPARING FINGERPRINTS.
WO2000070544A1 (en) * 1999-05-14 2000-11-23 Biolink Technologies International, Inc. Biometric system for biometric input, comparison, authentication and access control and method therefor
US20020031245A1 (en) * 1999-05-14 2002-03-14 Roman Rozenberg Biometric authentification method
US6763127B1 (en) * 2000-10-06 2004-07-13 Ic Media Corporation Apparatus and method for fingerprint recognition system
AU2002318165A1 (en) * 2001-05-25 2002-12-09 Biometric Informatics Technology, Inc. Fingerprint recognition system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4208651A (en) * 1978-05-30 1980-06-17 Sperry Corporation Fingerprint identification by ridge angle and minutiae recognition
US5040223A (en) * 1988-02-17 1991-08-13 Nippondenso Co., Ltd. Fingerprint verification method employing plural correlation judgement levels and sequential judgement stages
US5109428A (en) * 1988-12-06 1992-04-28 Fujitsu Ltd Minutia data extraction in fingerprint identification
US5631971A (en) * 1994-05-24 1997-05-20 Sparrow; Malcolm K. Vector based topological fingerprint matching
US5659626A (en) * 1994-10-20 1997-08-19 Calspan Corporation Fingerprint identification system
US5631972A (en) * 1995-05-04 1997-05-20 Ferris; Stephen Hyperladder fingerprint matcher
US6314196B1 (en) * 1995-10-05 2001-11-06 Fujitsu Denso Ltd. Fingerprint registering method and fingerprint checking device
US5917928A (en) * 1997-07-14 1999-06-29 Bes Systems, Inc. System and method for automatically verifying identity of a subject
US5982914A (en) * 1997-07-29 1999-11-09 Smarttouch, Inc. Identification of individuals from association of finger pores and macrofeatures
US6049621A (en) * 1997-08-22 2000-04-11 International Business Machines Corporation Determining a point correspondence between two points in two respective (fingerprint) images
US6263091B1 (en) * 1997-08-22 2001-07-17 International Business Machines Corporation System and method for identifying foreground and background portions of digitized images

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080144894A1 (en) * 2006-08-11 2008-06-19 Vladimir Nickolaevich Bichigov Method for filtering a fingerprint image continuation-in-part
US20120087553A1 (en) * 2006-10-10 2012-04-12 West Virginia University Research Corporation, WVU Office of Technology Transfer & WYU Business Multi-resolutional texture analysis fingerprint liveness systems and methods
US8498458B2 (en) * 2006-10-10 2013-07-30 West Virginia University Fingerprint liveness analysis
US9367729B2 (en) 2006-10-10 2016-06-14 West Virginia University Multi-resolutional texture analysis fingerprint liveness systems and methods
US8526687B2 (en) * 2006-11-08 2013-09-03 Vladimir Nickolaevich Bichigov Method for filtering a fingerprint image continuation-in-part
US20080209226A1 (en) * 2007-02-28 2008-08-28 Microsoft Corporation User Authentication Via Biometric Hashing
US20080209227A1 (en) * 2007-02-28 2008-08-28 Microsoft Corporation User Authentication Via Biometric Hashing
US8712114B2 (en) * 2008-12-19 2014-04-29 Texas Instruments Incorporated Elegant solutions for fingerprint image enhancement
US20100158329A1 (en) * 2008-12-19 2010-06-24 Shajil Asokan Thaniyath Elegant Solutions for Fingerprint Image Enhancement
US20130121607A1 (en) * 2008-12-19 2013-05-16 Texas Instruments Incorporated Elegant Solutions for Fingerprint Image Enhancement
US20120101822A1 (en) * 2010-10-25 2012-04-26 Lockheed Martin Corporation Biometric speaker identification
US8719018B2 (en) * 2010-10-25 2014-05-06 Lockheed Martin Corporation Biometric speaker identification
CN103778621A (en) * 2012-04-20 2014-05-07 帝麦克斯(苏州)医疗科技有限公司 Identification of foreign object debris
US20130279750A1 (en) * 2012-04-20 2013-10-24 Dmetrix, Inc. Identification of foreign object debris
EP3940583A4 (en) * 2019-03-15 2022-09-07 Arcsoft Corporation Limited Methods for fingerprint image enhancement, fingerprint recognition and application startup
US11874907B2 (en) 2019-03-15 2024-01-16 Arcsoft Corporation Limited Method for enhancing fingerprint image, identifying fingerprint and starting-up application program
WO2022125058A1 (en) * 2020-12-07 2022-06-16 Google Llc Fingerprint-based authentication using touch inputs
US20220398304A1 (en) * 2020-12-07 2022-12-15 Google Llc Fingerprint-Based Authentication Using Touch Inputs
TWI795930B (en) * 2020-12-07 2023-03-11 美商谷歌有限責任公司 Fingerprint-based authentication using touch inputs
WO2024030105A1 (en) * 2022-08-02 2024-02-08 Havelsan Hava Elektronik San. Ve Tic. A.S. Multi-stage fusion matcher for dirty fingerprint and dirty palm

Also Published As

Publication number Publication date
AU2002318165A1 (en) 2002-12-09
US20030039382A1 (en) 2003-02-27
WO2002096181A3 (en) 2003-04-03
WO2002096181A2 (en) 2002-12-05
US20050157913A1 (en) 2005-07-21
US6876757B2 (en) 2005-04-05

Similar Documents

Publication Publication Date Title
US6876757B2 (en) Fingerprint recognition system
Jain et al. Fingerprint classification and matching
Prabhakar et al. Learning fingerprint minutiae location and type
Raja Fingerprint recognition using minutia score matching
Afsar et al. Fingerprint identification and verification system using minutiae matching
US7072523B2 (en) System and method for fingerprint image enhancement using partitioned least-squared filters
EP1066589A2 (en) Fingerprint identification/verification system
Hemalatha A systematic review on Fingerprint based Biometric Authentication System
WO2001048681A9 (en) Automated fingerprint identification system
Sharma et al. Two-stage quality adaptive fingerprint image enhancement using Fuzzy C-means clustering based fingerprint quality analysis
Chaudhari et al. Implementation of minutiae based fingerprint identification system using crossing number concept
Patil et al. A novel approach for fingerprint matching using minutiae
Pakutharivu et al. A comprehensive survey on fingerprint recognition systems
Jain et al. Matching and classification: a case study in fingerprint domain
Kanjan et al. A comparative study of fingerprint matching algorithms
Sharma Fingerprint biometric system: a survey
Gil et al. Access control system with high level security using fingerprints
Barham et al. Fingerprint recognition using MATLAB
Kulshrestha et al. Finger print recognition: survey of minutiae and gabor filtering approach
Shobha et al. Development of palmprint verification system using biometrics.
Jaiswal et al. Biometric Recognition System (Algorithm)
Munir et al. Fingerprint matching using ridge patterns
Kovac et al. Multimodal biometric system based on fingerprint and finger vein pattern
Kommini et al. Scale and rotation independent fingerprint recognition
Popović et al. Fingerprint minutiae filtering based on multiscale directional information

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION