WO2004029862A1 - Method and apparatus for palmprint identification - Google Patents

Method and apparatus for palmprint identification Download PDF

Info

Publication number
WO2004029862A1
WO2004029862A1 PCT/CN2003/000816 CN0300816W WO2004029862A1 WO 2004029862 A1 WO2004029862 A1 WO 2004029862A1 CN 0300816 W CN0300816 W CN 0300816W WO 2004029862 A1 WO2004029862 A1 WO 2004029862A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
window
hand
sub
capture
Prior art date
Application number
PCT/CN2003/000816
Other languages
French (fr)
Inventor
Dapeng David Zhang
Xuan Niu
Guang Ming Lu
Wai-Kin Adams Kong
Keung Ming Wong
Original Assignee
The Hong Kong Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/253,912 external-priority patent/US20040057606A1/en
Priority claimed from US10/253,914 external-priority patent/US7466846B2/en
Application filed by The Hong Kong Polytechnic University filed Critical The Hong Kong Polytechnic University
Priority to JP2004538660A priority Critical patent/JP4246154B2/en
Priority to AU2003269671A priority patent/AU2003269671A1/en
Publication of WO2004029862A1 publication Critical patent/WO2004029862A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Definitions

  • the invention relates to biometrics identification, and in particular to a method for analyzing a palmprint for the identification of an individual.
  • the invention also relates to apparatus for capturing a palmprint image for the identification of an individual.
  • palmprint recognition as a method for personal identification is a new biometrics technology replacing fingerprints.
  • Known methods include analyzing a palmprint to identify singular points, minutiae, and wrinkles in a palmprint image . These known methods require a high- resolution image as illustrated in Figure 1. This can be obtained by way of inked palmprints. However, these are messy and cannot be obtained passively for real-time identification.
  • Figure 2 illustrates low-resolution images corresponding to Figure 1.
  • Figure 1 In low-resolution palmprint images, however, singular points and minutiae cannot be observed easily therefore more easily identifiable wrinkles must play an important role in the identification. It is noted from Figure 2 however, that only a small proportion of wrinkles are significantly clear, but it is questionable whether they provide sufficient distinctiveness to reliably identify individuals amongst a large population.
  • a method of biometrics identification including: obtaining an image of an area of skin surface from an individual , analyzing the image to extract texture features on the area of skin surface, and comparing the texture features with reference information in a database.
  • a method of biometrics identification including: obtaining an image of a portion of an inner surface of a hand of an individual , obtaining a sub-image of skin surface within a defined area of the inner surface of the hand, analyzing the sub-image to obtain texture data for the skin surface, and comparing the texture data with reference information in a database.
  • the defined area is dependent on one or more characteristics of the hand.
  • the one or more characteristics are the areas between fingers of the hand.
  • the sub-image is obtained by steps including: identifying at least two points representing the areas between fingers of the hand, determining a coordinate system having a first and a second axis, wherein the two points are located on the first axis and are equidistant from the second axis, and determining parameters of the sub-image within the coordinate system using the distance between the two points.
  • the parameters of the sub-image include points in the coordinate system represented by:
  • analyzing the sub-image includes using a Gabor Filter.
  • analyzing the sub-image includes segmenting layers of the sub-image with low resolution using Gabor analysis.
  • the sub-image is segmented into two parts, a real part and an imaginary part, each part being stored as a vector.
  • comparing the texture data with reference information in the database is based on a hamming distance of the form:
  • a palmprint image capture apparatus including: an enclosure with a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, and at least one protuberance adjacent the window, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
  • a palmprint image capture apparatus including: an enclosure having a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, a controller operable to control the image sensor and light source for capturing an image, and at least one protuberance, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
  • the protuberances are pegs or pins disposed to be between the two or more fingers of the hand suitably placed on the window.
  • the light source is an annulus with the image sensor at its center.
  • the image sensor is a Charged-Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) sensor.
  • CCD Charged-Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • FIG. 1 illustrates typical high-resolution palmprints images
  • FIG. 1 illustrates typical low-resolution palmprints images
  • Figures 3 to 8 illustrate preprocessing of an image of the inside of a hand
  • Figures 9 and 10 illustrate incorrect placement of a hand on a palm reader and the corresponding preprocessed image
  • Figures 11 to 14 illustrate the preprocessed image, real and imaginary parts and the masks.
  • Figures 15 and 16 illustrate the difference in image quality between first and second collected images
  • Figures 17 and 18 show verification test results for a method according to the invention
  • Figure 19 illustrates a schematic of a palmprint image capture device according to invention
  • Figure 20 is a plan view of the image capture surface of the device
  • Figure 21 is a sectional view through A-A' of Figure 19, where a CCD camera is revolved by the circle light, and
  • Figure 22 illustrates a raw palm image captured by the apparatus .
  • a method of palmprint identification according to the invention comprises three parts: 1) obtaining an image of the palmprint of an individual, 2) analyzing the skin texture data from the image and 3) matching the skin texture data with information stored in a database. These steps are described in more detail below. 1) Obtaining an image of the palmprint of an individual
  • a low-resolution image of a portion the inside surface of a hand is obtained in known manner using a CCD camera.
  • a repeatable • sub-image of the palm area must be identified using characteristics of the hand.
  • the holes between fingers are identified and used as the parameters to build a coordinate system in which parameters that define the sub-image can be found.
  • the preferred embodiment has six main steps, which are given below.
  • the first step is to apply a lowpass filter, L (u, v) , such as Gaussian, to the original image, 0 (x, y) .
  • a threshold, T p is used to convert the convoluted image to a binary image, B (x, y) .
  • the boundary of the hole between ring and middle fingers is not extracted since it is not useful for the following processing.
  • This line, represented by numeral 2 in Figure 6, is the Y-axis of the coordinate system for determining the location of the sub- image 1.
  • the fourth step is to find a line 3 passing through the midpoint of the two points that is perpendicular to line 2 to determine the X-axis and origin of the coordinate system.
  • the two points lie on the Y-axis, equidistant from the X- axis .
  • the fifth step is to extract a sub-image 1 with a dynamic size on the basis of the coordinate system.
  • the size and location of the sub-image 1 are based on the Euclidean distance (D) between the two points (xi, yi) and (x 2 , y ) .
  • the points 4, 5, 6, 7 representing the corners of the sub-image 1 in the coordinate system are (0.25D, 0.5D) , (1.25D, 0.5D) , (0.25D, -0.5D) and (1.25D, -0.5D) respectively.
  • the sub-image 1 is square with a distance along each side equal to the Euclidean distance and symmetrical about the Y-axis line 3. Because the sub-image is based on feature of the hand (the area between the fingers) it is repeatable for each individual hand.
  • Figure 7 shows the x and y axes 2, 3 of the coordinate system and the sub-image 1 overlaid on the raw image of Figure 3.
  • the sixth step is to extract and normalize the sub-image 1 to a standard size using bilinear interpolation for feature extraction.
  • Figure 8 shows the extracted and normalized sub-image 1.
  • the circular Gabor filter is an effective tool for texture analysis, and has the following general form,
  • is the standard deviation of the Gaussian envelope.
  • Gabor filters are widely used in texture analysis and thus the skilled addressee will be familiar with their use for such purpose. In order to make the texture analysis more robust to variations in image brightness a discrete Gabor filter
  • Figures 11 depict the preprocessed images
  • 12 depict the real part of the corresponding texture features
  • 13 depict the imaginary part of the corresponding texture features
  • Figure 14 depicts the corresponding masks.
  • Palmprint Matching The real and imaginary features are represented as vectors, which are compared to vectors of stored palmprint data. Palmprint matching is based on a normalized hamming distance. For example, let P and Q be two palmprint feature matrixes. The normalized hamming distance can be described as,
  • Palmprint images were collected from 154 subjects using a palmprint scanner. Approximately 65% of the subjects were male. The age distribution of the subjects is shown in the following Table 1.
  • Each subject provided two groups of images. Each group contained 10 images for the left palm and 10 images for the right palm. Totally, each subject provided 40 images to create an image database containing 6191 images from 308 different palms. The average time difference between the collection of the first and second groups of image from each subject was 57 days. The maximum and minimum time differences were 90 and 4 days respectively.
  • the light source was changed and the focus adjusted on the CCD camera so as to simulate image collection by two different palmprint scanners.
  • Figure 15 and 16 show corresponding hand images captured in the first and second groups for one subject. The collected images were in two sizes, 384x284 and 768x568. The larger images were resized to 384x284; consequently, the size of all the test
  • each palmprint image was matched with all palmprint images in the database.
  • a matching was noted as a correct matching of two palmprint images from the same palm of the same subject.
  • the total number of comparisons was 19,161,145.
  • the number of correct matches was 59,176.
  • FIG. 17 A probability distributions for genuine and imposter are estimated by the correct and incorrect matching, respectively, is shown in Figure 17.
  • Figure 18 depicts the corresponding Receiver Operating Curve (ROC) , being a plot of genuine acceptance rate against false acceptance rate for all possible operating points. From Figure 18 it is estimated that a method according to the invention can operate at 96% genuine acceptance rate and 0.1% false acceptance rate; the corresponding threshold is 0.35. This result is comparable with prior art palmprint approaches and other hand-based biometrics technologies including hand geometry and fingerprint verification.
  • ROC Receiver Operating Curve
  • a method according to the invention utilizes low-resolution images and has low-computational cost.
  • the verification accuracy is found to be comparable with known high- performance methods using high-resolution images.
  • the method can be used for access control, ATM and various security systems .
  • FIGS 19 and 10 illustrate a palmprint image capture apparatus according to the invention.
  • the apparatus includes a housing 1 with a flat top surface 2 on which a hand is placed, palm down, for the capture of the palmprint image.
  • the surface 2 is opaque with a window 8 through which the image is captured.
  • the window 8 contains a glass panel .
  • the window 8 may contain other transparent coverings, a lens or nothing (i.e. an open window) .
  • An image sensor such as a charge coupled device (CCD) 4 is mounted within housing 1.
  • a lens 5 is screwed on the CCD.
  • the aperture of the lens 5 is orientated towards window 8 in surface 2.
  • An annular light source 6 is mounted around the lens 5 to illuminate an image in window 8.
  • Mounting arms 7 support the annular light source 6 and screws 9 are used to mount the CCD firmly on the mounting arms 7.
  • a palmprint image can be formed through this optical plane from lens 5 to CCD 4, then the digitized imagery data are transferred to an external processor such as a personal computer (not shown) for processing and manipulating.
  • FIG 21 a plan view of the lens 5 and light source 6 through section A-A' of Figure 19 is shown.
  • the lens 5 is at the centre of the annular light source 6.
  • the lens 5 is mounted on the top of the CCD 4.
  • Adjacent window 8 in surface 2 are a plurality of protuberances in the form of pegs 3 which are used to correctly position a hand on surface 2 with the palm area over the window 8.
  • a person places their hand on the surface 2 to locate pegs 3 between the fingers and thumb of the hand. This ensures that the hand is placed correctly on the apparatus for the capture of the optimal area of the palm through window 8.
  • Figure 22 shows an image of the target palm area captured through window 8. It is apparent that using an opaque surface 2 with a target window 8 ensures that the area of interest from the palm can be obtained accordingly.
  • This image is acquired from CCD 4 by the personal computer for further processing.
  • a palmprint obtained by the apparatus is suitable for use in biometrics identification. The features and characteristics of the palmprint can be obtained and then compared to the database record to identify an individual.
  • a number of techniques can be used to determine the characteristics of the palm in the image.
  • One suitable technique is texture analysis. Texture analysis is suitable because it gives a high level of accuracy on low-resolution images.
  • CMOS Complementary Metal Oxide Semiconductor
  • the protuberances adjacent the window 8 are pegs 3.
  • the surface 2 with window 8 is made with a depression or concavity into which the hand can be placed palm down.
  • the apparatus can be used to capture an image for use in the method described.

Abstract

A method of palmprint identification includes analyzing an area from an image of a palm to obtain texture data for the skin surface with the area. The texture data is compared to reference information in a database to determine the identity of an individual. An apparatus for capturing an image of a palm includes an enclosure with a window in it, and an image sensor and light source disposed within the enclosure and arranged to capture an image through the window. Protuberances are provided on the surface. The protuberances are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.

Description

Method and Apparatus for Palmprint Identification
Background to the Invention
1. Field of the Invention
The invention relates to biometrics identification, and in particular to a method for analyzing a palmprint for the identification of an individual. The invention also relates to apparatus for capturing a palmprint image for the identification of an individual.
2. Background Information
Using palmprint recognition as a method for personal identification is a new biometrics technology replacing fingerprints. Known methods include analyzing a palmprint to identify singular points, minutiae, and wrinkles in a palmprint image . These known methods require a high- resolution image as illustrated in Figure 1. This can be obtained by way of inked palmprints. However, these are messy and cannot be obtained passively for real-time identification.
To overcome the problem of inked palmprints some companies have developed high-resolution palmprint scanners and identification systems. However, these devices capturing high-resolution images are costly and rely on high performance computers to fulfil the requirements of realtime identification.
One solution to the above problems seems to be the use of low-resolution images. Figure 2 illustrates low-resolution images corresponding to Figure 1. In low-resolution palmprint images, however, singular points and minutiae cannot be observed easily therefore more easily identifiable wrinkles must play an important role in the identification. It is noted from Figure 2 however, that only a small proportion of wrinkles are significantly clear, but it is questionable whether they provide sufficient distinctiveness to reliably identify individuals amongst a large population.
Summary of the Invention
It is an object of the present invention to provide a method of biometrics identification, and in particular a method for analyzing a palmprint for the identification of an individual, which overcomes or ameliorates problems with prior art methods. It is a further object of the present invention to provide an apparatus for capturing a palmprint image, which overcomes or ameliorates disadvantages with prior art apparatus or at least which provides the public with a useful alternative. According to a first aspect of the invention there is provided a method of biometrics identification including: obtaining an image of an area of skin surface from an individual , analyzing the image to extract texture features on the area of skin surface, and comparing the texture features with reference information in a database.
According to a second aspect of the invention there is provided a method of biometrics identification including: obtaining an image of a portion of an inner surface of a hand of an individual , obtaining a sub-image of skin surface within a defined area of the inner surface of the hand, analyzing the sub-image to obtain texture data for the skin surface, and comparing the texture data with reference information in a database.
Preferably, the defined area is dependent on one or more characteristics of the hand.
Preferably, the one or more characteristics are the areas between fingers of the hand. Preferably, the sub-image is obtained by steps including: identifying at least two points representing the areas between fingers of the hand, determining a coordinate system having a first and a second axis, wherein the two points are located on the first axis and are equidistant from the second axis, and determining parameters of the sub-image within the coordinate system using the distance between the two points.
Preferably, the parameters of the sub-image include points in the coordinate system represented by:
(0.25D, 0.5D), (1.25D, 0.5D), (0.25D, -0.5D) and (1.25D, -0.5D) where D is the distance between the two points.
Preferably, there is a further step of normalizing the sub- image .
Preferably, analyzing the sub-image includes using a Gabor Filter.
Preferably, analyzing the sub-image includes segmenting layers of the sub-image with low resolution using Gabor analysis. Preferably, the sub-image is segmented into two parts, a real part and an imaginary part, each part being stored as a vector.
Preferably, comparing the texture data with reference information in the database is based on a hamming distance of the form:
Figure imgf000006_0001
where PR { QR) and PΣ (Q) are the real part and the imaginary part .
According to a third aspect of the invention there is provided a palmprint image capture apparatus including: an enclosure with a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, and at least one protuberance adjacent the window, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
According to a forth aspect of the invention there is provided a palmprint image capture apparatus including: an enclosure having a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, a controller operable to control the image sensor and light source for capturing an image, and at least one protuberance, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
Preferably, the protuberances are pegs or pins disposed to be between the two or more fingers of the hand suitably placed on the window.
Preferably, the light source is an annulus with the image sensor at its center.
Preferably, the image sensor is a Charged-Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) sensor.
Further aspects of the invention will become apparent from the following description, which is given by way of example only. Brief Description of the Drawings
Embodiments of the invention will now be described with reference to the accompanying drawings in which:
Figure 1 illustrates typical high-resolution palmprints images,
Figure 2 illustrates typical low-resolution palmprints images,
Figures 3 to 8 illustrate preprocessing of an image of the inside of a hand,
Figures 9 and 10 illustrate incorrect placement of a hand on a palm reader and the corresponding preprocessed image,
Figures 11 to 14 illustrate the preprocessed image, real and imaginary parts and the masks.
Figures 15 and 16 illustrate the difference in image quality between first and second collected images,
Figures 17 and 18 show verification test results for a method according to the invention, Figure 19 illustrates a schematic of a palmprint image capture device according to invention,
Figure 20 is a plan view of the image capture surface of the device,
Figure 21 is a sectional view through A-A' of Figure 19, where a CCD camera is revolved by the circle light, and
Figure 22 illustrates a raw palm image captured by the apparatus .
Description of the Preferred Embodiments
A method of palmprint identification according to the invention comprises three parts: 1) obtaining an image of the palmprint of an individual, 2) analyzing the skin texture data from the image and 3) matching the skin texture data with information stored in a database. These steps are described in more detail below. 1) Obtaining an image of the palmprint of an individual
Referring to Figure 3, a low-resolution image of a portion the inside surface of a hand is obtained in known manner using a CCD camera. In order to extract identification data from the image a repeatable • sub-image of the palm area must be identified using characteristics of the hand. In the preferred embodiment the holes between fingers are identified and used as the parameters to build a coordinate system in which parameters that define the sub-image can be found. The preferred embodiment has six main steps, which are given below.
Referring to Figure 4, the first step is to apply a lowpass filter, L (u, v) , such as Gaussian, to the original image, 0 (x, y) . Then, a threshold, Tp, is used to convert the convoluted image to a binary image, B (x, y) .
Referring to Figure 5 the second step is to obtain the boundaries of the holes, (F±Xj, F±yj) : where i=l , 2, between the fingers using a boundary tracking algorithm. The boundary of the hole between ring and middle fingers is not extracted since it is not useful for the following processing.
Referring to Figure 6, the third step is to compute the tangent of the holes (F± j , F±yj ) . If (xlt yx) and (x2, y2) are two points on (FiXj, Fxyj) and (F2Xj, F yj) , respectively the line (y = mx + c) passing through these two points satisfies the inequality, F±yj mFiXj + C, for all i and j . The line (y = mx + c) is the tangent of the two holes. This line, represented by numeral 2 in Figure 6, is the Y-axis of the coordinate system for determining the location of the sub- image 1.
The fourth step is to find a line 3 passing through the midpoint of the two points that is perpendicular to line 2 to determine the X-axis and origin of the coordinate system. The two points lie on the Y-axis, equidistant from the X- axis .
The fifth step is to extract a sub-image 1 with a dynamic size on the basis of the coordinate system. The size and location of the sub-image 1 are based on the Euclidean distance (D) between the two points (xi, yi) and (x2, y ) . The points 4, 5, 6, 7 representing the corners of the sub-image 1 in the coordinate system are (0.25D, 0.5D) , (1.25D, 0.5D) , (0.25D, -0.5D) and (1.25D, -0.5D) respectively. Thus the sub-image 1 is square with a distance along each side equal to the Euclidean distance and symmetrical about the Y-axis line 3. Because the sub-image is based on feature of the hand (the area between the fingers) it is repeatable for each individual hand.
Figure 7 shows the x and y axes 2, 3 of the coordinate system and the sub-image 1 overlaid on the raw image of Figure 3.
The sixth step is to extract and normalize the sub-image 1 to a standard size using bilinear interpolation for feature extraction. Figure 8 shows the extracted and normalized sub-image 1.
Once the palm sub-image 1 is obtained the next part of the method is undertaken.
2) Analyzing the skin texture of the image
The circular Gabor filter is an effective tool for texture analysis, and has the following general form,
Figure imgf000012_0001
where .=■-[; u is the frequency of the sinusoidal wave; θ
controls the orientation of the function and σ is the standard deviation of the Gaussian envelope. Gabor filters are widely used in texture analysis and thus the skilled addressee will be familiar with their use for such purpose. In order to make the texture analysis more robust to variations in image brightness a discrete Gabor filter
Figure imgf000013_0001
is turned to zero DC with the application of the following formula:
Σ ∑G[i,j,θ,u,σ]
G[x, y, θ, u, σ] = G[x, y,θ,u,σ]- - n j=-n (2)
(2n + l)2 where (2n+l)2 is the size of the filter. In fact, the imaginary part of the Gabor filter automatically has zero DC because of odd symmetry. The use of the adjusted Gabor filter is to filter the preprocessed images. Then, the phase information is coded by the following inequalities,
br = 1 if Re ∑ ∑G[x,y,θ,u,σ]I(x + xo;y + y0) >0 (3) y=-nx=-n
br = 0 if x0,y + y0) <0 (4)
b,. =1 if x0, y + y0 ) >0, (5)
b. =0 if x0,y + y0) <0, (6)
Figure imgf000013_0002
where I (x, y) is a preprocessed image and (x0, yo) s center of filtering.
Referring to Figures 9 and 10, since it is expected that some users will not place their hand correctly some non- palmprint pixels will be contain in the palm sub-image. A mask is generated to point out the location of the non- palmprint pixels. Because the image source can be considered a semi-closed environment, the non-palmprint pixels come from the black boundaries of the image background. Thus a threshold can be used to segment the non-palmprint pixels. Typically, feature size including mask and palmprint features is 384 bytes.
Figures 11 depict the preprocessed images, 12 depict the real part of the corresponding texture features, 13 depict the imaginary part of the corresponding texture features, and Figure 14 depicts the corresponding masks.
A useful discussion on the use of Gabor filters for texture analysis can be found in the following two publications. A . Jain and G. Healey, X\A mul ti scale representation including opponent color features for texture recogni tion" , IEEE Transactions on Image Processing, vol . 7, no . 1 , pp . 124 -128, 1998. And, D. Dunn and W. E. Higgins, "Optimal Gabor fil ters for texture segmentation, " IEEE Transactions on Image Processing, vol . 4, no . 4 , pp . 947-964, 1995.
3) Palmprint Matching The real and imaginary features are represented as vectors, which are compared to vectors of stored palmprint data. Palmprint matching is based on a normalized hamming distance. For example, let P and Q be two palmprint feature matrixes. The normalized hamming distance can be described as,
∑ ∑PM (i, j)n Qu (i, j)( PR (i, j)® QR (i, j) + Pr (i, j)® Q, (i, j))) D _ -i . -l , (7)
2∑∑PM (i,j)r^QM (i,j)
1=1 . =1 where PR (QR) , Pτ ( j) and PM (QM) are the real part, the imaginary part and mask of P ( Q) , respectively; the result of Boolean operator, "<8>", is equal to zero if and only if the two bits, PR (i) (i , j) , equal to QR(i) (i , j ) ; π represents an
AND operator and the size of the feature matrixes is NxN. It is noted that D0 is between 1 and 0. For prefect matching, the matching score is zero. Because of imperfect preprocessing, the features need to be vertically and horizontally translated and then matched again. Then, the range of vertical and horizontal translation is -2 to 2. The minimum of D0' s obtained by translated matching is considered as the final matching score.
The following experimental results illustrate the effectiveness of a system according to the invention.
Palmprint images were collected from 154 subjects using a palmprint scanner. Approximately 65% of the subjects were male. The age distribution of the subjects is shown in the following Table 1.
Figure imgf000016_0001
Each subject provided two groups of images. Each group contained 10 images for the left palm and 10 images for the right palm. Totally, each subject provided 40 images to create an image database containing 6191 images from 308 different palms. The average time difference between the collection of the first and second groups of image from each subject was 57 days. The maximum and minimum time differences were 90 and 4 days respectively. After finishing the first collection, the light source was changed and the focus adjusted on the CCD camera so as to simulate image collection by two different palmprint scanners. Figure 15 and 16 show corresponding hand images captured in the first and second groups for one subject. The collected images were in two sizes, 384x284 and 768x568. The larger images were resized to 384x284; consequently, the size of all the test
images in the following experiments is 384x284 with 75dpi resolution.
To obtain the verification accuracy of the palmprint system, each palmprint image was matched with all palmprint images in the database. A matching was noted as a correct matching of two palmprint images from the same palm of the same subject. The total number of comparisons was 19,161,145. The number of correct matches was 59,176.
A probability distributions for genuine and imposter are estimated by the correct and incorrect matching, respectively, is shown in Figure 17. Figure 18 depicts the corresponding Receiver Operating Curve (ROC) , being a plot of genuine acceptance rate against false acceptance rate for all possible operating points. From Figure 18 it is estimated that a method according to the invention can operate at 96% genuine acceptance rate and 0.1% false acceptance rate; the corresponding threshold is 0.35. This result is comparable with prior art palmprint approaches and other hand-based biometrics technologies including hand geometry and fingerprint verification.
A method according to the invention utilizes low-resolution images and has low-computational cost. The verification accuracy is found to be comparable with known high- performance methods using high-resolution images.
The method can be used for access control, ATM and various security systems .
Figures 19 and 10 illustrate a palmprint image capture apparatus according to the invention. The apparatus includes a housing 1 with a flat top surface 2 on which a hand is placed, palm down, for the capture of the palmprint image. 'The surface 2 is opaque with a window 8 through which the image is captured. In the preferred embodiment the window 8 contains a glass panel . In alternative embodiments the window 8 may contain other transparent coverings, a lens or nothing (i.e. an open window) .
An image sensor such as a charge coupled device (CCD) 4 is mounted within housing 1. A lens 5 is screwed on the CCD. The aperture of the lens 5 is orientated towards window 8 in surface 2.
An annular light source 6 is mounted around the lens 5 to illuminate an image in window 8. Mounting arms 7 support the annular light source 6 and screws 9 are used to mount the CCD firmly on the mounting arms 7. A palmprint image can be formed through this optical plane from lens 5 to CCD 4, then the digitized imagery data are transferred to an external processor such as a personal computer (not shown) for processing and manipulating.
Referring to figure 21, a plan view of the lens 5 and light source 6 through section A-A' of Figure 19 is shown. The lens 5 is at the centre of the annular light source 6. The lens 5 is mounted on the top of the CCD 4.
Adjacent window 8 in surface 2 are a plurality of protuberances in the form of pegs 3 which are used to correctly position a hand on surface 2 with the palm area over the window 8. In use, a person places their hand on the surface 2 to locate pegs 3 between the fingers and thumb of the hand. This ensures that the hand is placed correctly on the apparatus for the capture of the optimal area of the palm through window 8.
Figure 22 shows an image of the target palm area captured through window 8. It is apparent that using an opaque surface 2 with a target window 8 ensures that the area of interest from the palm can be obtained accordingly. This image is acquired from CCD 4 by the personal computer for further processing. A palmprint obtained by the apparatus is suitable for use in biometrics identification. The features and characteristics of the palmprint can be obtained and then compared to the database record to identify an individual. A number of techniques can be used to determine the characteristics of the palm in the image. One suitable technique is texture analysis. Texture analysis is suitable because it gives a high level of accuracy on low-resolution images.
The described embodiment uses a CCD image sensor. In an alternative embodiment a Complementary Metal Oxide Semiconductor (CMOS) sensor is used. The CMOS sensor yields lower resolution at a lower cost. However, this is ameliorated if texture analysis is used.
In the preferred embodiment the protuberances adjacent the window 8 are pegs 3. In an alternative embodiment the surface 2 with window 8 is made with a depression or concavity into which the hand can be placed palm down.
The apparatus can be used to capture an image for use in the method described.
Where in the foregoing description reference has been made to integers or elements having known equivalents then such are included as if individually set forth herein. Embodiments of the invention have been described, however it is understood that variations, improvements or modifications can take place without departure from the spirit of the invention or scope of the appended claims.

Claims

What Is Claimed Is:
1. A method of biometrics identification including: obtaining an image of an area of skin surface from an individual , analyzing the image to extract texture features on the area of skin surface, and comparing the texture features with reference information in a database.
2. A method of biometrics identification including: obtaining an image of a portion of an inner surface of a hand of an individual, obtaining a sub-image of skin surface within a defined area of the inner surface of the hand, analyzing the sub-image to obtain texture data for the skin surface, and comparing the texture data with reference information in a database.
3. The method of claim 2 wherein the defined area is dependent on one or more characteristics of the hand.
4. The method of claims 2 or 3 wherein the one or more characteristics are the areas between fingers of the hand.
5. The method of any preceding claim wherein the sub-image is obtained by steps including: identifying at least two points representing the areas between fingers of the hand, determining a coordinate system having a first and a second axis, wherein the two points are located on the first axis and are equidistant from the second axis, and determining parameters of the sub-image within the coordinate system using the distance between the two points .
6. The method of claim 5 wherein the parameters of the sub-image include points in the coordinate system represented by:
(0.25D, 0.5D), (1.25D, 0.5D), (0.25D, -0.5D) and (1.25D, -0.5D) where D is the distance between the two points.
7. The method of claim 5 or 6 including a further step of normalizing the sub-image.
8. The method of any preceding claim wherein analyzing the sub-image includes using a Gabor Filter.
9. The method of any preceding claim wherein analyzing the sub-image includes segmenting layers of the sub-image with low resolution using Gabor analysis.
11. The method of any preceding claim wherein the sub-image is segmented into two parts, a real part and an imaginary part, each part being stored as a vector.
12. The method of claim 11 wherein comparing the texture data with reference information in the database is based on a hamming distance of the form:
Figure imgf000024_0001
where PR { QR) and Px (Qx) are the real part and the imaginary part .
13. A palmprint image capture apparatus including: an enclosure with a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, and at least one protuberance adjacent the window, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
14. The apparatus of claim 13 wherein the protuberances are pegs or pins disposed to be between the two or more fingers of the hand suitably placed on the window.
15. The apparatus of claims 13 or 14 wherein the light source is an annulus with the image sensor at its center.
16. The apparatus of any one of claims 13 to 15 wherein the image sensor is a Charged-Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) sensor.
17. A palmprint image capture apparatus including: an enclosure having a window in it, an image sensor disposed within the enclosure and arranged to capture an image through the window, a light source disposed to illuminate the window, a controller operable to control the image sensor and light source for capturing an image, and at least one protuberance, wherein the protuberance (s) is/are arranged to be in known juxtaposition to a hand suitably placed on the window for capture of an image that includes the palm area of the hand.
18. The apparatus of claim 17 wherein the protuberances are pegs or pins disposed to be between the two or more fingers of the hand suitably placed on the window.
19. The apparatus of claims 17 or 18 wherein the light source is an annulus with the image sensor at its center.
20. The apparatus of any one of claims 17 to 19 wherein the image sensor is a Charged-Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) sensor.
PCT/CN2003/000816 2002-09-25 2003-09-25 Method and apparatus for palmprint identification WO2004029862A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2004538660A JP4246154B2 (en) 2002-09-25 2003-09-25 Biometric authentication method
AU2003269671A AU2003269671A1 (en) 2002-09-25 2003-09-25 Method and apparatus for palmprint identification

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US10/253,914 2002-09-25
US10/253,912 2002-09-25
US10/253,912 US20040057606A1 (en) 2002-09-25 2002-09-25 Apparatus for capturing a palmprint image
US10/253,914 US7466846B2 (en) 2002-09-25 2002-09-25 Method for analyzing a palm print for the identification of an individual using gabor analysis

Publications (1)

Publication Number Publication Date
WO2004029862A1 true WO2004029862A1 (en) 2004-04-08

Family

ID=32044965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2003/000816 WO2004029862A1 (en) 2002-09-25 2003-09-25 Method and apparatus for palmprint identification

Country Status (5)

Country Link
JP (1) JP4246154B2 (en)
CN (1) CN100380391C (en)
AU (1) AU2003269671A1 (en)
HK (1) HK1062117A2 (en)
WO (1) WO2004029862A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007000504A1 (en) * 2005-06-27 2007-01-04 France Telecom Biometric hand recognition method and associated system and device
JP2007052534A (en) * 2005-08-16 2007-03-01 Kddi Corp Palm print authentication device, palm print authentication program, palm print authentication method, palm print image extraction method, and portable telephone terminal with palm print authentication device
CN100365646C (en) * 2006-09-15 2008-01-30 哈尔滨工业大学 Differential operation based high-precision palm print recognition method
CN100458832C (en) * 2007-06-21 2009-02-04 中国科学院合肥物质科学研究院 Palm grain identification method based on direction character
EP2709037A3 (en) * 2012-09-17 2015-04-08 Tata Consultancy Services Limited Enclosure for biometric sensor
US9128601B2 (en) 2007-09-24 2015-09-08 Apple Inc. Embedded authentication systems in an electronic device
US9342674B2 (en) 2003-05-30 2016-05-17 Apple Inc. Man-machine interface for controlling access to electronic devices
EP3125193A4 (en) * 2014-03-25 2017-10-11 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
US9847999B2 (en) 2016-05-19 2017-12-19 Apple Inc. User interface for a device requesting remote authorization
US9898642B2 (en) 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US10142835B2 (en) 2011-09-29 2018-11-27 Apple Inc. Authentication with secondary approver
CN110008825A (en) * 2019-02-20 2019-07-12 平安科技(深圳)有限公司 Palm grain identification method, device, computer equipment and storage medium
US10395128B2 (en) 2017-09-09 2019-08-27 Apple Inc. Implementation of biometric authentication
US10438205B2 (en) 2014-05-29 2019-10-08 Apple Inc. User interface for payments
US10484384B2 (en) 2011-09-29 2019-11-19 Apple Inc. Indirect authentication
US10521579B2 (en) 2017-09-09 2019-12-31 Apple Inc. Implementation of biometric authentication
US10860096B2 (en) 2018-09-28 2020-12-08 Apple Inc. Device control using gaze information
US11100349B2 (en) 2018-09-28 2021-08-24 Apple Inc. Audio assisted enrollment
US11170085B2 (en) 2018-06-03 2021-11-09 Apple Inc. Implementation of biometric authentication
US11209961B2 (en) 2012-05-18 2021-12-28 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US11676373B2 (en) 2008-01-03 2023-06-13 Apple Inc. Personal computing device control using face detection and recognition

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
CN101221619B (en) * 2007-01-09 2010-09-08 北京大学 Adjustable photo-optics palm print image acquisition equipment
CN101093626B (en) * 2007-07-27 2011-02-09 哈尔滨工业大学 Palm print cipher key system
US8528072B2 (en) 2010-07-23 2013-09-03 Apple Inc. Method, apparatus and system for access mode control of a device
WO2013161077A1 (en) * 2012-04-27 2013-10-31 富士通フロンテック株式会社 Biometric authentication device, biometric authentication program, and biometric authentication method
EP3125195B1 (en) 2014-03-25 2020-03-11 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
WO2015145588A1 (en) 2014-03-25 2015-10-01 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
WO2015145590A1 (en) 2014-03-25 2015-10-01 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
CN103955674B (en) * 2014-04-30 2017-05-10 广东瑞德智能科技股份有限公司 Palm print image acquisition device and palm print image positioning and segmenting method
CN107292273B (en) * 2017-06-28 2021-03-23 西安电子科技大学 Eight-neighborhood double Gabor palm print ROI matching method based on specific expansion
CN217061035U (en) * 2022-03-24 2022-07-26 腾讯科技(深圳)有限公司 Palm brushing equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5526436A (en) * 1993-06-17 1996-06-11 Asahi Kogaku Kogyo Kabushiki Kaisha Image detecting apparatus for an individual identifying system
CN1223461A (en) * 1997-12-23 1999-07-21 西门子公司 Dual damascene with bond pads
JP2002269562A (en) * 2001-03-14 2002-09-20 Nec Corp Image-collating device and method, image-collating system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4357597A (en) * 1980-08-26 1982-11-02 Palmguard, Inc. Palm-positioning and system-actuating mechanism
JP2944602B2 (en) * 1998-01-14 1999-09-06 警察庁長官 Palm print impression registration / collation method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5526436A (en) * 1993-06-17 1996-06-11 Asahi Kogaku Kogyo Kabushiki Kaisha Image detecting apparatus for an individual identifying system
CN1223461A (en) * 1997-12-23 1999-07-21 西门子公司 Dual damascene with bond pads
JP2002269562A (en) * 2001-03-14 2002-09-20 Nec Corp Image-collating device and method, image-collating system

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9342674B2 (en) 2003-05-30 2016-05-17 Apple Inc. Man-machine interface for controlling access to electronic devices
WO2007000504A1 (en) * 2005-06-27 2007-01-04 France Telecom Biometric hand recognition method and associated system and device
JP2007052534A (en) * 2005-08-16 2007-03-01 Kddi Corp Palm print authentication device, palm print authentication program, palm print authentication method, palm print image extraction method, and portable telephone terminal with palm print authentication device
JP4588577B2 (en) * 2005-08-16 2010-12-01 Kddi株式会社 Palmprint authentication apparatus, palmprint authentication program, palmprint authentication method, palmprint image extraction method, and mobile phone terminal provided with palmprint authentication apparatus
CN100365646C (en) * 2006-09-15 2008-01-30 哈尔滨工业大学 Differential operation based high-precision palm print recognition method
CN100458832C (en) * 2007-06-21 2009-02-04 中国科学院合肥物质科学研究院 Palm grain identification method based on direction character
US9953152B2 (en) 2007-09-24 2018-04-24 Apple Inc. Embedded authentication systems in an electronic device
US9134896B2 (en) 2007-09-24 2015-09-15 Apple Inc. Embedded authentication systems in an electronic device
US9250795B2 (en) 2007-09-24 2016-02-02 Apple Inc. Embedded authentication systems in an electronic device
US9274647B2 (en) 2007-09-24 2016-03-01 Apple Inc. Embedded authentication systems in an electronic device
US9304624B2 (en) 2007-09-24 2016-04-05 Apple Inc. Embedded authentication systems in an electronic device
US9329771B2 (en) 2007-09-24 2016-05-03 Apple Inc Embedded authentication systems in an electronic device
US9128601B2 (en) 2007-09-24 2015-09-08 Apple Inc. Embedded authentication systems in an electronic device
US9495531B2 (en) 2007-09-24 2016-11-15 Apple Inc. Embedded authentication systems in an electronic device
US9519771B2 (en) 2007-09-24 2016-12-13 Apple Inc. Embedded authentication systems in an electronic device
US10956550B2 (en) 2007-09-24 2021-03-23 Apple Inc. Embedded authentication systems in an electronic device
US11468155B2 (en) 2007-09-24 2022-10-11 Apple Inc. Embedded authentication systems in an electronic device
US10275585B2 (en) 2007-09-24 2019-04-30 Apple Inc. Embedded authentication systems in an electronic device
US11676373B2 (en) 2008-01-03 2023-06-13 Apple Inc. Personal computing device control using face detection and recognition
US10142835B2 (en) 2011-09-29 2018-11-27 Apple Inc. Authentication with secondary approver
US10419933B2 (en) 2011-09-29 2019-09-17 Apple Inc. Authentication with secondary approver
US11755712B2 (en) 2011-09-29 2023-09-12 Apple Inc. Authentication with secondary approver
US11200309B2 (en) 2011-09-29 2021-12-14 Apple Inc. Authentication with secondary approver
US10516997B2 (en) 2011-09-29 2019-12-24 Apple Inc. Authentication with secondary approver
US10484384B2 (en) 2011-09-29 2019-11-19 Apple Inc. Indirect authentication
US11209961B2 (en) 2012-05-18 2021-12-28 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
EP2709037A3 (en) * 2012-09-17 2015-04-08 Tata Consultancy Services Limited Enclosure for biometric sensor
US9898642B2 (en) 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US11287942B2 (en) 2013-09-09 2022-03-29 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces
US10410035B2 (en) 2013-09-09 2019-09-10 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US11768575B2 (en) 2013-09-09 2023-09-26 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
US10372963B2 (en) 2013-09-09 2019-08-06 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US11494046B2 (en) 2013-09-09 2022-11-08 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
US10055634B2 (en) 2013-09-09 2018-08-21 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US10803281B2 (en) 2013-09-09 2020-10-13 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US10262182B2 (en) 2013-09-09 2019-04-16 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
EP3125193A4 (en) * 2014-03-25 2017-10-11 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
US10019619B2 (en) 2014-03-25 2018-07-10 Fujitsu Frontech Limited Biometrics authentication device and biometrics authentication method
US10438205B2 (en) 2014-05-29 2019-10-08 Apple Inc. User interface for payments
US10796309B2 (en) 2014-05-29 2020-10-06 Apple Inc. User interface for payments
US11836725B2 (en) 2014-05-29 2023-12-05 Apple Inc. User interface for payments
US10902424B2 (en) 2014-05-29 2021-01-26 Apple Inc. User interface for payments
US10977651B2 (en) 2014-05-29 2021-04-13 Apple Inc. User interface for payments
US10748153B2 (en) 2014-05-29 2020-08-18 Apple Inc. User interface for payments
US10334054B2 (en) 2016-05-19 2019-06-25 Apple Inc. User interface for a device requesting remote authorization
US11206309B2 (en) 2016-05-19 2021-12-21 Apple Inc. User interface for remote authorization
US9847999B2 (en) 2016-05-19 2017-12-19 Apple Inc. User interface for a device requesting remote authorization
US10749967B2 (en) 2016-05-19 2020-08-18 Apple Inc. User interface for remote authorization
US10783227B2 (en) 2017-09-09 2020-09-22 Apple Inc. Implementation of biometric authentication
US11386189B2 (en) 2017-09-09 2022-07-12 Apple Inc. Implementation of biometric authentication
US10395128B2 (en) 2017-09-09 2019-08-27 Apple Inc. Implementation of biometric authentication
US10410076B2 (en) 2017-09-09 2019-09-10 Apple Inc. Implementation of biometric authentication
US11393258B2 (en) 2017-09-09 2022-07-19 Apple Inc. Implementation of biometric authentication
US10872256B2 (en) 2017-09-09 2020-12-22 Apple Inc. Implementation of biometric authentication
US11765163B2 (en) 2017-09-09 2023-09-19 Apple Inc. Implementation of biometric authentication
US10521579B2 (en) 2017-09-09 2019-12-31 Apple Inc. Implementation of biometric authentication
US11170085B2 (en) 2018-06-03 2021-11-09 Apple Inc. Implementation of biometric authentication
US11928200B2 (en) 2018-06-03 2024-03-12 Apple Inc. Implementation of biometric authentication
US11619991B2 (en) 2018-09-28 2023-04-04 Apple Inc. Device control using gaze information
US11809784B2 (en) 2018-09-28 2023-11-07 Apple Inc. Audio assisted enrollment
US11100349B2 (en) 2018-09-28 2021-08-24 Apple Inc. Audio assisted enrollment
US10860096B2 (en) 2018-09-28 2020-12-08 Apple Inc. Device control using gaze information
CN110008825A (en) * 2019-02-20 2019-07-12 平安科技(深圳)有限公司 Palm grain identification method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
AU2003269671A1 (en) 2004-04-19
JP4246154B2 (en) 2009-04-02
CN1685357A (en) 2005-10-19
CN100380391C (en) 2008-04-09
HK1062117A2 (en) 2004-09-17
JP2006500662A (en) 2006-01-05

Similar Documents

Publication Publication Date Title
WO2004029862A1 (en) Method and apparatus for palmprint identification
US6901155B2 (en) Wavelet-enhanced automated fingerprint identification system
Ma et al. Iris recognition based on multichannel Gabor filtering
Han et al. Palm vein recognition using adaptive Gabor filter
US8229178B2 (en) Method and apparatus for personal identification using palmprint and palm vein
Sun et al. Improving iris recognition accuracy via cascaded classifiers
Ross et al. A hybrid fingerprint matcher
Kumar et al. Palmprint identification using palmcodes
US7466846B2 (en) Method for analyzing a palm print for the identification of an individual using gabor analysis
Banerjee et al. ARTeM: A new system for human authentication using finger vein images
WO2005124662A1 (en) Palm print identification using palm line orientation
Dutagacı et al. Comparative analysis of global hand appearance-based person recognition
Zhang et al. Advanced biometrics
Ilankumaran et al. Multi-biometric authentication system using finger vein and iris in cloud computing
Raju et al. A proposed feature extraction technique for dental X-ray images based on multiple features
CN109598235B (en) Finger vein image authentication method and device
Hiew et al. Digital camera based fingerprint recognition
EP2138950A1 (en) Iris feature extraction, identification and verification system based on directionlets
WONG et al. Palmprint Based Biometric System: A Comparative Study on Discrete Cosine Transform Energy, Wavelet Transform Energy and SobelCode Methods (< Special Issue> BIOMETRICS AND ITS APPLICATIONS)
WO2004111919A1 (en) Method of palm print identification
Sahmoud Enhancing iris recognition
Kanchana et al. Quadtree decomposition for palm print feature representation in palmprint recognition system
Madasu et al. An authentication system based on palmprint
Alkhathami et al. A mosaic approach to touchless fingerprint image with multiple views
Pan et al. A modified preprocessing method for palmprint recognition

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003822593X

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2004538660

Country of ref document: JP

122 Ep: pct application non-entry in european phase