WO2000070542A1 - Biometric system for biometric input, comparison, authentication and access control and method therefor - Google Patents

Biometric system for biometric input, comparison, authentication and access control and method therefor Download PDF

Info

Publication number
WO2000070542A1
WO2000070542A1 PCT/US2000/013322 US0013322W WO0070542A1 WO 2000070542 A1 WO2000070542 A1 WO 2000070542A1 US 0013322 W US0013322 W US 0013322W WO 0070542 A1 WO0070542 A1 WO 0070542A1
Authority
WO
WIPO (PCT)
Prior art keywords
mmutia
data
image data
data set
biometric
Prior art date
Application number
PCT/US2000/013322
Other languages
French (fr)
Inventor
Roman Rozenberg
Tatyana Rozenberg
Sergey Olegovich Novikov
Boris Ivanovich Kotenev
Viacheslav Nikolajevch Koptelov
Jurig Jakovlevich Kharon
Igor Viadimirovich Matveev
Mukafat Davudogly Vagabov
Oleg Mikhailovich Chernomordik
Original Assignee
Biolink Technologies International, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Biolink Technologies International, Inc. filed Critical Biolink Technologies International, Inc.
Priority to JP2000618914A priority Critical patent/JP2003536121A/en
Priority to AU51350/00A priority patent/AU5135000A/en
Publication of WO2000070542A1 publication Critical patent/WO2000070542A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0336Mouse integrated fingerprint sensor

Definitions

  • the present invention relates to a system and method for biometric input, comparison, and authentication and, more particularly, to a biometric input device having a scanning window with a ridge structure, illuminated prism, an image detector and scanning electronics operable in conjunction with biometric data comparison system for comparing directional and minutia data.
  • the biometric data comparison system provides for controlled access to a computing system based upon comparison of imputed biometric data with biometric data stored in a database.
  • the system and method of the present invention further provides for secure communication of biometric data over public lines.
  • Biometric input devices are known for use with computing systems. Such biometric input devices include computer mouse designs. Existing designs for such biometric input devices have scanning windows lacking efficient positioning structure for scanning positioning and protection from ambient light, and do not provide mechanical integration of a position sensing ball assembly with an optical scanning assembly maximizing reliability of position sensing ball operation. Biometric data comparison methods and systems are known. Such known systems and methods suffer from various drawbacks including intensive computing power requirements, intensive memory requirements, slow data transfer, slow comparison, and comparison reliability reduction due to environmental and physiological factors. Known systems also fail to provide for secure communication of biometric data over public lines. Summary of the Invention
  • the present invention provides a biometric input device, system and method which includes a biometric input device having a scanning window surrounded by a ridge for ensuring positive positioning of a biometric sample such as a thumb.
  • the biometric input device includes an optical assembly having a prism with a focusing lens disposed on a side thereof and optionally integrally formed therewith.
  • a biometric comparison method for comparing data from said biometric input device with data from a database using both directional image comparison and cluste ⁇ zed mmutia location and direction comparison.
  • a further system is provided for allowing access to computer functions base on the outcome of the comparison method.
  • the present invention also provides a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body wall defining an aperture and an optical assembly for scanning the fingerprint disposed m the device body.
  • the optical assembly has a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly.
  • a ridge surrounds a portion of a periphery of the aperture such that the ridge engages the opposing tip sides and tip end such as to position the fingerprint on the scanning surface and block ambient light.
  • a further feature of the present invention includes the aforesaid biometric input device having a device body with a bottom surface opposing a substrate upon which the device body is placed, a device body length and a front portion, a middle portion and a heel portion.
  • a movement detection device for detecting movement of the device body relative the substrate is provided and the bottom surface defined a bottom surface aperture through which the movement detection device detects movement of the device body relative the substrate.
  • the bottom surface aperture is disposed in the heel portion of the device body and the optical assembly is disposed in the middle portion of the device body.
  • the movement detection device has a ball protruding through the bottom surface aperture for engaging the substrate to register the movement of the device body relative the substrate.
  • a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body side wall defining an aperture, an optical assembly for scanning the fingerprint disposed in the device body.
  • the optical assembly includes an imaging component for converting a light image into pixel output and a lens for focusing the light image into the imaging component.
  • the optical assembly includes a prism with first, second and third sides and a top side wherein the first side forms a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly, the second side has the lens for focusing the light image into the imaging component disposed thereon, and the third side has a light absorbing layer.
  • the present invention also includes the above embodiment wherein, in the alternative or in combination with one another, the lens is formed integrally with the prism and a light emitting device is disposed to emit light into the prism from the top side of the prism to illuminate the fingerprint when disposed at the scanning surface.
  • a biometric comparison method comprising a series steps beginning with (a) scanning in a fingerprint to and digitizing scanning signals to produce a matrix of print image data representing pixels. Next the method proceed with (b) dividing the print image data into cells, each including a number of pixel data for contiguous pixels, and (c) calculating a matrix of directional image data DI using gradient statistics applied to the cells wherein the directional image data DI includes for each of the cells a cell position indicator and one of a cell vector indicative of a direction of ridge lines and an unidirectional flag indicative of a nondirectional calculation result .
  • Processing then continues with (d) skeletonizing the print image data, and (e) extracting minutia from the print image data and producing a minutia data set comprised of data triplets for each minutia extracted including minutia position data and minutia direction data.
  • a comparing process is initiated by (f) providing reference fingerprint data from a database wherein the reference fingerprint data includes reference directional image data DI and a reference minutia data set, and (g) performing successive comparisons of the directional image data DI with the reference directional image data DI and determining a directional difference DifDI for each of the successive comparisons wherein for each of the successive comparisons one of the directional image data DI and the reference directional image data DI is positional shifted by adding position shift data.
  • a next stage of the comparison process proceeds with (i) positional shifting minutia data by applying the initial minutia shift data to one of the minutia data set and the reference minutia data set to initially positional shift the minutia position data and the minutia orientation data, then (j) performing successive comparisons of the minutia data set with the reference minutia data set following the positional shifting minutia data and determining matching minutia based on a minutia distance criteria, a number of matching minutia, and a similarity measure indicative of correspondence of the matching minutia for each of the successive comparisons wherein, for each of the successive comparisons, one of the minutia data set and the reference minutia data set is positional shifted within a minutia shift range R by adding minutia position shift data, and finally (k) determining a maximum similarity measure of the
  • the present invention also includes the above method wherein, as an alternative, the calculation of the directional image data includes (cl) identifying a directional group of cells comprising all cells of the cells that do not have the unidirectional flag associated therewith; and then excluding from the successive comparisons of minutia of minutia data sets of one of the minutia data set and the reference minutia data set located in or positional aligned with the cells that have the unidirectional flag associated therewith.
  • the present invention further provides a feature for use in conducting the successive comparisons of minutia comprising dividing the minutia data set into the minutia data set clusters formed on contiguous one the cells and each including a predetermined number of the minutia before conducting the successive comparisons, conducting the successive comparisons for each of the minutia data set clusters and determining for each of the minutia data set clusters a maximum similarity measure, and finally determining the maximum similarity measure as a sum of the maximum similarity measures of each of the minutia data set clusters .
  • the present invention also provides for the above comparison method excluding from further processing pairs of the minutia located within a minutia exclusion distance of one another and having minutia direction data with a direction exclusion limit of being in opposite directions.
  • the present invention further provides a feature wherein in the above comparison method the minutia extraction step extracts minutia limited to ends and bifurcations. Still further there is provided a feature wherein the minutia data set excludes data distinguishing ends and bifurcations.
  • biometric comparison system comprising a computer having a memory including a reference fingerprint data and at least one of file data and application software, a display, an apparatus for representing at least one of file data and application software as icons on the display, and a biometric input device for scanning a fingerprint and storing fingerprint data representing the fingerprint into the memory.
  • a comparison engine is provided for comparing the fingerprint data with the reference fingerprint data and determining a match if a similarity threshold is satisfied.
  • An access control icon generator permits a user to move an access control icon on the display and an access control means is provided for controlling access to the at least one of file data and application software when a user moves the access control icon onto the icon representing the at least one of file data application software whereby access to the at least one of file data and application software is permitted only if a user scans a fingerprint producing fingerprint data for which the comparison means determines matches the reference fingerprint data.
  • Figure la is a block diagram of a system of the present invention.
  • Figure lb is a block diagram of an alternative system of the present invention
  • Figure 2a is a top plan simplified view of a biometric input device of the present invention
  • Figure 2b is a side elevation view of the biometric input device of Figures 2a showing internal components in dashed lines;
  • Figure 3a is a side elevation view of the biometric input device of Figure 2a showing surface contours
  • Figure 3B is a bottom perspective view of the biometric input device of Figure 2a showing surface contours and dimensional disposition of features;
  • FIG 4 is a block schematic of the biometric input device of Figure 2a;
  • FIG. 5 is a flow chart for operation of the biometric input device of Figure 2a;
  • Figure 6 is a flow chart of the comparison method of the present invention
  • Figure 7 is an illustration of a directional image analysis
  • Figure 8 (a) is an image of the fingerprint based on data received from an optical scanning assembly
  • Figure 8 (b) is an image of the fingerprint of Figure 8 (a) following low pass filtering
  • Figure 8(c) is an image of the fingerprint of Figure 8(a) following directional filtering and binarization;
  • Figure 8 (d) is an image of the fingerprint of Figure 8 (a) following skeletonization
  • Figure 9 (a) is a depiction of a bifurcation
  • Figure 9(b) is a depiction of an end
  • Figure 10 is a depiction of an analysis of two minutia exclusion purposes
  • Figure 11 is a simplified depiction of a fingerprint image data FP1 divided into clusters.
  • Figure 12 is a simplified depiction of the clusters of Figure 11 applied individually shift to print image data FP2.
  • a computer 50 has a keyboard 52 and a biometric input device 54 with a scanning window 56 for accepting biometric input.
  • the computer 50 may take the form of a personal computer, a dedicated device such as an ATM machine, a dumb terminal, or a computer on the order of a workstation, minicomputer or mainframe.
  • the computer 50 is connected to a remote computer 51 via a link 53 which may be a direct link via phone lines or direct cabling, or via a network such as a LAN, WAN, intranet or Internet.
  • the computer 50 In order to gain access to use of the computer 50, or remote computer 51, for all or only specified functions, a user must provide a biometric input to the biometric input device 54 via the scanning window 56.
  • the computer 50 will be referred to, however, it is understood that the remote computer 51 may optionally perform the functions ascribed to the computer 50 with the computer 50 functioning as a terminal.
  • reference to gaining access to use of the computer 50 is understood to include the alternative of access to use of the remote computer 51.
  • the computer 50 compares biometric data, representing the biometric input, with stored biometric data and determines if the biometric data corresponds to any stored biometric data held in a data base. If a correspondence exists, the user is given authorization, that is, the user is allowed access to the computer 50 for performance of the specified functions or for use of the computer 50 in general.
  • the biometric input device 54 is connected to the computer 50 via an input cord 72.
  • an embodiment of the present invention has a port adaptor connector 57 connecting the input cord 72 to a corresponding port on the computer 50.
  • a stand-alone adaptor unit 58 channels data via the input cord 72 and a cable 59 to and from the computer 50.
  • the scanning window 56 and associated structure is incorporated in either the computer 50 or the keyboard 52.
  • the stand-alone biometric input device 54 is omitted and functions thereof are performed by the computer 50 or by circuitry incorporated in the keyboard 52. It is understood that functions discussed herein with respect to the biometric input device 54 and the computer 50 may optionally be distributed between the biometric input device 54 and the computer 50 as is practical .
  • the biometric input device 54 is shown in the form of a computer mouse 60.
  • the biometric input device may take the form of another type of input device such as a track ball, joystick, touch stapad or other variety of input device.
  • the computer mouse 60 includes a left button 62, a right button 64, a ball 66, an X sensor 68, and a Y sensor 70.
  • Various means may be used to effect input from these devices including mechanical, optical or other.
  • optical means may be substituted for the ball 66 to detect mouse movement.
  • the input cord 72 connects to the computer 50 for effecting data transfer.
  • the input cord 72 is replaced by wireless means for effecting data transfer which operate using optical or electromagnetic transmission.
  • An optical assembly 80 includes a prism 82, a first lens 84, a mirror 86, a CCD assembly 88, and LED ' s 89.
  • the prism 82 has first, second and third sides, 90, 92 and 94, respectively.
  • the first side 90 forms the surface of the scanning window 56. Coatings or a transparent plate may optionally be used to protect the first side 90.
  • the second side 92 has the first lens 84 fixed thereon or formed integrally with the prism 82.
  • the prism 82 is molded integrally with the first lens 84 which provides for reducing part count and simplifying assembly of the biometric input device 54.
  • the third side 94 has a light absorbing coating 96.
  • the CCD assembly 88 includes a CCD sensor 102 and a second lens 104 which functions as an object lens.
  • the first and second lenses 84 and 104 function in conjunction with the mirror 86, as shown by light ray tracings, to focus an image at the first surface 90 onto the CCD sensor 102.
  • Various other lens assemblies may optionally be realized by those of ordinary skill in the art and are considered to be within the scope and spirit of the present invention.
  • a user In order to input biometric data, a user holds the computer mouse 60 with the index, middle or third finger extended to operate the left and right buttons, 62 and 64, and with the thumb contacting the scanning window 56 to permit an image of a thumb print to be focussed onto the CCD sensor 102. The user then operates any of the left and right buttons, 62 or 64, or other input device to initiate scanning of the thumb print. Alternatively, scanning may be automatically initiated by circuitry in the biometric input device 54 or the computer 50.
  • a front portion 109 of the computer mouse 60 refers to an end portion of the computer mouse 60 where the input cord 72 extends from and the left and right buttons, 62 and 64, are situated, a heel portion 110 comprises a rear end portion where a user's palm typically rests, and a middle portion 111 is an area where balls of the user's hand typically are situated.
  • the front portion 109, the heel portion 110, and the middle portion 111 are situated to define thirds of a length L of the computer mouse 60 extending from a front end of the end portion 109 to a rear end of the heel portion 110.
  • the scanning window 56 is situated on a side of the middle portion 111 and has a ridge 120 framing at least three sides of the scanning window 56.
  • the ridge 120 is configured to accept a perimeter of a user's thumb thereby defining a scanning position of the user's thumb in the scanning window 56. Furthermore, the ridge 120 serves to shield the scanning window 56 from ambient light during the scanning process and also to protect the scanning window 56 from damage.
  • the ball 66 is disposed with a center thereof within the heel portion 110 of the computer mouse 60. Such disposition of the ball 66 provides advantageous situation of the ball 6 under the palm, of the user so that pressure from the palm during operation ensures positive contact of the ball 66 with a substrate upon which the computer mouse 60 is used.
  • the ball 66 is optionally disposed rearward of a mid-position in the computer mouse 60 wherein the mid-position is a middle of the length L of the computer mouse 60.
  • the ball 66 is situated either in the middle portion, forward of the mid-position in the computer mouse, or in the front portion.
  • a circuit board 140 contains circuitry for effecting scanning operation of the optical assembly 80.
  • a contact detection assembly may be realized wherein the scanning window 56 takes the form of a silicon contact sensor. In either configuration, a thumb print of the user is represented by data of an array of pixels.
  • the LED ' s 89 are mounted on the circuit board 140 in a position above a top surface of the prism 82 to radiate light into the prism 82 for scanning the thumb print.
  • the embodiment shown has two LED's, but it is realized a single LED may be used or alternative light generating devices may be substituted therefor.
  • the embodiment shown provides the LED's 89, mounted on the circuit board 140, the LED's 89 are alternatively mounted on the prism 82 or molded into the prism 82 at the top side in the same operation wherein the first lens 84 is molded integrally with the prism 82.
  • perspective depictions of the computer mouse 60 illustrate the length L of the computer mouse 60, the disposition of the ball 66 and ridge structure of the ridge 120.
  • the ridge 120 has an outer surface 122 extending outwardly from a side surface 126 of the computer mouse 60 and an inner surface 124 extending from a peak of the ridge structure to the scanning surface 56.
  • the ridge 120 is raised from the side surface 126 on at least three sides of the scanning window 56, that is, front, top and bottom sides.
  • a rise of the ridge 120 from the side surface 126 is optionally omitted to permit ease of insertion of the thumb against the scanning window 56.
  • the location of the ridge 120 on the three sides of the scanning window 56 ensures positive location of the thumb for scanning purposes to minimize scan to scan variations in positioning of the thumb print thereby facilitating thumb print comparisons.
  • the center of the ball 66 is shown rearward of the mid-position, the middle portion 111 which includes the middle third of the computer mouse 60, and the three quarter length position.
  • the outer surface 122 is concave but may optionally be flat or convex.
  • the inner surface 124 is concave but may optionally be flat or convex.
  • the outer surface 122 may be omitted with the inner surface 124 serving alone to position the thumb wherein the inner surface 124 defines a recess in the side surface 126.
  • the rising of the outer surface 122 from the side surface 124 provides for the side surface 126 protruding less outwardly from a mouse body centerline CL1 of the computer mouse 60, shown in Figure 2a, thereby providing for a functionally less cumbersome device.
  • a surface of the scanning window 56 is inclined with respect to the mouse body centerline CL1 to define an acute angle with respect thereto in the range of 5o to 250, and preferably in the range of 10° to 20°.
  • a front edge of the scanning surface 56 is recessed inwardly toward the mouse body centerline CL1 from a position of the side wall 126 relative to the mouse body centerline CL1. Such positioning provides for an ergonomically advantageous positioning of the thumb when the computer mouse 60 is held.
  • the scanning window 56 has a length of about 30mm and a width of about 18mm .
  • the scanning window 56 is inclined in the vertical plane with respect to the substrate upon which the computer mouse 60 rests such that a longitudinal center line CL2 of the scanning surface defines an acute angle with respect to the substrate in the range of 0° to 25°, and preferably in the range of 5° to 15°.
  • the prism 82 is a right angle prism with a forward acute angle in the range of 40° to 60° and preferably in the range of 45° to 55°.
  • the mirror 86 serves to redirect light to the CCD assembly 88 thereby providing for a compact arrangement of the optical assembly 80. In one embodiment the forward angle is about 50°.
  • a microcontroller 150 is interfaced with a CCD controller 152, a ROM 154, a RAM 156, and an A/D converter 158. Output from the CCD sensor 102 is input to the A/D converter 158 where it is digitized.
  • the CCD controller 152 effects scanning of the CCD sensor 102 to transfer sensed levels of the pixels of the CCD sensor 102.
  • the microcontroller 150 further controls intensity of light produced by the LED 89.
  • An interface controller 160 is interfaced with the microcontroller 150 to effect communication with a serial port of the computer 50. Other interfaces may be employed permitting data communication with the computer 50.
  • the microcontroller 150 may optionally receive mouse input from the left and right mouse buttons, 62 and 64, and the x and y sensors, 68 and 70, and transmit the mouse input to the computer 50 to effect combined functions of thumb print scanning and mouse control.
  • the microcontroller 150 is optionally in the form of a programmable logic device (PLD) .
  • PLD programmable logic device
  • One such device is the FLEX10K from Altera.
  • the microcontroller 150 controls the CCD controller 152, determines a size and position of a frame, records image data of the frame into the RAM 156, and supports communication protocol with the interface controller 160, such as the RS-232 interface, the PS-2 interface, or the USB interface.
  • the ROM 154 stores program codes for the microcontroller 150 and may be programmed to effect operations over various interfaces. While discrete IC's are shown, it is realized that the functions of the IC's may integrated in a single IC.
  • the CCD controller 152 effects reading of successive pixels and lines of the CCD sensor 102.
  • a matrix of data from the pixel array of the CCD sensor 102 forms the frame and is stored in the RAM 156.
  • the frame consists of data representative of the thumb print image and preferably excludes data from pixels not representative of the thumb print image. Thus, the frame represents a subset of data from a complete scanning of the CCD sensor 102. Accordingly, the amount of data to be processed and sent to the computer 50 is optionally reduced from that of an entire scan of the CCD sensor 102.
  • the interface controller 160 may be incorporated into an interface unit 162 for connecting the input cord 72 to the computer to permit operation over various interfaces by substitution of the interface unit 162 having the desired interface controller 160.
  • the interface unit 162 may be in a separate housing connectable to a desired input port, as shown in Figure la as the stand-alone adapter unit 58, or a connector housing itself as show in Figure la as the port adapter connector 57. Implementation of the interface unit 162 is dictated by the type of port to be interfaced.
  • a parallel printer port interface that is, a PS2 port interface
  • a microcontroller and a PLD for example, a ZILOG Corp. Z86E02 in conjunction with a FLEX8K PLD from Altera Corp.
  • the interface connector 162 is a separate housing which is connected to the computer's printer port with a cable and has a connector for the input cord 72 and for a parallel printer cable through which a printer may be interfaced to the computer 50. Power is supplied to the interface connector 162 and the computer mouse 60 via the PS2 port from the computer 50.
  • Data exchange for the computer mouse's 50 usual mouse input, that is, input from the left and right buttons, 62 and 64, and the x and y sensors, 69 and 70, is effected using standard protocol for PS2 mouse interface and the PLD based on output from the microcontroller 150 of the computer mouse 60.
  • a full speed USB interface at 12 MBaud may be effected using a processor in the interface unit 162, such as an Intel Corp. 930, which has in built USB functions.
  • the interface unit 162 is optionally a separate housing in the form of a stand-alone adapter unit 58 which is connected to the computer's USB port with a cable 59, as shown in Figure la, and has a connector for the input cord 72. Power is supplied from the computer 50 for the interface unit 162 and the computer mouse 60 via the USB port .
  • a serial port interface that is, a COM port interface, functioning at 115.2 KB may be effected using a processor in the interface unit 162, such as an Atmel AT29C2051, and an RS232 voltage converter.
  • the interface unit 162 is optionally incorporated in a connector for connecting the input cord 72 to the computer's 50 serial port. Power is supplied from the computer 50 via a further connector and is processed by the voltage converter to drive the computer mouse 50.
  • a flow chart is shown of operation of the computer mouse 60. Operation begins at an start point 200 and proceeds to decision step 205 to determine whether a read print command is received from the computer 50, referred to as "PC" in the flow chart, to read in a thumb print. If a "read print" command is received, the LED 89 is lit to a maximum level in step 210. Next, in step 215, data from the CCD sensor 102 is read. Following reading CCD data, a decision step 220 is executed to determine whether a finger is detected. When a finger is detected operation proceeds to a decision step 225 to determine whether the light level is acceptable, and if it is not the level is adjusted and operation returns to step 215.
  • operation proceeds to transmission step 230 wherein a message is sent to the computer 50 indicating that print data is to be sent.
  • a line of print data from the CCD sensor 102 is sent to the computer 50.
  • Operation then proceeds to a decision step 240 wherein it is determined whether the end of the image data has been sent to the computer 50. If transmission of the image data is not complete, a check is made m a status verification step 245 to see whether there is any mouse input, such as data from any of the left button 62, right button 64, X sensor 68, or Y sensor 70 input by the user and, if such data has been input, it is sent to the computer 50 m a transmission step 250.
  • Operation returns to the transmission step 235 wherein a next line of CCD data is sent to the computer 50 after the mouse input is sent to the computer 60 or if no mouse input is detected. If it is determined m the decision step 240 that transmission of image data is complete, operation returns to the beginning of the flow chart below the start step 200.
  • step 205 if no read print command is received, operation proceeds to a status verification step 255 to see whether any mouse input has been input by the user and, if such data has been input, it is sent to the computer 50 m transmission step 260.
  • image data is also referred to as print data m reference to the input of a thumb print.
  • image data is also referred to as print data m reference to the input of a thumb print.
  • other types of biometric input may be used and that the present invention may optionally used to process such other data.
  • examples of such other data include a print image of any of the other digits or images of other unique biometric data such as retinal images.
  • Finger print image analysis may effect comparison of images.
  • the present invention further provides an analysis algorithm that effects comparison of special points maps which indicate where special points, also known as minutia, of a fingerprint are located. The fingerprint analysis algorithm considers a fingerprint not as a determined object but as a stochastic object.
  • Factors that randomize print image data include elasticity of skin, humidity, level of impurity, skin temperature, individual characteristics of the user's fmger-touch, among many other factors.
  • the basic generation of a special points map optionally includes multiple finger touches of the same finger, that is, a user's thumb print is optionally scanned multiple times.
  • Each image data from each scanning is referred to herein as a "standard.”
  • the term "reliability,” as used above, relates a probability of recognizing a registered user, that is, matching a user's thumb print data with thumb print data in the data base after one touch.
  • Figure 6 a flow chart of a fingerprint analyzing algorithm of the present invention is shown. The algorithm is described below wherein the following definitions apply:
  • UnDir (>Pi) mask value to detect the absence of FP in a current cell, for n-th FP
  • imaging step 300 the user's thumb print is scanned by the CCD sensor 102 and then digitized step 305, wherein analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel.
  • analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel.
  • the analog levels of the pixels are successively digitized by the A/D converter 158 and stored m the RAM 156.
  • a sequence of filtering and contrasting transformations is executed on the initial matrix of intensity data. The aim is to get the more "stable" image of the fingerprint (while touching) .
  • the print image data FP is optionally transferred to the computer 50 as indicated m Figure 5.
  • the filtering and contrasting transformations may be executed by the mircocontroller 150 m the computer mouse 60.
  • the matrix of intensity data from the CCD sensor 102 that is, the print image data FP, includes the fingerprint and surrounding "garbage” .
  • a border between the print image and the "garbage” is defined and the "garbage” is excluded so that only the internal part of the print image, that is the portion which includes ridge lines, takes part m the further analysis.
  • preprocessing of the print image data FP is carried out beginning with a scale normalization step 310 m which the scale of the print image data FP is normalized using standard routines.
  • the print image data FP is then used to calculate directional image data DI using gradient statistics m directional calculation step 315, wherein the print image is divided into cells having a size defined by Fx and Fy .
  • the print image data FP is divided into cells as shown by a grid superimposed on the print image and a vector normal to the direction of ridge lines m each cell is calculated. These vectors form the directional image data DI .
  • an array of directional image data F( ⁇ ,j) is generated where l and j denote the cell and the value of F( ⁇ ,j) is between O and Pi for directional cells or is set to UnDir for cells wherein a directional gradient cannot be determined such as for isolated pixels or pixel groups lacking directionality.
  • the directional image data DI is then subjected to a smoothing process and its quality factor Q is determined m a smoothing and quality processing step 320.
  • the smoothing process includes first applying a low-pass filter and then a low-cut filter, after which a directional smoothing along the directions defined for each cell is effected.
  • Scale normalization, low-pass filtering, low-cut filtering directional image calculation and smoothing are processes that are realizable by those of ordinary skill the art. Accordingly, detailed discussions thereof are omitted.
  • the quality Q of a print image data FP is then calculated by determining a ratio of cells that remain substantially unchanged following the smoothing and quality processing step 320 to the total number of cells. This ratio is then squared and multiplied by the area of the print image data FP divided by the area of the entire scanned image. Thus, both the quality of the print image data FP and absence of image data corresponding to a fingerprint are taken into consideration.
  • Quality decision step 325 is then executed to determine whether the quality Q of the print image FP is above a given quality threshold. When the quality Q is below the given quality threshold, the process returns to the imaging step 300 for input of new data. This is because it is determined that the quality of the fingerprint is insufficient to base matching upon. If the quality is above the given threshold, processing proceeds a bma ⁇ zation step 330.
  • the image data FP shown Figure 8(a) is subjected to preliminary bma ⁇ zation using subtraction of low-pass filtering resulting m the image data FP producing the image shown m Figure 8 (b) , followed by directional filtering and binanzation resulting m the image of Figure 8 (c) .
  • Processing continues with execution of a skeletonization step 335 wherein the image data FP is subjected to a thinning and skeletonization processing wherein all ridge lines are reduced to a width of one pixel which results m the image shown m Figure 8(d) . In this stage visible ridge lines, that are some pixels m width are being changed to lines one pixel m width.
  • a minutia extraction step 340 is next executed upon the image data FP that has been skeletonized. Fingerprints are characterized by various minutia which are particular patterns of the ridges. Two basic types of mmutia are a bifurcation 400, or branch, shown Figure 9 (a) , wherein a ridge line 402 divides into two ridge lines, 403 and 405, and an end 410, shown m Figure 9(b) , wherein a ridge line 412 ends.
  • Each mmutia is characterized as a vector represented by a mmutia data triplet X, Y, and A wherein X and Y represent the location of the mmutia and A is an angle of a vector of the directionallization of the mmutia as shown m Figures 9 (a) and 9 (b) .
  • end mmutia 410 and bifurcation mmutia 400 are not made. It is found that exclusion of such distinction results m reduction of data, reduced processing needs and time, while still providing acceptable reliability of fingerprint comparison. Alternatively, distinction may be made with associated increase m processing.
  • the mmutia extraction step 340 further proceeds with exclusion of mmutia that are too closely located.
  • two end mmutia at (xl, yl) and (x2, y2 ) , respectively, and represented by vectors (pl,ql) and (p2,q2), respectively, are shown.
  • This threshold distance is optionally a distance r used to determine matching mmutia and discussed below, a fixed distance, or another distance based on mean ridge line separation distance.
  • the mmutia extraction is advantageous m reducing the amount of data to be processed and thereby reducing the processing time and requirements.
  • FP1 now refers to the image data of the input fingerprint and FP2 refers to print image data of a fingerprint retrieved from the database database retrieval step 347.
  • FP2 refers to print image data of a fingerprint retrieved from the database database retrieval step 347.
  • other variables are appended with 1 or 2 to represent the respective fingerprint.
  • the values of fa, fdx, fdy iteratively varied and for each permutation thereof the transformation of Fl(fa, fdx, fdy) ( ⁇ ,j) is made and compared with F2( ⁇ , ) to find a DifDI for each set of fa, fdx, fdy values.
  • a set of fa, fdx, fdy values is then chosen for which DifDI is minimal.
  • the chosen set of fa, fdx, fdy represent the best shifting parameters for shifting the directional image Dl to effect the best matching directional alignment of Dl and D2.
  • BI is determined as the number of cells ( ⁇ ,j) of either Dl or D2 that are not UnDir.
  • a directional difference decision step 350 is next executed wherein the minimal DifDI for the chosen set of fa, fdx fdy is compared against a threshold D ⁇ fDI TH which may be a set threshold or threshold based on BI . If DifDI exceeds the threshold D ⁇ fDI ⁇ , , then it is determined that the correspondence level, or matching level, between the directional images is insufficient to warrant further comparison of FP1 and FP2 and a different fingerprint image data is chosen for FP2 and processing returns to the beginning of the matching process step 345. If DifDI is less than the threshold, operation proceeds to similarity measure calculation step 355.
  • the chosen set of fa, fdx, fdy for orthogonal transformation is applied as (dfx*Fstepx, dfy*Fstepy and fa) to the mmutia data triplets XI (k) , Yl (k) , and Al (k) of FP1, where k represents a k-th mmutia.
  • the transformed mmutia data triplets of print image data FP1 are then grouped into clusters each containing not less than a given number of mmutia, preferably seven.
  • FP1 is illustrated as being divided m four clusters CS1, CS2 , CS3 , and CS4 , which each contain the given number of mmutia (not shown) .
  • Figure 11 (a) is a simplified depiction of the process m that the clusters do not necessarily cover square regions of the print image and the number of clusters is not limited to four. The clusters may be thought of a regional groupings of mmutia.
  • XI (k) , Yl (k) of the mmutia of the given cluster are all iteratively shifted x and y directions by values dr, wherein dr is varied withm a range R, such that abs (dr) ⁇ R, and a comparison of the shifted XI (k) , Yl (k) , Al (k) is made against all mmutia m a BI grouping of FP2 for each set of dr set values to identify mmutia of FP1 matching those of FP2.
  • a pair of mmutia are considered matched when a distance between them is less than a threshold r discussed below.
  • the BI grouping of FP2 is the group of cells m FP2 that are not UnDir.
  • Smt is taken, which is the sum of the following term for each set of matched mmutia m the cluster: d
  • a is 150
  • is set equal to Rl , where Rl equals 30, and R2 , where R2 , equals 20, Rl and R2 being discussed below, and 0 is set equal to 4.
  • the set of dr values yielding the greatest similarity measure Smt is selected and the total sum of the greatest similarity measure of each cluster is taken to find a similarity measure Smt (FPl, FP2) for the comparison of FP2 to FP2) .
  • Smt similarity measure
  • comparison of fingerprints is often hampered by various environmental and physiological factors.
  • the division of FPl into clusters provides compensation m part for such factors as stretching and shrinking of the skin.
  • the total distance difference due to stretching or shrinkage between two mmutia is limited due to the limited size of the cluster area.
  • adverse effects of shrinking and stretching are minimized.
  • individual cluster shifting and comparison are a preferred embodiment of the present invention.
  • division of FPl into clusters may be omitted and shifting and comparison of FPl as a whole effected.
  • the maximum similarity measure Smt (FPl, FP2) is generated for the best comparisons of all clusters of FPl with FP2 , along with a number Nmat of matched mmutia, and a number Ntot which is the total number of mmutia withm the BI grouping of FPl .
  • An overall similarity measure for the comparison of FPl with FP2 is calculated as follows:
  • Nmt (R,r,BI, Ntot) Smt (FPl, FP2)- DifDI
  • Smt (FPl, FP2 ) is a sum of the best Smt of each cluster.
  • Nmt (R, r, BI, Ntot) is compared with a threshold Thr (R, r, BI, Ntot) . If Nmt (R, r, BI, Ntot) is greater than the threshold Thr(R, r, BI , Ntot) , it is determined the FPl matches FP2 and a match is indicated match indication step 365.
  • Nmt(R, r, BI , Ntot) is less than or equal to the threshold Thr(R, r, BI, Ntot) it is determined the FPl does not match FP2 and execution proceeds to the data base retrieval step 347 for the selection of another set of print data from the database for use as FP2 m the process which returns to the matching process step 345. Indication of a match is then used to permit access to the computer 50 m general or specific functions thereof.
  • the threshold In a preferred embodiment of the invention, the threshold
  • Thr(R, r, BI, Ntot) is determined on the basis of threshold training using a sample pool of fingerprints from a number of individuals.
  • the sample pool is composed of a number of samples, or standards, from each individual the pool.
  • the number of samples, from each individual m the pool.
  • the number of samples from each individual m one example is 4 and the number of individuals is m a range of 100 to 1000.
  • the number of samples and individuals may be varied from the exemplary values and range without departing from the scope and spirit of the present invention.
  • the process steps 305 through 355 of Figure 6 are then executed for each print with every print being compared to every other print. Since the sample pool is known, comparisons of prints from a same individual and comparisons of prints from different individuals are known.
  • MID is the mean mter- ⁇ dge distance of the prints the sample pool. The following values are found:
  • NmtS (Rl, rl,BI, Ntot) , NmtA (Rl , rl , BI , tot ) , and NmtS (R2,r2,BI, Ntot) , NmtA (R2 , r2 , BI , Ntot ) , where NmtS is number of matched mmutia for prints compared from the same individual while NmtA is the number of matching mmutia resulting from the comparison of fingerprints from different individuals .
  • the similarity decision step 360 produces a positive match indication if for the current BI, Ntot:
  • the complete description to be stored m the database is a multilevel structure of 4 (or more) FP data sets taken from the different applications of the same FP .
  • Each level of the structure corresponds to mmutia appearance frequencies for all FP codes .
  • thresholds for the similarity comparison instead of using thresholds for the similarity comparison as discussed above, fixed values may be chosen and used as threshold values.
  • the data base of fingerprints of individuals for whom identification is required is created by a registration process.
  • the registration process entails a given individual having their fingerprints scanned a number of times, for example four. Of the four scannings, the scanning producing the greatest number of mmutia is then selected for the database.
  • the present invention further includes use of the above fingerprint m utia extraction and comparison process m conjunction with a cryptographic protection process.
  • the computer 50 also referred to as the client
  • the remote computer 51 also referred to as the server
  • the link 53 which may be, for example, a link over the Internet.
  • security protection for data sent over the link 53 is required.
  • the user In order to use the cryptographic process, the user must first register his fingerprint with the server. In order to maintain security, the fingerprint data must be encrypted to prevent unauthorized interception thereof. The following steps are used:
  • User fills m a registration form including a UserID.
  • Other information such as Name, E-mail address, etc. may be included.
  • the image data is typically on the order of 64 KB.
  • This data set is also referred to herein as a passport.
  • components of the data set may be omitted, such as F( ⁇ , ) , so the passport may be shortened to about 1.2 KB.
  • the client, computer 50 then sends a request for the public key to the server via the link 53.
  • Server sends its public key K E via the link 53.
  • Client encrypts its passport and his UserlD using RSA algorithm and public key K E .
  • the length of the key is 512 bits:
  • the computer 50 sends C to the remote computer 51 via the link 53.
  • the remote computer 51 decryts message using its secret key K L :
  • the remote computer 51 then adds the UserlD and passport to the database .
  • the user authorization process is used where a user wishes to gam access to the remote computer on the basis of his fmger print matching one m the database .
  • the computer converts the image of the fmger to the passport using processing steps 310 through 340 shown m Figure 6.
  • the computer 50 sends a request over the link 53 to the remote computer 51, the server, for the public key to the server. 4.
  • the remote computer 51 sends its public key K E to the computer 50. 5.
  • the computer 50 sends C to the remote computer 51 via the link 53. 7.
  • the remote computer 51 searches the database for the UserlD, finds the corresponding passport, and executes steps 345 through 365 of Figure 6 using the passport retrieved from the database as FP2.
  • step 350 is omitted. If the comparison of step 360 is positive, access is authorized. If the UserlD does not exist or the comparison result of step 360 negative, authorization for access is refused. III. Installation of the server and addition of new users is effected by the following steps:
  • a further aspect of the present invention provides software for working m the Windows environment.
  • a protection icon is provided which an authorized user, one whose passport has produced a positive comparison, may move and drop on a file or program object to require that future access thereto be permitted only when a positive fingerprint comparison has been executed.
  • the user may input a list of UserlD 's for whom access will be allowed.

Abstract

A biometric input device (54), system and method includes a biometric input device (54) having a scanning window (56) surrounded by a ridge (120) ensuring positive positioning of a biometric sample such as a thumb. The biometric input device (54) includes an optical assembly (80) having a prism (82) with a focusing lens (84) disposed on a side thereof and optionally integrally formed therewith. A biometric comparison method is provided for comparing data (300) from said biometric input device (54) with data from a database (347) using both directional image comparison (315) and clusterized minutia location (340) and direction comparison (350). A further system is provided for allowing access to computer functions based on the outcome of the comparison method.

Description

Description
BIOMETRIC SYSTEM FOR BIOMETRIC INPUT, COMPARISON,
AUTHENTICATION AND ACCESS CONTROL AND METHOD THEREFOR
BACKGROUND OF THE INVENTION
Field of the Invention
The present invention relates to a system and method for biometric input, comparison, and authentication and, more particularly, to a biometric input device having a scanning window with a ridge structure, illuminated prism, an image detector and scanning electronics operable in conjunction with biometric data comparison system for comparing directional and minutia data. The biometric data comparison system provides for controlled access to a computing system based upon comparison of imputed biometric data with biometric data stored in a database. The system and method of the present invention further provides for secure communication of biometric data over public lines.
Description of the Related Art
Biometric input devices are known for use with computing systems. Such biometric input devices include computer mouse designs. Existing designs for such biometric input devices have scanning windows lacking efficient positioning structure for scanning positioning and protection from ambient light, and do not provide mechanical integration of a position sensing ball assembly with an optical scanning assembly maximizing reliability of position sensing ball operation. Biometric data comparison methods and systems are known. Such known systems and methods suffer from various drawbacks including intensive computing power requirements, intensive memory requirements, slow data transfer, slow comparison, and comparison reliability reduction due to environmental and physiological factors. Known systems also fail to provide for secure communication of biometric data over public lines. Summary of the Invention
Accordingly, it is an object of the invention to provide a system and method for biometric input and comparison which overcomes the drawbacks of the prior art . It is a further ob ect of the invention to provide an ergonomically advantageous biometric input device which ensures increased precision is sampling biometric data.
It is still a further object of the invention to provide a biometric data comparison method which controls access to computers or data networks.
It is yet another object of the invention to provide a fingerprint comparison method which provides for accurate and rapid comparison of fingerprints while compensating for environmental and physiological factors. An object of the present invention is also to provide a biometric based access control system for use on computers which permits a user to graphically apply biometric access control features to data and applications by the use of a user manipulated biometric protection icon. Briefly stated, the present invention provides a biometric input device, system and method which includes a biometric input device having a scanning window surrounded by a ridge for ensuring positive positioning of a biometric sample such as a thumb. The biometric input device includes an optical assembly having a prism with a focusing lens disposed on a side thereof and optionally integrally formed therewith. A biometric comparison method is provided for comparing data from said biometric input device with data from a database using both directional image comparison and clusteπzed mmutia location and direction comparison. A further system is provided for allowing access to computer functions base on the outcome of the comparison method.
The present invention also provides a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body wall defining an aperture and an optical assembly for scanning the fingerprint disposed m the device body. The optical assembly has a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly. A ridge surrounds a portion of a periphery of the aperture such that the ridge engages the opposing tip sides and tip end such as to position the fingerprint on the scanning surface and block ambient light.
A further feature of the present invention includes the aforesaid biometric input device having a device body with a bottom surface opposing a substrate upon which the device body is placed, a device body length and a front portion, a middle portion and a heel portion. A movement detection device for detecting movement of the device body relative the substrate is provided and the bottom surface defined a bottom surface aperture through which the movement detection device detects movement of the device body relative the substrate. The bottom surface aperture is disposed in the heel portion of the device body and the optical assembly is disposed in the middle portion of the device body. In an embodiment of the present invention the movement detection device has a ball protruding through the bottom surface aperture for engaging the substrate to register the movement of the device body relative the substrate.
According to a feature of the invention, there is further provided a biometric input device for accepting a fingerprint of a finger tip having opposing tip sides and a tip end, comprising a device body having a body side wall defining an aperture, an optical assembly for scanning the fingerprint disposed in the device body. The optical assembly includes an imaging component for converting a light image into pixel output and a lens for focusing the light image into the imaging component. The optical assembly includes a prism with first, second and third sides and a top side wherein the first side forms a scanning surface at the aperture upon which the finger tip is placed for scanning of the fingerprint by the optical assembly, the second side has the lens for focusing the light image into the imaging component disposed thereon, and the third side has a light absorbing layer.
The present invention also includes the above embodiment wherein, in the alternative or in combination with one another, the lens is formed integrally with the prism and a light emitting device is disposed to emit light into the prism from the top side of the prism to illuminate the fingerprint when disposed at the scanning surface.
According to a still further feature of the invention, there is provided a biometric comparison method comprising a series steps beginning with (a) scanning in a fingerprint to and digitizing scanning signals to produce a matrix of print image data representing pixels. Next the method proceed with (b) dividing the print image data into cells, each including a number of pixel data for contiguous pixels, and (c) calculating a matrix of directional image data DI using gradient statistics applied to the cells wherein the directional image data DI includes for each of the cells a cell position indicator and one of a cell vector indicative of a direction of ridge lines and an unidirectional flag indicative of a nondirectional calculation result . Processing then continues with (d) skeletonizing the print image data, and (e) extracting minutia from the print image data and producing a minutia data set comprised of data triplets for each minutia extracted including minutia position data and minutia direction data. Next, a comparing process is initiated by (f) providing reference fingerprint data from a database wherein the reference fingerprint data includes reference directional image data DI and a reference minutia data set, and (g) performing successive comparisons of the directional image data DI with the reference directional image data DI and determining a directional difference DifDI for each of the successive comparisons wherein for each of the successive comparisons one of the directional image data DI and the reference directional image data DI is positional shifted by adding position shift data. In a next step (h) it is determined for which of the successive comparisons the directional difference DifDI is the least and the position shift data thereof is selected as initial minutia shift data. A next stage of the comparison process proceeds with (i) positional shifting minutia data by applying the initial minutia shift data to one of the minutia data set and the reference minutia data set to initially positional shift the minutia position data and the minutia orientation data, then (j) performing successive comparisons of the minutia data set with the reference minutia data set following the positional shifting minutia data and determining matching minutia based on a minutia distance criteria, a number of matching minutia, and a similarity measure indicative of correspondence of the matching minutia for each of the successive comparisons wherein, for each of the successive comparisons, one of the minutia data set and the reference minutia data set is positional shifted within a minutia shift range R by adding minutia position shift data, and finally (k) determining a maximum similarity measure of the similarity measures of the successive comparisons. The comparison method concludes with (1) determining whether the maximum similarity measure is above a similarity threshold and indicating the reference fingerprint data and the fingerprint data are from the same fingerprint when the maximum similarity measure is above the similarity threshold.
The present invention also includes the above method wherein, as an alternative, the calculation of the directional image data includes (cl) identifying a directional group of cells comprising all cells of the cells that do not have the unidirectional flag associated therewith; and then excluding from the successive comparisons of minutia of minutia data sets of one of the minutia data set and the reference minutia data set located in or positional aligned with the cells that have the unidirectional flag associated therewith.
The present invention further provides a feature for use in conducting the successive comparisons of minutia comprising dividing the minutia data set into the minutia data set clusters formed on contiguous one the cells and each including a predetermined number of the minutia before conducting the successive comparisons, conducting the successive comparisons for each of the minutia data set clusters and determining for each of the minutia data set clusters a maximum similarity measure, and finally determining the maximum similarity measure as a sum of the maximum similarity measures of each of the minutia data set clusters .
The present invention also provides for the above comparison method excluding from further processing pairs of the minutia located within a minutia exclusion distance of one another and having minutia direction data with a direction exclusion limit of being in opposite directions.
The present invention further provides a feature wherein in the above comparison method the minutia extraction step extracts minutia limited to ends and bifurcations. Still further there is provided a feature wherein the minutia data set excludes data distinguishing ends and bifurcations.
Yet another feature of the present invention is a biometric comparison system comprising a computer having a memory including a reference fingerprint data and at least one of file data and application software, a display, an apparatus for representing at least one of file data and application software as icons on the display, and a biometric input device for scanning a fingerprint and storing fingerprint data representing the fingerprint into the memory. A comparison engine is provided for comparing the fingerprint data with the reference fingerprint data and determining a match if a similarity threshold is satisfied. An access control icon generator permits a user to move an access control icon on the display and an access control means is provided for controlling access to the at least one of file data and application software when a user moves the access control icon onto the icon representing the at least one of file data application software whereby access to the at least one of file data and application software is permitted only if a user scans a fingerprint producing fingerprint data for which the comparison means determines matches the reference fingerprint data.
The above, and other objects, features and advantages of the present invention will become apparent from the following description read in conjunction with the accompanying drawings, in which like reference numerals designate the same elements. Brief Description of the Drawings
For a fuller understanding of the nature of the present invention, reference should be had to the following detailed description taken in connection with the accompanying drawings in which:
Figure la is a block diagram of a system of the present invention;
Figure lb is a block diagram of an alternative system of the present invention; Figure 2a is a top plan simplified view of a biometric input device of the present invention;
Figure 2b is a side elevation view of the biometric input device of Figures 2a showing internal components in dashed lines;
Figure 3a is a side elevation view of the biometric input device of Figure 2a showing surface contours;
Figure 3B is a bottom perspective view of the biometric input device of Figure 2a showing surface contours and dimensional disposition of features;
Figure 4 is a block schematic of the biometric input device of Figure 2a;
Figure 5 is a flow chart for operation of the biometric input device of Figure 2a;
Figure 6 is a flow chart of the comparison method of the present invention; Figure 7 is an illustration of a directional image analysis;
Figure 8 (a) is an image of the fingerprint based on data received from an optical scanning assembly;
Figure 8 (b) is an image of the fingerprint of Figure 8 (a) following low pass filtering; Figure 8(c) is an image of the fingerprint of Figure 8(a) following directional filtering and binarization;
Figure 8 (d) is an image of the fingerprint of Figure 8 (a) following skeletonization;
Figure 9 (a) is a depiction of a bifurcation; Figure 9(b) is a depiction of an end;
Figure 10 is a depiction of an analysis of two minutia exclusion purposes;
Figure 11 is a simplified depiction of a fingerprint image data FP1 divided into clusters; and
Figure 12 is a simplified depiction of the clusters of Figure 11 applied individually shift to print image data FP2.
Like reference numerals refer to like parts throughout the several views of the drawings .
Detailed Description of the Preferred Embodiment Referring to Figure 1A, a computer 50 has a keyboard 52 and a biometric input device 54 with a scanning window 56 for accepting biometric input. The computer 50 may take the form of a personal computer, a dedicated device such as an ATM machine, a dumb terminal, or a computer on the order of a workstation, minicomputer or mainframe. Optionally, the computer 50 is connected to a remote computer 51 via a link 53 which may be a direct link via phone lines or direct cabling, or via a network such as a LAN, WAN, intranet or Internet. In order to gain access to use of the computer 50, or remote computer 51, for all or only specified functions, a user must provide a biometric input to the biometric input device 54 via the scanning window 56. Hereinafter the computer 50 will be referred to, however, it is understood that the remote computer 51 may optionally perform the functions ascribed to the computer 50 with the computer 50 functioning as a terminal. Likewise, reference to gaining access to use of the computer 50 is understood to include the alternative of access to use of the remote computer 51.
The computer 50 compares biometric data, representing the biometric input, with stored biometric data and determines if the biometric data corresponds to any stored biometric data held in a data base. If a correspondence exists, the user is given authorization, that is, the user is allowed access to the computer 50 for performance of the specified functions or for use of the computer 50 in general. The biometric input device 54 is connected to the computer 50 via an input cord 72. Alternatively, depending upon the type of port the biometric input device 54 uses to communicate with the computer 50, an embodiment of the present invention has a port adaptor connector 57 connecting the input cord 72 to a corresponding port on the computer 50. A still further alternative provides an embodiment of the present invention wherein a stand-alone adaptor unit 58 channels data via the input cord 72 and a cable 59 to and from the computer 50.
Referring to Figure IB, and alternative configuration is shown wherein the scanning window 56 and associated structure is incorporated in either the computer 50 or the keyboard 52. In such instances, the stand-alone biometric input device 54 is omitted and functions thereof are performed by the computer 50 or by circuitry incorporated in the keyboard 52. It is understood that functions discussed herein with respect to the biometric input device 54 and the computer 50 may optionally be distributed between the biometric input device 54 and the computer 50 as is practical .
Referring to Figures 2A and 2B, the biometric input device 54 is shown in the form of a computer mouse 60. Alternatively, the biometric input device may take the form of another type of input device such as a track ball, joystick, touch stapad or other variety of input device. The computer mouse 60 includes a left button 62, a right button 64, a ball 66, an X sensor 68, and a Y sensor 70. Various means may be used to effect input from these devices including mechanical, optical or other. For example, optical means may be substituted for the ball 66 to detect mouse movement. The input cord 72 connects to the computer 50 for effecting data transfer. Optionally, the input cord 72 is replaced by wireless means for effecting data transfer which operate using optical or electromagnetic transmission.
An optical assembly 80 includes a prism 82, a first lens 84, a mirror 86, a CCD assembly 88, and LED ' s 89. The prism 82 has first, second and third sides, 90, 92 and 94, respectively. The first side 90 forms the surface of the scanning window 56. Coatings or a transparent plate may optionally be used to protect the first side 90. The second side 92 has the first lens 84 fixed thereon or formed integrally with the prism 82. Preferably, the prism 82 is molded integrally with the first lens 84 which provides for reducing part count and simplifying assembly of the biometric input device 54. The third side 94 has a light absorbing coating 96. The CCD assembly 88 includes a CCD sensor 102 and a second lens 104 which functions as an object lens. The first and second lenses 84 and 104 function in conjunction with the mirror 86, as shown by light ray tracings, to focus an image at the first surface 90 onto the CCD sensor 102. Various other lens assemblies may optionally be realized by those of ordinary skill in the art and are considered to be within the scope and spirit of the present invention.
In order to input biometric data, a user holds the computer mouse 60 with the index, middle or third finger extended to operate the left and right buttons, 62 and 64, and with the thumb contacting the scanning window 56 to permit an image of a thumb print to be focussed onto the CCD sensor 102. The user then operates any of the left and right buttons, 62 or 64, or other input device to initiate scanning of the thumb print. Alternatively, scanning may be automatically initiated by circuitry in the biometric input device 54 or the computer 50.
The structural configuration of the computer mouse 60 is detailed below wherein a front portion 109 of the computer mouse 60 refers to an end portion of the computer mouse 60 where the input cord 72 extends from and the left and right buttons, 62 and 64, are situated, a heel portion 110 comprises a rear end portion where a user's palm typically rests, and a middle portion 111 is an area where balls of the user's hand typically are situated. The front portion 109, the heel portion 110, and the middle portion 111 are situated to define thirds of a length L of the computer mouse 60 extending from a front end of the end portion 109 to a rear end of the heel portion 110.
The scanning window 56 is situated on a side of the middle portion 111 and has a ridge 120 framing at least three sides of the scanning window 56. The ridge 120 is configured to accept a perimeter of a user's thumb thereby defining a scanning position of the user's thumb in the scanning window 56. Furthermore, the ridge 120 serves to shield the scanning window 56 from ambient light during the scanning process and also to protect the scanning window 56 from damage. The ball 66 is disposed with a center thereof within the heel portion 110 of the computer mouse 60. Such disposition of the ball 66 provides advantageous situation of the ball 6 under the palm, of the user so that pressure from the palm during operation ensures positive contact of the ball 66 with a substrate upon which the computer mouse 60 is used. The ball 66 is optionally disposed rearward of a mid-position in the computer mouse 60 wherein the mid-position is a middle of the length L of the computer mouse 60. In conventional configurations the ball 66 is situated either in the middle portion, forward of the mid-position in the computer mouse, or in the front portion. Such a construction is prone to intermittent contact of the ball with the substrate due to the user applying excessive downward force to the heel portion of the mouse resulting in the front and middle portions rising from the substrate. A circuit board 140 contains circuitry for effecting scanning operation of the optical assembly 80. As an alternative to the optical assembly 80, a contact detection assembly may be realized wherein the scanning window 56 takes the form of a silicon contact sensor. In either configuration, a thumb print of the user is represented by data of an array of pixels. The LED ' s 89 are mounted on the circuit board 140 in a position above a top surface of the prism 82 to radiate light into the prism 82 for scanning the thumb print. The embodiment shown has two LED's, but it is realized a single LED may be used or alternative light generating devices may be substituted therefor. Furthermore, although the embodiment shown provides the LED's 89, mounted on the circuit board 140, the LED's 89 are alternatively mounted on the prism 82 or molded into the prism 82 at the top side in the same operation wherein the first lens 84 is molded integrally with the prism 82. Referring to Figures 3A and 3B, perspective depictions of the computer mouse 60 illustrate the length L of the computer mouse 60, the disposition of the ball 66 and ridge structure of the ridge 120. The ridge 120 has an outer surface 122 extending outwardly from a side surface 126 of the computer mouse 60 and an inner surface 124 extending from a peak of the ridge structure to the scanning surface 56. The ridge 120 is raised from the side surface 126 on at least three sides of the scanning window 56, that is, front, top and bottom sides. On a fourth side, a rear side, a rise of the ridge 120 from the side surface 126 is optionally omitted to permit ease of insertion of the thumb against the scanning window 56. The location of the ridge 120 on the three sides of the scanning window 56 ensures positive location of the thumb for scanning purposes to minimize scan to scan variations in positioning of the thumb print thereby facilitating thumb print comparisons. The center of the ball 66 is shown rearward of the mid-position, the middle portion 111 which includes the middle third of the computer mouse 60, and the three quarter length position. The outer surface 122 is concave but may optionally be flat or convex. Likewise, the inner surface 124 is concave but may optionally be flat or convex. Furthermore, the outer surface 122 may be omitted with the inner surface 124 serving alone to position the thumb wherein the inner surface 124 defines a recess in the side surface 126. However, the rising of the outer surface 122 from the side surface 124 provides for the side surface 126 protruding less outwardly from a mouse body centerline CL1 of the computer mouse 60, shown in Figure 2a, thereby providing for a functionally less cumbersome device.
Referring again to Figure 2a, a surface of the scanning window 56 is inclined with respect to the mouse body centerline CL1 to define an acute angle with respect thereto in the range of 5o to 250, and preferably in the range of 10° to 20°. A front edge of the scanning surface 56 is recessed inwardly toward the mouse body centerline CL1 from a position of the side wall 126 relative to the mouse body centerline CL1. Such positioning provides for an ergonomically advantageous positioning of the thumb when the computer mouse 60 is held. In one embodiment of the invention the scanning window 56 has a length of about 30mm and a width of about 18mm .
Referring again to Figure 2b, the scanning window 56 is inclined in the vertical plane with respect to the substrate upon which the computer mouse 60 rests such that a longitudinal center line CL2 of the scanning surface defines an acute angle with respect to the substrate in the range of 0° to 25°, and preferably in the range of 5° to 15°. Such positioning provides for a further ergonomically advantageous positioning of the thumb when the computer mouse 60 is held. The prism 82 is a right angle prism with a forward acute angle in the range of 40° to 60° and preferably in the range of 45° to 55°. The mirror 86 serves to redirect light to the CCD assembly 88 thereby providing for a compact arrangement of the optical assembly 80. In one embodiment the forward angle is about 50°. Referring to Figure 4, an embodiment of circuitry provided board 140 is shown. A microcontroller 150 is interfaced with a CCD controller 152, a ROM 154, a RAM 156, and an A/D converter 158. Output from the CCD sensor 102 is input to the A/D converter 158 where it is digitized. The CCD controller 152 effects scanning of the CCD sensor 102 to transfer sensed levels of the pixels of the CCD sensor 102. The microcontroller 150 further controls intensity of light produced by the LED 89. An interface controller 160 is interfaced with the microcontroller 150 to effect communication with a serial port of the computer 50. Other interfaces may be employed permitting data communication with the computer 50. Furthermore, the microcontroller 150 may optionally receive mouse input from the left and right mouse buttons, 62 and 64, and the x and y sensors, 68 and 70, and transmit the mouse input to the computer 50 to effect combined functions of thumb print scanning and mouse control.
The microcontroller 150 is optionally in the form of a programmable logic device (PLD) . One such device is the FLEX10K from Altera. The microcontroller 150 controls the CCD controller 152, determines a size and position of a frame, records image data of the frame into the RAM 156, and supports communication protocol with the interface controller 160, such as the RS-232 interface, the PS-2 interface, or the USB interface.
The ROM 154 stores program codes for the microcontroller 150 and may be programmed to effect operations over various interfaces. While discrete IC's are shown, it is realized that the functions of the IC's may integrated in a single IC. The CCD controller 152 effects reading of successive pixels and lines of the CCD sensor 102. A matrix of data from the pixel array of the CCD sensor 102 forms the frame and is stored in the RAM 156. The frame consists of data representative of the thumb print image and preferably excludes data from pixels not representative of the thumb print image. Thus, the frame represents a subset of data from a complete scanning of the CCD sensor 102. Accordingly, the amount of data to be processed and sent to the computer 50 is optionally reduced from that of an entire scan of the CCD sensor 102.
In an embodiment of the invention, the interface controller 160 may be incorporated into an interface unit 162 for connecting the input cord 72 to the computer to permit operation over various interfaces by substitution of the interface unit 162 having the desired interface controller 160. The interface unit 162 may be in a separate housing connectable to a desired input port, as shown in Figure la as the stand-alone adapter unit 58, or a connector housing itself as show in Figure la as the port adapter connector 57. Implementation of the interface unit 162 is dictated by the type of port to be interfaced.
A parallel printer port interface (LPT) , that is, a PS2 port interface, may be effected using a microcontroller and a PLD, for example, a ZILOG Corp. Z86E02 in conjunction with a FLEX8K PLD from Altera Corp. In such instance the interface connector 162 is a separate housing which is connected to the computer's printer port with a cable and has a connector for the input cord 72 and for a parallel printer cable through which a printer may be interfaced to the computer 50. Power is supplied to the interface connector 162 and the computer mouse 60 via the PS2 port from the computer 50. Data exchange for the computer mouse's 50 usual mouse input, that is, input from the left and right buttons, 62 and 64, and the x and y sensors, 69 and 70, is effected using standard protocol for PS2 mouse interface and the PLD based on output from the microcontroller 150 of the computer mouse 60.
A full speed USB interface at 12 MBaud may be effected using a processor in the interface unit 162, such as an Intel Corp. 930, which has in built USB functions. In such an instance the interface unit 162 is optionally a separate housing in the form of a stand-alone adapter unit 58 which is connected to the computer's USB port with a cable 59, as shown in Figure la, and has a connector for the input cord 72. Power is supplied from the computer 50 for the interface unit 162 and the computer mouse 60 via the USB port .
A serial port interface, that is, a COM port interface, functioning at 115.2 KB may be effected using a processor in the interface unit 162, such as an Atmel AT29C2051, and an RS232 voltage converter. In such an instance the interface unit 162 is optionally incorporated in a connector for connecting the input cord 72 to the computer's 50 serial port. Power is supplied from the computer 50 via a further connector and is processed by the voltage converter to drive the computer mouse 50.
Referring to Figure 5, a flow chart is shown of operation of the computer mouse 60. Operation begins at an start point 200 and proceeds to decision step 205 to determine whether a read print command is received from the computer 50, referred to as "PC" in the flow chart, to read in a thumb print. If a "read print" command is received, the LED 89 is lit to a maximum level in step 210. Next, in step 215, data from the CCD sensor 102 is read. Following reading CCD data, a decision step 220 is executed to determine whether a finger is detected. When a finger is detected operation proceeds to a decision step 225 to determine whether the light level is acceptable, and if it is not the level is adjusted and operation returns to step 215. If the light level is acceptable, operation proceeds to transmission step 230 wherein a message is sent to the computer 50 indicating that print data is to be sent. In another transmission step 235 a line of print data from the CCD sensor 102 is sent to the computer 50. Operation then proceeds to a decision step 240 wherein it is determined whether the end of the image data has been sent to the computer 50. If transmission of the image data is not complete, a check is made m a status verification step 245 to see whether there is any mouse input, such as data from any of the left button 62, right button 64, X sensor 68, or Y sensor 70 input by the user and, if such data has been input, it is sent to the computer 50 m a transmission step 250. Operation returns to the transmission step 235 wherein a next line of CCD data is sent to the computer 50 after the mouse input is sent to the computer 60 or if no mouse input is detected. If it is determined m the decision step 240 that transmission of image data is complete, operation returns to the beginning of the flow chart below the start step 200.
In step 205, if no read print command is received, operation proceeds to a status verification step 255 to see whether any mouse input has been input by the user and, if such data has been input, it is sent to the computer 50 m transmission step 260.
Once a complete set of image, or print data, is sent to the computer 50, the computer 50 then proceeds to process the data. In the present description, image data is also referred to as print data m reference to the input of a thumb print. However, it is realized that other types of biometric input may be used and that the present invention may optionally used to process such other data. Examples of such other data include a print image of any of the other digits or images of other unique biometric data such as retinal images. Thus, such applications are considered to be withm the scope and spirit of the present invention.
After the thumb print image is scanned m and the image data thereof transferred to the computer 50, the image data is then processed and added to a database of print image data or used to gam access to use of the computer 50 by comparison to previously stored print image data m the database. Hereinafter, using image data to gain access is referred to as an authorization process while entering print image data into the database is referred to as a registration process. Finger print image analysis may effect comparison of images. Alternatively, the present invention further provides an analysis algorithm that effects comparison of special points maps which indicate where special points, also known as minutia, of a fingerprint are located. The fingerprint analysis algorithm considers a fingerprint not as a determined object but as a stochastic object. There is a philosophical analogy, like the Laplas ' s determinism and the stochastic picture of the world. Another analogy is that the first practically significant results m speech recognition appeared as soon as the first stochastic models of human's speech had appeared. A discussion of standard approaches is found m the paper A real-time matching system for large fingerprint databases, N.K. Ratha, K. Karu, S. Chen, and A.K. Jam, IEEE Trans, on PA I , Aug. 1996, vol. 18, no. 8, pp. 799-813, which is incorporated herein by reference for its teaching relating to fingerprint analysis and modeling.
Factors that randomize print image data include elasticity of skin, humidity, level of impurity, skin temperature, individual characteristics of the user's fmger-touch, among many other factors. The basic generation of a special points map optionally includes multiple finger touches of the same finger, that is, a user's thumb print is optionally scanned multiple times. Each image data from each scanning is referred to herein as a "standard." The greater the number standards of a user stored the database, the higher the reliability of the recognition is. The shorter the process of studying multiple standards, the less the reliability of recognition is.
Applicants have conducted experiments showing that the reliability of recognition and the quantity of the standards exhibit the following relationship: Quantity of Standards Reliability
1 89%
3 92%
5 95%
7 98%
12 99.5%
20 99.9%
The term "reliability," as used above, relates a probability of recognizing a registered user, that is, matching a user's thumb print data with thumb print data in the data base after one touch. Referring to Figure 6, a flow chart of a fingerprint analyzing algorithm of the present invention is shown. The algorithm is described below wherein the following definitions apply:
VARIABLE DEFINITION
Xn(i) , Yn(i) , An(i) i-th minutia description wherein Xn is an x coordinate of the i-th minutia, Yn is an x coordinate of the i-th minutia, and An is an angle of the i-th minutia
FP fingerprint
N number of minutia of fingerprint after extraction
FPn n-th fingerprint
MID mean inter-ridge distance
DI directional image
Xmas, Ymax linear sizes of an input image Fx, Fy linear sizes (numbers of cells) in directional image,
Fstepx - Ymax/Fy linear sizes of cells onto which the initial image is distributed to get directional image
Fn (i.j) directional image for n-th fingerprint
Pi discrete upper bound for 180 degrees
BI number of cells of directional image that are not UnDir
UnDir (>Pi) mask value to detect the absence of FP in a current cell, for n-th FP
In imaging step 300, the user's thumb print is scanned by the CCD sensor 102 and then digitized step 305, wherein analog levels for each pixel of the CCD sensor 102 are digitized to form one byte per pixel. Although depicted as separate operations, it is understood from the schematic of Figure 4 that the analog levels of the pixels are successively digitized by the A/D converter 158 and stored m the RAM 156. Next, a sequence of filtering and contrasting transformations is executed on the initial matrix of intensity data. The aim is to get the more "stable" image of the fingerprint (while touching) . Following storage m the RAM 158, the print image data FP is optionally transferred to the computer 50 as indicated m Figure 5. However, m an alternative embodiment of the invention the filtering and contrasting transformations may be executed by the mircocontroller 150 m the computer mouse 60. The matrix of intensity data from the CCD sensor 102, that is, the print image data FP, includes the fingerprint and surrounding "garbage" . In an optional process a border between the print image and the "garbage" is defined and the "garbage" is excluded so that only the internal part of the print image, that is the portion which includes ridge lines, takes part m the further analysis.
After the print image data FP is acquired, preprocessing of the print image data FP is carried out beginning with a scale normalization step 310 m which the scale of the print image data FP is normalized using standard routines. After the scale normalization step 310 the print image data FP is then used to calculate directional image data DI using gradient statistics m directional calculation step 315, wherein the print image is divided into cells having a size defined by Fx and Fy . Referring to Figure 7, the print image data FP is divided into cells as shown by a grid superimposed on the print image and a vector normal to the direction of ridge lines m each cell is calculated. These vectors form the directional image data DI . Thus, an array of directional image data F(ι,j) is generated where l and j denote the cell and the value of F(ι,j) is between O and Pi for directional cells or is set to UnDir for cells wherein a directional gradient cannot be determined such as for isolated pixels or pixel groups lacking directionality. The directional image data DI is then subjected to a smoothing process and its quality factor Q is determined m a smoothing and quality processing step 320. The smoothing process includes first applying a low-pass filter and then a low-cut filter, after which a directional smoothing along the directions defined for each cell is effected. Scale normalization, low-pass filtering, low-cut filtering directional image calculation and smoothing are processes that are realizable by those of ordinary skill the art. Accordingly, detailed discussions thereof are omitted.
The quality Q of a print image data FP is then calculated by determining a ratio of cells that remain substantially unchanged following the smoothing and quality processing step 320 to the total number of cells. This ratio is then squared and multiplied by the area of the print image data FP divided by the area of the entire scanned image. Thus, both the quality of the print image data FP and absence of image data corresponding to a fingerprint are taken into consideration. Quality decision step 325 is then executed to determine whether the quality Q of the print image FP is above a given quality threshold. When the quality Q is below the given quality threshold, the process returns to the imaging step 300 for input of new data. This is because it is determined that the quality of the fingerprint is insufficient to base matching upon. If the quality is above the given threshold, processing proceeds a bmaπzation step 330.
In the binanzation step 330, the image data FP shown Figure 8(a) is subjected to preliminary bmaπzation using subtraction of low-pass filtering resulting m the image data FP producing the image shown m Figure 8 (b) , followed by directional filtering and binanzation resulting m the image of Figure 8 (c) . Processing continues with execution of a skeletonization step 335 wherein the image data FP is subjected to a thinning and skeletonization processing wherein all ridge lines are reduced to a width of one pixel which results m the image shown m Figure 8(d) . In this stage visible ridge lines, that are some pixels m width are being changed to lines one pixel m width. The values on the ridge lines are 1 and for all other areas the values are 0. Now the matrix consists of only two values. Detailed discussions of the filtering and skeletonization processes are omitted as such are realizable by those of ordinary skill m the art given the present disclosure.
A minutia extraction step 340 is next executed upon the image data FP that has been skeletonized. Fingerprints are characterized by various minutia which are particular patterns of the ridges. Two basic types of mmutia are a bifurcation 400, or branch, shown Figure 9 (a) , wherein a ridge line 402 divides into two ridge lines, 403 and 405, and an end 410, shown m Figure 9(b) , wherein a ridge line 412 ends. Each mmutia is characterized as a vector represented by a mmutia data triplet X, Y, and A wherein X and Y represent the location of the mmutia and A is an angle of a vector of the directionallization of the mmutia as shown m Figures 9 (a) and 9 (b) .
In a preferred embodiment of the present invention, distinction between end mmutia 410 and bifurcation mmutia 400 is not made. It is found that exclusion of such distinction results m reduction of data, reduced processing needs and time, while still providing acceptable reliability of fingerprint comparison. Alternatively, distinction may be made with associated increase m processing.
The mmutia extraction step 340 further proceeds with exclusion of mmutia that are too closely located. Referring to Figure 10, two end mmutia at (xl, yl) and (x2, y2 ) , respectively, and represented by vectors (pl,ql) and (p2,q2), respectively, are shown. First, determination is made as to whether the two mmutia are withm a threshold distance. This threshold distance is optionally a distance r used to determine matching mmutia and discussed below, a fixed distance, or another distance based on mean ridge line separation distance. When two mmutia are withm the given threshold distance, a determination is made whether the angle between the two vectors (pl,ql) and (p2,q2) is with a given threshold of 180° and the angle between (p2,q2) and (x2-xl, y2-yl) is within a given threshold of 0. If two mmutia satisfy the aforesaid criteria they are excluded because that are too close and aligned m a nearly straight line. As a result of the mmutia extraction process, the print image FP is now represented by a data set defined as FP={Q, N, F(I,J) , X(k), Y(k), A(k)} wherein N is the total number of mmutia for the fingerprint FP, and X(k) , Y(K) and A(k) are the data triplet representing the k-th mmutia. The mmutia extraction is advantageous m reducing the amount of data to be processed and thereby reducing the processing time and requirements.
Processing next proceeds to a matching process step 345 wherein the print image data FP is compared to image data m the database. FP1 now refers to the image data of the input fingerprint and FP2 refers to print image data of a fingerprint retrieved from the database database retrieval step 347. Likewise m this description, other variables are appended with 1 or 2 to represent the respective fingerprint.
It is necessary to find the best alignment of the directional images DI1 and DI2 of Fl(ι,j) and F2(ι,j). Data Fl(fa, fdx, fdy) (ι,j) is now calculated wherein rotation by angle fa and shift by distance fx and fy is effected m an orthogonal transformation of Fl(ι,j). After the transformation of FI, a comparison of Fl(fa, fdx, fdy) (ι,j) with F2(ι,j) is then made wherein differences m orientations of corresponding cells of the directional images Dl and D2 is calculated as DifDI. DifDI is calculated as the sum of all angular differences between corresponding cells. The values of fa, fdx, fdy iteratively varied and for each permutation thereof the transformation of Fl(fa, fdx, fdy) (ι,j) is made and compared with F2(ι, ) to find a DifDI for each set of fa, fdx, fdy values. A set of fa, fdx, fdy values is then chosen for which DifDI is minimal. The chosen set of fa, fdx, fdy represent the best shifting parameters for shifting the directional image Dl to effect the best matching directional alignment of Dl and D2. Subsequent alignment of mmutia for matching purposes used the chosen set of fa, fdx, fdy as a starting point for adjustments. Additionally, BI is determined as the number of cells (ι,j) of either Dl or D2 that are not UnDir.
A directional difference decision step 350 is next executed wherein the minimal DifDI for the chosen set of fa, fdx fdy is compared against a threshold DιfDITH which may be a set threshold or threshold based on BI . If DifDI exceeds the threshold DιfDIτ, , then it is determined that the correspondence level, or matching level, between the directional images is insufficient to warrant further comparison of FP1 and FP2 and a different fingerprint image data is chosen for FP2 and processing returns to the beginning of the matching process step 345. If DifDI is less than the threshold, operation proceeds to similarity measure calculation step 355. Next, the chosen set of fa, fdx, fdy for orthogonal transformation is applied as (dfx*Fstepx, dfy*Fstepy and fa) to the mmutia data triplets XI (k) , Yl (k) , and Al (k) of FP1, where k represents a k-th mmutia. The transformed mmutia data triplets of print image data FP1 are then grouped into clusters each containing not less than a given number of mmutia, preferably seven. Referring to Figure 11(a), FP1 is illustrated as being divided m four clusters CS1, CS2 , CS3 , and CS4 , which each contain the given number of mmutia (not shown) . Figure 11 (a) is a simplified depiction of the process m that the clusters do not necessarily cover square regions of the print image and the number of clusters is not limited to four. The clusters may be thought of a regional groupings of mmutia.
Referring now to Figure 11(b) , for each of the clusters CS1, CS2, CS3, and CS4 on a cluster by cluster basis, XI (k) , Yl (k) of the mmutia of the given cluster are all iteratively shifted x and y directions by values dr, wherein dr is varied withm a range R, such that abs (dr) < R, and a comparison of the shifted XI (k) , Yl (k) , Al (k) is made against all mmutia m a BI grouping of FP2 for each set of dr set values to identify mmutia of FP1 matching those of FP2. A pair of mmutia are considered matched when a distance between them is less than a threshold r discussed below. The BI grouping of FP2 is the group of cells m FP2 that are not UnDir. For each shift of a cluster, a similarity measure Smt is taken, which is the sum of the following term for each set of matched mmutia m the cluster: d
m(xl,yl;x2 ,y2) =a f exp ( - z/2)dz + δ,
Figure imgf000026_0001
where d= (xl-x2)2 + (yl-y2)2 and a, δ and 0 are empirical values. In an embodiment of the invention, a is 150, δ is set equal to Rl , where Rl equals 30, and R2 , where R2 , equals 20, Rl and R2 being discussed below, and 0 is set equal to 4. These values are exemplary and alterable without departing from the scope and spirit of the present invention. For each cluster, the set of dr values yielding the greatest similarity measure Smt is selected and the total sum of the greatest similarity measure of each cluster is taken to find a similarity measure Smt (FPl, FP2) for the comparison of FP2 to FP2) . As noted above, comparison of fingerprints is often hampered by various environmental and physiological factors. The division of FPl into clusters provides compensation m part for such factors as stretching and shrinking of the skin. For a given cluster, the total distance difference due to stretching or shrinkage between two mmutia is limited due to the limited size of the cluster area. Thus, adverse effects of shrinking and stretching are minimized. Accordingly, individual cluster shifting and comparison are a preferred embodiment of the present invention. Alternatively, division of FPl into clusters may be omitted and shifting and comparison of FPl as a whole effected.
The maximum similarity measure Smt (FPl, FP2) is generated for the best comparisons of all clusters of FPl with FP2 , along with a number Nmat of matched mmutia, and a number Ntot which is the total number of mmutia withm the BI grouping of FPl . An overall similarity measure for the comparison of FPl with FP2 is calculated as follows:
Nmt (R,r,BI, Ntot) =Smt (FPl, FP2)- DifDI where Smt (FPl, FP2 ) is a sum of the best Smt of each cluster. Thus, this takes into account the maximal number of matched mmutia, DifDI and statistical peculiarities of distances distribution.
Processing then proceeds to similarity decision step 360 wherein Nmt (R, r, BI, Ntot) is compared with a threshold Thr (R, r, BI, Ntot) . If Nmt (R, r, BI, Ntot) is greater than the threshold Thr(R, r, BI , Ntot) , it is determined the FPl matches FP2 and a match is indicated match indication step 365. If Nmt(R, r, BI , Ntot) is less than or equal to the threshold Thr(R, r, BI, Ntot) it is determined the FPl does not match FP2 and execution proceeds to the data base retrieval step 347 for the selection of another set of print data from the database for use as FP2 m the process which returns to the matching process step 345. Indication of a match is then used to permit access to the computer 50 m general or specific functions thereof.
In a preferred embodiment of the invention, the threshold
Thr(R, r, BI, Ntot) is determined on the basis of threshold training using a sample pool of fingerprints from a number of individuals. The sample pool is composed of a number of samples, or standards, from each individual the pool. The number of samples, from each individual m the pool. The number of samples from each individual m one example is 4 and the number of individuals is m a range of 100 to 1000. The number of samples and individuals may be varied from the exemplary values and range without departing from the scope and spirit of the present invention. The process steps 305 through 355 of Figure 6 are then executed for each print with every print being compared to every other print. Since the sample pool is known, comparisons of prints from a same individual and comparisons of prints from different individuals are known.
In performing the threshold training, n number of variations of R and r are used and are shown below as Rl , R2 and rl, r2 for an example where n=2. For example, values are set such that Rl < R2 and rl < r2 where R1=2*MID, rl=MID, R2=3.5-4 MID, and r2=2*MID. MID is the mean mter-πdge distance of the prints the sample pool. The following values are found:
NmtS (Rl, rl,BI, Ntot) , NmtA (Rl , rl , BI , tot ) , and NmtS (R2,r2,BI, Ntot) , NmtA (R2 , r2 , BI , Ntot ) , where NmtS is number of matched mmutia for prints compared from the same individual while NmtA is the number of matching mmutia resulting from the comparison of fingerprints from different individuals . For a given BI,Ntot (withm subrange of appropriate quantization), BestA (n, BI , Nmat ) is set to the max NmtA (Rn, rn, BI , Ntot) , of all the comparisons of fingerprints from different individuals, and MmNmtS (Rn, rn, BI , Ntot ) is set to the minimum NmtS (Rn, rn, BI , Ntot ) of all comparisons of fingerprints from the same individual for n=l,2, etc. The threshold are then calculated as follows: Thr (n,BI, Nmat) = (BestA(n, ... ) =MmNmtS (Rn,rn, .. ) , where
NmtS (Rn, ... ) >BestA (Rn, ... ) /2. In conjunction with the above discussion of threshold calculations, the similarity decision step 360 produces a positive match indication if for the current BI, Ntot:
Nmt (Rl,rl,BI,Ntot) >Thr ( 1 , BI , Ntot ) , or Nmt (R2,r2,BI,Ntot) >Thr (2 , BI , tot ) . If this condition is not found, then the dichotomy analysis gives some correction. The results of identical and not identical matchmgs is considered as two classes of patterns and the pairs of values Nmt (Rl , rl , ... ) , Nmt (R2 , r2 , ... ) as feature coordinates. The dichotomies are performed by the second order threshold functions which are calculated according to chapter 2.3. m the classical book by J.Tu and R. Gonzalez "Pattern Recognition Principles" Addison-Wesley Publ . 1974, which is incorporated herein by reference for its relevant dichotomy teachings.
The complete description to be stored m the database is a multilevel structure of 4 (or more) FP data sets taken from the different applications of the same FP . Each level of the structure corresponds to mmutia appearance frequencies for all FP codes .
Optionally, instead of using thresholds for the similarity comparison as discussed above, fixed values may be chosen and used as threshold values.
The data base of fingerprints of individuals for whom identification is required is created by a registration process. The registration process entails a given individual having their fingerprints scanned a number of times, for example four. Of the four scannings, the scanning producing the greatest number of mmutia is then selected for the database.
The present invention further includes use of the above fingerprint m utia extraction and comparison process m conjunction with a cryptographic protection process. For this aspect of the invention, the computer 50, also referred to as the client, will send fingerprint data to the remote computer 51, also referred to as the server, over the link 53 which may be, for example, a link over the Internet. Thus, security protection for data sent over the link 53 is required.
There are three different cryptographic procedures used in the cryptographic process. As they are not used simultaneously, they are described below separately. All cryptographic parts are written in italic font. The cryptographic method employed is RSA encryption.
I. User registration
In order to use the cryptographic process, the user must first register his fingerprint with the server. In order to maintain security, the fingerprint data must be encrypted to prevent unauthorized interception thereof. The following steps are used:
1. User fills m a registration form including a UserID. Other information such as Name, E-mail address, etc. may be included.
2. User scans his fingerprint into the computer 50 via the biometric input device where it is stored as image data. The image data is typically on the order of 64 KB.
3. The computer 50 then converts the image data of the finger to the data set defined as FP={Q, N, F(ι,j), X(k), Y(k), A(k) using processing steps 310 through 340 shown m Figure 6. This data set is also referred to herein as a passport. Optionally, components of the data set may be omitted, such as F(ι, ) , so the passport may be shortened to about 1.2 KB. 4. The client, computer 50, then sends a request for the public key to the server via the link 53.
5. Server sends its public key KE via the link 53.
6. Client encrypts its passport and his UserlD using RSA algorithm and public key KE . In a preferred embodiment the length of the key is 512 bits:
C = RSA. Encode Public (K passport, UserlD)
7. The computer 50 sends C to the remote computer 51 via the link 53. 8. The remote computer 51 decryts message using its secret key KL :
M = Passport + UserlD = RSA. Encode
Secret (KD, C) 9. The remote computer 51 then adds the UserlD and passport to the database .
II. User authorization
The user authorization process is used where a user wishes to gam access to the remote computer on the basis of his fmger print matching one m the database .
1. User scans his fingerprint image data into the computer 50.
2. The computer converts the image of the fmger to the passport using processing steps 310 through 340 shown m Figure 6. 3. The computer 50 sends a request over the link 53 to the remote computer 51, the server, for the public key to the server. 4. The remote computer 51 sends its public key KE to the computer 50. 5. The computer 50 encrypts the passport and UserlD using RSA algorithm using the public key KE : C = RSA EncodePublic (KL, passport, UserlD) 6. The computer 50 sends C to the remote computer 51 via the link 53. 7. The remote computer 51 decrypts message using its secret key K : M = passport + UserlD = RSA EncodeSecret (KC,C)
9. The remote computer 51 then searches the database for the UserlD, finds the corresponding passport, and executes steps 345 through 365 of Figure 6 using the passport retrieved from the database as FP2. Optionally, step 350 is omitted. If the comparison of step 360 is positive, access is authorized. If the UserlD does not exist or the comparison result of step 360 negative, authorization for access is refused. III. Installation of the server and addition of new users is effected by the following steps:
1. Installation of normal Web-server components.
2. Generation of the public and secret keys for the administrator of the server: first of all random integer is generated, possibly based on administrator's fingerprint, which is part random, then the deterministic algorithm is started to determine public and secret keys.
Figure imgf000033_0001
being registered, server takes its UserlD and passport and encrypts them with administrator's public key. Usage of two different keys makes it more difficult to corrupt fingerprint data since an intruder must obtain both public and private keys to complete his attack. Different servers will have different keys to ensure that corrupted fingerprint data (i.e. stolen from some server) could not be used on other servers. The 512-bιts RSA keys are extremely difficult to crack. In fact, the keys of that length are not known to have been broken, so current cryptography declares them as keys for long-term secret information (30-50 years or longer) . Average time of encryption of passport (client side) is less than a second. Average time of decryption of passport (server side) is about 2 seconds, so it is reasonable to predict that network delays would be more significant. Besides, servers are usually more powerful than the client computers.
A further aspect of the present invention provides software for working m the Windows environment. In particular, a protection icon is provided which an authorized user, one whose passport has produced a positive comparison, may move and drop on a file or program object to require that future access thereto be permitted only when a positive fingerprint comparison has been executed. Optionally, the user may input a list of UserlD 's for whom access will be allowed. Having described preferred embodiments of the invention with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various changes and modifications may be effected therein by one skilled m the art without departing from the scope or spirit of the invention as defined m the appended claims.
Now that the invention has been described,

Claims

Claims 1. A biometric comparison method comprising the steps of
(a) scanning a fingerprint to and digitizing scanning signals to produce a matrix of print image data representing pixels;
(b) dividing said print image data into cells, each including a number of pixel data for contiguous pixels;
(c) calculating a matrix of directional image data DI using gradient statistics applied to said cells wherein said directional image data DI includes for each of said cells position indicator and one of a cell vector indicative of a direction of ridge lines and an unidirectional flag indicative of a nondirectional calculation result;
(d) skeletonizing said print image data; (e) extracting m utia from said print image data and producing a mmutia data set comprised of data triplets for each mmutia extracted including m utia position data and minutia direction data;
(f) providing reference fingerprint data from a database wherein said reference fingerprint data includes reference directional image data DI and a reference mmutia data set;
(g) performing successive comparisons of said directional image data DI with said reference directional image data DI and determining a directional difference DifDI for each of said successive comparisons wherein for each of said successive comparisons one of said directional image data DI and said reference directional image data DI is positionally shifted by adding position shift data;
(h) determining for which of said successive comparisons said directional difference DifDI is the least and selecting said position shift data thereof as initial mmutia shift data;
(l) positionally shifting mmutia data by applying said initial mmutia shift data to one of said mmutia data set and said reference m utia data set to initially positionally shift said mmutia position data and said mmutia orientation data;
(j) performing successive comparisons of said mmutia data set with said reference mmutia data set following said positionally shifting mmutia data and determining matching mmutia based on a mmutia distance criteria, a number of matching mmutia, and a similarity measure indicative of correspondence of said matching mmutia for each of said successive comparisons wherein, for each of said successive comparisons, one of said mmutia data set and said reference mmutia data set is positionally shifted withm a mmutia shift range R by adding mmutia position shift data; (k) determining a maximum similarity measure of said similarity measures of said successive comparisons; and
(1) determining whether said maximum similarity measure is above a similarity threshold and indicating said reference fingerprint data and said fingerprint data are from the same fingerprint when said maximum similarity measure is above said similarity threshold.
2. The biometric comparison method of claim 1 further comprising the steps of:
(cl) identifying as part of step (c) a directional group of cells comprising all cells of said cells that do not have said unidirectional flag associated therewith; and
(jl) excluding from said successive comparisons m step (j) mmutia of said mmutia data set and said reference mmutia data set located m or positionally aligned with said cells that have said unidirectional flag associated therewith.
3. The biometric comparison method of claim 1 wherein: step (j ) includes : dividing said mmutia data set into said mmutia data set clusters formed of contiguous one said cells and each including a predetermined number of said mmutia before conducting said successive comparisons; and conducting said successive comparisons for each of said m utia data set clusters and determining for each of said mmutia data set clusters a maximum similarity measure; and step (k) includes determining said maximum similarity measure as a sum of said maximum similarity measures of each of said mmutia data set clusters.
4. The biometric comparison method of claim 1 wherein step
(e) includes excluding from further processing pairs of said mmutia located withm a mmutia exclusion distance of one another and having mmutia direction data with a direction exclusion limit of being m opposite directions.
5. The biometric comparisons method of claim 1 wherein said mmutia extraction step (e) extracts mmutia limited to ends and bifurcations .
6. The biometric comparison method of claim 5 wherein said mmutia extraction step (e) wherein said mmutia data set excludes data distinguishing ends and bifurcations.
PCT/US2000/013322 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor WO2000070542A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2000618914A JP2003536121A (en) 1999-05-14 2000-05-12 Apparatus and method for inputting, comparing, authenticating and controlling access to biometric data
AU51350/00A AU5135000A (en) 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/312,002 US6282304B1 (en) 1999-05-14 1999-05-14 Biometric system for biometric input, comparison, authentication and access control and method therefor
US09/312,002 1999-05-14

Publications (1)

Publication Number Publication Date
WO2000070542A1 true WO2000070542A1 (en) 2000-11-23

Family

ID=23209426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/013322 WO2000070542A1 (en) 1999-05-14 2000-05-12 Biometric system for biometric input, comparison, authentication and access control and method therefor

Country Status (4)

Country Link
US (2) US6282304B1 (en)
JP (1) JP2003536121A (en)
AU (1) AU5135000A (en)
WO (1) WO2000070542A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084494A1 (en) * 2000-04-28 2001-11-08 Precise Biometrics Ab Biometric identity check
SG122737A1 (en) * 2000-06-14 2006-06-29 Univ Singapore Apparatus and method for compressing and decompressing fingerprint information
US8086868B2 (en) 2004-06-08 2011-12-27 Nec Corporation Data communication method and system
US10812891B2 (en) 2016-10-24 2020-10-20 Sony Corporation Sound output apparatus and method of executing function of sound output apparatus

Families Citing this family (135)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8882666B1 (en) 1998-05-08 2014-11-11 Ideal Life Inc. Personal health monitoring and/or communication system
US6807291B1 (en) * 1999-06-04 2004-10-19 Intelligent Verification Systems, Inc. Animated toy utilizing artificial intelligence and fingerprint verification
US6981016B1 (en) * 1999-06-11 2005-12-27 Visage Development Limited Distributed client/server computer network
JP3679953B2 (en) * 1999-09-14 2005-08-03 富士通株式会社 Personal authentication system using biometric information
US8479012B1 (en) 1999-10-19 2013-07-02 Harris Technology, Llc Using biometrics as an encryption key
AU4137601A (en) 1999-11-30 2001-06-12 Barry Johnson Methods, systems, and apparatuses for secure interactions
US6742129B1 (en) * 1999-12-08 2004-05-25 Carrier Corporation Software security mechanism
EP1237091A4 (en) * 1999-12-10 2006-08-23 Fujitsu Ltd Personal authentication system and portable electronic device having personal authentication function using body information
US7761715B1 (en) * 1999-12-10 2010-07-20 International Business Machines Corporation Semiotic system and method with privacy protection
US6453301B1 (en) 2000-02-23 2002-09-17 Sony Corporation Method of using personal device with internal biometric in conducting transactions over a network
US20020095608A1 (en) * 2000-11-06 2002-07-18 Slevin Richard S. Access control apparatus and method for electronic device
US20040069846A1 (en) * 2000-11-22 2004-04-15 Francis Lambert Method and apparatus for non-intrusive biometric capture
US8462994B2 (en) * 2001-01-10 2013-06-11 Random Biometrics, Llc Methods and systems for providing enhanced security over, while also facilitating access through, secured points of entry
US20020091937A1 (en) * 2001-01-10 2002-07-11 Ortiz Luis M. Random biometric authentication methods and systems
US7921297B2 (en) 2001-01-10 2011-04-05 Luis Melisendro Ortiz Random biometric authentication utilizing unique biometric signatures
EP1359496B1 (en) * 2001-02-09 2017-07-19 Sony Corporation Input device
US20110090047A1 (en) * 2001-02-20 2011-04-21 Patel Pankaj B Biometric switch and indicating means
US20020124190A1 (en) * 2001-03-01 2002-09-05 Brian Siegel Method and system for restricted biometric access to content of packaged media
US20020133499A1 (en) * 2001-03-13 2002-09-19 Sean Ward System and method for acoustic fingerprinting
US6603462B2 (en) * 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
US20020145507A1 (en) * 2001-04-04 2002-10-10 Foster Ronald R. Integrated biometric security system
US20020146157A1 (en) * 2001-04-09 2002-10-10 Goodman Mitchell E. Fingerprint acquisition assembly using prism and camera
US20040085300A1 (en) * 2001-05-02 2004-05-06 Alec Matusis Device and method for selecting functions based on intrinsic finger features
JP4602606B2 (en) * 2001-08-15 2010-12-22 ソニー株式会社 Authentication processing system, authentication processing method, authentication device, and computer program
WO2003017244A1 (en) * 2001-08-17 2003-02-27 Multidigit, Inc. System and method for selecting actions based on the identification of user's fingers
US7103576B2 (en) * 2001-09-21 2006-09-05 First Usa Bank, Na System for providing cardless payment
JP2003141267A (en) * 2001-11-05 2003-05-16 Sony Corp System and method for correspondence education
US8065713B1 (en) 2001-12-12 2011-11-22 Klimenty Vainstein System and method for providing multi-location access management to secured items
US7565683B1 (en) 2001-12-12 2009-07-21 Weiqing Huang Method and system for implementing changes to security policies in a distributed security system
US7380120B1 (en) 2001-12-12 2008-05-27 Guardian Data Storage, Llc Secured data format for access control
US8006280B1 (en) 2001-12-12 2011-08-23 Hildebrand Hal S Security system for generating keys from access rules in a decentralized manner and methods therefor
US7921284B1 (en) 2001-12-12 2011-04-05 Gary Mark Kinghorn Method and system for protecting electronic data in enterprise environment
US7921450B1 (en) 2001-12-12 2011-04-05 Klimenty Vainstein Security system using indirect key generation from access rules and methods therefor
US10360545B2 (en) 2001-12-12 2019-07-23 Guardian Data Storage, Llc Method and apparatus for accessing secured electronic data off-line
US7921288B1 (en) 2001-12-12 2011-04-05 Hildebrand Hal S System and method for providing different levels of key security for controlling access to secured items
US10033700B2 (en) 2001-12-12 2018-07-24 Intellectual Ventures I Llc Dynamic evaluation of access rights
US7930756B1 (en) 2001-12-12 2011-04-19 Crocker Steven Toye Multi-level cryptographic transformations for securing digital assets
US7260555B2 (en) 2001-12-12 2007-08-21 Guardian Data Storage, Llc Method and architecture for providing pervasive security to digital assets
US7178033B1 (en) 2001-12-12 2007-02-13 Pss Systems, Inc. Method and apparatus for securing digital assets
US7950066B1 (en) 2001-12-21 2011-05-24 Guardian Data Storage, Llc Method and system for restricting use of a clipboard application
US8176334B2 (en) 2002-09-30 2012-05-08 Guardian Data Storage, Llc Document security system that permits external users to gain access to secured files
US7418255B2 (en) * 2002-02-21 2008-08-26 Bloomberg Finance L.P. Computer terminals biometrically enabled for network functions and voice communication
US7181048B2 (en) * 2002-06-28 2007-02-20 Hewlett-Packard Development Company, L.P. Biometric capture adapter for digital imaging devices
US7155416B2 (en) * 2002-07-03 2006-12-26 Tri-D Systems, Inc. Biometric based authentication system with random generated PIN
CA2494491C (en) * 2002-07-29 2010-11-02 C-Signature Ltd. Method and apparatus for electro-biometric identity recognition
US20030191764A1 (en) * 2002-08-06 2003-10-09 Isaac Richards System and method for acoustic fingerpringting
EP3547599A1 (en) 2002-08-06 2019-10-02 Apple Inc. Methods for secure enrollment and backup of personal identity credentials into electronic devices
CN1238809C (en) * 2002-09-04 2006-01-25 长春鸿达光电子与生物统计识别技术有限公司 Fingerprint identification method as well as fingerprint controlling method and system
US20040050929A1 (en) * 2002-09-16 2004-03-18 Fayfield Robert W. Extranet security system and method
US20040080492A1 (en) * 2002-10-23 2004-04-29 Chen Hung Hua Fingerprint access control mouse
US7046234B2 (en) 2002-11-21 2006-05-16 Bloomberg Lp Computer keyboard with processor for audio and telephony functions
US20040101172A1 (en) * 2002-11-26 2004-05-27 Stmicroelectronics, Inc. Imaging system with locator bar for accurate fingerprint recognition
US20040172403A1 (en) * 2002-11-26 2004-09-02 Steele Rhea L. Method and system for automated tracking of persons at remote activities
US20040101171A1 (en) * 2002-11-26 2004-05-27 Stmicroelectronics, Inc. Imaging system with guide for accurate fingerprint recognition
FR2849244B1 (en) * 2002-12-20 2006-03-10 Sagem METHOD FOR DETERMINING THE LIVING CHARACTER OF A CARRIER COMPONENT OF A DIGITAL IMPRINT
US20040125993A1 (en) * 2002-12-30 2004-07-01 Yilin Zhao Fingerprint security systems in handheld electronic devices and methods therefor
US9678967B2 (en) * 2003-05-22 2017-06-13 Callahan Cellular L.L.C. Information source agent systems and methods for distributed data storage and management using content signatures
US20070276823A1 (en) * 2003-05-22 2007-11-29 Bruce Borden Data management systems and methods for distributed data storage and management using content signatures
US8707034B1 (en) 2003-05-30 2014-04-22 Intellectual Ventures I Llc Method and system for using remote headers to secure electronic files
CA2724292C (en) 2003-05-30 2014-09-30 Privaris, Inc. An in-circuit security system and methods for controlling access to and use of sensitive data
US8034294B1 (en) 2003-07-15 2011-10-11 Ideal Life, Inc. Medical monitoring/consumables tracking device
US8571880B2 (en) * 2003-08-07 2013-10-29 Ideal Life, Inc. Personal health management device, method and system
US20050044909A1 (en) * 2003-08-28 2005-03-03 Volker Lange Knob cylinder with biometrical sensor
CA2440778A1 (en) * 2003-09-12 2005-03-12 Code Incorporated Digital identification kit
US7245218B2 (en) * 2003-09-12 2007-07-17 Curtis Satoru Ikehara Input device to continuously detect biometrics
US8127366B2 (en) 2003-09-30 2012-02-28 Guardian Data Storage, Llc Method and apparatus for transitioning between states of security policies used to secure electronic documents
WO2005034020A1 (en) * 2003-09-30 2005-04-14 Ultra-Scan Corporation Finger scanner and method of scanning a finger
US7703140B2 (en) 2003-09-30 2010-04-20 Guardian Data Storage, Llc Method and system for securing digital assets using process-driven security policies
SG113483A1 (en) * 2003-10-30 2005-08-29 Ritronics Components S Pte Ltd A biometrics parameters protected usb interface portable data storage device with usb interface accessible biometrics processor
CN1305001C (en) * 2003-11-10 2007-03-14 北京握奇数据系统有限公司 Finger print characteristic matching method in intelligent card
CN1299230C (en) * 2003-11-10 2007-02-07 北京握奇数据系统有限公司 Finger print characteristic matching method based on inter information
US20050149738A1 (en) * 2004-01-02 2005-07-07 Targosky David G. Biometric authentication system and method for providing access to a KVM system
RU2279129C2 (en) * 2004-02-13 2006-06-27 Футроник Технолоджис Компани Лтд Method for recognition of papillary patterns
RU2279128C2 (en) * 2004-02-13 2006-06-27 Футроник Технолоджис Компани Лтд. Method for recognition of papillary pattern
US20050184855A1 (en) * 2004-02-24 2005-08-25 Burchette Robert L.Jr. Fingerprint vehicle access system
US7325727B2 (en) * 2004-09-02 2008-02-05 Weaver Howard C Personal account protection system
JP4340618B2 (en) * 2004-10-08 2009-10-07 富士通株式会社 Biometric information authentication apparatus and method, biometric information authentication program, and computer-readable recording medium recording the biometric information authentication program
US7369700B2 (en) * 2004-10-14 2008-05-06 The Secretary Of State For The Home Department Identifier comparison
EP1800239A1 (en) * 2004-10-14 2007-06-27 Forensic Science Service Ltd A process to improve the quality the skeletonisation of a fingerprint image
US20060083413A1 (en) * 2004-10-14 2006-04-20 The Secretary Of State For The Home Department Identifier investigation
US20060088225A1 (en) 2004-10-26 2006-04-27 The Secretary Of State For The Home Department Comparison
US7624281B2 (en) * 2004-12-07 2009-11-24 Video Products, Inc. System and method for providing access to a keyboard video and mouse drawer using biometric authentication
US7406186B2 (en) * 2005-01-25 2008-07-29 Ruei-Bin Lin Dermatoglyph test system
GB0502990D0 (en) * 2005-02-11 2005-03-16 Sec Dep For The Home Departmen Improvements in and relating to identifier investigation
JP4922288B2 (en) * 2005-03-24 2012-04-25 プリバリス,インコーポレイテッド Biometric device with smart card function
US20060271788A1 (en) * 2005-05-24 2006-11-30 An-Sheng Chang Access method for wireless authentication login system
US20070171027A1 (en) * 2006-01-25 2007-07-26 Odi Security; Llc Biometric anti-theft system and method
US7813531B2 (en) * 2006-05-01 2010-10-12 Unisys Corporation Methods and apparatus for clustering templates in non-metric similarity spaces
US8051468B2 (en) 2006-06-14 2011-11-01 Identity Metrics Llc User authentication system
US7818290B2 (en) 2006-06-14 2010-10-19 Identity Metrics, Inc. System to associate a demographic to a user of an electronic system
US8161530B2 (en) * 2006-07-11 2012-04-17 Identity Metrics, Inc. Behaviormetrics application system for electronic transaction authorization
US8843754B2 (en) * 2006-09-15 2014-09-23 Identity Metrics, Inc. Continuous user identification and situation analysis with identification of anonymous users through behaviormetrics
US8452978B2 (en) * 2006-09-15 2013-05-28 Identity Metrics, LLC System and method for user authentication and dynamic usability of touch-screen devices
EP2947592B1 (en) 2007-09-24 2021-10-27 Apple Inc. Embedded authentication systems in an electronic device
US20090121834A1 (en) * 2007-11-13 2009-05-14 Ari Huostila Biometric association model
US20090150993A1 (en) * 2007-12-10 2009-06-11 Symbol Technologies, Inc. Mobile Device with Frequently Operated Biometric Sensors
DE102008024320B4 (en) * 2008-05-20 2017-02-23 Infineon Technologies Ag Safe procedure for the biometric verification of a person
US20100060419A1 (en) * 2008-09-05 2010-03-11 Smith Gaylan S Biometric Control System and Method For Machinery
US8902044B2 (en) * 2008-09-05 2014-12-02 Gaylon Smith Biometric control system and method for machinery
GB0819069D0 (en) 2008-10-17 2008-11-26 Forensic Science Service Ltd Improvements in and relating to methods and apparatus for comparison
US8300902B2 (en) * 2008-10-20 2012-10-30 Union Community Co., Ltd. Apparatus for distinguishing forged fingerprint and method thereof
JP5382127B2 (en) * 2009-09-11 2014-01-08 富士通株式会社 Biometric authentication device, biometric authentication system, and biometric authentication method
US8351668B2 (en) * 2010-01-11 2013-01-08 Utah State University System and method for automated particle imaging analysis
CN101787824B (en) * 2010-01-28 2013-04-03 南京信息工程大学 Intelligent anti-theft lock system
US8520903B2 (en) * 2010-02-01 2013-08-27 Daon Holdings Limited Method and system of accounting for positional variability of biometric features
US8041956B1 (en) 2010-08-16 2011-10-18 Daon Holdings Limited Method and system for biometric authentication
US9357024B2 (en) * 2010-08-05 2016-05-31 Qualcomm Incorporated Communication management utilizing destination device user presence probability
KR20120018685A (en) * 2010-08-23 2012-03-05 주식회사 팬택 Termianl for recogniging multi user input and control method thereof
US8724861B1 (en) * 2010-12-06 2014-05-13 University Of South Florida Fingertip force, location, and orientation sensor
US9002322B2 (en) 2011-09-29 2015-04-07 Apple Inc. Authentication with secondary approver
US8769624B2 (en) 2011-09-29 2014-07-01 Apple Inc. Access control utilizing indirect authentication
US8750852B2 (en) 2011-10-27 2014-06-10 Qualcomm Incorporated Controlling access to a mobile device
US9066125B2 (en) 2012-02-10 2015-06-23 Advanced Biometric Controls, Llc Secure display
BR112014028774B1 (en) 2012-05-18 2022-05-10 Apple Inc Method, electronic device, computer readable storage medium and information processing apparatus
US9165129B2 (en) * 2012-06-26 2015-10-20 Intel Corporation Keyboard as biometric authentication device
US20160034772A1 (en) * 2013-03-15 2016-02-04 Ellis I. Betensky Method and apparatus for acquiring biometric image
US9330513B2 (en) * 2013-05-31 2016-05-03 Microsoft Technology Licensing, Llc Resource management based on biometric data
US9898642B2 (en) 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
EP3125193B1 (en) * 2014-03-25 2020-12-23 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
EP3125195B1 (en) * 2014-03-25 2020-03-11 Fujitsu Frontech Limited Biometric authentication device, biometric authentication method, and program
WO2015145590A1 (en) * 2014-03-25 2015-10-01 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
WO2015145588A1 (en) 2014-03-25 2015-10-01 富士通フロンテック株式会社 Biometric authentication device, biometric authentication method, and program
US9443072B2 (en) * 2014-03-28 2016-09-13 Sony Corporation Methods and devices for granting access to and enabling passcode protection for a file
US20160350607A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Biometric authentication device
US11140171B1 (en) 2015-06-05 2021-10-05 Apple Inc. Establishing and verifying identity using action sequences while protecting user privacy
US10868672B1 (en) 2015-06-05 2020-12-15 Apple Inc. Establishing and verifying identity using biometrics while protecting user privacy
US9626549B1 (en) * 2015-11-16 2017-04-18 MorphoTrak, LLC Derived virtual quality parameters for fingerprint matching
DK179186B1 (en) 2016-05-19 2018-01-15 Apple Inc REMOTE AUTHORIZATION TO CONTINUE WITH AN ACTION
US9530042B1 (en) * 2016-06-13 2016-12-27 King Saud University Method for fingerprint classification
US11152085B2 (en) * 2016-06-27 2021-10-19 International Business Machines Corporation Using sensors and location to trigger events and share data
US10127429B2 (en) 2016-11-10 2018-11-13 Synaptics Incorporated Systems and methods for spoof detection based on local interest point locations
US10650212B2 (en) * 2016-12-30 2020-05-12 Beyond Time Invetments Limited Optical identification method and optical identification system
KR102143148B1 (en) 2017-09-09 2020-08-10 애플 인크. Implementation of biometric authentication
TWM558396U (en) * 2017-12-18 2018-04-11 精元電腦股份有限公司 Mouse with capacitive fingerprint reader
CN110632094B (en) * 2019-07-24 2022-04-19 北京中科慧眼科技有限公司 Pattern quality detection method, device and system based on point-by-point comparison analysis

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4696046A (en) * 1985-08-02 1987-09-22 Fingermatrix, Inc. Matcher
US4876726A (en) * 1986-01-07 1989-10-24 De La Rue Printrak, Inc. Method and apparatus for contextual data enhancement
US5105467A (en) * 1989-11-28 1992-04-14 Kim Bong I Method of fingerprint verification
US5519785A (en) * 1993-08-31 1996-05-21 Nec Corporation Correcting method for directional data of streaked patterns and information processing apparatus for executing it and correcting method for pitch data of streaked patterns and information processing apparatus for executing it
US5883971A (en) * 1996-10-23 1999-03-16 International Business Machines Corporation System and method for determining if a fingerprint image contains an image portion representing a smudged fingerprint impression
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint

Family Cites Families (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3940795A (en) 1970-04-21 1976-02-24 Lemelson Jerome H Verification apparatus using a card scanning means
US4210899A (en) 1975-06-23 1980-07-01 Fingermatrix, Inc. Fingerprint-based access control and identification apparatus
US4152056A (en) 1977-09-02 1979-05-01 Fowler Randall C Fingerprinting arrangement
US4322163A (en) 1977-10-25 1982-03-30 Fingermatrix Inc. Finger identification
US4428670A (en) 1980-08-11 1984-01-31 Siemens Corporation Fingerprint sensing device for deriving an electric signal
US4438824A (en) 1981-04-22 1984-03-27 Siemens Corporation Apparatus and method for cryptographic identity verification
US4537484A (en) 1984-01-30 1985-08-27 Identix Incorporated Fingerprint imaging apparatus
EP0244498B1 (en) 1986-05-06 1991-06-12 Siemens Aktiengesellschaft Arrangement and process for determining the authenticity of persons by verifying their finger prints
US5067162A (en) 1986-06-30 1991-11-19 Identix Incorporated Method and apparatus for verifying identity using image correlation
FI893028A (en) 1988-06-23 1989-12-24 Fujitsu Ltd ANORDING FOR THE PURPOSE OF DATA FRAON EN OJAEMN YTA.
US4993068A (en) 1989-11-27 1991-02-12 Motorola, Inc. Unforgeable personal identification system
EP0449242A3 (en) 1990-03-28 1992-10-28 National Semiconductor Corporation Method and structure for providing computer security and virus prevention
US5077795A (en) * 1990-09-28 1991-12-31 Xerox Corporation Security system for electronic printing systems
US5131038A (en) 1990-11-07 1992-07-14 Motorola, Inc. Portable authentification system
US5229764A (en) 1991-06-20 1993-07-20 Matchett Noel D Continuous biometric authentication matrix
GB9125540D0 (en) 1991-11-30 1992-01-29 Davies John H E Access control systems
US5335278A (en) 1991-12-31 1994-08-02 Wireless Security, Inc. Fraud prevention system and process for cellular mobile telephone networks
US5335288A (en) 1992-02-10 1994-08-02 Faulkner Keith W Apparatus and method for biometric identification
US5889474A (en) 1992-05-18 1999-03-30 Aeris Communications, Inc. Method and apparatus for transmitting subject status information over a wireless communications network
EP0593386A3 (en) 1992-10-16 1996-07-31 Ibm Method and apparatus for accessing touch screen desktop objects via fingerprint recognition
US5954583A (en) 1992-11-05 1999-09-21 Com21 Limited Secure access control system
US5502759A (en) 1993-05-13 1996-03-26 Nynex Science & Technology, Inc. Apparatus and accompanying methods for preventing toll fraud through use of centralized caller voice verification
US5416573A (en) 1993-09-10 1995-05-16 Indentix Incorporated Apparatus for producing fingerprint images which are substantially free of artifacts attributable to moisture on the finger being imaged
US5526428A (en) 1993-12-29 1996-06-11 International Business Machines Corporation Access control apparatus and method
US5528355A (en) 1994-03-11 1996-06-18 Idnetix Incorporated Electro-optic palm scanner system employing a non-planar platen
JPH09510636A (en) 1994-03-24 1997-10-28 ミネソタ マイニング アンド マニュファクチャリング カンパニー Biometric personal identification system
US5926533A (en) 1994-04-19 1999-07-20 Opus Telecom, Inc. Computer-based method and apparatus for controlling, monitoring, recording and reporting telephone access
US5655013A (en) 1994-04-19 1997-08-05 Gainsboro; Jay L. Computer-based method and apparatus for controlling, monitoring, recording and reporting telephone access
US5666400A (en) 1994-07-07 1997-09-09 Bell Atlantic Network Services, Inc. Intelligent recognition
GB9415627D0 (en) 1994-08-01 1994-09-21 Marshall James Verification apparatus
US5659626A (en) * 1994-10-20 1997-08-19 Calspan Corporation Fingerprint identification system
US5546471A (en) 1994-10-28 1996-08-13 The National Registry, Inc. Ergonomic fingerprint reader apparatus
US5613012A (en) 1994-11-28 1997-03-18 Smarttouch, Llc. Tokenless identification system for authorization of electronic transactions and electronic transmissions
US5764789A (en) 1994-11-28 1998-06-09 Smarttouch, Llc Tokenless biometric ATM access system
US5838306A (en) 1995-05-05 1998-11-17 Dell U.S.A., L.P. Mouse with security feature
CA2156236C (en) 1995-08-16 1999-07-20 Stephen J. Borza Biometrically secured control system for preventing the unauthorized use of a vehicle
US5825474A (en) 1995-10-27 1998-10-20 Identix Corporation Heated optical platen cover for a fingerprint imaging system
US5650842A (en) 1995-10-27 1997-07-22 Identix Incorporated Device and method for obtaining a plain image of multiple fingerprints
CH690048A5 (en) 1995-11-28 2000-03-31 C Sam S A En Formation C O Jue Safety device controlling access to a computer or a network terminal.
EP0788069A3 (en) 1996-02-01 2000-01-19 Kaba Schliesssysteme AG Wearable identification carrier
US5848231A (en) 1996-02-12 1998-12-08 Teitelbaum; Neil System configuration contingent upon secure input
US5781651A (en) 1996-04-15 1998-07-14 Aetex Biometric Corporation Compact fingerprint recognizing apparatus illuminated with electroluminescent device
US5748766A (en) 1996-04-30 1998-05-05 Identix Incorporated Method and device for reducing smear in a rolled fingerprint image
US5764222A (en) 1996-05-28 1998-06-09 International Business Machines Corporation Virtual pointing device for touchscreens
US5748184A (en) 1996-05-28 1998-05-05 International Business Machines Corporation Virtual pointing device for touchscreens
US5680205A (en) 1996-08-16 1997-10-21 Dew Engineering And Development Ltd. Fingerprint imaging apparatus with auxiliary lens
US5867795A (en) 1996-08-23 1999-02-02 Motorola, Inc. Portable electronic device with transceiver and visual image display
US5963657A (en) 1996-09-09 1999-10-05 Arete Associates Economical skin-pattern-acquisition and analysis apparatus for access control; systems controlled thereby
US6035403A (en) 1996-09-11 2000-03-07 Hush, Inc. Biometric based method for software distribution
WO1998011501A2 (en) 1996-09-11 1998-03-19 Rao D Ramesh K Embeddable module for fingerprint capture and matching
US5872834A (en) 1996-09-16 1999-02-16 Dew Engineering And Development Limited Telephone with biometric sensing device
US5881226A (en) 1996-10-28 1999-03-09 Veneklase; Brian J. Computer security system
US5844497A (en) * 1996-11-07 1998-12-01 Litronic, Inc. Apparatus and method for providing an authentication system
US5963908A (en) 1996-12-23 1999-10-05 Intel Corporation Secure logon to notebook or desktop computers
JP2944557B2 (en) * 1997-02-27 1999-09-06 日本電気ソフトウェア株式会社 Stripe pattern matching device
US5930804A (en) 1997-06-09 1999-07-27 Philips Electronics North America Corporation Web-based biometric authentication system and method
US6059842A (en) * 1998-04-14 2000-05-09 International Business Machines Corp. System and method for optimizing computer software and hardware

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4696046A (en) * 1985-08-02 1987-09-22 Fingermatrix, Inc. Matcher
US4876726A (en) * 1986-01-07 1989-10-24 De La Rue Printrak, Inc. Method and apparatus for contextual data enhancement
US5105467A (en) * 1989-11-28 1992-04-14 Kim Bong I Method of fingerprint verification
US5519785A (en) * 1993-08-31 1996-05-21 Nec Corporation Correcting method for directional data of streaked patterns and information processing apparatus for executing it and correcting method for pitch data of streaked patterns and information processing apparatus for executing it
US5883971A (en) * 1996-10-23 1999-03-16 International Business Machines Corporation System and method for determining if a fingerprint image contains an image portion representing a smudged fingerprint impression
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084494A1 (en) * 2000-04-28 2001-11-08 Precise Biometrics Ab Biometric identity check
US7333637B2 (en) 2000-04-28 2008-02-19 Precise Biometrics Ab Biometric identity check
SG122737A1 (en) * 2000-06-14 2006-06-29 Univ Singapore Apparatus and method for compressing and decompressing fingerprint information
US8086868B2 (en) 2004-06-08 2011-12-27 Nec Corporation Data communication method and system
US10812891B2 (en) 2016-10-24 2020-10-20 Sony Corporation Sound output apparatus and method of executing function of sound output apparatus

Also Published As

Publication number Publication date
US6282304B1 (en) 2001-08-28
JP2003536121A (en) 2003-12-02
AU5135000A (en) 2000-12-05
US6487662B1 (en) 2002-11-26

Similar Documents

Publication Publication Date Title
US6282304B1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
US20020031245A1 (en) Biometric authentification method
EP0976087B1 (en) Biometric recognition using a master pattern set
US7853054B2 (en) Fingerprint template generation, verification and identification system
US6941001B1 (en) To a combined fingerprint acquisition and control device
Rowe et al. A multispectral whole-hand biometric authentication system
US6072891A (en) Method of gathering biometric information
EP1825418B1 (en) Fingerprint biometric machine
US4896363A (en) Apparatus and method for matching image characteristics such as fingerprint minutiae
US20040125993A1 (en) Fingerprint security systems in handheld electronic devices and methods therefor
EP1399874A1 (en) Method and system for transforming an image of a biological surface
WO1990012371A1 (en) Finger profile identification system
US20020031244A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
WO2007018545A2 (en) Protometric authentication system
Olagunju et al. Staff attendance monitoring system using fingerprint biometrics
JPH10275233A (en) Information processing system, pointing device and information processor
WO2000070545A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor
WO2000070543A1 (en) Biometric system for biometric input, comparison, authentication and access control and method therefor biometric
JP2002279413A (en) Device for identifying dummy fingerprint and device for collating fingerprint
EP1208528B1 (en) Method and arrangement for registering and verifying fingerprint information
JP4270842B2 (en) Fingerprint verification device
CN111709312A (en) Local feature face recognition method based on joint main mode
KR100384294B1 (en) Fingerprint recognition system and method thereof
JPH01211184A (en) Person himself collating device
You et al. Parallel biometrics computing using mobile agents

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 618914

Kind code of ref document: A

Format of ref document f/p: F

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase