US20060294390A1 - Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge - Google Patents

Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge Download PDF

Info

Publication number
US20060294390A1
US20060294390A1 US11/159,722 US15972205A US2006294390A1 US 20060294390 A1 US20060294390 A1 US 20060294390A1 US 15972205 A US15972205 A US 15972205A US 2006294390 A1 US2006294390 A1 US 2006294390A1
Authority
US
United States
Prior art keywords
user
challenge
false
verification
authentication result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/159,722
Inventor
Jiri Navratil
Ryan Osborn
Jason Pelecanos
Ganesh Ramaswamy
Ran Zilca
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US11/159,722 priority Critical patent/US20060294390A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAVRATIL, JIRI, OSBORN, RYAN L., PELECANOS, JASON W., RAMASWAMY, GANESH N., ZILCA, RAN D.
Priority to JP2006169543A priority patent/JP4939121B2/en
Priority to CN200610093157.9A priority patent/CN100485702C/en
Publication of US20060294390A1 publication Critical patent/US20060294390A1/en
Priority to US12/057,470 priority patent/US8930709B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/104Grouping of entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2103Challenge-response

Definitions

  • the present invention is generally related to user authentication techniques and, more particularly, to techniques for providing sequential user authentication.
  • Authenticating the identity claim of a user is an important step in ensuring the security of systems, networks, services and facilities, both for physical and for logical access.
  • Existing user authentication is often performed on the basis of a user's knowledge of a single verification object, e.g., a password or a personal identification number (PIN) or on the basis of possession of a single verification object, e.g., a key or a card.
  • Other existing authentication techniques include the use of a biometric feature as the verification object, e.g., a fingerprint, a voiceprint, an iris scan or a face scan.
  • Verification is typically done by comparing the verification object obtained from the user at the time of attempted access to previously stored objects.
  • Biometric systems typically produce a similarity score measuring how close an input biometric is to a reference biometric template. A threshold is then applied to the score to make a binary decision about whether to accept or reject a given user.
  • Possession-based user authentication systems make a binary accept/reject decision based on the presence of a physical device (e.g., a key) or a virtual device (e.g., a digital certificate). For knowledge verification, a single challenge will result in a binary decision based on the correctness of the user's response.
  • Sequential user authentication may be accomplished by using a sequence of authentication challenges from the same mode (e.g., presenting only knowledge verification questions), or using multiple verification modes (e.g., presenting both random knowledge challenges and asking for one or more physical keys). Sequential authentication based on biometrics may be possible, depending on the type of biometric. For example, fingerprints are consistent and sequential challenges would not be beneficial since they capture the same identical fingerprint. The human voice, however, does change, and therefore sequential voice biometrics (“speaker recognition”) is beneficial.
  • a policy is the set of rules that specify, at each turn, whether to accept the user, reject the user, or present the user with a new challenge.
  • conversational biometric techniques provide improved authentication frameworks with a high degree of flexibility, accuracy, convenience and robustness, they suffer from a number of limitations, which if overcome, could further improve the efficiency and security of such user authentication techniques.
  • the above-described techniques for conversational biometrics yield a binary decision for each challenge.
  • the continuous-value score allows for fusing of multiple biometric systems prior to setting a security level.
  • a user is challenged with at least one knowledge challenge to obtain an intermediate authentication result; and the user challenges continue until a cumulative authentication result satisfies one or more criteria.
  • the intermediate authentication result is based, for example, on log likelihood ratio and the cumulative authentication result is, for example, a sum of individual log likelihood ratios.
  • the intermediate authentication result is based, for example, on one or more of false accept and false reject error probabilities for each knowledge challenge.
  • a false accept error probability describes a probability of a different user answering the knowledge challenge correctly.
  • a false reject error probability describes a probability of a genuine user not answering the knowledge challenge correctly.
  • one or more of the false accept and false reject error probabilities are adapted based on field data or known information about a given challenge.
  • the FA and FR values may be changed by adapting to field data reflecting the measured FA and FR values.
  • the FA and FR values may also be changed to reflect expected security breaches.
  • the continuous scores provided by the present invention allow easier adaptation, as they imply a statistical model that has parameters, such as FA and FR.
  • the continuous scores provided by the present invention allow one or more of the intermediate authentication results and the cumulative authentication result to be combined with a result from another verification method (such as biometric or possession based authentication).
  • FIG. 1 is a block diagram illustrating a client-server architecture of an authentication system for implementing sequential authentication in accordance with the present invention
  • FIG. 2 is a block diagram illustrating an exemplary computing system environment for implementing sequential authentication in accordance with the present invention
  • FIG. 3 is a diagram illustrating an exemplary specification of multiple verification objects, according to one embodiment of the invention.
  • FIG. 4 is a diagram illustrating an exemplary specification of user models including multiple verification objects, according to one embodiment of the invention.
  • FIG. 5 is a block diagram illustrating a sequential authentication system incorporating features of the present invention.
  • FIG. 6 is a flow chart describing an exemplary implementation of a sequential authentication process incorporating features of the present invention.
  • the present invention provides a sequential authentication system.
  • the disclosed sequential authentication system is based on knowledge verification for the purpose of measuring a similarity score for every interaction turn.
  • the disclosed sequential authentication system continuously estimates the probability that the user's identity claim is genuine and the probability that the user is not who he or she claims to be.
  • a series of challenges is presented to the user, and each user response is compared to one or more models, resulting in an intermediate authentication decision (such as a log likelihood ratio (LLR)).
  • LLR log likelihood ratio
  • the intermediate decisions from the individual turns are combined (such as a sum of LLRs) to create a cumulative authentication result to ultimately either accept or reject the user's identity claim.
  • the models used for the sequential authentication process may also be adapted from the user data during authentication, and may also be used in conjunction with voice biometric models performing speaker recognition (or another modality) to complete the user authentication task.
  • the user or background models, or both comprise False Accept (FA) and False Reject (FR) error rates that characterize each security challenge.
  • the FA describes the probability of a different user answering correctly to the security challenge
  • the FR describes the probability of the genuine user not answering correctly.
  • the FA and FR assigned to each challenge may be only in the background model, thus assuming that all users have the same FA and FR, or in addition user specific FA and FR may be assigned to each challenge and stored in the user model.
  • the user model also includes the correct responses to the security challenges.
  • the FA and FR values may be changed by adapting to field data reflecting the measured FA and FR values.
  • the FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected.
  • the continuous scores provided by the present invention allow easier adaptation, as they imply a statistical model that has parameters, such as FA and FR.
  • the invention is illustrated using an exemplary client-server system architecture. It should be understood, however, that the invention is not limited to use with any particular system architecture. The invention is instead more generally applicable to any system architecture in which it is desirable to provide an authentication framework that provides a high degree of flexibility, accuracy, convenience and/or robustness. That is, the techniques of the present invention may be implemented on a single computer system or on multiple computer systems coupled by a suitable network, examples of which will be described below.
  • the interaction design is based on authentication policies implemented as a statistical state machine using XML (extensible Markup Language).
  • XML extensible Markup Language
  • the entire authentication interaction is determined dynamically based on the authentication policy in effect (selected based on user preferences and transaction or application requirements), using operations on the shared context, further utilizing the authentication objects in effect and the user profile of interest.
  • the authentication techniques of the present invention utilize the following components: (1) verification objects and verification engines; (2) verification policies and a verification policy manager; and (3) user models.
  • Verification objects are objects that can be used for the purpose of verifying the identity of users, such as the user's biometric characteristics (e.g., voiceprint, fingerprint, face scan, iris scan, handwritten signature and/or keyboard dynamics), the user's knowledge (e.g., passwords, passphrases, and/or answers to personal questions), and the user's possessions (e.g., keys, cards, tokens, certificates, cellular telephone or home telephone transmitting caller-id information, personal or handheld computer with client software and/or user's location).
  • biometric characteristics e.g., voiceprint, fingerprint, face scan, iris scan, handwritten signature and/or keyboard dynamics
  • the user's knowledge e.g., passwords, passphrases, and/or answers to personal questions
  • the user's possessions e.g., keys, cards, tokens, certificates, cellular telephone or home telephone transmitting caller-id information, personal or handheld computer with client software and/or user's location.
  • Verification engines are used to match the verification objects with the representation stored in a user model.
  • Examples of verification engines include a fingerprint recognition system to match the user's fingerprint, a conversational system to evaluate spoken answers to questions such as a voice response system, a conversational system such as a speech or voiceprint recognition system (that may include natural understanding techniques) to extract and recognize a user's spoken utterances (wherein the conversational system may also include a speech synthesis system for generating synthesized questions and prompts), a caller-id recognition system to extract and match the user's telephone number, a badge reader to scan the user's badge or card, a PIN confirmation system to confirm a user's PIN, a face recognition system to extract and match a user's face scan, an iris recognition system to extract and match a user's iris scan, a handwriting recognition system to recognize a user's handwriting, a keyboard dynamic recognizer to match a user's keyboard dynamics, as well as other modality-specific engines discussed herein and/
  • verification engines typically perform user verification by comparing user input to the user's model that was created when the user enrolled
  • the invention is not restricted to verification engines that require user enrollment.
  • Unsupervised verification engines that do not require the user to enroll, may also be used.
  • unsupervised verification engines When unsupervised verification engines are used, a single user model may be employed, including the user attributes as measured by the verification engines.
  • the following verification engines can be used: acoustic accent recognition, language identification, and face features detection (e.g., color of eyes, glasses detection). In this case, none of the individual verification engines require user enrollment, and one user model is used, stating the user's speech accent spoken, language, color of eyes, and whether he/she wears glasses.
  • the invention realizes that, while individual verification engines can be used to perform simple verification steps that operate in a predefined static manner, a more general framework is necessary when multiple verification objects are used to perform dynamic user authentication, in order to achieve a greater degree of accuracy and flexibility.
  • the present invention provides such an improved authentication framework.
  • the present invention employs verification policies that govern the interaction between the user and the overall system, including the authentication system, and between the various verification engines. Any number of verification policies could be written to satisfy a wide variety of user-specific, transaction-specific or application-specific authentication needs, including needs that change in real-time.
  • verification policies are managed by a verification policy manager which uses operations on a common context shared across all verification objects to achieve maximum programmability of the authentication system.
  • User models are typically created when a user enrolls in the system, using the inputs provided by the user (e.g., samples of voice, samples of fingerprint, and/or answers to personal questions), or acquired through other means (such as details of past transactions, balance in most recent bill, serial number of a key or badge issued, and/or encryption key contained in a smartcard or a client software).
  • inputs provided by the user e.g., samples of voice, samples of fingerprint, and/or answers to personal questions
  • other means such as details of past transactions, balance in most recent bill, serial number of a key or badge issued, and/or encryption key contained in a smartcard or a client software.
  • the user models may be updated in real-time when needed, such as when a new bill is issued and the balance changes or when more voice samples are available.
  • An individual user model contains information regarding all verification objects relevant to that user, including any user preferences related to the verification objects (e.g., a user may prefer questions regarding colors rather than numbers).
  • User models also preferably support nontrivial manipulations of the verification objects, such as asking the user to add the first and third digits of his social security number. Again, any of the above-mentioned examples are not intended to limit the invention.
  • FIG. 1 a block diagram illustrates a client-server architecture of an authentication system for implementing sequential authentication, according to one embodiment of the invention.
  • the authentication system 100 comprises a verification client device 102 and a verification server 104 , coupled via a network adapter 106 .
  • the verification client 102 has context 108 and application 110 associated therewith.
  • the verification server 104 comprises a verification policy manager 112 and a plurality of verification engines 114 - 1 through 114 -N, where N can be any integer 2, 3, 4 . . . , and represents the number of verification object families or types that the particular implementation of the invention can support.
  • the authentication system 100 further comprises a data manager 116 , a verification objects store 118 , a verification policies store 120 and a user models store 122 . While the data manager 116 and data stores 118 , 120 and 122 are shown outside of the verification server box, it is to be understood that they may be implemented on the verification server.
  • the verification client device 102 is responsible for interfacing with the user and collecting the inputs from the user, communicating with the verification server 104 through the network adapter 106 , and communicating with the application 110 . In one embodiment of the invention, the verification client device 102 is also responsible for acquiring and maintaining the context 108 .
  • the context 108 may be stored on a central database (not shown), accessible by other components of the system 100 .
  • a central database (not shown), accessible by other components of the system 100 .
  • Such an implementation allows for a stateless operation between the verification client device 102 and the verification server 104 , such that different servers could be used for different turns in the verification process, thereby providing protection against a particular server going down in the middle of a verification process, and also allowing for improved load balancing of the server resources.
  • the context 108 records all relevant variables for the verification process, such as: (1) the user name; (2) the current state in the verification policy that is in effect; (3) the history pertaining to the verification objects that have been invoked and the scores and outcomes associated with the invocations; (4) transaction-specific requirements (e.g., desired level of accuracy or nature of the transaction); (5) user-specific requirements (e.g., a user having a cold may prefer not to rely on voiceprint match); and (6) other physical and logical variables (e.g., type of network connection—remote or local, or quality of a voice channel).
  • transaction-specific requirements e.g., desired level of accuracy or nature of the transaction
  • user-specific requirements e.g., a user having a cold may prefer not to rely on voiceprint match
  • other physical and logical variables e.g., type of network connection—remote or local, or quality of a voice channel.
  • the context 108 may also record other variables that represent verification scores from external verification sources (not shown). For example, a customer entering a bank may have done so after swiping his bank card at the entrance, and that information could be included in the context 108 as an external score and be used for subsequent authentication processes at the counter or at the automated teller machine.
  • variables initially included in the context 108 are system default variables relevant to the verification objects and other known requirements at the time of the initial build. However, as additional verification objects are added to the system 100 or as new requirements are discovered, user-defined variables may be added to the context 108 .
  • the network adapter 106 enables communication between the client device 102 and the verification server 104 .
  • the network adapter 106 implements network transport protocols, such as the standard Transmission Control Protocol (TCP)/Internet Protocol (IP) or the Secure Sockets Layer (SSL) protocol. It is to be understood that in an embodiment where the authentication system 100 is implemented on a single computer system, a network adapter is not required.
  • TCP Transmission Control Protocol
  • IP Internet Protocol
  • SSL Secure Sockets Layer
  • the verification server 104 comprises a verification policy manager 112 and a set of verification engines 114 - 1 through 114 -N.
  • Each verification engine operates on a given verification object or a family (type) of verification objects.
  • a fingerprint verification engine may operate on a particular fingerprint or different types of fingerprints (e.g., thumbprint or index-fingerprint).
  • a knowledge verification engine may operate on different types of challenge-response questions.
  • Verification engines to be added could be of a new type or an existing type.
  • a face recognition engine could be added to a verification server that previously comprised voiceprint and fingerprint recognition engines, or a second voiceprint recognition engine (which could be from a different manufacturer, for example) could be added.
  • new verification objects could be added to new verification engines or existing verification engines (such as adding a new question to an existing knowledge verification engine).
  • the verification policy manager 112 interprets a verification policy for a given user model, and drives the entire authentication process.
  • the policy manager 112 receives the current context 108 from the verification client device 102 , operates on the context, incorporates updated status of current verification objects, and returns an updated context to the verification client device 102 along with the specification of the next step to be taken during the verification process.
  • the verification policy manager 112 can optionally be responsible for invoking states in a finite state machine, interpreting the conditions of the state machine and branching to the next state.
  • the verification policy manager 112 is the entity that makes the final accept or reject decision for the authentication process, and in some cases may also make intermediate decisions if the current transaction requires such decisions, provided the verification policy in effect permits it.
  • the data manager 116 controls the external storage resources, including verification objects store 118 , verification policies store 120 and user models store 122 . These resources may be accessed directly by the verification server 104 (either by the verification policy manager 112 or by the individual verification engines 114 - 1 through 114 -N). In an alternative embodiment, such resources may be accessed by the verification client device 102 and shipped to the verification server 104 through the network adapter 106 .
  • the application 110 is the application for which user authentication is required prior to granting access.
  • Example applications include banking applications, travel applications and e-mail applications.
  • the application 110 is responsible for providing application-specific and transaction-specific information and requirements. It is to be understood that the invention is not limited to any particular application.
  • the verification client device 102 communicates with the verification server 104 using an XML message interface.
  • the components associated with the verification server may themselves communicate with one another over the network adapter 106 .
  • one or more of the verification engines 114 may communicate with the verification policy manager 112 over the network adapter 106 .
  • a similar distributed arrangement may exist with respect to the verification policy manager 112 and the data manager 116 , and with the data manager 116 and the data stores 118 , 120 and 122 .
  • the interconnectivity of components shown in FIG. 1 is intended to be illustrative and, therefore, other suitable interconnections may be implemented to provide the authentication functionality of the present invention.
  • the computing system 200 may represent at least a portion of a distributed computing system wherein a user communicates via a computer system 202 (referred to illustratively as a “client” or client device) with another computer system 204 (referred to illustratively as a “server”) via a network 206 .
  • the network may be any suitable network across which the computer systems can communicate, e.g., the Internet or Word Wide Web, or a local area network.
  • the invention is not limited to any particular type of network. In fact, it is to be understood that the computer systems may be directly linked without a network.
  • the network may link a plurality of client devices and a plurality of servers.
  • the techniques of the invention may be implemented on a single computer system wherein, for example, the user interacts directly with the computer system that performs the authentication operations.
  • the client device 102 may be implemented via computer system 202 , and that the verification server 104 (and its components), the data manager 116 and the respective object, policy and user model stores ( 118 , 120 and 122 ) may be implemented via the computer system 204 .
  • Network adapter 106 would therefore be implemented in accordance with network 206 .
  • FIG. 2 generally illustrates an exemplary architecture for each computer system communicating over the network.
  • the computer system 202 comprises a processor 208 -A, memory 210 -A and I/O devices 212 -A, all coupled via a computer bus 214 -A.
  • the computer system 204 comprises a processor 208 -B, memory 210 -B and I/O devices 212 -B, all coupled via a computer bus 214 -B.
  • processor as used herein is intended to include one or more processing devices, including a central processing unit (CPU) or other processing circuitry.
  • memory as used herein is intended to include memory associated with a processor or CPU, such as RAM, ROM, a fixed, persistent memory device (e.g., hard drive), or a removable, persistent memory device (e.g., diskette or CD-ROM).
  • I/O devices as used herein is intended to include one or more input devices (e.g., a keyboard or mouse) for inputting data to the processing unit, as well as one or more output devices (e.g., a display) for providing results associated with the processing unit.
  • the I/O devices associated with the computer system 202 are understood to include those devices necessary to collect the particular data associated with the verification objects supported by the authentication system, e.g., a microphone to capture voice data for voiceprint recognition and/or answers to questions posed, a speaker to output such questions to the user, a face scanner, an iris scanner, and/or a fingerprint scanner.
  • the client computer system illustrated in FIG. 2 may comprise a computer system programmed to implement the inventive techniques such as a personal computer, a personal digital assistant, or a cellular phone.
  • the server computer system illustrated in FIG. 2 may comprise a computer system programmed to implement the inventive techniques such as a personal computer, a microcomputer, or a minicomputer.
  • the invention is not limited to any particular computer architecture.
  • software instructions or code for performing the methodologies of the invention, as described herein, may be stored in one or more of the associated memory devices, e.g., ROM, fixed or removable memory, and, when ready to be utilized, loaded into RAM and executed by the CPU.
  • FIG. 3 an example is shown of a registry of verification objects.
  • the registry 300 is represented using XML and stored in the verification objects store 118 ( FIG. 1 ).
  • the specification contains a description of all registered verification objects, which can be updated as new verification objects are added.
  • the first object ( 302 ) in this example is the Date-of-Birth (DOB) object, which is of the type Question-Answer (QA) and the verification engine responsible for operating on this object is the knowledge verification engine.
  • DOB Date-of-Birth
  • QA Question-Answer
  • a suggested prompt may also be included to prompt the user for the required response when this object in invoked, but the prompt may be modified or replaced by the verification client, if necessary.
  • the “perplexity” is a quantity that represents the difficulty associated with the verification object and may optionally be used by the verification policy manager in making verification decisions.
  • the second object ( 304 ) in this example is Caller-ID, which, in the case of a telephony connection, attempts to match the telephone number of the telephone originating the call with the telephone number in the relevant user model. No prompt is specified since this information may be obtained automatically from telephony infrastructure without any explicit input from the user.
  • the third object ( 306 ) in this example is the Voiceprint object, and in this case no type is specified, since the voiceprint verification engine operates on one type of verification object. Given that voiceprints are a biometric feature that may not be stolen, a high perplexity is specified in this example.
  • the fourth and fifth objects ( 308 and 310 ) illustrate the hierarchical nature of the specification, whereby the CAR_COLOR object inherits default properties from the parent object COLOR.
  • the last two objects ( 312 and 314 ) in this example are examples of dynamic verification objects, whereby the intended response changes dynamically, and in this example, the correct responses are obtained from the application, rather than from the user model.
  • the current balance (CUR_BALANCE) object ( 312 ) is an application-specific object of the type numeric (APP_NUM) and the last transaction date (LAST_TRANSACTION_DATE) object ( 314 ) is an application-specific object of the type string.
  • the user model 400 is represented using XML and stored in the user models store 122 ( FIG. 1 ).
  • the user model contains a description of verification objects for which the user has provided enrollment data.
  • the first object ( 402 ) is the Caller-ID object, for which this user's correct response is 914-945-3000 in this example.
  • the user's preference for this object may be optionally included and used by the verification policy in selecting objects with higher preference when possible.
  • the second and third objects are similar.
  • the fourth object color of car or CAR_COLOR 408
  • the fifth object is the voiceprint object, for which model parameters are needed, which may be stored in a file, and the filename is included.
  • the last two objects (CUR_BALANCE 412 and LAST_TRANSACTION_DATE 414 ) do not have any correct responses included because they are dynamic verification objects, and the current correct responses have to be obtained from the application.
  • any of the objects can be updated or deleted in real-time, and new objects can be added in real-time.
  • a user model in accordance with the present invention comprises False Accept (FA) and False Reject (FR) error rates that characterize each security challenge.
  • the FA describes the probability of a different user answering correctly to the security challenge, and the FR the probability of the genuine user not answering correctly.
  • the FA and FR assigned to each challenge may be only in the background model, thus assuming that all users have the same FA and FR, or in addition user specific FA and FR values may be assigned to each challenge. For example, a particular challenge may have an FA value of 0.001 and an FR value of 0.07.
  • a lower FA rate in the user model 400 may reflect, for example, that the user easily gives out the answer to the challenge.
  • a lower FA rate in the background model 540 for a social security number challenge may reflect, for example, that a repository of social security numbers has been stolen.
  • a lower FR rate in the user model 400 may reflect, for example, that the user often forgets the answer to a particular challenge.
  • a lower FR rate in the background model 540 for a particular challenge may reflect, for example, that a number of users tend to forget the answer to the challenge.
  • the user model also includes the correct responses to the security challenges, as shown in FIG. 4 .
  • the FA and FR values may be changed by adapting to field data reflecting the measured FA and FR in practice. For example, if a number of users tend to forget the answer to the challenge, the FR value should be increased.
  • the FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected.
  • FIG. 5 is a block diagram illustrating a sequential authentication system 500 incorporating features of the present invention.
  • the sequential authentication system 500 employs a user model 400 to describe the expected behavior of each user.
  • the sequential authentication system 500 optionally employs a background model 540 that describes the expected behavior of the general population of users.
  • the background model 540 may be, for example, a collection of individual user models 400 .
  • the user model 400 and background model 540 may be created using any known technique, as would be apparent to a person of ordinary skill.
  • the user model 400 and background model 540 may be created using statistical generative models, such as Gaussian Mixture Models (GMM) and Hidden Markov Models (HMM).
  • GMM Gaussian Mixture Models
  • HMM Hidden Markov Models
  • discriminative models such as Artificial Neural Networks (ANN) and Support Vector Machines (SVM) may also be used.
  • ANN Artificial Neural Networks
  • SVM Support Vector Machines
  • a security challenge is presented to a user.
  • the challenge is typically a question that the user must answer.
  • an intermediate decision is computed at stage 510 using the background model 540 and user authentication model 400 .
  • the intermediate decision generated at stage 510 is then passed to a module 520 that aggregates the intermediate results to form a cumulative result 525 by which a user accept/reject decision is made. If a final decision cannot be made, the module 520 produces a third state cumulative result (“undecided”), meaning that additional challenges need to be presented to the user.
  • the user and/or background model may be adapted at stage 515 to reflect the new user input.
  • a Log Likelihood Ratio (LLR) score is computed for each interaction turn (forming an intermediate result) and is summed over turns (to form a cumulative user accept/reject result 525 ).
  • the sum of the LLR scores can be compared to two thresholds to make a decision, for example, based on the Wald SPRT theory. See, for example, A. Wald, “Sequential Analysis,” (J. Wiley, 1947).
  • the LLR score may be computed as follows:
  • p i is the false accept (FA) rate for the challenge (e.g., the probability of guessing the answer or having the answer compromised);
  • q i is the false rejection (FR) rate for the challenge (e.g., the probability of forgetting the answer or not knowing).
  • the observation for the entire dialog may be represented by a binary vector x where every bit in x is either 1 for a correct answer or 0 for an incorrect answer for a particular challenge.
  • LLR ⁇ ( N ) log ⁇ [ P ⁇ ( x 1 , x 2 , ... ⁇ , x N
  • the value of the LLR therefore increases with the number of turns, and since it is a sum, the distribution of this sum becomes more Gaussian-like assuming independent turn based LLR estimates.
  • the knowledge biometric score may be the score returned from the speaker verification engine (which is an estimate of the biometric LLR), or a probabilistic interpretation of this score.
  • the thresholds are then applied to the combined score 525 .
  • FIG. 6 is a flow chart describing an exemplary implementation of a sequential authentication process 600 incorporating features of the present invention.
  • the sequential authentication process 600 initially presents a challenge to the user during step 610 .
  • the user response is then compared to the user model 400 and/or background model 540 during step 620 to generate an intermediate authentication decision.
  • the cumulative authentication decision 525 is then updated during step 630 by adding to the current intermediate authentication decision (e.g., an LLR value) to the sum of LLR values.
  • the user model 400 and/or background model 540 are updated, if necessary, during step 640 based on the user data.
  • the FA and FR values may be changed during step 640 by adapting to field data reflecting the measured FA and FR values.
  • the FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected.
  • a test is performed during step 650 to evaluate the cumulative authentication decision (sum of the LLR scores) 525 to the established thresholds.
  • the sum of the LLR scores can be compared to two thresholds to make a decision. If the LLR score exceeds the high threshold, then the user is accepted (Branch A from step 650 ). If the cumulative LLR score does not meet the low threshold, the user is rejected (Branch B from step 650 ), and if the LLR score is between the two thresholds the interaction continues (Branch C from step 650 ).
  • the methods and apparatus discussed herein may be distributed as an article of manufacture that itself comprises a computer readable medium having computer readable code means embodied thereon.
  • the computer readable program code means is operable, in conjunction with a computer system, to carry out all or some of the steps to perform the methods or create the apparatuses discussed herein.
  • the computer readable medium may be a recordable medium (e.g., floppy disks, hard drives, compact disks, or memory cards) or may be a transmission medium (e.g., a network comprising fiber-optics, the world-wide web, cables, or a wireless channel using time-division multiple access, code-division multiple access, or other radio-frequency channel). Any medium known or developed that can store information suitable for use with a computer system may be used.
  • the computer-readable code means is any mechanism for allowing a computer to read instructions and data, such as magnetic variations on a magnetic media or height variations on the surface of a compact disk.
  • the computer systems and servers described herein each contain a memory that will configure associated processors to implement the methods, steps, and functions disclosed herein.
  • the memories could be distributed or local and the processors could be distributed or singular.
  • the memories could be implemented as an electrical, magnetic or optical memory, or any combination of these or other types of storage devices.
  • the term “memory” should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by an associated processor. With this definition, information on a network is still within a memory because the associated processor can retrieve the information from the network.

Abstract

Methods and apparatus are provided for sequential authentication of a user that employ one or more error rates characterizing each security challenge. According to one aspect of the invention, a user is challenged with at least one knowledge challenge to obtain an intermediate authentication result; and the user challenges continue until a cumulative authentication result satisfies one or more criteria. The intermediate authentication result is based, for example, on one or more of false accept and false reject error probabilities for each knowledge challenge. A false accept error probability describes a probability of a different user answering the knowledge challenge correctly. A false reject error probability describes a probability of a genuine user not answering the knowledge challenge correctly. The false accept and false reject error probabilities can be adapted based on field data or known information about a given challenge.

Description

    FIELD OF THE INVENTION
  • The present invention is generally related to user authentication techniques and, more particularly, to techniques for providing sequential user authentication.
  • BACKGROUND OF THE INVENTION
  • Authenticating the identity claim of a user is an important step in ensuring the security of systems, networks, services and facilities, both for physical and for logical access. Existing user authentication is often performed on the basis of a user's knowledge of a single verification object, e.g., a password or a personal identification number (PIN) or on the basis of possession of a single verification object, e.g., a key or a card. Other existing authentication techniques include the use of a biometric feature as the verification object, e.g., a fingerprint, a voiceprint, an iris scan or a face scan.
  • Verification is typically done by comparing the verification object obtained from the user at the time of attempted access to previously stored objects. Biometric systems, for example, typically produce a similarity score measuring how close an input biometric is to a reference biometric template. A threshold is then applied to the score to make a binary decision about whether to accept or reject a given user. Possession-based user authentication systems make a binary accept/reject decision based on the presence of a physical device (e.g., a key) or a virtual device (e.g., a digital certificate). For knowledge verification, a single challenge will result in a binary decision based on the correctness of the user's response.
  • When multiple challenges are presented to the user for the purpose of authentication, user authentication is said to be sequential. Sequential user authentication may be accomplished by using a sequence of authentication challenges from the same mode (e.g., presenting only knowledge verification questions), or using multiple verification modes (e.g., presenting both random knowledge challenges and asking for one or more physical keys). Sequential authentication based on biometrics may be possible, depending on the type of biometric. For example, fingerprints are consistent and sequential challenges would not be beneficial since they capture the same identical fingerprint. The human voice, however, does change, and therefore sequential voice biometrics (“speaker recognition”) is beneficial.
  • When sequential user authentication is utilized, the set of rules or algorithms for making a binary decision to accept or reject the user may be more complicated than a simple threshold, since the results from individual interaction turns (challenges) may be contradicting. A policy is the set of rules that specify, at each turn, whether to accept the user, reject the user, or present the user with a new challenge.
  • A number of techniques have been proposed or suggested for combining speaker recognition and knowledge verification using conversational biometrics with a policy that governs the user interaction based on both the measured biometric (speaker recognition) and knowledge responses. For example, U.S. patent application Ser. No. 10/283,729, filed Oct. 30, 3002, entitled “Methods and Apparatus for Dynamic User Authentication Using Customizable Context-Dependent Interaction Across Multiple Verification Objects,” assigned to the assignee of the present invention and incorporated by reference herein, discloses an authentication framework that enables a dynamic user authentication that combines multiple authentication objects using a shared context and that permits customizable interaction design to suit varying user preferences and transaction/application requirements. See also, U.S. Pat. No. 6,529,871, entitled “A Way to Identify Using Both Voice Authentication and Personal Queries,” assigned to the assignee of the present invention and incorporated by reference herein.
  • While such conversational biometric techniques provide improved authentication frameworks with a high degree of flexibility, accuracy, convenience and robustness, they suffer from a number of limitations, which if overcome, could further improve the efficiency and security of such user authentication techniques. In particular, the above-described techniques for conversational biometrics yield a binary decision for each challenge.
  • A need therefore exists for methods and apparatus for conversational biometrics that yield a continuous-value score for each challenge. The continuous-value score allows for fusing of multiple biometric systems prior to setting a security level. A further need exists for methods and apparatus that measure similarity scores from knowledge verification systems. Yet another need exists for methods and apparatus that manage a sequential authentication system based on measured knowledge scores.
  • SUMMARY OF THE INVENTION
  • Generally, methods and apparatus are provided for sequential authentication of a user that employ one or more error rates characterizing each security challenge. According to one aspect of the invention, a user is challenged with at least one knowledge challenge to obtain an intermediate authentication result; and the user challenges continue until a cumulative authentication result satisfies one or more criteria. The intermediate authentication result is based, for example, on log likelihood ratio and the cumulative authentication result is, for example, a sum of individual log likelihood ratios.
  • The intermediate authentication result is based, for example, on one or more of false accept and false reject error probabilities for each knowledge challenge. A false accept error probability describes a probability of a different user answering the knowledge challenge correctly. A false reject error probability describes a probability of a genuine user not answering the knowledge challenge correctly.
  • According to another aspect of the invention, one or more of the false accept and false reject error probabilities are adapted based on field data or known information about a given challenge. For example, the FA and FR values may be changed by adapting to field data reflecting the measured FA and FR values. The FA and FR values may also be changed to reflect expected security breaches. The continuous scores provided by the present invention allow easier adaptation, as they imply a statistical model that has parameters, such as FA and FR. In addition, the continuous scores provided by the present invention allow one or more of the intermediate authentication results and the cumulative authentication result to be combined with a result from another verification method (such as biometric or possession based authentication).
  • A more complete understanding of the present invention, as well as further features and advantages of the present invention, will be obtained by reference to the following detailed description and drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating a client-server architecture of an authentication system for implementing sequential authentication in accordance with the present invention;
  • FIG. 2 is a block diagram illustrating an exemplary computing system environment for implementing sequential authentication in accordance with the present invention;
  • FIG. 3 is a diagram illustrating an exemplary specification of multiple verification objects, according to one embodiment of the invention;
  • FIG. 4 is a diagram illustrating an exemplary specification of user models including multiple verification objects, according to one embodiment of the invention;
  • FIG. 5 is a block diagram illustrating a sequential authentication system incorporating features of the present invention; and
  • FIG. 6 is a flow chart describing an exemplary implementation of a sequential authentication process incorporating features of the present invention.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The present invention provides a sequential authentication system. The disclosed sequential authentication system is based on knowledge verification for the purpose of measuring a similarity score for every interaction turn. The disclosed sequential authentication system continuously estimates the probability that the user's identity claim is genuine and the probability that the user is not who he or she claims to be.
  • During a user authentication, a series of challenges is presented to the user, and each user response is compared to one or more models, resulting in an intermediate authentication decision (such as a log likelihood ratio (LLR)). At each interaction turn, the intermediate decisions from the individual turns are combined (such as a sum of LLRs) to create a cumulative authentication result to ultimately either accept or reject the user's identity claim. The models used for the sequential authentication process may also be adapted from the user data during authentication, and may also be used in conjunction with voice biometric models performing speaker recognition (or another modality) to complete the user authentication task.
  • According to one aspect of the invention, the user or background models, or both, comprise False Accept (FA) and False Reject (FR) error rates that characterize each security challenge. The FA describes the probability of a different user answering correctly to the security challenge, and the FR describes the probability of the genuine user not answering correctly. The FA and FR assigned to each challenge may be only in the background model, thus assuming that all users have the same FA and FR, or in addition user specific FA and FR may be assigned to each challenge and stored in the user model. As discussed below in conjunction with FIG. 4, the user model also includes the correct responses to the security challenges.
  • According to a further aspect of the invention, the FA and FR values may be changed by adapting to field data reflecting the measured FA and FR values. The FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected. The continuous scores provided by the present invention allow easier adaptation, as they imply a statistical model that has parameters, such as FA and FR.
  • The invention is illustrated using an exemplary client-server system architecture. It should be understood, however, that the invention is not limited to use with any particular system architecture. The invention is instead more generally applicable to any system architecture in which it is desirable to provide an authentication framework that provides a high degree of flexibility, accuracy, convenience and/or robustness. That is, the techniques of the present invention may be implemented on a single computer system or on multiple computer systems coupled by a suitable network, examples of which will be described below.
  • In one embodiment, the interaction design is based on authentication policies implemented as a statistical state machine using XML (extensible Markup Language). In addition, there is a file that specifies the relevant authentication objects (e.g., questions to be asked or actions to be performed) and files that contain user profiles (e.g., user selected authentication objects and correct responses and/or user preferences) both of which may also be implemented using XML.
  • The entire authentication interaction is determined dynamically based on the authentication policy in effect (selected based on user preferences and transaction or application requirements), using operations on the shared context, further utilizing the authentication objects in effect and the user profile of interest.
  • Such an approach provides significantly improved authentication capabilities as compared with existing authentication systems, and ensures a very high degree of accuracy, flexibility, convenience and robustness.
  • Furthermore, as will be illustratively explained in detail below, the authentication techniques of the present invention utilize the following components: (1) verification objects and verification engines; (2) verification policies and a verification policy manager; and (3) user models.
  • Verification objects are objects that can be used for the purpose of verifying the identity of users, such as the user's biometric characteristics (e.g., voiceprint, fingerprint, face scan, iris scan, handwritten signature and/or keyboard dynamics), the user's knowledge (e.g., passwords, passphrases, and/or answers to personal questions), and the user's possessions (e.g., keys, cards, tokens, certificates, cellular telephone or home telephone transmitting caller-id information, personal or handheld computer with client software and/or user's location). It is to be understood that the lists of example objects above are not intended to be exhaustive and, further, that the invention is not intended to be limited to any particular objects.
  • Verification engines are used to match the verification objects with the representation stored in a user model. Examples of verification engines include a fingerprint recognition system to match the user's fingerprint, a conversational system to evaluate spoken answers to questions such as a voice response system, a conversational system such as a speech or voiceprint recognition system (that may include natural understanding techniques) to extract and recognize a user's spoken utterances (wherein the conversational system may also include a speech synthesis system for generating synthesized questions and prompts), a caller-id recognition system to extract and match the user's telephone number, a badge reader to scan the user's badge or card, a PIN confirmation system to confirm a user's PIN, a face recognition system to extract and match a user's face scan, an iris recognition system to extract and match a user's iris scan, a handwriting recognition system to recognize a user's handwriting, a keyboard dynamic recognizer to match a user's keyboard dynamics, as well as other modality-specific engines discussed herein and/or may otherwise be known. It is to be understood that since these types of engine are well-known, further descriptions of details of such engines are not necessary and therefore are not provided herein. Again, it is to be understood that the list of example engines above is not intended to be exhaustive and, further, that the invention is not intended to be limited to any particular verification engines.
  • While verification engines typically perform user verification by comparing user input to the user's model that was created when the user enrolled, the invention is not restricted to verification engines that require user enrollment. Unsupervised verification engines, that do not require the user to enroll, may also be used. When unsupervised verification engines are used, a single user model may be employed, including the user attributes as measured by the verification engines. For example, the following verification engines can be used: acoustic accent recognition, language identification, and face features detection (e.g., color of eyes, glasses detection). In this case, none of the individual verification engines require user enrollment, and one user model is used, stating the user's speech accent spoken, language, color of eyes, and whether he/she wears glasses.
  • Thus, the invention realizes that, while individual verification engines can be used to perform simple verification steps that operate in a predefined static manner, a more general framework is necessary when multiple verification objects are used to perform dynamic user authentication, in order to achieve a greater degree of accuracy and flexibility. The present invention provides such an improved authentication framework.
  • To accomplish this and other goals, the present invention employs verification policies that govern the interaction between the user and the overall system, including the authentication system, and between the various verification engines. Any number of verification policies could be written to satisfy a wide variety of user-specific, transaction-specific or application-specific authentication needs, including needs that change in real-time.
  • As will be seen, such verification policies are managed by a verification policy manager which uses operations on a common context shared across all verification objects to achieve maximum programmability of the authentication system.
  • User models are typically created when a user enrolls in the system, using the inputs provided by the user (e.g., samples of voice, samples of fingerprint, and/or answers to personal questions), or acquired through other means (such as details of past transactions, balance in most recent bill, serial number of a key or badge issued, and/or encryption key contained in a smartcard or a client software).
  • The user models may be updated in real-time when needed, such as when a new bill is issued and the balance changes or when more voice samples are available. An individual user model contains information regarding all verification objects relevant to that user, including any user preferences related to the verification objects (e.g., a user may prefer questions regarding colors rather than numbers). User models also preferably support nontrivial manipulations of the verification objects, such as asking the user to add the first and third digits of his social security number. Again, any of the above-mentioned examples are not intended to limit the invention.
  • Given the above general description of some of the principles and features of the present invention, illustrative embodiments of these principles and features will now be given in the context of the figures.
  • Referring initially to FIG. 1, a block diagram illustrates a client-server architecture of an authentication system for implementing sequential authentication, according to one embodiment of the invention. As shown, the authentication system 100 comprises a verification client device 102 and a verification server 104, coupled via a network adapter 106. The verification client 102 has context 108 and application 110 associated therewith. The verification server 104 comprises a verification policy manager 112 and a plurality of verification engines 114-1 through 114-N, where N can be any integer 2, 3, 4 . . . , and represents the number of verification object families or types that the particular implementation of the invention can support. The authentication system 100 further comprises a data manager 116, a verification objects store 118, a verification policies store 120 and a user models store 122. While the data manager 116 and data stores 118, 120 and 122 are shown outside of the verification server box, it is to be understood that they may be implemented on the verification server.
  • The verification client device 102 is responsible for interfacing with the user and collecting the inputs from the user, communicating with the verification server 104 through the network adapter 106, and communicating with the application 110. In one embodiment of the invention, the verification client device 102 is also responsible for acquiring and maintaining the context 108.
  • In an alternative embodiment, the context 108 may be stored on a central database (not shown), accessible by other components of the system 100. Such an implementation allows for a stateless operation between the verification client device 102 and the verification server 104, such that different servers could be used for different turns in the verification process, thereby providing protection against a particular server going down in the middle of a verification process, and also allowing for improved load balancing of the server resources.
  • The context 108 records all relevant variables for the verification process, such as: (1) the user name; (2) the current state in the verification policy that is in effect; (3) the history pertaining to the verification objects that have been invoked and the scores and outcomes associated with the invocations; (4) transaction-specific requirements (e.g., desired level of accuracy or nature of the transaction); (5) user-specific requirements (e.g., a user having a cold may prefer not to rely on voiceprint match); and (6) other physical and logical variables (e.g., type of network connection—remote or local, or quality of a voice channel).
  • The context 108 may also record other variables that represent verification scores from external verification sources (not shown). For example, a customer entering a bank may have done so after swiping his bank card at the entrance, and that information could be included in the context 108 as an external score and be used for subsequent authentication processes at the counter or at the automated teller machine.
  • The variables initially included in the context 108 are system default variables relevant to the verification objects and other known requirements at the time of the initial build. However, as additional verification objects are added to the system 100 or as new requirements are discovered, user-defined variables may be added to the context 108.
  • The network adapter 106 enables communication between the client device 102 and the verification server 104. The network adapter 106 implements network transport protocols, such as the standard Transmission Control Protocol (TCP)/Internet Protocol (IP) or the Secure Sockets Layer (SSL) protocol. It is to be understood that in an embodiment where the authentication system 100 is implemented on a single computer system, a network adapter is not required.
  • As shown, the verification server 104 comprises a verification policy manager 112 and a set of verification engines 114-1 through 114-N. Each verification engine operates on a given verification object or a family (type) of verification objects. For example, a fingerprint verification engine may operate on a particular fingerprint or different types of fingerprints (e.g., thumbprint or index-fingerprint). Similarly, a knowledge verification engine may operate on different types of challenge-response questions.
  • The flexible architecture allows for easy addition of new verification engines and verification objects. Verification engines to be added could be of a new type or an existing type. For example, a face recognition engine could be added to a verification server that previously comprised voiceprint and fingerprint recognition engines, or a second voiceprint recognition engine (which could be from a different manufacturer, for example) could be added. Similarly, new verification objects could be added to new verification engines or existing verification engines (such as adding a new question to an existing knowledge verification engine).
  • The verification policy manager 112 interprets a verification policy for a given user model, and drives the entire authentication process. The policy manager 112 receives the current context 108 from the verification client device 102, operates on the context, incorporates updated status of current verification objects, and returns an updated context to the verification client device 102 along with the specification of the next step to be taken during the verification process.
  • The verification policy manager 112 can optionally be responsible for invoking states in a finite state machine, interpreting the conditions of the state machine and branching to the next state. The verification policy manager 112 is the entity that makes the final accept or reject decision for the authentication process, and in some cases may also make intermediate decisions if the current transaction requires such decisions, provided the verification policy in effect permits it.
  • The data manager 116 controls the external storage resources, including verification objects store 118, verification policies store 120 and user models store 122. These resources may be accessed directly by the verification server 104 (either by the verification policy manager 112 or by the individual verification engines 114-1 through 114-N). In an alternative embodiment, such resources may be accessed by the verification client device 102 and shipped to the verification server 104 through the network adapter 106.
  • The application 110 is the application for which user authentication is required prior to granting access. Example applications include banking applications, travel applications and e-mail applications. The application 110 is responsible for providing application-specific and transaction-specific information and requirements. It is to be understood that the invention is not limited to any particular application.
  • In one embodiment of the invention, the verification client device 102 communicates with the verification server 104 using an XML message interface.
  • Further, in alternative embodiments, it is to be understood that the components associated with the verification server may themselves communicate with one another over the network adapter 106. Thus, for example, one or more of the verification engines 114 may communicate with the verification policy manager 112 over the network adapter 106. A similar distributed arrangement may exist with respect to the verification policy manager 112 and the data manager 116, and with the data manager 116 and the data stores 118, 120 and 122. Thus, it is to be understood that the interconnectivity of components shown in FIG. 1 is intended to be illustrative and, therefore, other suitable interconnections may be implemented to provide the authentication functionality of the present invention.
  • Referring now to FIG. 2, a block diagram illustrates an exemplary computing system environment for implementing sequential authentication, according to one embodiment of the invention. By way of example, the computing system 200 may represent at least a portion of a distributed computing system wherein a user communicates via a computer system 202 (referred to illustratively as a “client” or client device) with another computer system 204 (referred to illustratively as a “server”) via a network 206. The network may be any suitable network across which the computer systems can communicate, e.g., the Internet or Word Wide Web, or a local area network. However, the invention is not limited to any particular type of network. In fact, it is to be understood that the computer systems may be directly linked without a network.
  • Further, while only two computer systems are shown for the sake of simplicity in FIG. 2, it is to be understood that the network may link a plurality of client devices and a plurality of servers. However, it is also to be appreciated that the techniques of the invention may be implemented on a single computer system wherein, for example, the user interacts directly with the computer system that performs the authentication operations.
  • With reference to FIG. 1, it is to be understood that the client device 102 may be implemented via computer system 202, and that the verification server 104 (and its components), the data manager 116 and the respective object, policy and user model stores (118, 120 and 122) may be implemented via the computer system 204. Network adapter 106 would therefore be implemented in accordance with network 206.
  • Thus, it is to be understood that FIG. 2 generally illustrates an exemplary architecture for each computer system communicating over the network. As shown, the computer system 202 comprises a processor 208-A, memory 210-A and I/O devices 212-A, all coupled via a computer bus 214-A. Similarly, the computer system 204 comprises a processor 208-B, memory 210-B and I/O devices 212-B, all coupled via a computer bus 214-B.
  • It should be understood that the term “processor” as used herein is intended to include one or more processing devices, including a central processing unit (CPU) or other processing circuitry. Also, the term “memory” as used herein is intended to include memory associated with a processor or CPU, such as RAM, ROM, a fixed, persistent memory device (e.g., hard drive), or a removable, persistent memory device (e.g., diskette or CD-ROM). In addition, the term “I/O devices” as used herein is intended to include one or more input devices (e.g., a keyboard or mouse) for inputting data to the processing unit, as well as one or more output devices (e.g., a display) for providing results associated with the processing unit. Further, the I/O devices associated with the computer system 202 are understood to include those devices necessary to collect the particular data associated with the verification objects supported by the authentication system, e.g., a microphone to capture voice data for voiceprint recognition and/or answers to questions posed, a speaker to output such questions to the user, a face scanner, an iris scanner, and/or a fingerprint scanner.
  • It is also to be understood that the client computer system illustrated in FIG. 2 may comprise a computer system programmed to implement the inventive techniques such as a personal computer, a personal digital assistant, or a cellular phone. Likewise, the server computer system illustrated in FIG. 2 may comprise a computer system programmed to implement the inventive techniques such as a personal computer, a microcomputer, or a minicomputer. However, the invention is not limited to any particular computer architecture.
  • Accordingly, software instructions or code for performing the methodologies of the invention, as described herein, may be stored in one or more of the associated memory devices, e.g., ROM, fixed or removable memory, and, when ready to be utilized, loaded into RAM and executed by the CPU.
  • Referring now to FIG. 3, an example is shown of a registry of verification objects. In this particular embodiment, the registry 300 is represented using XML and stored in the verification objects store 118 (FIG. 1).
  • The specification contains a description of all registered verification objects, which can be updated as new verification objects are added. The first object (302) in this example is the Date-of-Birth (DOB) object, which is of the type Question-Answer (QA) and the verification engine responsible for operating on this object is the knowledge verification engine. A suggested prompt may also be included to prompt the user for the required response when this object in invoked, but the prompt may be modified or replaced by the verification client, if necessary. The “perplexity” is a quantity that represents the difficulty associated with the verification object and may optionally be used by the verification policy manager in making verification decisions.
  • The second object (304) in this example is Caller-ID, which, in the case of a telephony connection, attempts to match the telephone number of the telephone originating the call with the telephone number in the relevant user model. No prompt is specified since this information may be obtained automatically from telephony infrastructure without any explicit input from the user.
  • The third object (306) in this example is the Voiceprint object, and in this case no type is specified, since the voiceprint verification engine operates on one type of verification object. Given that voiceprints are a biometric feature that may not be stolen, a high perplexity is specified in this example.
  • The fourth and fifth objects (308 and 310) illustrate the hierarchical nature of the specification, whereby the CAR_COLOR object inherits default properties from the parent object COLOR.
  • The last two objects (312 and 314) in this example are examples of dynamic verification objects, whereby the intended response changes dynamically, and in this example, the correct responses are obtained from the application, rather than from the user model. The current balance (CUR_BALANCE) object (312) is an application-specific object of the type numeric (APP_NUM) and the last transaction date (LAST_TRANSACTION_DATE) object (314) is an application-specific object of the type string.
  • Referring now to FIG. 4, an example is shown of a user model. In this particular embodiment, the user model 400 is represented using XML and stored in the user models store 122 (FIG. 1).
  • The user model contains a description of verification objects for which the user has provided enrollment data. The first object (402) is the Caller-ID object, for which this user's correct response is 914-945-3000 in this example. The user's preference for this object may be optionally included and used by the verification policy in selecting objects with higher preference when possible.
  • The second and third objects (DOB 404 and COLOR 406) are similar. The fourth object (color of car or CAR_COLOR 408) has two responses in this example, since this user has two cars and either response may be accepted as the correct answer. The fifth object (410) is the voiceprint object, for which model parameters are needed, which may be stored in a file, and the filename is included. The last two objects (CUR_BALANCE 412 and LAST_TRANSACTION_DATE 414) do not have any correct responses included because they are dynamic verification objects, and the current correct responses have to be obtained from the application.
  • As mentioned above, in accordance with the present invention, any of the objects can be updated or deleted in real-time, and new objects can be added in real-time.
  • As shown in FIG. 4, a user model in accordance with the present invention comprises False Accept (FA) and False Reject (FR) error rates that characterize each security challenge. The FA describes the probability of a different user answering correctly to the security challenge, and the FR the probability of the genuine user not answering correctly. The FA and FR assigned to each challenge may be only in the background model, thus assuming that all users have the same FA and FR, or in addition user specific FA and FR values may be assigned to each challenge. For example, a particular challenge may have an FA value of 0.001 and an FR value of 0.07.
  • A lower FA rate in the user model 400 may reflect, for example, that the user easily gives out the answer to the challenge. Likewise, a lower FA rate in the background model 540 for a social security number challenge may reflect, for example, that a repository of social security numbers has been stolen.
  • A lower FR rate in the user model 400 may reflect, for example, that the user often forgets the answer to a particular challenge. Likewise, a lower FR rate in the background model 540 for a particular challenge may reflect, for example, that a number of users tend to forget the answer to the challenge.
  • The user model also includes the correct responses to the security challenges, as shown in FIG. 4. The FA and FR values may be changed by adapting to field data reflecting the measured FA and FR in practice. For example, if a number of users tend to forget the answer to the challenge, the FR value should be increased. The FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected.
  • FIG. 5 is a block diagram illustrating a sequential authentication system 500 incorporating features of the present invention. As shown in FIG. 5, the sequential authentication system 500 employs a user model 400 to describe the expected behavior of each user. In addition, the sequential authentication system 500 optionally employs a background model 540 that describes the expected behavior of the general population of users. The background model 540 may be, for example, a collection of individual user models 400.
  • The user model 400 and background model 540 may be created using any known technique, as would be apparent to a person of ordinary skill. For example, the user model 400 and background model 540 may be created using statistical generative models, such as Gaussian Mixture Models (GMM) and Hidden Markov Models (HMM). In addition, discriminative models, such as Artificial Neural Networks (ANN) and Support Vector Machines (SVM) may also be used. It is noted that while the user model 400 includes an FR rate for each challenge, and optionally an FA rate, the background model 540 includes an FA rate for each challenge, and optionally an FR rate.
  • As previously indicated, at each interaction turn, a security challenge is presented to a user. The challenge is typically a question that the user must answer. Upon completion of each turn, an intermediate decision is computed at stage 510 using the background model 540 and user authentication model 400. The intermediate decision generated at stage 510 is then passed to a module 520 that aggregates the intermediate results to form a cumulative result 525 by which a user accept/reject decision is made. If a final decision cannot be made, the module 520 produces a third state cumulative result (“undecided”), meaning that additional challenges need to be presented to the user. Optionally, the user and/or background model may be adapted at stage 515 to reflect the new user input.
  • In one embodiment, a Log Likelihood Ratio (LLR) score is computed for each interaction turn (forming an intermediate result) and is summed over turns (to form a cumulative user accept/reject result 525). The sum of the LLR scores can be compared to two thresholds to make a decision, for example, based on the Wald SPRT theory. See, for example, A. Wald, “Sequential Analysis,” (J. Wiley, 1947).
  • If the LLR score exceeds the high threshold, then the user is accepted. If the cumulative LLR score does not meet the low threshold, the user is rejected, and if the LLR score is between the two thresholds the interaction continues. Assuming a global FA and FR for each challenge (stored in the background model 540), the LLR scores may be computed as follows:
  • For a given challenge i two parameters are defined that characterize the challenge:
  • pi is the false accept (FA) rate for the challenge (e.g., the probability of guessing the answer or having the answer compromised); and
  • qi is the false rejection (FR) rate for the challenge (e.g., the probability of forgetting the answer or not knowing).
  • The observation for the entire dialog may be represented by a binary vector x where every bit in x is either 1 for a correct answer or 0 for an incorrect answer for a particular challenge.
  • Defining λ to be the case where the speaker claim is true, and {overscore (λ)} the complementary case where the speaker is attempting to break into another account (i.e., an “imposter”), the following probabilities can be computed for turn number j: P ( x j | λ ) = { q i , x j = 0 1 - q i , x j = 1 P ( x j | λ _ ) = { 1 - p i , x j = 0 p i , x j = 1
    and the LLR for each turn is: LLR ( j ) = log ( P ( x j | λ ) P ( x j | λ _ ) )
  • Now, assuming that the turns are independent (which could mean in practice that multi-field turns such as date-of-birth should be treated like a single turn), then after turn number N, the Log Likelihood Ratio (LLR) is calculated as: LLR ( N ) = log [ P ( x 1 , x 2 , , x N | λ ) P ( x 1 , x 2 , , x N | λ _ ) ] = j = 1 N log [ P ( x j | λ ) P ( x j | λ _ ) ] = j = 1 N LLR ( j )
  • The value of the LLR therefore increases with the number of turns, and since it is a sum, the distribution of this sum becomes more Gaussian-like assuming independent turn based LLR estimates.
  • A straightforward combination of a biometric and knowledge-based system would be to add or average the two scores. The knowledge biometric score may be the score returned from the speaker verification engine (which is an estimate of the biometric LLR), or a probabilistic interpretation of this score. The thresholds are then applied to the combined score 525.
  • FIG. 6 is a flow chart describing an exemplary implementation of a sequential authentication process 600 incorporating features of the present invention. As shown in FIG. 6, the sequential authentication process 600 initially presents a challenge to the user during step 610. The user response is then compared to the user model 400 and/or background model 540 during step 620 to generate an intermediate authentication decision. The Log Likelihood Ratio (LLR) score may be computed for each interaction turn as follows: LLR ( j ) = log ( P ( x j | λ ) P ( x j | λ _ ) ) .
  • The cumulative authentication decision 525 is then updated during step 630 by adding to the current intermediate authentication decision (e.g., an LLR value) to the sum of LLR values. In addition, the user model 400 and/or background model 540 are updated, if necessary, during step 640 based on the user data. For example, the FA and FR values may be changed during step 640 by adapting to field data reflecting the measured FA and FR values. The FA and FR values may also be changed to reflect expected security breaches. For example, in the case where a repository of social security numbers is stolen, the FA assigned to the social security number challenge will be updated to be higher than typically expected.
  • Finally, a test is performed during step 650 to evaluate the cumulative authentication decision (sum of the LLR scores) 525 to the established thresholds. In one exemplary implementation, the sum of the LLR scores can be compared to two thresholds to make a decision. If the LLR score exceeds the high threshold, then the user is accepted (Branch A from step 650). If the cumulative LLR score does not meet the low threshold, the user is rejected (Branch B from step 650), and if the LLR score is between the two thresholds the interaction continues (Branch C from step 650).
  • System and Article of Manufacture Details
  • As is known in the art, the methods and apparatus discussed herein may be distributed as an article of manufacture that itself comprises a computer readable medium having computer readable code means embodied thereon. The computer readable program code means is operable, in conjunction with a computer system, to carry out all or some of the steps to perform the methods or create the apparatuses discussed herein. The computer readable medium may be a recordable medium (e.g., floppy disks, hard drives, compact disks, or memory cards) or may be a transmission medium (e.g., a network comprising fiber-optics, the world-wide web, cables, or a wireless channel using time-division multiple access, code-division multiple access, or other radio-frequency channel). Any medium known or developed that can store information suitable for use with a computer system may be used. The computer-readable code means is any mechanism for allowing a computer to read instructions and data, such as magnetic variations on a magnetic media or height variations on the surface of a compact disk.
  • The computer systems and servers described herein each contain a memory that will configure associated processors to implement the methods, steps, and functions disclosed herein. The memories could be distributed or local and the processors could be distributed or singular. The memories could be implemented as an electrical, magnetic or optical memory, or any combination of these or other types of storage devices. Moreover, the term “memory” should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by an associated processor. With this definition, information on a network is still within a memory because the associated processor can retrieve the information from the network.
  • It is to be understood that the embodiments and variations shown and described herein are merely illustrative of the principles of this invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.

Claims (20)

1. A method for sequential authentication of a user, comprising:
challenging said user with at least one knowledge challenge to obtain an intermediate authentication result, wherein said intermediate authentication result is based on one or more of false accept and false reject error probabilities for each knowledge challenge; and
repeating said step of challenging said user with at least one knowledge challenge until a cumulative authentication result satisfies one or more criteria.
2. The method of claim 1, wherein said false accept error probability describes a probability of a different user answering said knowledge challenge correctly.
3. The method of claim 1, wherein said false reject error probability describes a probability of a genuine user not answering said knowledge challenge correctly.
4. The method of claim 1, wherein one or more of said false accept and false reject error probabilities are defined for a population of users.
5. The method of claim 1, wherein one or more of said false accept and false reject error probabilities are defined for said user.
6. The method of claim 1, wherein one or more of said false accept and false reject error probabilities are adapted based on field data.
7. The method of claim 1, wherein one or more of said false accept and false reject error probabilities are adapted based on known information about a given challenge.
8. The method of claim 1, wherein said intermediate authentication result is a continuous score.
9. The method of claim 1, wherein said intermediate authentication result is based on log likelihood ratio.
10. The method of claim 1, wherein said cumulative authentication result is a sum of individual log likelihood ratios.
11. The method of claim 1, wherein one or more of said intermediate authentication results and said cumulative authentication result are combined with a result from a biometric verification method.
12. The method of claim 1, wherein one or more of said intermediate authentication results and said cumulative authentication result are combined with a result from a speaker verification method.
13. A system for sequential authentication of a user, the apparatus comprising:
a memory; and
at least one processor, coupled to the memory, operative to:
challenge said user with at least one knowledge challenge to obtain an intermediate authentication result, wherein said intermediate authentication result is based on one or more of false accept and false reject error probabilities for each knowledge challenge; and
repeating said challenge until a cumulative authentication result satisfies one or more criteria.
14. The system of claim 13, wherein said false accept error probability describes a probability of a different user answering said knowledge challenge correctly and said false reject error probability describes a probability of a genuine user not answering said knowledge challenge correctly.
15. The system of claim 13, wherein one or more of said false accept and false reject error probabilities are defined for a population of users.
16. The system of claim 13, wherein one or more of said false accept and false reject error probabilities are defined for said user.
17. The system of claim 13, wherein one or more of said false accept and false reject error probabilities are adapted based on field data or known information about a given challenge.
18. The system of claim 13, wherein said intermediate authentication result is based on log likelihood ratio and said cumulative authentication result is a sum of individual log likelihood ratios.
19. The system of claim 13, wherein one or more of said intermediate authentication results and said cumulative authentication result are combined with one or more of a result from a biometric verification method and a result from a speaker verification method.
20. An article of manufacture for sequential authentication of a user, comprising a machine readable medium containing one or more programs which when executed implement the steps of:
challenging said user with at least one knowledge challenge to obtain an intermediate authentication result; and
repeating said step of challenging said user with at least one knowledge challenge until a cumulative authentication result satisfies one or more criteria.
US11/159,722 2005-06-23 2005-06-23 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge Abandoned US20060294390A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US11/159,722 US20060294390A1 (en) 2005-06-23 2005-06-23 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge
JP2006169543A JP4939121B2 (en) 2005-06-23 2006-06-20 Methods, systems, and programs for sequential authentication using one or more error rates that characterize each security challenge
CN200610093157.9A CN100485702C (en) 2005-06-23 2006-06-22 Method and apparatus for sequential authentication of user
US12/057,470 US8930709B2 (en) 2005-06-23 2008-03-28 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/159,722 US20060294390A1 (en) 2005-06-23 2005-06-23 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/057,470 Continuation US8930709B2 (en) 2005-06-23 2008-03-28 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge

Publications (1)

Publication Number Publication Date
US20060294390A1 true US20060294390A1 (en) 2006-12-28

Family

ID=37569015

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/159,722 Abandoned US20060294390A1 (en) 2005-06-23 2005-06-23 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge
US12/057,470 Active 2028-03-18 US8930709B2 (en) 2005-06-23 2008-03-28 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/057,470 Active 2028-03-18 US8930709B2 (en) 2005-06-23 2008-03-28 Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge

Country Status (3)

Country Link
US (2) US20060294390A1 (en)
JP (1) JP4939121B2 (en)
CN (1) CN100485702C (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070094497A1 (en) * 2005-10-21 2007-04-26 Avaya Technology Corp. Secure authentication with voiced responses from a telecommunications terminal
US20070220590A1 (en) * 2006-02-23 2007-09-20 Microsoft Corporation Non-intrusive background synchronization when authentication is required
US20070260774A1 (en) * 2006-03-30 2007-11-08 Oracle International Corporation Wrapper for Use with Global Standards Compliance Checkers
US20080133296A1 (en) * 2006-12-05 2008-06-05 Electronics And Telecommunications Research Institute Method and system for managing reliability of identification management apparatus for user centric identity management
US20080220872A1 (en) * 2007-03-08 2008-09-11 Timothy Michael Midgley Method and apparatus for issuing a challenge prompt in a gaming environment
WO2008110441A1 (en) * 2007-03-12 2008-09-18 Voice.Trust Ag Digital method and arrangement for authenticating a person
US20080316984A1 (en) * 2007-06-22 2008-12-25 Kabushiki Kaisha Toshiba Information processing appartus and control method of an information processing apparatus
WO2009020482A2 (en) * 2007-05-15 2009-02-12 Louisiana Tech University Research Foundation, A Division Of The Louisiana Tech University Foundation, Inc. Hidden markov model ('hmm')-based user authentication using keystroke dynamics
US20090199282A1 (en) * 2008-02-01 2009-08-06 Zhanna Tsitkova Techniques for non-unique identity establishment
US20090265770A1 (en) * 2008-04-16 2009-10-22 Basson Sara H Security system based on questions that do not publicly identify the speaker
US20100071031A1 (en) * 2008-09-15 2010-03-18 Carter Stephen R Multiple biometric smart card authentication
US20100131279A1 (en) * 2008-11-26 2010-05-27 Voice.Trust Ag Method and arrangement for controlling user access
US20100293608A1 (en) * 2009-05-14 2010-11-18 Microsoft Corporation Evidence-based dynamic scoring to limit guesses in knowledge-based authentication
US8027540B2 (en) 2008-01-15 2011-09-27 Xerox Corporation Asymmetric score normalization for handwritten word spotting system
US20140250515A1 (en) * 2013-03-01 2014-09-04 Bjorn Markus Jakobsson Systems and methods for authenticating a user based on a biometric model associated with the user
US8856879B2 (en) 2009-05-14 2014-10-07 Microsoft Corporation Social authentication for account recovery
US9363263B2 (en) * 2014-08-27 2016-06-07 Bank Of America Corporation Just in time polymorphic authentication
US20160306938A1 (en) * 2005-10-25 2016-10-20 Nxstage Medical, Inc. Safety Features for Medical Devices Requiring Assistance and Supervision
US20170161490A1 (en) * 2015-12-08 2017-06-08 Google Inc. Dynamically Updating CAPTCHA Challenges
CN107221333A (en) * 2016-03-21 2017-09-29 中兴通讯股份有限公司 A kind of identity authentication method and device
US9961076B2 (en) * 2015-05-11 2018-05-01 Genesys Telecommunications Laboratoreis, Inc. System and method for identity authentication
US9959694B2 (en) * 2006-04-24 2018-05-01 Jeffrey Dean Lindsay Security systems for protecting an asset
US20180152446A1 (en) * 2012-02-24 2018-05-31 Cirrus Logic International Semiconductor Ltd. System and method for speaker recognition on mobile devices
CN108111545A (en) * 2013-06-27 2018-06-01 英特尔公司 Continuous dual factor anthentication
US20200265132A1 (en) * 2019-02-18 2020-08-20 Samsung Electronics Co., Ltd. Electronic device for authenticating biometric information and operating method thereof
US10887107B1 (en) * 2017-10-05 2021-01-05 National Technology & Engineering Solutions Of Sandia, Llc Proof-of-work for securing IoT and autonomous systems
US11100204B2 (en) * 2018-07-19 2021-08-24 Motorola Mobility Llc Methods and devices for granting increasing operational access with increasing authentication factors
US20230008868A1 (en) * 2021-07-08 2023-01-12 Nippon Telegraph And Telephone Corporation User authentication device, user authentication method, and user authentication computer program
US11792024B2 (en) 2019-03-29 2023-10-17 Nok Nok Labs, Inc. System and method for efficient challenge-response authentication
US11831409B2 (en) 2018-01-12 2023-11-28 Nok Nok Labs, Inc. System and method for binding verifiable claims
US11868995B2 (en) 2017-11-27 2024-01-09 Nok Nok Labs, Inc. Extending a secure key storage for transaction confirmation and cryptocurrency
US11929997B2 (en) 2013-03-22 2024-03-12 Nok Nok Labs, Inc. Advanced authentication techniques and applications

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7895659B1 (en) * 2008-04-18 2011-02-22 The United States Of America As Represented By The Director, National Security Agency Method of assessing security of an information access system
US20090327131A1 (en) * 2008-04-29 2009-12-31 American Express Travel Related Services Company, Inc. Dynamic account authentication using a mobile device
US8312033B1 (en) 2008-06-26 2012-11-13 Experian Marketing Solutions, Inc. Systems and methods for providing an integrated identifier
WO2010009495A1 (en) * 2008-07-21 2010-01-28 Auraya Pty Ltd Voice authentication systems and methods
US9665854B1 (en) 2011-06-16 2017-05-30 Consumerinfo.Com, Inc. Authentication alerts
US10008206B2 (en) * 2011-12-23 2018-06-26 National Ict Australia Limited Verifying a user
CN102684882B (en) * 2012-05-16 2016-08-03 中国科学院计算机网络信息中心 Verification method and checking equipment
US9489950B2 (en) * 2012-05-31 2016-11-08 Agency For Science, Technology And Research Method and system for dual scoring for text-dependent speaker verification
US10664936B2 (en) 2013-03-15 2020-05-26 Csidentity Corporation Authentication systems and methods for on-demand products
US9633322B1 (en) 2013-03-15 2017-04-25 Consumerinfo.Com, Inc. Adjustment of knowledge-based authentication
US9721147B1 (en) 2013-05-23 2017-08-01 Consumerinfo.Com, Inc. Digital identity
CN103475490B (en) * 2013-09-29 2018-02-27 广州网易计算机系统有限公司 A kind of auth method and device
US9705676B2 (en) * 2013-12-12 2017-07-11 International Business Machines Corporation Continuous monitoring of fingerprint signature on a mobile touchscreen for identity management
US10373240B1 (en) 2014-04-25 2019-08-06 Csidentity Corporation Systems, methods and computer-program products for eligibility verification
US9996837B2 (en) 2014-06-06 2018-06-12 Visa International Service Association Integration of secure protocols into a fraud detection system
US9875347B2 (en) * 2014-07-31 2018-01-23 Nok Nok Labs, Inc. System and method for performing authentication using data analytics
WO2016131063A1 (en) * 2015-02-15 2016-08-18 Alibaba Group Holding Limited System and method for user identity verification, and client and server by use thereof
CN105991590B (en) * 2015-02-15 2019-10-18 阿里巴巴集团控股有限公司 A kind of method, system, client and server for verifying user identity
CN106034029A (en) 2015-03-20 2016-10-19 阿里巴巴集团控股有限公司 Verification method and apparatus based on image verification codes
JP6049958B1 (en) 2015-04-30 2016-12-21 真旭 徳山 Terminal device and computer program
JP6077191B1 (en) * 2015-04-30 2017-02-08 真旭 徳山 Terminal device and computer program
US10063533B2 (en) * 2016-11-28 2018-08-28 International Business Machines Corporation Protecting a web server against an unauthorized client application
US11062014B1 (en) * 2018-01-30 2021-07-13 Rsa Security Llc Dynamic challenge question-based authentication
US10911234B2 (en) 2018-06-22 2021-02-02 Experian Information Solutions, Inc. System and method for a token gateway environment
US11941065B1 (en) 2019-09-13 2024-03-26 Experian Information Solutions, Inc. Single identifier platform for storing entity data
CN113128296B (en) * 2019-12-31 2023-05-09 重庆傲雄在线信息技术有限公司 Electronic handwriting signature fuzzy label recognition system
JP2021197031A (en) * 2020-06-17 2021-12-27 オムロン株式会社 Information processing apparatus, permission determination method, and program

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020184538A1 (en) * 2001-05-30 2002-12-05 Fujitsu Limited Combined authentication system
US6529871B1 (en) * 1997-06-11 2003-03-04 International Business Machines Corporation Apparatus and method for speaker verification/identification/classification employing non-acoustic and/or acoustic models and databases
US20040049687A1 (en) * 1999-09-20 2004-03-11 Orsini Rick L. Secure data parser method and system
US20040083394A1 (en) * 2002-02-22 2004-04-29 Gavin Brebner Dynamic user authentication
US20040088587A1 (en) * 2002-10-30 2004-05-06 International Business Machines Corporation Methods and apparatus for dynamic user authentication using customizable context-dependent interaction across multiple verification objects
US20040153656A1 (en) * 2003-01-30 2004-08-05 Cluts Jonathan C. Authentication surety and decay system and method
US20040164139A1 (en) * 2003-02-25 2004-08-26 Hillhouse Robert D. Method and apparatus for biometric verification with data packet transmission prioritization
US20040164848A1 (en) * 2003-01-21 2004-08-26 Samsung Electronics Co., Ltd User authentication method and apparatus
US6857073B2 (en) * 1998-05-21 2005-02-15 Equifax Inc. System and method for authentication of network users
US20050132235A1 (en) * 2003-12-15 2005-06-16 Remco Teunen System and method for providing improved claimant authentication
US7086085B1 (en) * 2000-04-11 2006-08-01 Bruce E Brown Variable trust levels for authentication
US7434063B2 (en) * 2001-10-24 2008-10-07 Kabushiki Kaisha Toshiba Authentication method, apparatus, and system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ATE282990T1 (en) * 1998-05-11 2004-12-15 Citicorp Dev Ct Inc SYSTEM AND METHOD FOR BIOMETRIC AUTHENTICATION OF A USER USING A CHIP CARD
JP2000259278A (en) * 1999-03-12 2000-09-22 Fujitsu Ltd Device and method for performing indivisual authentication by using living body information
JP3699608B2 (en) * 1999-04-01 2005-09-28 富士通株式会社 Speaker verification apparatus and method
US6591224B1 (en) * 2000-06-01 2003-07-08 Northrop Grumman Corporation Biometric score normalizer
JP2003067340A (en) * 2001-08-28 2003-03-07 Mitsubishi Electric Corp Selection system for authentication, and authentication system
JP4068334B2 (en) * 2001-11-26 2008-03-26 日本電気株式会社 Fingerprint authentication method, fingerprint authentication system, and biometric authentication system
JP4214760B2 (en) * 2002-10-31 2009-01-28 沖電気工業株式会社 Personal authentication system using biometric information
KR100543699B1 (en) * 2003-01-21 2006-01-20 삼성전자주식회사 Method and Apparatus for user authentication
JP2005107592A (en) * 2003-09-26 2005-04-21 Bank Of Tokyo-Mitsubishi Ltd System and method for selecting authentication method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6529871B1 (en) * 1997-06-11 2003-03-04 International Business Machines Corporation Apparatus and method for speaker verification/identification/classification employing non-acoustic and/or acoustic models and databases
US6857073B2 (en) * 1998-05-21 2005-02-15 Equifax Inc. System and method for authentication of network users
US20040049687A1 (en) * 1999-09-20 2004-03-11 Orsini Rick L. Secure data parser method and system
US7086085B1 (en) * 2000-04-11 2006-08-01 Bruce E Brown Variable trust levels for authentication
US20020184538A1 (en) * 2001-05-30 2002-12-05 Fujitsu Limited Combined authentication system
US7434063B2 (en) * 2001-10-24 2008-10-07 Kabushiki Kaisha Toshiba Authentication method, apparatus, and system
US20040083394A1 (en) * 2002-02-22 2004-04-29 Gavin Brebner Dynamic user authentication
US20040088587A1 (en) * 2002-10-30 2004-05-06 International Business Machines Corporation Methods and apparatus for dynamic user authentication using customizable context-dependent interaction across multiple verification objects
US20040164848A1 (en) * 2003-01-21 2004-08-26 Samsung Electronics Co., Ltd User authentication method and apparatus
US20040153656A1 (en) * 2003-01-30 2004-08-05 Cluts Jonathan C. Authentication surety and decay system and method
US20040164139A1 (en) * 2003-02-25 2004-08-26 Hillhouse Robert D. Method and apparatus for biometric verification with data packet transmission prioritization
US20050132235A1 (en) * 2003-12-15 2005-06-16 Remco Teunen System and method for providing improved claimant authentication

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070094497A1 (en) * 2005-10-21 2007-04-26 Avaya Technology Corp. Secure authentication with voiced responses from a telecommunications terminal
US7694138B2 (en) * 2005-10-21 2010-04-06 Avaya Inc. Secure authentication with voiced responses from a telecommunications terminal
US20160306938A1 (en) * 2005-10-25 2016-10-20 Nxstage Medical, Inc. Safety Features for Medical Devices Requiring Assistance and Supervision
US11783939B2 (en) 2005-10-25 2023-10-10 Nxstage Medical, Inc. Safety features for medical devices requiring assistance and supervision
US9779223B2 (en) 2006-02-23 2017-10-03 Microsoft Technology Licensing, Llc Non-intrusive background synchronization when authentication is required
US20070220590A1 (en) * 2006-02-23 2007-09-20 Microsoft Corporation Non-intrusive background synchronization when authentication is required
US20110093948A1 (en) * 2006-02-23 2011-04-21 Microsoft Corporation Non-intrusive background synchronization when authentication is required
US7877797B2 (en) * 2006-02-23 2011-01-25 Microsoft Corporation Non-intrusive background synchronization when authentication is required
US10162951B2 (en) 2006-02-23 2018-12-25 Microsoft Technology Licensing, Llc Non-intrusive background synchronization when authentication is required
US8621600B2 (en) 2006-02-23 2013-12-31 Microsoft Corporation Non-intrusive background synchronization when authentication is required
US20070260774A1 (en) * 2006-03-30 2007-11-08 Oracle International Corporation Wrapper for Use with Global Standards Compliance Checkers
US7814069B2 (en) * 2006-03-30 2010-10-12 Oracle International Corporation Wrapper for use with global standards compliance checkers
US9959694B2 (en) * 2006-04-24 2018-05-01 Jeffrey Dean Lindsay Security systems for protecting an asset
US20080133296A1 (en) * 2006-12-05 2008-06-05 Electronics And Telecommunications Research Institute Method and system for managing reliability of identification management apparatus for user centric identity management
US20080220872A1 (en) * 2007-03-08 2008-09-11 Timothy Michael Midgley Method and apparatus for issuing a challenge prompt in a gaming environment
WO2008110441A1 (en) * 2007-03-12 2008-09-18 Voice.Trust Ag Digital method and arrangement for authenticating a person
US8600751B2 (en) 2007-03-12 2013-12-03 Voice.Trust Ag Digital method and arrangement for authenticating a person
US20090328200A1 (en) * 2007-05-15 2009-12-31 Phoha Vir V Hidden Markov Model ("HMM")-Based User Authentication Using Keystroke Dynamics
WO2009020482A3 (en) * 2007-05-15 2009-06-25 Louisiana Tech University Res Hidden markov model ('hmm')-based user authentication using keystroke dynamics
WO2009020482A2 (en) * 2007-05-15 2009-02-12 Louisiana Tech University Research Foundation, A Division Of The Louisiana Tech University Foundation, Inc. Hidden markov model ('hmm')-based user authentication using keystroke dynamics
US8136154B2 (en) 2007-05-15 2012-03-13 The Penn State Foundation Hidden markov model (“HMM”)-based user authentication using keystroke dynamics
US20080316984A1 (en) * 2007-06-22 2008-12-25 Kabushiki Kaisha Toshiba Information processing appartus and control method of an information processing apparatus
US8027540B2 (en) 2008-01-15 2011-09-27 Xerox Corporation Asymmetric score normalization for handwritten word spotting system
US8776198B2 (en) * 2008-02-01 2014-07-08 Oracle International Corporation Techniques for non-unique identity establishment
US20090199282A1 (en) * 2008-02-01 2009-08-06 Zhanna Tsitkova Techniques for non-unique identity establishment
US9311461B2 (en) * 2008-04-16 2016-04-12 International Business Machines Corporation Security system based on questions that do not publicly identify the speaker
US20090265770A1 (en) * 2008-04-16 2009-10-22 Basson Sara H Security system based on questions that do not publicly identify the speaker
US20100071031A1 (en) * 2008-09-15 2010-03-18 Carter Stephen R Multiple biometric smart card authentication
US8392965B2 (en) * 2008-09-15 2013-03-05 Oracle International Corporation Multiple biometric smart card authentication
US20100131279A1 (en) * 2008-11-26 2010-05-27 Voice.Trust Ag Method and arrangement for controlling user access
DE102008058883B4 (en) 2008-11-26 2023-07-27 Lumenvox Corporation Method and arrangement for controlling user access
US8903725B2 (en) * 2008-11-26 2014-12-02 Voice.Trust Ag Method and arrangement for controlling user access
US10013728B2 (en) 2009-05-14 2018-07-03 Microsoft Technology Licensing, Llc Social authentication for account recovery
US8856879B2 (en) 2009-05-14 2014-10-07 Microsoft Corporation Social authentication for account recovery
US20100293608A1 (en) * 2009-05-14 2010-11-18 Microsoft Corporation Evidence-based dynamic scoring to limit guesses in knowledge-based authentication
US9124431B2 (en) * 2009-05-14 2015-09-01 Microsoft Technology Licensing, Llc Evidence-based dynamic scoring to limit guesses in knowledge-based authentication
US10749864B2 (en) * 2012-02-24 2020-08-18 Cirrus Logic, Inc. System and method for speaker recognition on mobile devices
US20180152446A1 (en) * 2012-02-24 2018-05-31 Cirrus Logic International Semiconductor Ltd. System and method for speaker recognition on mobile devices
US11545155B2 (en) 2012-02-24 2023-01-03 Cirrus Logic, Inc. System and method for speaker recognition on mobile devices
US10666648B2 (en) 2013-03-01 2020-05-26 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US9203835B2 (en) * 2013-03-01 2015-12-01 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US9832191B2 (en) 2013-03-01 2017-11-28 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US20220239644A1 (en) * 2013-03-01 2022-07-28 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US20140250515A1 (en) * 2013-03-01 2014-09-04 Bjorn Markus Jakobsson Systems and methods for authenticating a user based on a biometric model associated with the user
US11349835B2 (en) 2013-03-01 2022-05-31 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US11863554B2 (en) * 2013-03-01 2024-01-02 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US11929997B2 (en) 2013-03-22 2024-03-12 Nok Nok Labs, Inc. Advanced authentication techniques and applications
CN108111545A (en) * 2013-06-27 2018-06-01 英特尔公司 Continuous dual factor anthentication
US9363263B2 (en) * 2014-08-27 2016-06-07 Bank Of America Corporation Just in time polymorphic authentication
US9619643B2 (en) 2014-08-27 2017-04-11 Bank Of America Corporation Just in time polymorphic authentication
US10313341B2 (en) 2015-05-11 2019-06-04 Genesys Telecommunications Laboratories, Inc. System and method for identity authentication
US9961076B2 (en) * 2015-05-11 2018-05-01 Genesys Telecommunications Laboratoreis, Inc. System and method for identity authentication
KR102069759B1 (en) * 2015-12-08 2020-02-11 구글 엘엘씨 Dynamic Updates for CAPTCHA Challenges
US10216923B2 (en) 2015-12-08 2019-02-26 Google Llc Dynamically updating CAPTCHA challenges
KR20180079423A (en) * 2015-12-08 2018-07-10 구글 엘엘씨 Dynamic update of CAPTCHA Challenge
US9977892B2 (en) * 2015-12-08 2018-05-22 Google Llc Dynamically updating CAPTCHA challenges
US20170161490A1 (en) * 2015-12-08 2017-06-08 Google Inc. Dynamically Updating CAPTCHA Challenges
CN107221333A (en) * 2016-03-21 2017-09-29 中兴通讯股份有限公司 A kind of identity authentication method and device
US10887107B1 (en) * 2017-10-05 2021-01-05 National Technology & Engineering Solutions Of Sandia, Llc Proof-of-work for securing IoT and autonomous systems
US11868995B2 (en) 2017-11-27 2024-01-09 Nok Nok Labs, Inc. Extending a secure key storage for transaction confirmation and cryptocurrency
US11831409B2 (en) 2018-01-12 2023-11-28 Nok Nok Labs, Inc. System and method for binding verifiable claims
US11100204B2 (en) * 2018-07-19 2021-08-24 Motorola Mobility Llc Methods and devices for granting increasing operational access with increasing authentication factors
US20200265132A1 (en) * 2019-02-18 2020-08-20 Samsung Electronics Co., Ltd. Electronic device for authenticating biometric information and operating method thereof
US11792024B2 (en) 2019-03-29 2023-10-17 Nok Nok Labs, Inc. System and method for efficient challenge-response authentication
US20230008868A1 (en) * 2021-07-08 2023-01-12 Nippon Telegraph And Telephone Corporation User authentication device, user authentication method, and user authentication computer program

Also Published As

Publication number Publication date
US20080222722A1 (en) 2008-09-11
US8930709B2 (en) 2015-01-06
CN1892666A (en) 2007-01-10
CN100485702C (en) 2009-05-06
JP4939121B2 (en) 2012-05-23
JP2007004796A (en) 2007-01-11

Similar Documents

Publication Publication Date Title
US8930709B2 (en) Method and apparatus for sequential authentication using one or more error rates characterizing each security challenge
US8656469B2 (en) Methods and apparatus for dynamic user authentication using customizable context-dependent interaction across multiple verification objects
US11545155B2 (en) System and method for speaker recognition on mobile devices
EP2784710B1 (en) Method and system for validating personalized account identifiers using biometric authentication and self-learning algorithms
US7356168B2 (en) Biometric verification system and method utilizing a data classifier and fusion model
US20180047397A1 (en) Voice print identification portal
US7487089B2 (en) Biometric client-server security system and method
JP4196973B2 (en) Personal authentication apparatus and method
AU2004300140B2 (en) System and method for providing improved claimant authentication
US20070150747A1 (en) Method and apparatus for multi-model hybrid comparison system
JP2003132023A (en) Personal authentication method, personal authentication device and personal authentication system
CA2736133A1 (en) Voice authentication system and methods
US20210366489A1 (en) Voice authentication system and method
Poh et al. Can chimeric persons be used in multimodal biometric authentication experiments?
Maes et al. Conversational speech biometrics
Gupta et al. Text dependent voice based biometric authentication system using spectrum analysis and image acquisition
Bennett Access control by audio‐visual recognition
Pelecanos et al. Conversational biometrics: a probabilistic view
AU2012200605B2 (en) Voice authentication system and methods
Rylov et al. The Discriminant-Stochastic Approach of the Speaker Verification for Entry Control by the Biometrical Technologies
Kounoudes et al. Intelligent Speaker Verification based Biometric System for Electronic Commerce Applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAVRATIL, JIRI;OSBORN, RYAN L.;PELECANOS, JASON W.;AND OTHERS;REEL/FRAME:016491/0618

Effective date: 20050623

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION