US20170004573A1 - Workflow processing and user interface generation based on activity data - Google Patents

Workflow processing and user interface generation based on activity data Download PDF

Info

Publication number
US20170004573A1
US20170004573A1 US14/918,169 US201514918169A US2017004573A1 US 20170004573 A1 US20170004573 A1 US 20170004573A1 US 201514918169 A US201514918169 A US 201514918169A US 2017004573 A1 US2017004573 A1 US 2017004573A1
Authority
US
United States
Prior art keywords
user
data
score
payment
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US14/918,169
Inventor
Mikael Hussain
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Klarna Bank AB
Original Assignee
KLARNA AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by KLARNA AB filed Critical KLARNA AB
Priority to US14/918,169 priority Critical patent/US20170004573A1/en
Assigned to KLARNA AB reassignment KLARNA AB ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUSSAIN, MIKAEL
Priority to US15/167,890 priority patent/US10387882B2/en
Publication of US20170004573A1 publication Critical patent/US20170004573A1/en
Assigned to KLARNA BANK AB reassignment KLARNA BANK AB CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: KLARNA AB
Pending legal-status Critical Current

Links

Images

Classifications

    • G06Q40/025
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/12Payment architectures specially adapted for electronic shopping systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/552Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/03Credit; Loans; Processing thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/12Accounting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities

Definitions

  • FIG. 1 illustrates an example of online merchant webpages communicating with an application programming interface in accordance with an embodiment
  • FIG. 2 illustrates an example of a risk engine in accordance with an embodiment
  • FIG. 3 illustrates an example of different screens presented to different users based on a fidelity score in accordance with an embodiment
  • FIG. 4 illustrates an example of selecting different payment options based on a fidelity score in accordance with an embodiment
  • FIG. 5 is a flowchart that illustrates an example of displaying payment options based on a fidelity score in accordance with an embodiment
  • FIG. 6 is a flowchart that illustrates an example of selecting payment options for display based on a fidelity score in accordance with an embodiment
  • FIG. 7 is a flowchart that illustrates an example of generating a fidelity score in accordance with an embodiment
  • FIG. 8 illustrates an environment in which various embodiments can be implemented.
  • the system of the present disclosure provides executable instructions, capable of capturing measurements of user interactions with the particular user interface, to a computing device hosting the particular interface.
  • the executable instructions may be a set of JavaScript instructions configured to execute in a web browser in association with one or more webpages of an online merchant website.
  • the executable instructions as a result of being executed, may collect information such as mouse tracking data (e.g., paths of a cursor, elements in the webpage being hovered over by a cursor, identity of the previous webpage, time between receiving input by the user (e.g., time between clicks of a mouse), identities of elements clicked on in the webpages, identity of the current webpage, and so on.
  • the executable instructions may be in the form of a browser plug-in, a Java applet or other embedded executable object, or at least a subset of executable instructions of a standalone application (e.g., software application for a computing device, mobile application for a smart phone, etc.).
  • the system of the present disclosure may accumulate this set of measurements up to a point where the user is ready to apply for credit or select payment options for the transaction.
  • the computing device hosting the particular interface accumulates the set of measurements and, before payment method selection by the user, may provide the set of measurements and personally identifiable information (e.g., first and family name, address, telephone number, email address, etc.) to the system of the present disclosure.
  • the personally identifiable information or set of measurements may also include information such as Internet protocol address of the computing device, browser type, operating system, and so on.
  • the personally identifiable information and set of measurements may be referred to cumulatively as a set of client data; that is, data obtained from the client device of the user.
  • the system may, at this point, obtain a set of internal data associated with the user.
  • the set of internal data may include details stored with this system about previous transactions by the user, such as types of previous purchases, amounts of purchases, payment history, and so on.
  • internal data may include geographic and demographic data.
  • the set of internal data includes specific details about one or more previous transactions, while in other implementations the specific details of one or more previous transactions are aggregated into a summary of attributes (e.g., has the user paid before, and if so, when did the user pay, what was the average payment, has the user previously been late with payment, did user pay debt off early, does the user only pay the minimum payment or less or more than the minimum payment, etc.), while, in still other implementations, the specific details are processed (e.g., by a neural network, by a vector machine (relevance or support), by a random forest, or some other supervised learning algorithm) to yield an internal data score, which may be used in part to generate a fidelity score, described below.
  • a neural network e.g., by a vector machine (relevance or support), by a random forest, or some other supervised learning algorithm
  • the system may also obtain, from an external source, a set of verification data associated with the user.
  • the set of verification data may include verification data regarding the personally identifiable information.
  • the set of verification data may include simple verification data regarding whether the address provided is a valid address, whether the telephone number is a valid telephone number, whether the provided full name is known by the external source, and so on.
  • the set of verification data may include more complex verification data, such as whether the address is known to be associated with the provided last and/or given name, or whether the provided email address is known to be associated with the user name by the external source.
  • a set of user behaviors and characteristics may be presumed. For example, based on demographic data, certain browser types may be more likely to be associated with certain types of users than others. For example, use of a version of the Internet Explorer browser that is known to have been distributed preinstalled with a computing device operating system may suggest that the user is less fickle than a user using the Google Chrome browser. Similarly, if characteristics of the browser suggest that the user is using an anonymous browser or mimicking another browser type, this may be an indication that the user may be less reliable when making credit commitments. Data received about the type and version of the operating system being used by the client device of the user may be comparably useful.
  • the weights to be associated with the behavioral and characteristic determinations, the set of internal data, and the set of verification data may be used to generate a fidelity score that may reflect a projected likelihood of default on payment by the user.
  • This fidelity score may be generated by a logistic regression, a random forest similar to the supervised models described in U.S. patent application Ser. No. 14/820,468, entitled “INCREMENTAL LOGIN AND AUTHENTICATION TO USER PORTAL WITHOUT USERNAME/PASSWORD,” U.S. patent application Ser. No. 14/830,686, entitled “METHOD FOR USING SUPERVISED MODEL TO IDENTIFY USER,” and U.S. patent application Ser. No. 14/830,690, entitled “METHOD FOR USING SUPERVISED MODEL TO CONFIGURE USER INTERFACE PRESENTATION,” incorporated by reference herein, or by some other classification algorithm.
  • the resulting fidelity score may suggest that the user has a likelihood of default above a particular threshold, or the information available on the user (i.e., the client data, internal data, and verification data), may be insufficient to generate a reliable fidelity score (e.g., a minimum number of input values are not available), in which case, the system may submit a request to an external entity (e.g., a credit bureau, a bank, etc.) for additional data.
  • This additional data may include information such as a credit score or credit history of the user, verification of a minimum bank balance, and so on.
  • This additional data may be used to generate a new fidelity score, which in some implementations, includes the original fidelity score as a factor in generating the new fidelity score.
  • the system may cause the particular user interface being used by the user to conduct the transaction to update in order to display certain payment or credit options that the user has been determined to qualify for.
  • An e-commerce site to which techniques of the present disclosure can be applied may be a website for buying and/or selling goods or services, such as an interface (e.g., software application residing on a local device, website, or other interface) for a retail outlet, discount house, wholesale outlet, bank, credit provider, currency exchange service, insurance provider, investment services provider, debt resolution service, brokerage, bazaar, auction house, shopping center, boutique, supermarket, chain store, thrift shop, flea market, sales kiosk, concession stand, trade fair, or consignment house.
  • an interface e.g., software application residing on a local device, website, or other interface
  • a retail outlet discount house, wholesale outlet, bank, credit provider, currency exchange service, insurance provider, investment services provider, debt resolution service, brokerage, bazaar, auction house, shopping center, boutique, supermarket, chain store, thrift shop, flea market, sales kiosk, concession stand, trade fair, or consignment house.
  • techniques described and suggested in the present disclosure improve the usability of computing systems by reducing the latency of risk determination by making the risk determination without recourse to external data. Moreover, techniques described and suggested in the present disclosure are necessarily rooted in computer technology in order to overcome problems specifically arising with network latency caused by obtaining credit risk data from external sources.
  • FIG. 1 illustrates an aspect of an environment 100 in which an embodiment may be practiced.
  • the environment 100 may include a set of web pages 124 of a merchant system 118 , with at least one of the set of web pages 120 being a checkout page 122 for finalizing a transaction.
  • the set of web pages 120 may be static web pages, may be dynamic web pages generated based on one or more templates or executable instructions (client-side and/or server-side), or may be a combination of static and dynamic web pages.
  • the set of web pages may be designed to provide information to consumers about a merchant of the merchant system 118 and/or products or services being offered by the merchant.
  • Executable code embedded in the checkout page 122 may cause a computing device used by the user in conducting the transaction to collect personally identifiable information (e.g., name, geographic address, telephone number, email address, date of birth, etc.) entered by a user as well as behavioral data relating to interactions between the user and the set of web pages 120 (e.g., time and identity of web page elements clicked on, hovered over, etc.) as client data 124 .
  • the executable code in the checkout page 122 may cause the computing device to make an application programming interface call (API), such as a GetCredit( ) API 128 of a scoring engine 126 , to begin a process of determining a set of payment or credit options to present to the user in the checkout page 122 .
  • API application programming interface call
  • the data may be captured by calling one or more APIs of the operating system of the computing device.
  • the data may be captured by executing code having direct access to hardware of the computing device (e.g., keyboard, mouse, touch screen, and/or other input devices).
  • the merchant system 118 in some examples, may be single devices and, in other examples, may be distributed computer systems comprising multiple devices that operate differently such that the distributed computer system performs the operations described (i.e., all operations of the merchant system 118 map not necessarily be performed by a single device).
  • the merchant system 118 configured to present a website or other Internet-accessible platform for providing goods and/or services to users/consumers at a price.
  • the merchant system 118 may have one or more web pages comprising the set of web pages 120 that are configured to interact with a user/consumer. That is, web pages of the set of web pages 120 may be configured to display images of various products, descriptions of the various products, reviews of various products, and prices of the various products.
  • Web pages of the set of web pages may have one or more embedded controls, such as clickable images and HyperText Markup Language (HTML) form elements, configured to allow the user/consumer to navigate the website, search for products, compare products, view larger images of the products, post reviews of the product, add/remove products to an online shopping cart, enter delivery and billing information, log in and manage an account profile with the merchant, and so on.
  • embedded controls such as clickable images and HyperText Markup Language (HTML) form elements, configured to allow the user/consumer to navigate the website, search for products, compare products, view larger images of the products, post reviews of the product, add/remove products to an online shopping cart, enter delivery and billing information, log in and manage an account profile with the merchant, and so on.
  • HTML HyperText Markup Language
  • Executable code may be embedded in one or more of the set of web pages 120 to collect details (also referred to as a set of measurements) about user interactions with the webpages.
  • each of the set of web pages 120 may include JavaScript or other client-side executable code that keeps a record of certain actions performed by a user (also referred to herein as a set of measurements).
  • the code may be provided by the merchant or may be embedded in the merchant web pages in an HTML inline frame (iframe), with the iframe source being provided by a payment service provider (which may be the same provider as the provider of system of the present disclosure).
  • the record may be a set of measurements of actions performed by the user, and each measurement of the set of measurements may include information such as a user or session identifier for the particular user/session, a detected action (e.g., onClick, onDblClick, onMouseOver, onMouseOut, onMouseDown, onMouseUp, onMouseMove, and onSubmit event triggers, etc.) performed by the user using a user input device (e.g., mouse, trackball, touch screen, keyboard, light pen, game controller, fingerprint scanner, etc.), and a timestamp for the detected action.
  • a detected action e.g., onClick, onDblClick, onMouseOver, onMouseOut, onMouseDown, onMouseUp, onMouseMove, and onSubmit event triggers, etc.
  • a user input device e.g., mouse, trackball, touch screen, keyboard, light pen, game controller, fingerprint
  • gaze detection data is used to generate behavioral data.
  • many mobile devices are configured to detect an area of a screen receiving the focus of the gaze of the user, referred to as gaze detection or eye tracking Gaze detection data may be obtained from an image sensor or some other optical sensor that senses reflections of light from the eye. The gaze detection data could be used to determine how long the user spends reading certain portions of the terms and conditions.
  • an accelerometer, global positioning system receiver, gyroscope, microphone, and other sensors may provide data useful for generating behavioral data.
  • “behavioral data” may refer to inferences made by the system based on interactions between the user and the user interface. That is, the record of actions performed by the user may be used by the system to infer behavior of the user.
  • the behavioral data and personally identifiable information may be cumulatively referred to as the client data 124 .
  • executable code other than JavaScript code and interfaces other than HTML interfaces may be used by the merchant system 118 .
  • the merchant system 118 may additionally or alternatively provide standalone mobile applications for devices like smartphones and tablet computing devices for conducting online transactions.
  • the merchant corresponding to the merchant system 118 may provide services rather than products; for example, the merchant system 118 may be a system to provide an online presence (i.e., e-commerce site, as described above) for commerce and financial transactions.
  • Examples for the merchant system 118 would include a system for providing an network (e.g., Internet, local area network, wide area network, etc.) accessible site for a bank, credit union, stockbroker, roofing contractor, landscaper, cleaning service, peer-to-peer lender, and so on.
  • an network e.g., Internet, local area network, wide area network, etc.
  • At least one web page of the set of web pages 120 may be the checkout page 122 .
  • the checkout page 122 may be one or more pages configured to handle finalizing the transaction.
  • an initial checkout page may allow the user/consumer to review details (e.g., quantities, colors, sizes, prices, etc.) of items in the shopping cart
  • other checkout pages may allow the user/consumer to log into his/her account with the merchant system 118 , to select from a set of delivery options, select gift wrap options, and/or enter delivery and billing addresses.
  • Another checkout page 122 may involve presenting the user/consumer with one or more payment/credit options.
  • the one or more payment/credit options presented to the user/consumer may be presented based on the collected client data 124 .
  • the payment or credit options on the checkout page 122 may be generated by a server of the merchant based on the fidelity score and/or other information received from the scoring engine 126 , or, alternatively, the payment or credit options on the checkout page 122 may be presented through an iframe, application software, or software development kit, provided by a payment service provider.
  • the client data 124 may be fed to the scoring engine 126 through the GetCredit( ) API 128 .
  • the scoring engine 126 of FIG. 1 upon receiving the client data through the GetCredit( ) API 128 generates a series of variables 130 and corresponding values based on the client data.
  • a variable “given name” may have the value of “Winston”
  • a variable “family name” may have the value of “Churchill”
  • a variable of “time_spent_reading_terms_and_conditions” may have a value of 32 seconds
  • a variable of “browser_version” may have a value of “8.0”
  • a variable of “time_between_clickdown_submit_and_clickrelease” may have the value of 2103 milliseconds, and so on.
  • the scoring engine 126 may next pass the variable 130 to the risk assessment API 132 in the risk engine 134 to perform a risk assessment of the user/consumer based on the variables 130 and their values.
  • the risk engine 134 may be comprised of one or more computing devices and/or software configured to perform a risk assessment of a user/consumer based on the variables 130 provided.
  • the result of the risk assessment may be a value representing an estimated risk of default of payment by the user/consumer, may be a set of executable code (e.g., executable JavaScript) and/or HTML with payment or credit options, or may be some other output as appropriate based on the determined risk associated with the user/consumer.
  • the risk engine 134 may pass the variables 130 through the risk model, described in further detail with respect to FIG. 2 .
  • the risk engine 134 may reference external data 136 of a third party entity, such as a credit bureau service, in the course of performing its risk assessment.
  • Data obtained from the external data source 136 may be data such as credit scores, confirmation of personally identifiable information (e.g., name, address, telephone number, email address, date of birth, etc.), and so on.
  • Such data about the user/consumer may be stored in the cache 140 , and, in some cases, if such external data about the user/consumer is located in the cache and is not out of date, such information may be retrieved from the cache 140 without resorting to retrieving it from the external data source 136 .
  • personally identifiable information may be used to retrieve details about previous transactions involving the user/consumer from internal data 138 of the scoring engine 126 .
  • the internal data 138 may include details about one or more previous transactions identified as having been conducted by the user/consumer. If the scoring engine 126 has no record of the user/consumer conducting a previous transaction, this lack of information may itself be a factor in computing a risk assessment for the user/consumer. For example, if details of the previous transaction recorded in the internal data 138 indicate that the user/consumer faithfully made payments for the previous purchases and paid in full, these details may weigh toward a favorable risk assessment. However if no records of previous transactions are found, the scoring engine 126 has no payment history for the user/consumer, and consequently this lack of payment history may weigh towards a less favorable risk assessment. On the other hand, records of previous transactions indicating that the user/consumer was unreliable with payments for previous purchases may weigh towards an unfavorable risk assessment.
  • the details about previous customer transactions are obtained from a record matching service 146 of the type described in U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference, configured to store and retrieve records regarding previous transactions by the user/consumer.
  • the record matching service 146 may be a system comprising one or more computing devices configured to at least identify the user/consumer from the client data 124 .
  • the record matching service 146 may determine that the user/consumer has a low probability of being the same “Henry Gibson” as was found in the internal data 138 .
  • the risk engine may generate its assessment at least in part from the values of the variables 130 . For example, if the variable values indicate that the user/consumer added and removed an expensive item from the shopping cart multiple times, this behavioral data may indicate that the user/consumer is operating at a limit of what the user/consumer is able to afford. Consequently, this may weigh towards a less favorable risk assessment. As another example, if the variables 130 indicate that the user/consumer compared a variety of similar products before finally selecting the product with the best reviews, this behavioral data may indicate that the user/consumer has given careful thought to the purchase, and consequently this may weigh towards a more favorable risk assessment.
  • information in the personally identifiable information may be used to verify the stability of the user/consumer. For example, if the name and address provided by the user is checked against a database, such as the external data source 136 , and the external data source 136 responds that the name and address are indeed known to be associated with each other, then this may weigh towards a more favorable risk assessment. On the other hand, if the external data source 136 is unable to find a match against the name and address provided by the user/consumer this may weigh towards a less favorable risk assessment.
  • a matching family name with a non-matching given name may suggest that a child of the household of that address is placing the order, rather than an adult, and consequently the child may be at a higher risk of default of payment on the transaction.
  • FIG. 2 illustrates another aspect of an environment 200 in which an embodiment may be practiced.
  • the environment 200 may include a user 244 using a computing device 242 to conduct a transaction.
  • client data such as the client data 124 of FIG. 1
  • the risk engine 234 may be similar to the risk engine 134 of FIG. 1 .
  • the user 244 may be an individual conducting a transaction through the computing device 242 or may be another entity authorized to conduct such a transaction for the individual.
  • the user 244 may be a first-time user in the environment 200 , or may have previously conducted transactions using the risk engine 234 of the environment 200 , in which case, details of the previous transactions may be recorded in the internal data 206 .
  • the computing device 242 may be any type of computing device having one or more processors and memory, such as the computing device 800 of FIG. 8 , capable of receiving input from the user 244 and communicating with components of the risk engine 234 through the network 248 .
  • the network 248 represents the path of communication between the user and the risk engine 234 . Examples of the network 248 include the Internet, a local area network, a wide area network and Wi-Fi.
  • client data may first be passed through a set of rules 202 . That is, the model 204 of the risk engine 234 is configured to provide a risk assessment based on internal data 206 , verification data 208 , and behavioral data 210 , but the set of rules 202 provide an initial screen of the user 244 to determine whether to proceed with the risk assessment. For example, if the user 244 is a returning user who has defaulted on a previous purchase, the set of rules 202 may be configured to immediately reject the user 244 or present of the user 244 with a fixed set of payment or credit options, such as “prepayment required.” In other words, the set of rules are rules that would not be overcome by a risk assessment from the model 204 .
  • the model 204 may include a combination of hardware and software configured to output a risk assessment based on one or more of the internal data 206 , the verification data 208 , and the behavioral data 210 .
  • An advantage provided by the model 204 is that transactions can be processed and finalized more quickly, because an online transaction system may rely on a risk assessment generated by the model 204 without having to request a credit check of the user 244 by a third party entity, such as a credit bureau. Furthermore, because some credit bureau services charge a fee for credit checking services, cost savings may be achieved by relying on the risk assessment of the model 204 .
  • the risk engine 234 may be further configured to update/retrain the model 204 based on the collected data (e.g., client data, details about the transaction, etc.) by associating the collected data in a data store with payments by the user 244 as they are received in a timely, or, as the case may be, in a not so timely fashion from the user, and regenerating the logistic regression or retraining the neural network, vector machine, random forest, or other supervised learning algorithm upon which the model 204 is based.
  • the collected data e.g., client data, details about the transaction, etc.
  • the internal data 206 used by the model 204 may include, if the user is a returning user, details about previous transactions, such as addresses used in previous transactions, purchase amounts of previous transactions, methods of payment used in previous transactions, whether the user paid more or less than the minimum payment, whether the user accrued any late fees, whether any funds are still owed on previous transactions, and so on. If the user 244 is a new user rather than a returning user, this lack of a purchase history for the user 244 is itself internal data 206 that may be taken into consideration by the model 204 . In some implementations, the user 244 is determined to be a new user if the user 244 creates a new account/user profile with the merchant or payment services provider involved in the transaction.
  • the determination that the user 244 is a new user is made by a record matching service of the type described in U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference.
  • a determination that the user 244 is a new user is made by determining that there is not a match between personally identifiable information provided by the user 244 and records in the internal data 206 .
  • a user profile is created for a new user, and if the user 244 conducts transactions in the future, details of the future transactions may be stored in a data store hosting the internal data 206 .
  • the internal data 206 component incorporate the record matching service of U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference.
  • the client data provided by the user 244 may be usable by the random forest of the record matching service to make a determination about the identity of the user.
  • the random forest of the record matching service is trained on behavioral data obtained in a manner as described in the present disclosure.
  • an identity of the user 244 may be determinable at least in part based on the user's interactions with the user interface, or may provide additional information usable to make a more certain determination about the identity of the user 244 .
  • the user 244 need not create an account with a merchant, but rather the identity of the user is determined by the record matching service. In other implementations, the user 244 can create an account with the merchant.
  • the record matching service includes detailed information about previous transactions likely conducted by the user 244 based on client data received.
  • the behavioral data for the current transaction and previous transactions are stored in a data store accessible to the record matching service, and with recourse to the previously recorded behavioral data, the model 204 may be able to generate more accurate risk assessments.
  • the verification data 208 may include one or more pieces of data indicating that the user 244 has provided accurate information.
  • the verification data 208 may be based on personally identifiable information submitted with client data by the computing device 244 .
  • personally identifiable information may be a device identifier (ID) for the computing device being used by the user to conduct the transaction. If the internal data 206 contains information that shows that the present user has previously conducted a transaction using the same device (based on the device ID), this may be indicative of stability (e.g., not making purchases using borrowed computing devices) and consequently weigh in favor of lower risk.
  • ID device identifier
  • verification data 208 may be a simple result of true or false for whether the address provided by the user 244 actually exists.
  • some of the personally identifiable information e.g., name, address, etc.
  • each piece of verification data 208 may have an associated strength level. For example, a determination that an address exists may have a strength level of 1, indicating that it is a weak verification of the accuracy of information provided by the user 244 .
  • a piece of verification data 208 indicating that a given name and family name provided by the user 244 has been confirmed as being associated with the physical address, email address, date of birth, and telephone number also provided by the user may have a strength level of 5, indicating that it is a very strong verification of the accuracy of information provided by the user 244 .
  • verification that a given name and family name provided by the user is associated with an address provided by the user has a stronger strength level than a verification that only the family name, but not the given name, provided by the user could be verified against that address.
  • strength levels such as these are provided to the model 204 rather than Boolean values (e.g., yes/no, true/false, etc.).
  • the verification data 208 may be a set of one or more pieces of data such as described above. In some instances, the verification data may be determined using the internal data 206 . For example, if the user 244 is a returning user, the information provided in the present transaction may be compared with information provided in previous transactions to yield the verification data 208 . If such verification data 208 cannot be obtained from the internal data 206 , the model 204 may resort to obtaining at least some of the verification data 208 from a third-party data source 216 , such as a credit bureau, a telephone directory, or an internet search engine.
  • a third-party data source 216 such as a credit bureau, a telephone directory, or an internet search engine.
  • the behavioral data 210 may additionally or alternatively include more granular data than this.
  • the executable code collecting the actions and the record of actions may collect data indicating that the user 244 scrolled quickly through various portions of the terms and conditions, but paused for 10 minutes on the section relating to consequences of nonpayment.
  • behavioral data 210 examples include whether the personally identifiable information was automatically populated by the browser or browser plug-in (indicating that the user has used the same information in previous transactions, and also indicating that the user is not trying to disguise him/herself, suggesting that the user may be more reliable), whether the user click-held the submit button for an amount of time suggesting that the user was being very thoughtful about the transaction (e.g., such as may be captured by onMouseDown and onMouseUp JavaScript event triggers), and whether the user had to make corrections while inputting the address data (possibly suggesting that the user is entering a fake address or that the user has not resided at that address for very long). Still another example of behavioral data 210 may be if the user repeatedly decreases the amount of items in the shopping cart, which may be suggestive that the user has a limited budget.
  • the third-party data source 214 may be third party entity for providing verification data 208 about the user 244 .
  • verification data 208 could include confirmation that the name provided by the user 244 has been previously known to be associated with an address and/or telephone number also provided by the user 244 in the personally identifiable information.
  • third-party data source 216 could be a telephone directory service, package delivery service, credit bureau, or mapping service.
  • the output may include a value indicating a statistical certainty value of a fidelity score generated by the model based on the information provided to the model and information that was unable to be provided to the model 204 .
  • the statistical certainty value may be a variance indicating a whether enough information was provided to the model to generate a reliable fidelity score. That is, available information from the data 206 , the verification data 208 , and the behavioral data 210 may be provided to the model, but not all information may be present. For example, the user may have left a telephone number field blank, no record of a previous purchase may be found in internal data, or an external source may be unable to confirm or deny some of the personally identifiable information.
  • the model 204 may make a determination whether, if the missing values were provided, the resulting score would be affected such that payment/credit options would change for the user. If so, in some implementations, the system of the present disclosure queries an external entity for the additional data 214 in order to obtain a more certain fidelity score. Alternatively or additionally, in some implementations, the system responds by updating the user interface being used by the user to prompt the user for additional information (e.g., “Please input your telephone number,” “Your email address is required to proceed further,” “Please enter your mother's maiden name,” etc.).
  • additional information e.g., “Please input your telephone number,” “Your email address is required to proceed further,” “Please enter your mother's maiden name,” etc.
  • the output 212 of the model 204 may, depending on implementation, take various forms.
  • the output 212 may be a simple “yes” (the user is a good credit risk), “no” (the user is a bad credit risk), or “not enough information.”
  • the output may be a number representing an estimated percentage chance of default by the user 244 . In the latter case, payment/credit options to present to the user 244 may vary based on the generated number. For example, 0-10% may enable display of a “No money down, pay in 30 days” option, 0-20% may enable display of a “Pay by credit card” option, and 0-30% may enable display of a “Pay on delivery” option.
  • 80-90% may only allow a “Pre-pay” option.
  • terms of credit may be determined or altered based on the output. For example, output indicating a 0-5% risk of default may cause the user 244 to be presented with credit terms having an 8% interest rate, output indicating a 5-10% risk of default may cause the user 244 to be presented with credit terms having a 15% interest rate, output indicating a 10-20% risk of default may cause the user 244 to be presented with credit terms having a 29.99% interest rate, and so on. In this manner, credit terms may be determined for the user 244 through a risk assessment of behavioral data obtained from the user 244 .
  • the internal data 206 , behavioral data 210 , and verification data 208 alone may be insufficient information to result in a reliable risk estimate (e.g., a “not enough information” result or a numeric range of 40-60%).
  • the model may retrieve additional data 214 from a credit bureau service in order to make a more reliable risk estimate.
  • the additional data 214 may be a credit score of the user, identifying information about the user (e.g., current and past addresses, date of birth, known employers, name of spouse, national identification number, etc.), credit history of the user (e.g., list of bank accounts, list of credit card accounts, list of outstanding loans, how long an account has been open, credit limits on accounts, identities of co-signers, etc.), public records associated with the user (e.g., list of evictions reported, list of bankruptcies, list of tax liens, list of civil judgements), or other information about the user (e.g., identities of other entities who have requested a credit report on the user).
  • identifying information about the user e.g., current and past addresses, date of birth, known employers, name of spouse, national identification number, etc.
  • credit history of the user e.g., list of bank accounts, list of credit card accounts, list of outstanding loans, how long an account has been open, credit limits on accounts, identities of co-signers, etc.
  • the additional data 214 may alone be sufficient for the risk assessment, and the additional data 214 may be provided as the output 212 .
  • the additional data 214 is fed back into the model 204 , and the additional data 214 in combination with the internal data 206 , the behavioral data 210 , and the verification data 208 may be utilized to generate a new risk assessment for the output 212 .
  • an advantage provided by the system of the present disclosure is that in many cases a reliable risk assessment may be generated without resorting to using the additional data 214 from an external data source.
  • This provides advantages in speed, usability, and cost. For example, not having to contact a credit service bureau to obtain a credit assessment of the user 244 may save 1.3 seconds or more of transaction processing time. Although 1.3 seconds may not seem like much, every second spent by a user waiting for his/her computing device to respond can have a negative effect on the user's overall satisfaction with the transaction process.
  • credit service bureaus may charge a fee for each credit check a merchant or payment service provider may reap savings by being able to generate a reliable risk assessment without recourse to the additional data 214 .
  • FIG. 3 illustrates a couple of scenarios 300 from an embodiment of the present disclosure. Specifically, FIG. 3 depicts a first user 344 A and a second user 344 B at a checkout stage of separate transactions.
  • client data for the first user 344 A is transmitted to the rules engine 302 of the system of the present disclosure.
  • the client data may include personally identifiable information and data reflecting interactions of the first user 344 A with the user interfaces during the course of conducting the transaction being finalized.
  • the rules engine 302 determines whether, based on the client data received, any rules exist that would preempt risk processing of the first user 344 A by the model 304 . If not, the client data of the first user 344 A is passed to the model 304 .
  • internal data 306 A, verification data 308 A, behavioral data 310 A are obtained or generated by the model 304 and processed to yield a first risk assessment 312 A.
  • the internal data 306 A may be used to generate an internal data score based on details about previous transactions conducted by the user 344 A.
  • the verification data 308 A may be used to generate a verification data score indicating a confidence level with the accuracy of information provided by the user 344 A.
  • the behavioral data 310 A may be used to generate a behavioral data score reflecting a level of intent to pay by the user 344 A.
  • Each of these three scores may be weighted and/or fed into a risk algorithm, random forest, regression model, neural network, vector machine, other supervised learning algorithm to yield the risk assessment 312 A.
  • the internal data 306 A, the verification data 308 A, and the behavioral data 310 A may be processed to yield a set of variables.
  • the set of variables themselves may be weighted and/or fed into a risk algorithm, random forest, regression model, neural network, vector machine, other supervised learning algorithm to yield the risk
  • the risk assessment 312 A suggests that the first user 344 A is estimated to have a low risk of default on payment. Consequently, executable code of the system of the present disclosure, as a result of being executed, may cause the payment user interface 322 A to update to display a list of payment/credit options available (“Pre-Pay,” “Credit,” “Debit,” “Pay in 14 Days,”) to the first user 344 A based on the risk assessment 312 A.
  • a similar process is performed for the second user 344 B. That is, client data for the second user 344 B is transmitted to the rules engine 302 for determination whether, based on the client data, any rules exist that would preempt risk processing of the second user 344 B by the model 304 . If not, the client data of the second user 344 B is passed to the model 304 . As with the first user 344 A, based on this client data, internal data 306 B, verification data 308 B, and behavioral data 310 B are obtained or generated by the model 304 and processed in a manner similar to that described for the first user 344 A to yield a second risk assessment 312 B.
  • the risk assessment 312 B suggests that the second user 344 B is estimated to have a high risk of default on payment. Consequently, the executable code of the system of the present disclosure, as a result of being executed, may cause the payment user interface 322 B to update the list of payment/credit options available to the second user 344 B based on the risk assessment 312 B to only allow one method of payment, which is to require the second user 344 to pre-pay for the purchase before the merchant will agree to the transaction.
  • FIG. 4 illustrates an example scenario 400 of an embodiment of the present disclosure.
  • FIG. 4 depicts an example of selecting payment/credit options from a set of payment/credit options 442 for display in a checkout user interface 422 for the user.
  • the set of payment/credit options 442 may be various options for remitting payment for purchases from a particular merchant by users (e.g., credit card, cash on delivery, deferred payment, bank withdrawal, payment service, etc.).
  • Each merchant utilizing the system of the present disclosure may specify which payment/credit options should be included in the set of payment/credit options 442 .
  • each merchant utilizing the system of the present disclosure may also specify the ranges of fidelity scores necessary to enable the particular payment/credit options of the set of payment/credit options 442 . In this manner, each merchant utilizing the system can customize which payment/credit options he or she are willing to accept and how much risk they are willing to accept for each particular payment or credit option.
  • FIG. 4 illustrates that, for the particular fidelity score generated for the user currently conducting the checkout, the generated fidelity score for the user falls outside of the specified ranges for at least a first payment/credit option 442 A and a second payment/credit option 442 B, as well as a second-to-last payment/credit option 442 N- 1 . However, the generated fidelity score appears to have fallen within the ranges for a third payment/credit option 442 C, a fourth payment/credit option 442 D, and a last payment/credit option 442 N.
  • the third payment/credit option 442 C, the fourth payment/credit option 442 D, and the last payment/credit option 442 N are displayed to the user as available methods of payment, while the first payment/credit option 442 A, the second payment/credit option 442 B, and the second-to-last payment/credit option 442 N- 1 are not shown to the user.
  • updating the checkout user interface 422 may be achieved with JavaScript.
  • the updated user interface 422 may be dynamically generated using server-side executable instructions as a security measure in order to prevent a user from accessing hidden payment and credit options.
  • server-side executable instructions may be, depending upon implementation, executed by a web server of the merchant or may be executed by a server of a payment service via a call embedded in the merchant website, such as through an HTML iframe, software application, or software development kit.
  • FIG. 5 is a flowchart illustrating an example of a process 500 for determining payment and credit options based on a fidelity score in accordance with various embodiments.
  • Some or all of the process 500 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
  • the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
  • process 500 may be performed by any suitable system, such as a server in a data center or the computing device 800 described in conjunction with FIG.
  • the process 500 includes a series of operations wherein a request is received by the system performing the process 500 to determine some payment or credit options for a particular user, client data is received regarding the user, and a fidelity score (i.e., risk assessment score, backspace) is generated and options are displayed based on the fidelity score.
  • a fidelity score i.e., risk assessment score, backspace
  • a request to determine a set of payment or credit options to present to a user in an interface is received.
  • the request is received directly from the client device that the user is using to conduct the transaction.
  • executable code may be embedded in the user interface running in a browser or other application on the client device, that, as a result of being executed, causes the request to be made directly to the system performing the process 500 .
  • the request is received from a computing device of the merchant with whom the user is conducting a transaction. That is, the user may have just clicked a button (e.g., “Proceed to checkout”) which may cause the merchant computing system to submit the request for payment or credit options for the next interface screen to the system performing the process 500 .
  • client data comprising personally identifiable information provided by the client device and the user him/herself and a set of information about the interactions between the user and the user interface during the conduct of the transaction may be received.
  • This client data may be used to yield one or more variables and corresponding variable values, from which internal data, verification data, and behavioral data may be obtained, as described in connection with FIGS. 1 and 2 .
  • a fidelity score (also referred to as a risk assessment or risk assessment score) may be generated.
  • the fidelity score may indicate an estimated risk of default by the user conducting the transaction at hand. Consequently, in 508 , the system performing the process 500 makes a determination whether the resulting fidelity score is sufficiently reliable to make a determination which payment or credit options to present to the user without recourse to additional information. That is, a statistical certainty value, generated based on the available client data, may be a variance that indicates whether enough information was provided to the model to generate a reliable fidelity score. For example, it may be that behavioral data included some factors suggesting a high risk of default as well as other factors suggesting a low risk of default by the user.
  • the system performing the process 500 may determine that insufficient information is available for reliable fidelity score determination.
  • the process 500 may determine to obtain additional information if the generated fidelity score, using only the client data received in 504 , is near a threshold value used for determining whether or not to show a payment/credit option. For example, if a threshold value for displaying a particular payment/credit option is 40 and the generated fidelity score is 41, the system performing the process 500 may determine to obtain additional information and regenerate the fidelity score. If the regenerated fidelity score indicates a lower risk of default (e.g., 39), the system performing the process 500 may determine that the regenerated fidelity score is more reflective of the actual security risk of the user, and present the particular payment/credit option to the user.
  • a threshold value for displaying a particular payment/credit option is 40 and the generated fidelity score is 41
  • the system performing the process 500 may determine to obtain additional information and regenerate the fidelity score. If the regenerated fidelity score indicates a lower risk of default (e.g.
  • the system may proceed to 510 , whereupon the system may make a request to the computing device of a third party entity for additional data; for example, the system may request a credit score of the user from a credit service bureau.
  • the system performing the process 500 may return to 506 to update the fidelity score based at least in part on the additional information.
  • the system performing the process may repeat the operations of 506 - 10 multiple times, each time at 510 querying a different third party entity until a determination on a fidelity score is made. For example, at a first time at 510 the system may query a first third-party entity which may provide additional information about the user (e.g., how long the user has resided at the given address, age of the user, etc.). This information may be obtained quickly and inexpensively, but may be less valuable for risk assessment than more expensively obtained additional information.
  • additional information about the user e.g., how long the user has resided at the given address, age of the user, etc.
  • the system performing the process 500 may query a second third party entity, which may be slower or may require a larger fee than the first third-party entity, for more additional information, and so on.
  • the system may determine which payment or credit options to present to the user for selection and cause the client device of the user to display the determined payment/credit options. For example, if, despite attempting to obtain additional information, the system performing the process 500 was still unable to generate a reliable fidelity score, the system may err on the side of caution and only present conservative payment/credit options (e.g., “Pre-pay,” “Cash on Delivery,” etc.). Likewise, such conservative payment/credit options may be determined and displayed if the fidelity score for the user indicates that the user is a high risk of default on payment.
  • conservative payment/credit options may be determined and displayed if the fidelity score for the user indicates that the user is a high risk of default on payment.
  • fidelity score indicates that the user is at low risk of default
  • more generous payment/credit options may be determined and presented (e.g., “0% down, no payments for 90 days,” “Pay in 12 easy monthly installments,” etc.). Payment and credit options may be even more granular; for example, an installment payment option on a checkout webpage may include a slider bar allowing the user to select the number of as many installments (e.g., three months, six months, 12 months, 24 months, etc.) as a fidelity score may allow. Note that one or more of the operations performed in 502 - 20 may be performed in various orders and combinations, including in parallel.
  • FIG. 6 is a flowchart illustrating an example of a process 600 for displaying payment and credit options in a user interface in accordance with various embodiments.
  • Some or all of the process 600 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
  • the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
  • some or all of the process 600 may be performed by any suitable system, such as a server in a data center or by the computing device 800 described in conjunction with FIG. 8 .
  • the process 600 illustrates in further detail the operations performed in 512 of FIG. 5 .
  • the fidelity score generated by the system performing the process 500 of FIG. 5 is obtained by the system performing the process 600 .
  • the systems performing the processes 500 and 600 may be the same or different systems.
  • a set of potential payment options are obtained.
  • Each of the payment/credit options of the set may be associated with a fidelity score range.
  • a fidelity score range of 0-60 may be associated with the option “Pay by credit card.”
  • the option to pay by credit card may be displayed in the interface for the user.
  • the set of payment/credit options may be arranged in various orders, such as the most conservative payment/credit options first and the most generous payment/credit options at the last of the set. Thus, starting with the first payment/credit option of the set, the system performing the process may proceed to 606 .
  • the system performing the process 600 may determine whether the fidelity score obtained in 602 falls within the range associated with the current payment/credit option being considered.
  • the fidelity score may indicate an estimated risk of default, with a higher number meaning greater risk.
  • the fidelity score may be generated such that a lower number indicates a greater risk.
  • a negative fidelity score may indicate an above average risk of default while in a positive fidelity score may indicate a below average risk of default, or vice versa.
  • certain payment and credit options may be shown to or hidden from the user if the fidelity score crosses/exceeds a particular high threshold, while in other implementations, certain payment and credit options may be shown to or hidden from the user if the fidelity score crosses/falls below a particular low threshold.
  • payment and credit options are shown if the fidelity score lies within a range of values (e.g., an upper and lower threshold). If the fidelity score does not fall within the range associated with the current payment/credit option, the system performing the process 600 may return to 604 to obtain the next payment/credit option. However, if the fidelity score falls within the range of the current payment/credit option, the system performing the process 600 may proceed to 608 to select the payment/credit option for display in the checkout user interface.
  • the system performing the process 600 determines whether it has assessed all of the possible payment and credit options in the set of payment/credit options. If not, the system may return to 604 to retrieve the next payment/credit option and repeat the operations of 604 - 10 . However, if the last payment/credit option has been processed, the system may proceed to 612 , whereupon the selected payment/credit options from 608 may be displayed in a checkout user interface for finalizing the transaction. For example, if the fidelity score is 40 (i.e., 40% chance of default of payment), and a first payment/credit option has a range of 0-20, the fidelity score can be seen to be out of range of the payment/credit options.
  • the operations of 604 retrieved a second payment/credit option which has a range of 0-35, which again falls short of the fidelity score in this example.
  • the operations of 604 retrieved a third payment/credit option that has a range of 0-55, it can be seen that the user qualifies for this payment/credit option. Consequently, in 608 the third payment/credit option would be among the payment/credit options selected.
  • This process may repeat from 604 through 610 until all possible payment and credit options are considered, and then in 612 the third payment/credit option and any others whose range overlapped the fidelity score and selected in 608 may be displayed on the checkout user interface for the user. It is contemplated that the process 600 may be one of several methods for selecting payment and credit options based on fidelity score may be possible. Note also that one or more of the operations performed in 602 - 20 may be performed in various orders and combinations, including in parallel.
  • FIG. 7 is a flowchart illustrating an example of a process 700 for generating a fidelity score in accordance with various embodiments.
  • Some or all of the process 700 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
  • the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
  • process 700 may be performed by any suitable system, such as a server in a data center or by the computing device 800 described in conjunction with FIG. 8 .
  • the process 700 includes a series of operations wherein sets of behavioral data, verification data, and internal data are transformed into a set of variables and corresponding variable values, which are then input into a data model to generate the fidelity score.
  • behavioral data is transformed into one or more variables and corresponding variable values.
  • behavioral data may include data about the interactions between the user and the user interface being used to conduct a transaction. Examples may include variables such as time_between_clickdown_submit_and_clickrelease, user_misspelled_last_name, and time_spent_reading_terms_and_conditions.
  • verification data may be transformed into one or more variables and corresponding variable values.
  • verification data may be data verifying one or more elements of personally identifiable information provided by the user.
  • variables generated from verification data include variables such as email_associated_with_user_name, user_name_associated_with_phone_number, user_name_associated_with_address, and so on.
  • Values for the verification data variables may be obtained from internal data (e.g., data regarding previous transactions conducted by the user, etc.) or may be obtained with reference to an external data source, such as a credit bureau service or a telephone directory service.
  • internal data is transformed into one or more variables and corresponding variable values.
  • internal data may include details regarding previous transactions conducted by the user.
  • variables generated from internal data may include variables such as number_of_purchases_in_last_3 months, amount_of_last_purchase, payment_method_of_last_purchase, and so on.
  • the system performing the process 700 may determine whether additional data is available. For example, if a previous attempt to generate a fidelity score was made, and, per a determination such as the determination made via the operations of 508 of FIG. 5 , the available behavioral data, verification data, and internal data was insufficient to generate a reliable fidelity score, the system of the present disclosure may have made a call to an external data source, such as a credit bureau, to obtain additional information. Thus, if such additional information is available, the system performing the process 700 may proceed to 710 , whereupon the additional data is transformed into one or more variables having corresponding values. An example of such a variable may be credit_score.
  • the various variables and their corresponding values generated in steps 702 - 10 may be passed through a data model.
  • the data model may be any of various types of data models, such as a random forest generated from a large data set of similar data, neural network, vector machine, other supervised learning algorithm, or some other regression model generated from a sample set representative of typical user data for conducting transactions in the manner described in the present disclosure.
  • the output of the model may be indicative of a risk of default on payment by the present user.
  • the output may be provided to the merchant, some other entity, or to an internal process for determining which payment and credit options to present to the user in an interface, such as a checkout-type interface.
  • a statistical certainty value may accompany a fidelity score, indicating a whether enough information was provided to the model to generate a reliable fidelity score. That is, if some information was unavailable to be provided to the fidelity score, the statistical certainty value may indicate, if the missing information were available and favored positive or negative risk assessment, whether the fidelity score would change such that payment and credit options provided to the user would also change.
  • the statistical certainty value may be output in various formats. For example, the statistical certainty value may be in the form of a variance/standard error (e.g., standard deviation) or other value indicative of how the fidelity score could change based on worst case/best case scenario.
  • the model determines a fidelity score for a particular user that indicates that the user has a 5% chance of default on payment.
  • this fidelity score may have been generated by the model without having certain information (i.e., some behavioral data, verification data, and/or internal data may have been unavailable).
  • a statistical certainty value may indicate that, if certain unavailable information favored higher risk (e.g., worst-case information), the user's risk of default could rise to 30%.
  • the system of the present disclosure may make a determination whether, based on the potential risk of default of 30%, the cost of obtaining additional data from an external source is worth mitigating the potential risk. It may be that the system determines that 30% is an acceptable risk when weighed against the costs in time and money associated with obtaining the additional data.
  • one or more of the operations performed in 702 - 20 may be performed in various orders and combinations, including in parallel.
  • the system of the present disclosure is configured as a service that provides the fidelity score, and, in some cases, as the statistical certainty value based on internal data, verification data, and client data provided by a subscriber to the service.
  • the service may be offered to subscribers as a faster and cheaper alternative for obtaining credit risk data than traditional credit bureau services.
  • the system of the present disclosure is configured to update a checkout page with items other than payment and credit options, or may be configured to update one or more webpages with information different than payment and credit options but based on collected client data.
  • the system of the present disclosure may present product offers believed to be of interest to such an age group, such as clothing of certain styles and so on.
  • the client data suggests that the user is within a certain age group or participates in certain high risk activities (such as conducting transactions while in a moving vehicle)
  • this information may be usable by the system in determining insurance premiums, loan interest rates, whether to offer an option for a reverse mortgage, and so on.
  • executable instructions also referred to as code, applications, agents, etc.
  • operations that “instructions” do not ordinarily perform unaided denote that the instructions are being executed by a machine, thereby causing the machine to perform the specified operations.
  • FIG. 8 is an illustrative, simplified block diagram of an example computing device 800 that may be used to practice at least one embodiment of the present disclosure.
  • the computing device 800 may be used to implement any of the systems illustrated herein and described above.
  • the computing device 800 may be configured for use as a data server, a web server, a portable computing device, a personal computer, or any electronic computing device.
  • the computing device 800 may include one or more processors 802 that may be configured to communicate with, and are operatively coupled to, a number of peripheral subsystems via a bus subsystem 804 .
  • the processors 802 may be utilized for the traversal of decision trees in random forest of supervised models in embodiments of the present disclosure (e.g., cause the evaluation of inverse document frequencies of various search terms, etc.).
  • These peripheral subsystems may include a storage subsystem 806 , comprising a memory subsystem 808 and a file storage subsystem 810 , one or more user interface input devices 812 , one or more user interface output devices 814 , and a network interface subsystem 816 .
  • Such storage subsystem 806 may be used for temporary or long-term storage of information such as details associated with transactions described in the present disclosure, databases of historical records described in the present disclosure, and storage of decision rules of the supervised models in the present disclosure).
  • the bus subsystem 804 may provide a mechanism for enabling the various components and subsystems of computing device 800 to communicate with each other as intended. Although the bus subsystem 804 is shown schematically as a single bus, alternative embodiments of the bus subsystem utilize multiple busses.
  • the network interface subsystem 816 may provide an interface to other computing devices and networks.
  • the network interface subsystem 816 may serve as an interface for receiving data from, and transmitting data to, other systems from the computing device 800 .
  • the network interface subsystem 816 may enable a data technician to connect the device to a wireless network such that the data technician may be able to transmit and receive data while in a remote location, such as a user data center.
  • the bus subsystem 804 may be utilized for communicating data, such as details, search terms, and so on to the supervised model of the present disclosure, and may be utilized for communicating the output of the supervised model to the one or more processors 802 and to merchants and/or creditors via the network interface subsystem 816 .
  • the user interface input devices 812 may include one or more user input devices, such as a keyboard, pointing devices such as an integrated mouse, trackball, touchpad, or graphics tablet, a scanner, a barcode scanner, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices.
  • user input devices such as a keyboard, pointing devices such as an integrated mouse, trackball, touchpad, or graphics tablet, a scanner, a barcode scanner, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices.
  • input device is intended to include all possible types of devices and mechanisms for inputting information to the computing device 800 .
  • the one or more user interface output devices 814 may include a display subsystem, a printer, or non-visual displays such as audio output devices, etc.
  • the display subsystem may be a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), light emitting diode (LED) display, or a projection or other display device.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • LED light emitting diode
  • output device is intended to include all possible types of devices and mechanisms for outputting information from the computing device 800 .
  • the one or more output devices 814 may be used, for example, to present user interfaces to facilitate user interaction with applications performing processes described herein and variations therein, where such interaction may be appropriate.
  • the storage subsystem 806 may provide a computer-readable storage medium for storing the basic programming and data constructs that may provide the functionality of at least one embodiment of the present disclosure.
  • the applications programs, code modules, instructions that, as a result of being executed by one or more processors, may provide the functionality of one or more embodiments of the present disclosure, and may be stored in the storage subsystem 806 .
  • These application modules or instructions may be executed by the one or more processors 802 .
  • the storage subsystem 806 may additionally provide a repository for storing data used in accordance with the present disclosure.
  • the storage subsystem 806 may comprise a memory subsystem 808 and a file/disk storage subsystem 810 .
  • the memory subsystem 808 may include a number of memories, including a main random access memory (RAM) 818 for storage of instructions and data during program execution and a read only memory (ROM) 820 in which fixed instructions may be stored.
  • the file storage subsystem 810 may provide a non-transitory persistent (non-volatile) storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a Compact Disk Read Only Memory (CD-ROM) drive, an optical drive, removable media cartridges, and other like storage media.
  • CD-ROM Compact Disk Read Only Memory
  • the computing device 800 may include at least one local clock 824 .
  • the local clock 824 may be a counter that represents the number of ticks that have transpired from a particular starting date and may be located integrally within the computing device 800 .
  • the local clock 824 may be used to synchronize data transfers in the processors for the computing device 800 and all of the subsystems included therein at specific clock pulses and may be used to coordinate synchronous operations between the computing device 800 and other systems in a data center.
  • the local clock 824 is an atomic clock.
  • the local clock is a programmable interval timer.
  • the computing device 800 may be of various types, including a portable computer device, tablet computer, a workstation, or any other device described below. Additionally, the computing device 800 may include another device that may be connected to the computing device 800 through one or more ports (e.g., USB, a headphone jack, Lightning connector, etc.). The device that may be connected to the computing device 800 may include a plurality of ports configured to accept fiber-optic connectors. Accordingly, this device may be configured to convert optical signals to electrical signals that may be transmitted through the port connecting the device to the computing device 800 for processing. Due to the ever-changing nature of computers and networks, the description of the computing device 800 depicted in FIG. 8 is intended only as a specific example for purposes of illustrating the preferred embodiment of the device. Many other configurations having more or fewer components from the system depicted in FIG. 8 are possible.
  • the conjunctive phrases “at least one of A, B, and C” and “at least one of A, B and C” refer to any of the following sets: ⁇ A ⁇ , ⁇ B ⁇ , ⁇ C ⁇ , ⁇ A, B ⁇ , ⁇ A, C ⁇ , ⁇ B, C ⁇ , ⁇ A, B, C ⁇ .
  • conjunctive language is not generally intended to imply that certain embodiments require at least one of A, at least one of B and at least one of C each to be present.
  • Processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.
  • Processes described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof.
  • the code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors.
  • the computer-readable storage medium may be non-transitory.

Abstract

A system obtains a first set of data associated with interactions by a user conducting a transaction using an interface of a computing device, a second set of data associated with a previous transaction, and a third set of data associated with verification of a user identity. The system generates a first score based at least in part on the first, second, and the third sets of data. If the first score does not cross a threshold, the system updates the interface based at least in part on the first score. However, if the first score crosses the threshold, the system obtains, from an external entity, a fourth set of data associated with the user, generates a second score based at least in part on the fourth set of data, and updates the interface based at least in part on the second score.

Description

    BACKGROUND
  • These days, many transactions are conducted through the Internet, typically by consumers using a browser with a personal computing system or through an application running on a mobile device, such as a smartphone. In order to keep transaction costs low, merchants, creditors, peer-to-peer lenders, and payment services go to considerable effort to mitigate the risk of default on a transaction by a consumer. To this end, some merchants and payment services may contact a private credit bureau to obtain the consumer's credit score. However, contacting a credit bureau during an online transaction is costly, both in terms of time and money, because such credit bureaus typically charge for their services and because of latency involved in contacting and receiving an answer.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Various embodiments in accordance with the present disclosure will be described with reference to the drawings, in which:
  • FIG. 1 illustrates an example of online merchant webpages communicating with an application programming interface in accordance with an embodiment;
  • FIG. 2 illustrates an example of a risk engine in accordance with an embodiment;
  • FIG. 3 illustrates an example of different screens presented to different users based on a fidelity score in accordance with an embodiment;
  • FIG. 4 illustrates an example of selecting different payment options based on a fidelity score in accordance with an embodiment;
  • FIG. 5 is a flowchart that illustrates an example of displaying payment options based on a fidelity score in accordance with an embodiment;
  • FIG. 6 is a flowchart that illustrates an example of selecting payment options for display based on a fidelity score in accordance with an embodiment;
  • FIG. 7 is a flowchart that illustrates an example of generating a fidelity score in accordance with an embodiment; and
  • FIG. 8 illustrates an environment in which various embodiments can be implemented.
  • DETAILED DESCRIPTION
  • In one example, the system of the present disclosure provides executable instructions, capable of capturing measurements of user interactions with the particular user interface, to a computing device hosting the particular interface. For example, the executable instructions may be a set of JavaScript instructions configured to execute in a web browser in association with one or more webpages of an online merchant website. The executable instructions, as a result of being executed, may collect information such as mouse tracking data (e.g., paths of a cursor, elements in the webpage being hovered over by a cursor, identity of the previous webpage, time between receiving input by the user (e.g., time between clicks of a mouse), identities of elements clicked on in the webpages, identity of the current webpage, and so on. In other examples, the executable instructions may be in the form of a browser plug-in, a Java applet or other embedded executable object, or at least a subset of executable instructions of a standalone application (e.g., software application for a computing device, mobile application for a smart phone, etc.).
  • The system of the present disclosure may accumulate this set of measurements up to a point where the user is ready to apply for credit or select payment options for the transaction. Alternatively, in some implementations the computing device hosting the particular interface accumulates the set of measurements and, before payment method selection by the user, may provide the set of measurements and personally identifiable information (e.g., first and family name, address, telephone number, email address, etc.) to the system of the present disclosure. The personally identifiable information or set of measurements may also include information such as Internet protocol address of the computing device, browser type, operating system, and so on. The personally identifiable information and set of measurements may be referred to cumulatively as a set of client data; that is, data obtained from the client device of the user.
  • The system may, at this point, obtain a set of internal data associated with the user. The set of internal data may include details stored with this system about previous transactions by the user, such as types of previous purchases, amounts of purchases, payment history, and so on. In some cases, internal data may include geographic and demographic data. In some implementations, the set of internal data includes specific details about one or more previous transactions, while in other implementations the specific details of one or more previous transactions are aggregated into a summary of attributes (e.g., has the user paid before, and if so, when did the user pay, what was the average payment, has the user previously been late with payment, did user pay debt off early, does the user only pay the minimum payment or less or more than the minimum payment, etc.), while, in still other implementations, the specific details are processed (e.g., by a neural network, by a vector machine (relevance or support), by a random forest, or some other supervised learning algorithm) to yield an internal data score, which may be used in part to generate a fidelity score, described below.
  • The system may also obtain, from an external source, a set of verification data associated with the user. The set of verification data may include verification data regarding the personally identifiable information. For example, the set of verification data may include simple verification data regarding whether the address provided is a valid address, whether the telephone number is a valid telephone number, whether the provided full name is known by the external source, and so on. Additionally or alternatively, the set of verification data may include more complex verification data, such as whether the address is known to be associated with the provided last and/or given name, or whether the provided email address is known to be associated with the user name by the external source.
  • Based on the set of client data, a set of user behaviors and characteristics may be presumed. For example, based on demographic data, certain browser types may be more likely to be associated with certain types of users than others. For example, use of a version of the Internet Explorer browser that is known to have been distributed preinstalled with a computing device operating system may suggest that the user is less fickle than a user using the Google Chrome browser. Similarly, if characteristics of the browser suggest that the user is using an anonymous browser or mimicking another browser type, this may be an indication that the user may be less reliable when making credit commitments. Data received about the type and version of the operating system being used by the client device of the user may be comparably useful. Similarly, if the user hovers the cursor over a submit button for at least a threshold amount of time (as determined from the set of measurements), this may indicate that the user is applying careful consideration before completing the transaction, suggesting that the user may be more reliable in making credit commitments than a user who quickly clicks the submit button. Note that such reliability data is presented for illustrative purposes only, and an amount of weight to give to such behaviors may be based on aggregated transaction and statistical data.
  • The weights to be associated with the behavioral and characteristic determinations, the set of internal data, and the set of verification data may be used to generate a fidelity score that may reflect a projected likelihood of default on payment by the user. This fidelity score may be generated by a logistic regression, a random forest similar to the supervised models described in U.S. patent application Ser. No. 14/820,468, entitled “INCREMENTAL LOGIN AND AUTHENTICATION TO USER PORTAL WITHOUT USERNAME/PASSWORD,” U.S. patent application Ser. No. 14/830,686, entitled “METHOD FOR USING SUPERVISED MODEL TO IDENTIFY USER,” and U.S. patent application Ser. No. 14/830,690, entitled “METHOD FOR USING SUPERVISED MODEL TO CONFIGURE USER INTERFACE PRESENTATION,” incorporated by reference herein, or by some other classification algorithm.
  • In some cases, the resulting fidelity score may suggest that the user has a likelihood of default above a particular threshold, or the information available on the user (i.e., the client data, internal data, and verification data), may be insufficient to generate a reliable fidelity score (e.g., a minimum number of input values are not available), in which case, the system may submit a request to an external entity (e.g., a credit bureau, a bank, etc.) for additional data. This additional data may include information such as a credit score or credit history of the user, verification of a minimum bank balance, and so on. This additional data may be used to generate a new fidelity score, which in some implementations, includes the original fidelity score as a factor in generating the new fidelity score. Based on either the original fidelity score (if the score indicates that the user is sufficiently reliable) or the new fidelity score, the system may cause the particular user interface being used by the user to conduct the transaction to update in order to display certain payment or credit options that the user has been determined to qualify for.
  • In the following description, various techniques will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of possible ways of implementing the techniques. However, it will also be apparent that the techniques described below may be practiced in different configurations without the specific details. Furthermore, well-known features may be omitted or simplified to avoid obscuring the technique being described.
  • Techniques described and suggested in the present disclosure improve the field of computing, specifically the field of e-commerce, by determining risk of default of payment by users based at least in part on interactions between the users and an e-commerce user interface. An e-commerce site to which techniques of the present disclosure can be applied may be a website for buying and/or selling goods or services, such as an interface (e.g., software application residing on a local device, website, or other interface) for a retail outlet, discount house, wholesale outlet, bank, credit provider, currency exchange service, insurance provider, investment services provider, debt resolution service, brokerage, bazaar, auction house, shopping center, boutique, supermarket, chain store, thrift shop, flea market, sales kiosk, concession stand, trade fair, or consignment house. Additionally, techniques described and suggested in the present disclosure improve the usability of computing systems by reducing the latency of risk determination by making the risk determination without recourse to external data. Moreover, techniques described and suggested in the present disclosure are necessarily rooted in computer technology in order to overcome problems specifically arising with network latency caused by obtaining credit risk data from external sources.
  • FIG. 1 illustrates an aspect of an environment 100 in which an embodiment may be practiced. As illustrated in FIG. 1, the environment 100 may include a set of web pages 124 of a merchant system 118, with at least one of the set of web pages 120 being a checkout page 122 for finalizing a transaction. The set of web pages 120 may be static web pages, may be dynamic web pages generated based on one or more templates or executable instructions (client-side and/or server-side), or may be a combination of static and dynamic web pages. The set of web pages may be designed to provide information to consumers about a merchant of the merchant system 118 and/or products or services being offered by the merchant. Executable code embedded in the checkout page 122 may cause a computing device used by the user in conducting the transaction to collect personally identifiable information (e.g., name, geographic address, telephone number, email address, date of birth, etc.) entered by a user as well as behavioral data relating to interactions between the user and the set of web pages 120 (e.g., time and identity of web page elements clicked on, hovered over, etc.) as client data 124. The executable code in the checkout page 122 may cause the computing device to make an application programming interface call (API), such as a GetCredit( ) API 128 of a scoring engine 126, to begin a process of determining a set of payment or credit options to present to the user in the checkout page 122. In some implementations, the data may be captured by calling one or more APIs of the operating system of the computing device. In other implementations, the data may be captured by executing code having direct access to hardware of the computing device (e.g., keyboard, mouse, touch screen, and/or other input devices).
  • The merchant system 118 in some examples, may be single devices and, in other examples, may be distributed computer systems comprising multiple devices that operate differently such that the distributed computer system performs the operations described (i.e., all operations of the merchant system 118 map not necessarily be performed by a single device). The merchant system 118 configured to present a website or other Internet-accessible platform for providing goods and/or services to users/consumers at a price. The merchant system 118 may have one or more web pages comprising the set of web pages 120 that are configured to interact with a user/consumer. That is, web pages of the set of web pages 120 may be configured to display images of various products, descriptions of the various products, reviews of various products, and prices of the various products. Web pages of the set of web pages may have one or more embedded controls, such as clickable images and HyperText Markup Language (HTML) form elements, configured to allow the user/consumer to navigate the website, search for products, compare products, view larger images of the products, post reviews of the product, add/remove products to an online shopping cart, enter delivery and billing information, log in and manage an account profile with the merchant, and so on.
  • Executable code may be embedded in one or more of the set of web pages 120 to collect details (also referred to as a set of measurements) about user interactions with the webpages. For example, each of the set of web pages 120 may include JavaScript or other client-side executable code that keeps a record of certain actions performed by a user (also referred to herein as a set of measurements). The code may be provided by the merchant or may be embedded in the merchant web pages in an HTML inline frame (iframe), with the iframe source being provided by a payment service provider (which may be the same provider as the provider of system of the present disclosure). The record may be a set of measurements of actions performed by the user, and each measurement of the set of measurements may include information such as a user or session identifier for the particular user/session, a detected action (e.g., onClick, onDblClick, onMouseOver, onMouseOut, onMouseDown, onMouseUp, onMouseMove, and onSubmit event triggers, etc.) performed by the user using a user input device (e.g., mouse, trackball, touch screen, keyboard, light pen, game controller, fingerprint scanner, etc.), and a timestamp for the detected action.
  • In some implementations, gaze detection data is used to generate behavioral data. For example, many mobile devices are configured to detect an area of a screen receiving the focus of the gaze of the user, referred to as gaze detection or eye tracking Gaze detection data may be obtained from an image sensor or some other optical sensor that senses reflections of light from the eye. The gaze detection data could be used to determine how long the user spends reading certain portions of the terms and conditions. Similarly, an accelerometer, global positioning system receiver, gyroscope, microphone, and other sensors may provide data useful for generating behavioral data. For example, if the background noise indicates that the user is conducting the transaction in a noisy environment, such as a public bus or sporting event, or an accelerometer suggests that the user is conducting the transaction in a moving vehicle, this could be indicative that the user is hurriedly conducting the transaction and may not be giving it sufficient consideration. In some examples, “behavioral data” may refer to inferences made by the system based on interactions between the user and the user interface. That is, the record of actions performed by the user may be used by the system to infer behavior of the user. The behavioral data and personally identifiable information may be cumulatively referred to as the client data 124.
  • Note that it is contemplated that executable code other than JavaScript code and interfaces other than HTML interfaces may be used by the merchant system 118. For example, the merchant system 118 may additionally or alternatively provide standalone mobile applications for devices like smartphones and tablet computing devices for conducting online transactions. Note too, that the merchant corresponding to the merchant system 118 may provide services rather than products; for example, the merchant system 118 may be a system to provide an online presence (i.e., e-commerce site, as described above) for commerce and financial transactions. Examples for the merchant system 118 would include a system for providing an network (e.g., Internet, local area network, wide area network, etc.) accessible site for a bank, credit union, stockbroker, roofing contractor, landscaper, cleaning service, peer-to-peer lender, and so on.
  • At least one web page of the set of web pages 120 may be the checkout page 122. The checkout page 122 may be one or more pages configured to handle finalizing the transaction. For example, an initial checkout page may allow the user/consumer to review details (e.g., quantities, colors, sizes, prices, etc.) of items in the shopping cart, other checkout pages may allow the user/consumer to log into his/her account with the merchant system 118, to select from a set of delivery options, select gift wrap options, and/or enter delivery and billing addresses. Another checkout page 122 may involve presenting the user/consumer with one or more payment/credit options. It is an objective of the present disclosure that the one or more payment/credit options presented to the user/consumer may be presented based on the collected client data 124. Like the other web pages 120, the payment or credit options on the checkout page 122 may be generated by a server of the merchant based on the fidelity score and/or other information received from the scoring engine 126, or, alternatively, the payment or credit options on the checkout page 122 may be presented through an iframe, application software, or software development kit, provided by a payment service provider.
  • In order to determine the one or more payment/credit options to present to the user/consumer, the client data 124 may be fed to the scoring engine 126 through the GetCredit( ) API 128. The scoring engine 126 of FIG. 1, upon receiving the client data through the GetCredit( ) API 128 generates a series of variables 130 and corresponding values based on the client data. For example, a variable “given name” may have the value of “Winston,” a variable “family name” may have the value of “Churchill,” a variable of “time_spent_reading_terms_and_conditions” may have a value of 32 seconds, a variable of “browser_version” may have a value of “8.0,” a variable of “time_between_clickdown_submit_and_clickrelease” may have the value of 2103 milliseconds, and so on.
  • The scoring engine 126 may next pass the variable 130 to the risk assessment API 132 in the risk engine 134 to perform a risk assessment of the user/consumer based on the variables 130 and their values. The risk engine 134 may be comprised of one or more computing devices and/or software configured to perform a risk assessment of a user/consumer based on the variables 130 provided. The result of the risk assessment may be a value representing an estimated risk of default of payment by the user/consumer, may be a set of executable code (e.g., executable JavaScript) and/or HTML with payment or credit options, or may be some other output as appropriate based on the determined risk associated with the user/consumer.
  • The risk engine 134 may pass the variables 130 through the risk model, described in further detail with respect to FIG. 2. The risk engine 134 may reference external data 136 of a third party entity, such as a credit bureau service, in the course of performing its risk assessment. Data obtained from the external data source 136 may be data such as credit scores, confirmation of personally identifiable information (e.g., name, address, telephone number, email address, date of birth, etc.), and so on. Such data about the user/consumer may be stored in the cache 140, and, in some cases, if such external data about the user/consumer is located in the cache and is not out of date, such information may be retrieved from the cache 140 without resorting to retrieving it from the external data source 136. Likewise, personally identifiable information may be used to retrieve details about previous transactions involving the user/consumer from internal data 138 of the scoring engine 126.
  • As noted, the internal data 138 may include details about one or more previous transactions identified as having been conducted by the user/consumer. If the scoring engine 126 has no record of the user/consumer conducting a previous transaction, this lack of information may itself be a factor in computing a risk assessment for the user/consumer. For example, if details of the previous transaction recorded in the internal data 138 indicate that the user/consumer faithfully made payments for the previous purchases and paid in full, these details may weigh toward a favorable risk assessment. However if no records of previous transactions are found, the scoring engine 126 has no payment history for the user/consumer, and consequently this lack of payment history may weigh towards a less favorable risk assessment. On the other hand, records of previous transactions indicating that the user/consumer was unreliable with payments for previous purchases may weigh towards an unfavorable risk assessment.
  • In some embodiments, the details about previous customer transactions are obtained from a record matching service 146 of the type described in U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference, configured to store and retrieve records regarding previous transactions by the user/consumer. The record matching service 146 may be a system comprising one or more computing devices configured to at least identify the user/consumer from the client data 124. As an example, if the client data 124 include details for a large order of flowers by “Henry Gibson” from the merchant system 118, but records in the internal data 138 from the record matching service 146 corresponding to the name “Henry Gibson” are typically orders for writing materials, the record matching service 146 may determine that the user/consumer has a low probability of being the same “Henry Gibson” as was found in the internal data 138.
  • In addition to the internal and external data, the risk engine may generate its assessment at least in part from the values of the variables 130. For example, if the variable values indicate that the user/consumer added and removed an expensive item from the shopping cart multiple times, this behavioral data may indicate that the user/consumer is operating at a limit of what the user/consumer is able to afford. Consequently, this may weigh towards a less favorable risk assessment. As another example, if the variables 130 indicate that the user/consumer compared a variety of similar products before finally selecting the product with the best reviews, this behavioral data may indicate that the user/consumer has given careful thought to the purchase, and consequently this may weigh towards a more favorable risk assessment. As noted other information in the personally identifiable information may be used to verify the stability of the user/consumer. For example, if the name and address provided by the user is checked against a database, such as the external data source 136, and the external data source 136 responds that the name and address are indeed known to be associated with each other, then this may weigh towards a more favorable risk assessment. On the other hand, if the external data source 136 is unable to find a match against the name and address provided by the user/consumer this may weigh towards a less favorable risk assessment. Likewise if only the family name provided by the user/consumer can be matched to the address provided by the user/consumer, this may weigh towards a more favorable risk assessment but not as much as if the full name provided by the user/consumer had been matched against the address. For example, a matching family name with a non-matching given name may suggest that a child of the household of that address is placing the order, rather than an adult, and consequently the child may be at a higher risk of default of payment on the transaction.
  • FIG. 2 illustrates another aspect of an environment 200 in which an embodiment may be practiced. As illustrated in FIG. 2, the environment 200 may include a user 244 using a computing device 242 to conduct a transaction. At a point before finalizing the transaction, client data, such as the client data 124 of FIG. 1, may be provided to a risk engine 234 in order to determine a risk of default of payment by the user 244. The risk engine 234 may be similar to the risk engine 134 of FIG. 1.
  • The user 244 may be an individual conducting a transaction through the computing device 242 or may be another entity authorized to conduct such a transaction for the individual. The user 244 may be a first-time user in the environment 200, or may have previously conducted transactions using the risk engine 234 of the environment 200, in which case, details of the previous transactions may be recorded in the internal data 206. The computing device 242 may be any type of computing device having one or more processors and memory, such as the computing device 800 of FIG. 8, capable of receiving input from the user 244 and communicating with components of the risk engine 234 through the network 248. The network 248 represents the path of communication between the user and the risk engine 234. Examples of the network 248 include the Internet, a local area network, a wide area network and Wi-Fi.
  • As an initial matter, client data may first be passed through a set of rules 202. That is, the model 204 of the risk engine 234 is configured to provide a risk assessment based on internal data 206, verification data 208, and behavioral data 210, but the set of rules 202 provide an initial screen of the user 244 to determine whether to proceed with the risk assessment. For example, if the user 244 is a returning user who has defaulted on a previous purchase, the set of rules 202 may be configured to immediately reject the user 244 or present of the user 244 with a fixed set of payment or credit options, such as “prepayment required.” In other words, the set of rules are rules that would not be overcome by a risk assessment from the model 204.
  • The model 204 may include a combination of hardware and software configured to output a risk assessment based on one or more of the internal data 206, the verification data 208, and the behavioral data 210. An advantage provided by the model 204 is that transactions can be processed and finalized more quickly, because an online transaction system may rely on a risk assessment generated by the model 204 without having to request a credit check of the user 244 by a third party entity, such as a credit bureau. Furthermore, because some credit bureau services charge a fee for credit checking services, cost savings may be achieved by relying on the risk assessment of the model 204. The risk engine 234 may be further configured to update/retrain the model 204 based on the collected data (e.g., client data, details about the transaction, etc.) by associating the collected data in a data store with payments by the user 244 as they are received in a timely, or, as the case may be, in a not so timely fashion from the user, and regenerating the logistic regression or retraining the neural network, vector machine, random forest, or other supervised learning algorithm upon which the model 204 is based.
  • The internal data 206 used by the model 204 may include, if the user is a returning user, details about previous transactions, such as addresses used in previous transactions, purchase amounts of previous transactions, methods of payment used in previous transactions, whether the user paid more or less than the minimum payment, whether the user accrued any late fees, whether any funds are still owed on previous transactions, and so on. If the user 244 is a new user rather than a returning user, this lack of a purchase history for the user 244 is itself internal data 206 that may be taken into consideration by the model 204. In some implementations, the user 244 is determined to be a new user if the user 244 creates a new account/user profile with the merchant or payment services provider involved in the transaction. In other implementations, the determination that the user 244 is a new user is made by a record matching service of the type described in U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference. In still other implementations, a determination that the user 244 is a new user is made by determining that there is not a match between personally identifiable information provided by the user 244 and records in the internal data 206. In the latter implementations, a user profile is created for a new user, and if the user 244 conducts transactions in the future, details of the future transactions may be stored in a data store hosting the internal data 206.
  • In some embodiments, the internal data 206 component incorporate the record matching service of U.S. patent application Ser. No. 14/820,468, U.S. patent application Ser. No. 14/830,686, and U.S. patent application Ser. No. 14/830,690, incorporated herein by reference. For example, the client data provided by the user 244 may be usable by the random forest of the record matching service to make a determination about the identity of the user. In some implementations, the random forest of the record matching service is trained on behavioral data obtained in a manner as described in the present disclosure. In this way, an identity of the user 244 may be determinable at least in part based on the user's interactions with the user interface, or may provide additional information usable to make a more certain determination about the identity of the user 244. In this manner, in some implementations, the user 244 need not create an account with a merchant, but rather the identity of the user is determined by the record matching service. In other implementations, the user 244 can create an account with the merchant.
  • In either embodiment, the record matching service includes detailed information about previous transactions likely conducted by the user 244 based on client data received. Furthermore, in some implementations the behavioral data for the current transaction and previous transactions are stored in a data store accessible to the record matching service, and with recourse to the previously recorded behavioral data, the model 204 may be able to generate more accurate risk assessments. The verification data 208 may include one or more pieces of data indicating that the user 244 has provided accurate information. The verification data 208 may be based on personally identifiable information submitted with client data by the computing device 244. One example of personally identifiable information may be a device identifier (ID) for the computing device being used by the user to conduct the transaction. If the internal data 206 contains information that shows that the present user has previously conducted a transaction using the same device (based on the device ID), this may be indicative of stability (e.g., not making purchases using borrowed computing devices) and consequently weigh in favor of lower risk.
  • Another example of verification data 208 may be a simple result of true or false for whether the address provided by the user 244 actually exists. For example, some of the personally identifiable information (e.g., name, address, etc.) may be transmitted through the network communication to a third party entity, such as a credit bureau, to verify the user 244 associated with that personally identifiable information. Furthermore, each piece of verification data 208 may have an associated strength level. For example, a determination that an address exists may have a strength level of 1, indicating that it is a weak verification of the accuracy of information provided by the user 244. On the other hand, a piece of verification data 208 indicating that a given name and family name provided by the user 244 has been confirmed as being associated with the physical address, email address, date of birth, and telephone number also provided by the user may have a strength level of 5, indicating that it is a very strong verification of the accuracy of information provided by the user 244. As another example, verification that a given name and family name provided by the user is associated with an address provided by the user has a stronger strength level than a verification that only the family name, but not the given name, provided by the user could be verified against that address. However, verification that the family name can be verified as associated with the address has a stronger strength level than verification that the family name is associated with a postcode provided by the user, but could not be verified against the address. In some embodiments, strength levels such as these are provided to the model 204 rather than Boolean values (e.g., yes/no, true/false, etc.).
  • Note that the verification data 208 may be a set of one or more pieces of data such as described above. In some instances, the verification data may be determined using the internal data 206. For example, if the user 244 is a returning user, the information provided in the present transaction may be compared with information provided in previous transactions to yield the verification data 208. If such verification data 208 cannot be obtained from the internal data 206, the model 204 may resort to obtaining at least some of the verification data 208 from a third-party data source 216, such as a credit bureau, a telephone directory, or an internet search engine.
  • The behavioral data 210 may be data associated with one or more actions performed by the user 244 in the course of conducting the transaction with the computing device 242. For example, if a record of actions performed by the user 244 indicates that the user 244 clicked a link to read “terms and conditions” of the transaction at a first time, clicked a checkbox next to the words “I agree with the terms and conditions,” at a second time, and the difference between the first time in the second time is 15 minutes, one piece of behavioral data may be a variable and corresponding variable value, “studied_the_terms=15.” On the other hand, if the user clicked the checkbox agreeing to the terms and conditions without actually viewing the terms and conditions, that piece of behavioral data may be stored in a variable similar to, “studied_the_terms=0.” The behavioral data 210 may additionally or alternatively include more granular data than this. For example, the executable code collecting the actions and the record of actions may collect data indicating that the user 244 scrolled quickly through various portions of the terms and conditions, but paused for 10 minutes on the section relating to consequences of nonpayment. Such behavioral data may be stored in a variable similar to “studied_default_on_payment_terms=10,” which may indicate that the user is concerned about an inability to pay, and consequently may be a less-favorable factor in risk assessment. Other examples of the behavioral data 210 include whether the personally identifiable information was automatically populated by the browser or browser plug-in (indicating that the user has used the same information in previous transactions, and also indicating that the user is not trying to disguise him/herself, suggesting that the user may be more reliable), whether the user click-held the submit button for an amount of time suggesting that the user was being very thoughtful about the transaction (e.g., such as may be captured by onMouseDown and onMouseUp JavaScript event triggers), and whether the user had to make corrections while inputting the address data (possibly suggesting that the user is entering a fake address or that the user has not resided at that address for very long). Still another example of behavioral data 210 may be if the user repeatedly decreases the amount of items in the shopping cart, which may be suggestive that the user has a limited budget.
  • As noted, the third-party data source 214 may be third party entity for providing verification data 208 about the user 244. For example, such verification data 208 could include confirmation that the name provided by the user 244 has been previously known to be associated with an address and/or telephone number also provided by the user 244 in the personally identifiable information. Thus, such third-party data source 216 could be a telephone directory service, package delivery service, credit bureau, or mapping service.
  • The output may include a value indicating a statistical certainty value of a fidelity score generated by the model based on the information provided to the model and information that was unable to be provided to the model 204. The statistical certainty value may be a variance indicating a whether enough information was provided to the model to generate a reliable fidelity score. That is, available information from the data 206, the verification data 208, and the behavioral data 210 may be provided to the model, but not all information may be present. For example, the user may have left a telephone number field blank, no record of a previous purchase may be found in internal data, or an external source may be unable to confirm or deny some of the personally identifiable information. The model 204 may make a determination whether, if the missing values were provided, the resulting score would be affected such that payment/credit options would change for the user. If so, in some implementations, the system of the present disclosure queries an external entity for the additional data 214 in order to obtain a more certain fidelity score. Alternatively or additionally, in some implementations, the system responds by updating the user interface being used by the user to prompt the user for additional information (e.g., “Please input your telephone number,” “Your email address is required to proceed further,” “Please enter your mother's maiden name,” etc.).
  • The output 212 of the model 204 may, depending on implementation, take various forms. For example, the output 212 may be a simple “yes” (the user is a good credit risk), “no” (the user is a bad credit risk), or “not enough information.” As another example, the output may be a number representing an estimated percentage chance of default by the user 244. In the latter case, payment/credit options to present to the user 244 may vary based on the generated number. For example, 0-10% may enable display of a “No money down, pay in 30 days” option, 0-20% may enable display of a “Pay by credit card” option, and 0-30% may enable display of a “Pay on delivery” option. Likewise, 80-90% may only allow a “Pre-pay” option. Additionally or alternatively, terms of credit may be determined or altered based on the output. For example, output indicating a 0-5% risk of default may cause the user 244 to be presented with credit terms having an 8% interest rate, output indicating a 5-10% risk of default may cause the user 244 to be presented with credit terms having a 15% interest rate, output indicating a 10-20% risk of default may cause the user 244 to be presented with credit terms having a 29.99% interest rate, and so on. In this manner, credit terms may be determined for the user 244 through a risk assessment of behavioral data obtained from the user 244.
  • In some cases, however, the internal data 206, behavioral data 210, and verification data 208 alone may be insufficient information to result in a reliable risk estimate (e.g., a “not enough information” result or a numeric range of 40-60%). In such a case, the model may retrieve additional data 214 from a credit bureau service in order to make a more reliable risk estimate. The additional data 214 may be a credit score of the user, identifying information about the user (e.g., current and past addresses, date of birth, known employers, name of spouse, national identification number, etc.), credit history of the user (e.g., list of bank accounts, list of credit card accounts, list of outstanding loans, how long an account has been open, credit limits on accounts, identities of co-signers, etc.), public records associated with the user (e.g., list of evictions reported, list of bankruptcies, list of tax liens, list of civil judgements), or other information about the user (e.g., identities of other entities who have requested a credit report on the user). In some cases, the additional data 214 may alone be sufficient for the risk assessment, and the additional data 214 may be provided as the output 212. In other implementations, the additional data 214 is fed back into the model 204, and the additional data 214 in combination with the internal data 206, the behavioral data 210, and the verification data 208 may be utilized to generate a new risk assessment for the output 212.
  • Note that an advantage provided by the system of the present disclosure is that in many cases a reliable risk assessment may be generated without resorting to using the additional data 214 from an external data source. This provides advantages in speed, usability, and cost. For example, not having to contact a credit service bureau to obtain a credit assessment of the user 244 may save 1.3 seconds or more of transaction processing time. Although 1.3 seconds may not seem like much, every second spent by a user waiting for his/her computing device to respond can have a negative effect on the user's overall satisfaction with the transaction process. Furthermore, because credit service bureaus may charge a fee for each credit check a merchant or payment service provider may reap savings by being able to generate a reliable risk assessment without recourse to the additional data 214. However, in some cases there may be insufficient amounts of the internal data 206, the behavioral data 210, and the verification data 208 for the model 204 to generate a reliable risk assessment, and in such cases the additional data 214 may be needed.
  • FIG. 3 illustrates a couple of scenarios 300 from an embodiment of the present disclosure. Specifically, FIG. 3 depicts a first user 344A and a second user 344B at a checkout stage of separate transactions. As can be seen, client data for the first user 344A is transmitted to the rules engine 302 of the system of the present disclosure. As noted, the client data may include personally identifiable information and data reflecting interactions of the first user 344A with the user interfaces during the course of conducting the transaction being finalized. The rules engine 302 determines whether, based on the client data received, any rules exist that would preempt risk processing of the first user 344A by the model 304. If not, the client data of the first user 344A is passed to the model 304. Based on the client data, internal data 306A, verification data 308A, behavioral data 310A are obtained or generated by the model 304 and processed to yield a first risk assessment 312A. Note that there are various methods by which the internal data 306A, the verification data 308A, and the behavioral data 310A may be processed. In one example, the internal data 306A, may be used to generate an internal data score based on details about previous transactions conducted by the user 344A. Similarly, the verification data 308A may be used to generate a verification data score indicating a confidence level with the accuracy of information provided by the user 344A. Likewise, the behavioral data 310A may be used to generate a behavioral data score reflecting a level of intent to pay by the user 344A. Each of these three scores may be weighted and/or fed into a risk algorithm, random forest, regression model, neural network, vector machine, other supervised learning algorithm to yield the risk assessment 312A. Alternatively, the internal data 306A, the verification data 308A, and the behavioral data 310A may be processed to yield a set of variables. An example of the set of variables, may be something like: “time_spent_reading_overall_terms=15,” “time_spent_reading_default_terms=1,” “browser=IE,” “browser_version=10.0,” “time_between_clickdown_submit_and_clickrelease=3,” “transaction_amount=308.00,” “current_debt=0,” “transactions_last 5 months=3,” “last_transaction_payment_method=credit,” “paid_in_full=yes,” “last_transaction_amount=12.99,” “verified_address=yes,” “address_matches_last_transaction_address=yes,” “verified_full_name=yes,” “verified_email=yes,” “verified_phone_number=no,” and so on. The set of variables themselves may be weighted and/or fed into a risk algorithm, random forest, regression model, neural network, vector machine, other supervised learning algorithm to yield the risk assessment 312A.
  • As can be seen from FIG. 3, the risk assessment 312A suggests that the first user 344A is estimated to have a low risk of default on payment. Consequently, executable code of the system of the present disclosure, as a result of being executed, may cause the payment user interface 322A to update to display a list of payment/credit options available (“Pre-Pay,” “Credit,” “Debit,” “Pay in 14 Days,”) to the first user 344A based on the risk assessment 312A.
  • A similar process is performed for the second user 344B. That is, client data for the second user 344B is transmitted to the rules engine 302 for determination whether, based on the client data, any rules exist that would preempt risk processing of the second user 344B by the model 304. If not, the client data of the second user 344B is passed to the model 304. As with the first user 344A, based on this client data, internal data 306B, verification data 308B, and behavioral data 310B are obtained or generated by the model 304 and processed in a manner similar to that described for the first user 344A to yield a second risk assessment 312B. Perhaps in the case of the second user 344B, an example of the set of variables may be something like: “time_spent_reading_overall_terms=15,” “time_spent_reading_default_terms=10,” “browser=IE,” “browser_version=8.0,” “time_between_clickdown_submit_and_clickrelease=0,” “transaction_amount=1553.00,” “current_debt=0,” “transactions_last 5 months=1,” “last_transaction_payment_method=pay_in_14,” “paid_in_full=no,” “last_transaction_amount=76.95,” “verified_address=yes,” “address_matches_last_transaction_address=no,” “verified_full_name=no,” “verified_email=no,” “verified_phone_number=no,” and so on.
  • As can be seen from FIG. 3, the risk assessment 312B suggests that the second user 344B is estimated to have a high risk of default on payment. Consequently, the executable code of the system of the present disclosure, as a result of being executed, may cause the payment user interface 322B to update the list of payment/credit options available to the second user 344B based on the risk assessment 312B to only allow one method of payment, which is to require the second user 344 to pre-pay for the purchase before the merchant will agree to the transaction.
  • FIG. 4 illustrates an example scenario 400 of an embodiment of the present disclosure. Specifically, FIG. 4 depicts an example of selecting payment/credit options from a set of payment/credit options 442 for display in a checkout user interface 422 for the user. The set of payment/credit options 442 may be various options for remitting payment for purchases from a particular merchant by users (e.g., credit card, cash on delivery, deferred payment, bank withdrawal, payment service, etc.). Each merchant utilizing the system of the present disclosure may specify which payment/credit options should be included in the set of payment/credit options 442. Furthermore, each merchant utilizing the system of the present disclosure may also specify the ranges of fidelity scores necessary to enable the particular payment/credit options of the set of payment/credit options 442. In this manner, each merchant utilizing the system can customize which payment/credit options he or she are willing to accept and how much risk they are willing to accept for each particular payment or credit option.
  • FIG. 4 illustrates that, for the particular fidelity score generated for the user currently conducting the checkout, the generated fidelity score for the user falls outside of the specified ranges for at least a first payment/credit option 442A and a second payment/credit option 442B, as well as a second-to-last payment/credit option 442N-1. However, the generated fidelity score appears to have fallen within the ranges for a third payment/credit option 442C, a fourth payment/credit option 442D, and a last payment/credit option 442N. As a result, it is seen that after the checkout user interface 442 has been updated, the third payment/credit option 442C, the fourth payment/credit option 442D, and the last payment/credit option 442N are displayed to the user as available methods of payment, while the first payment/credit option 442A, the second payment/credit option 442B, and the second-to-last payment/credit option 442N-1 are not shown to the user. Note that updating the checkout user interface 422 may be achieved with JavaScript. However, alternatively, the updated user interface 422 may be dynamically generated using server-side executable instructions as a security measure in order to prevent a user from accessing hidden payment and credit options. Such server-side executable instructions may be, depending upon implementation, executed by a web server of the merchant or may be executed by a server of a payment service via a call embedded in the merchant website, such as through an HTML iframe, software application, or software development kit.
  • FIG. 5 is a flowchart illustrating an example of a process 500 for determining payment and credit options based on a fidelity score in accordance with various embodiments. Some or all of the process 500 (or any other processes described, or variations and/or combinations of those processes) may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors. The executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media). For example, some or all of process 500 may be performed by any suitable system, such as a server in a data center or the computing device 800 described in conjunction with FIG. 8. The process 500 includes a series of operations wherein a request is received by the system performing the process 500 to determine some payment or credit options for a particular user, client data is received regarding the user, and a fidelity score (i.e., risk assessment score, backspace) is generated and options are displayed based on the fidelity score.
  • In 502, a request to determine a set of payment or credit options to present to a user in an interface is received. In some implementations, the request is received directly from the client device that the user is using to conduct the transaction. For example, executable code may be embedded in the user interface running in a browser or other application on the client device, that, as a result of being executed, causes the request to be made directly to the system performing the process 500. In other implementations, the request is received from a computing device of the merchant with whom the user is conducting a transaction. That is, the user may have just clicked a button (e.g., “Proceed to checkout”) which may cause the merchant computing system to submit the request for payment or credit options for the next interface screen to the system performing the process 500.
  • In 504, included with the request of 502 or in another request subsequent to the request of 502, client data comprising personally identifiable information provided by the client device and the user him/herself and a set of information about the interactions between the user and the user interface during the conduct of the transaction may be received. This client data may be used to yield one or more variables and corresponding variable values, from which internal data, verification data, and behavioral data may be obtained, as described in connection with FIGS. 1 and 2.
  • In 506, based on the internal data, verification data, and behavioral data, a fidelity score (also referred to as a risk assessment or risk assessment score) may be generated. The fidelity score may indicate an estimated risk of default by the user conducting the transaction at hand. Consequently, in 508, the system performing the process 500 makes a determination whether the resulting fidelity score is sufficiently reliable to make a determination which payment or credit options to present to the user without recourse to additional information. That is, a statistical certainty value, generated based on the available client data, may be a variance that indicates whether enough information was provided to the model to generate a reliable fidelity score. For example, it may be that behavioral data included some factors suggesting a high risk of default as well as other factors suggesting a low risk of default by the user. In such a case, the system performing the process 500 may determine that insufficient information is available for reliable fidelity score determination. In some implementations, additionally or alternatively, the process 500 may determine to obtain additional information if the generated fidelity score, using only the client data received in 504, is near a threshold value used for determining whether or not to show a payment/credit option. For example, if a threshold value for displaying a particular payment/credit option is 40 and the generated fidelity score is 41, the system performing the process 500 may determine to obtain additional information and regenerate the fidelity score. If the regenerated fidelity score indicates a lower risk of default (e.g., 39), the system performing the process 500 may determine that the regenerated fidelity score is more reflective of the actual security risk of the user, and present the particular payment/credit option to the user.
  • As a result of determining that additional information is needed, the system may proceed to 510, whereupon the system may make a request to the computing device of a third party entity for additional data; for example, the system may request a credit score of the user from a credit service bureau. With this additional information, the system performing the process 500 may return to 506 to update the fidelity score based at least in part on the additional information. It should be noted, that there may be one or more third-party entities from which the system can obtain additional information. For example, in 510, the system may query three different credit service bureaus for three different credit scores for the user. Additionally or alternatively, the system performing the process may repeat the operations of 506-10 multiple times, each time at 510 querying a different third party entity until a determination on a fidelity score is made. For example, at a first time at 510 the system may query a first third-party entity which may provide additional information about the user (e.g., how long the user has resided at the given address, age of the user, etc.). This information may be obtained quickly and inexpensively, but may be less valuable for risk assessment than more expensively obtained additional information. Thus, in this example, if the information from the first third-party entity is determined in 508 to be insufficient for generating a reliable fidelity score, the system performing the process 500 may query a second third party entity, which may be slower or may require a larger fee than the first third-party entity, for more additional information, and so on.
  • Lastly, in 512, the system may determine which payment or credit options to present to the user for selection and cause the client device of the user to display the determined payment/credit options. For example, if, despite attempting to obtain additional information, the system performing the process 500 was still unable to generate a reliable fidelity score, the system may err on the side of caution and only present conservative payment/credit options (e.g., “Pre-pay,” “Cash on Delivery,” etc.). Likewise, such conservative payment/credit options may be determined and displayed if the fidelity score for the user indicates that the user is a high risk of default on payment. On the other hand, if the generated fidelity score indicates that the user is at low risk of default, more generous payment/credit options may be determined and presented (e.g., “0% down, no payments for 90 days,” “Pay in 12 easy monthly installments,” etc.). Payment and credit options may be even more granular; for example, an installment payment option on a checkout webpage may include a slider bar allowing the user to select the number of as many installments (e.g., three months, six months, 12 months, 24 months, etc.) as a fidelity score may allow. Note that one or more of the operations performed in 502-20 may be performed in various orders and combinations, including in parallel.
  • FIG. 6 is a flowchart illustrating an example of a process 600 for displaying payment and credit options in a user interface in accordance with various embodiments. Some or all of the process 600 (or any other processes described, or variations and/or combinations of those processes) may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors. The executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media). For example, some or all of the process 600 may be performed by any suitable system, such as a server in a data center or by the computing device 800 described in conjunction with FIG. 8. The process 600 illustrates in further detail the operations performed in 512 of FIG. 5.
  • In 602, the fidelity score generated by the system performing the process 500 of FIG. 5 is obtained by the system performing the process 600. Note that the systems performing the processes 500 and 600 may be the same or different systems. In 604, a set of potential payment options are obtained. Each of the payment/credit options of the set may be associated with a fidelity score range. For example, a fidelity score range of 0-60 may be associated with the option “Pay by credit card.” In other words, if the resulting fidelity score falls within a range of 0-60, the option to pay by credit card may be displayed in the interface for the user. The set of payment/credit options may be arranged in various orders, such as the most conservative payment/credit options first and the most generous payment/credit options at the last of the set. Thus, starting with the first payment/credit option of the set, the system performing the process may proceed to 606.
  • In 606, the system performing the process 600 may determine whether the fidelity score obtained in 602 falls within the range associated with the current payment/credit option being considered. In some implementations, the fidelity score may indicate an estimated risk of default, with a higher number meaning greater risk. In other implementations, the fidelity score may be generated such that a lower number indicates a greater risk. In still other implementations, a negative fidelity score may indicate an above average risk of default while in a positive fidelity score may indicate a below average risk of default, or vice versa. In some implementations, certain payment and credit options may be shown to or hidden from the user if the fidelity score crosses/exceeds a particular high threshold, while in other implementations, certain payment and credit options may be shown to or hidden from the user if the fidelity score crosses/falls below a particular low threshold. In the process 600, payment and credit options are shown if the fidelity score lies within a range of values (e.g., an upper and lower threshold). If the fidelity score does not fall within the range associated with the current payment/credit option, the system performing the process 600 may return to 604 to obtain the next payment/credit option. However, if the fidelity score falls within the range of the current payment/credit option, the system performing the process 600 may proceed to 608 to select the payment/credit option for display in the checkout user interface.
  • In 610, the system performing the process 600 determines whether it has assessed all of the possible payment and credit options in the set of payment/credit options. If not, the system may return to 604 to retrieve the next payment/credit option and repeat the operations of 604-10. However, if the last payment/credit option has been processed, the system may proceed to 612, whereupon the selected payment/credit options from 608 may be displayed in a checkout user interface for finalizing the transaction. For example, if the fidelity score is 40 (i.e., 40% chance of default of payment), and a first payment/credit option has a range of 0-20, the fidelity score can be seen to be out of range of the payment/credit options. Consequently, the operations of 604 retrieved a second payment/credit option which has a range of 0-35, which again falls short of the fidelity score in this example. However, if the operations of 604 retrieved a third payment/credit option that has a range of 0-55, it can be seen that the user qualifies for this payment/credit option. Consequently, in 608 the third payment/credit option would be among the payment/credit options selected. This process may repeat from 604 through 610 until all possible payment and credit options are considered, and then in 612 the third payment/credit option and any others whose range overlapped the fidelity score and selected in 608 may be displayed on the checkout user interface for the user. It is contemplated that the process 600 may be one of several methods for selecting payment and credit options based on fidelity score may be possible. Note also that one or more of the operations performed in 602-20 may be performed in various orders and combinations, including in parallel.
  • FIG. 7 is a flowchart illustrating an example of a process 700 for generating a fidelity score in accordance with various embodiments. Some or all of the process 700 (or any other processes described, or variations and/or combinations of those processes) may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors. The executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media). For example, some or all of process 700 may be performed by any suitable system, such as a server in a data center or by the computing device 800 described in conjunction with FIG. 8. The process 700 includes a series of operations wherein sets of behavioral data, verification data, and internal data are transformed into a set of variables and corresponding variable values, which are then input into a data model to generate the fidelity score.
  • In 702, behavioral data is transformed into one or more variables and corresponding variable values. As noted, behavioral data may include data about the interactions between the user and the user interface being used to conduct a transaction. Examples may include variables such as time_between_clickdown_submit_and_clickrelease, user_misspelled_last_name, and time_spent_reading_terms_and_conditions.
  • In 704, verification data may be transformed into one or more variables and corresponding variable values. As noted, verification data may be data verifying one or more elements of personally identifiable information provided by the user. Examples of variables generated from verification data include variables such as email_associated_with_user_name, user_name_associated_with_phone_number, user_name_associated_with_address, and so on. Values for the verification data variables may be obtained from internal data (e.g., data regarding previous transactions conducted by the user, etc.) or may be obtained with reference to an external data source, such as a credit bureau service or a telephone directory service.
  • In 706, internal data is transformed into one or more variables and corresponding variable values. As noted, internal data may include details regarding previous transactions conducted by the user. Examples of variables generated from internal data may include variables such as number_of_purchases_in_last_3 months, amount_of_last_purchase, payment_method_of_last_purchase, and so on.
  • In 708, the system performing the process 700 may determine whether additional data is available. For example, if a previous attempt to generate a fidelity score was made, and, per a determination such as the determination made via the operations of 508 of FIG. 5, the available behavioral data, verification data, and internal data was insufficient to generate a reliable fidelity score, the system of the present disclosure may have made a call to an external data source, such as a credit bureau, to obtain additional information. Thus, if such additional information is available, the system performing the process 700 may proceed to 710, whereupon the additional data is transformed into one or more variables having corresponding values. An example of such a variable may be credit_score.
  • In 712, the various variables and their corresponding values generated in steps 702-10 may be passed through a data model. It is contemplated that the data model may be any of various types of data models, such as a random forest generated from a large data set of similar data, neural network, vector machine, other supervised learning algorithm, or some other regression model generated from a sample set representative of typical user data for conducting transactions in the manner described in the present disclosure. The output of the model may be indicative of a risk of default on payment by the present user. Upon generating the output, in 714, the output may be provided to the merchant, some other entity, or to an internal process for determining which payment and credit options to present to the user in an interface, such as a checkout-type interface.
  • As noted, a statistical certainty value may accompany a fidelity score, indicating a whether enough information was provided to the model to generate a reliable fidelity score. That is, if some information was unavailable to be provided to the fidelity score, the statistical certainty value may indicate, if the missing information were available and favored positive or negative risk assessment, whether the fidelity score would change such that payment and credit options provided to the user would also change. The statistical certainty value may be output in various formats. For example, the statistical certainty value may be in the form of a variance/standard error (e.g., standard deviation) or other value indicative of how the fidelity score could change based on worst case/best case scenario. In one example, the model determines a fidelity score for a particular user that indicates that the user has a 5% chance of default on payment. However, this fidelity score may have been generated by the model without having certain information (i.e., some behavioral data, verification data, and/or internal data may have been unavailable). However a statistical certainty value may indicate that, if certain unavailable information favored higher risk (e.g., worst-case information), the user's risk of default could rise to 30%. In this case, the system of the present disclosure may make a determination whether, based on the potential risk of default of 30%, the cost of obtaining additional data from an external source is worth mitigating the potential risk. It may be that the system determines that 30% is an acceptable risk when weighed against the costs in time and money associated with obtaining the additional data. Note too that one or more of the operations performed in 702-20 may be performed in various orders and combinations, including in parallel.
  • In some embodiments, the system of the present disclosure is configured as a service that provides the fidelity score, and, in some cases, as the statistical certainty value based on internal data, verification data, and client data provided by a subscriber to the service. For example, the service may be offered to subscribers as a faster and cheaper alternative for obtaining credit risk data than traditional credit bureau services. In some embodiments, the system of the present disclosure is configured to update a checkout page with items other than payment and credit options, or may be configured to update one or more webpages with information different than payment and credit options but based on collected client data. For example, if the client data (i.e., personally identifiable information and behavioral data) suggests that the user is within a certain demographic, such as age group, the system of the present disclosure may present product offers believed to be of interest to such an age group, such as clothing of certain styles and so on. As another example, if the client data suggests that the user is within a certain age group or participates in certain high risk activities (such as conducting transactions while in a moving vehicle), this information may be usable by the system in determining insurance premiums, loan interest rates, whether to offer an option for a reverse mortgage, and so on.
  • In the context of describing disclosed embodiments, unless otherwise specified, use of expressions regarding executable instructions (also referred to as code, applications, agents, etc.) performing operations that “instructions” do not ordinarily perform unaided (e.g., transmission of data, calculations, etc.) denote that the instructions are being executed by a machine, thereby causing the machine to perform the specified operations.
  • Embodiments of the disclosure can be described in view of the following clauses:
      • 1. A computer-implemented method, comprising:
  • under the control of one or more computer systems configured with first executable instructions,
      • providing second executable instructions to a first computing device associated with a user that, as a result of being executed by the first computing device, causes the first computing device to collect, as a set of client data, personally identifiable information and a set of measurements associated with interactions by the user with a user interface of the first computing device;
      • obtaining a set of internal data associated with the user, the set of internal data associated with one or more previous transactions involving the user;
      • obtaining a set of verification data associated with the user, the set of verification data including identification verifying that the personally identifiable information is accurate;
      • transforming the set of client data, the set of internal data, and the set of verification data into one or more variables;
      • generating a fidelity score based at least in part on the one or more variables, the fidelity score indicating a risk of default on the transaction by the user;
      • based at least in part on a comparison of the fidelity score against a threshold, determining not to request, from a second computing device, additional information associated with the user; and
      • updating the user interface based at least in part on the fidelity score.
      • 2. The computer-implemented method of clause 1, wherein:
      • the user is a first user;
      • the set of client data is a first set of client data;
      • the personally identifiable information is first personally identifiable information;
      • the user interface is a first user interface;
      • the set of internal data is a first set of internal data;
      • the set of verification data is a first set of verification data;
      • the one or more variables are a one or more first variables;
      • the fidelity score is a first fidelity score; and
      • the method further comprises:
        • providing the second executable instructions to a third computing device associated with a second user that, as a result of being executed by the third computing device, causes the third computing device to collect a second set of client data associated with the second user;
        • obtaining a second set of internal data associated with the second user, the second set of internal data associated with one or more previous transactions involving the second user;
        • obtaining a second set of verification data associated with the user, the second set of verification data including identification verifying that second personally identifiable information in the second set of client data is accurate;
        • transforming the second set of client data, the second set of internal data, and the second set of verification data into one or more second variables;
        • generating a second fidelity score based at least in part on the one or more second variables, the second fidelity score indicating a risk of default on the transaction by the second user;
        • based at least in part on a comparison of the second fidelity score against the threshold, determining to submit a request, to the second computing device, for additional information associated with the second user;
        • receiving the additional information from the second computing device in response to the request;
        • updating the second fidelity score based at least in part on the additional information to yield an updated second fidelity score; and
      • updating a second user interface based at least in part on the updated second fidelity score.
      • 3. The computer-implemented method of clause 1 or 2, further comprising generating a statistical certainty value indicating a variance of the fidelity score, and updating the user interface is further based at least in part on the statistical certainty value.
      • 4. The computer-implemented method of any of clauses 1 to 3, wherein the user interface is updated based at least in part on the fidelity score to provide one or more payment options for the transaction.
      • 5. A system, comprising:
      • one or more processors;
      • memory including instructions that, as a result of being executed by the one or more processors, cause the system to:
        • obtain a set of client data, including personally identifiable information of a user and one or more measurements associated with interactions by the user with an interface;
        • obtain a set of internal data associated with the user;
        • obtain a set of verification data associated with the user;
        • generate a score based at least in part on the set of client data, the set of internal data, and the set of verification data; and
        • determine based at least in part the score whether to:
          • request, from a third party entity, additional information associated with the user; and
          • update the score based at least in part the additional information; and
      • update the interface based at least in part on the score.
      • 6. The system of clause 5, wherein the set of client data includes one or more of:
      • a type and version of web browser being used by the user,
      • a type of operating system installed on a computing device being used by the user, or
      • a device identifier.
      • 7. The system of clause 5 or 6, wherein the one or more measurements are obtained from one or more of:
      • a user input device,
      • a gyroscope,
      • an image sensor,
      • accelerometer,
      • global positioning receiver, or
      • microphone.
      • 8. The system of any of clauses 5 to 7, wherein set of internal data includes one or more of:
      • whether the user has made a payment on a previous transaction,
      • a payment type of the previous transaction,
      • an amount owed from the previous transaction, or
      • an amount paid on the previous transaction.
      • 9. The system of any of clauses 5 to 8, wherein the set of verification data includes one or more of:
      • confirmation that a name of the user is associated with a geographical location provided by the user,
      • confirmation that the name of the user is associated with an email address provided by the user, or
      • confirmation that the name of the user is associated with a date of birth provided by the user.
      • 10. The system of any of clauses 5 to 9, wherein the set of client data includes a set of measurements, individual measurements of the set of measurements including:
      • an action performed by the user to an object in the interface;
      • an identity of the object; and
      • a time value indicating a time at which the action was performed on the object.
      • 11. The system of any of clauses 5 to 10, wherein the additional information includes one or more of:
      • a credit score of the user,
      • credit history of the user, or
      • public records associated with the user.
      • 12. The system of any of clauses 5 to 11, wherein the set of verification data is obtained from an entity external to the system.
      • 13. A non-transitory computer-readable storage medium having stored thereon executable instructions that, as a result of being executed by one or more processors of a computer system, cause the computer system to at least:
      • obtain a first set of data associated with interactions by a user conducting a current transaction using an interface of a computing device;
      • obtain a second set of data associated with a previous transaction conducted by the user;
      • obtain a third set of data associated verification of an identity of the user;
      • generate a first score based at least in part on the first set of data, the second set of data, and the third set of data; and
      • if the first score does not cross a threshold, update the interface based at least in part on the first score; and
      • if the first score crosses the threshold:
        • obtain, from an external entity, a fourth set of data associated with the user;
        • generate a second score based at least in part on the fourth set of data; and
      • update the interface based at least in part on the second score.
      • 14. The non-transitory computer-readable storage medium of clause 13, wherein the executable instructions that cause the computer system to update the interface based at least in part on the first score or the second score further include executable instructions that cause the computer system to:
      • determine a subset of payment options, based at least in part on the first score or second score, from a set of payment options; and
      • update the interface to present the subset of payment options to the user.
      • 15. The non-transitory computer-readable storage medium of clause 13 or 14, wherein the executable instructions that cause the computer system to update the interface based at least in part on the first score or the second score further include executable instructions that cause the computer system to:
      • determine one or more product or services offerings based at least in part on the first score or second score; and
      • update the interface to present one or more product or services offerings to the user.
      • 16. The non-transitory computer-readable storage medium of any of clauses 13 to 15, wherein the first set of data includes one or more of:
      • a geographic address of the user,
      • a telephone number of the user, or
      • an email address of the user.
      • 17. The non-transitory computer-readable storage medium of any of clauses 13 to 16, wherein:
      • the first set of data is obtained by the computer system by execution of code embedded in a web page of a merchant;
      • the computer system is a computer system of a payment service provider different from the merchant; and
      • the code, as a result of being executed, causes the first set of data to be transmitted to the computer system.
      • 18. The non-transitory computer-readable storage medium of any of clauses 13 to 17, wherein:
      • the interface is a web page hosted by a first entity;
      • the computer system is a computer system of a second entity different from the user and the first entity; and
      • the executable instructions that cause the computer system to update the interface include executable instructions that cause the computer system to provide at least a portion of the web page of the first entity for display on the computing device of the user.
      • 19. The non-transitory computer-readable storage medium of any of clauses 13 to 18, wherein the instructions further include instructions that cause the computer system to generate a variance for the first score.
      • 20. The non-transitory computer-readable storage medium of clause 19, wherein the executable instructions that cause the computer system to obtain the fourth set of data, generate the second score, and update the interface based at least in part the second score are further executed if the first score combined with the variance crosses the threshold.
  • FIG. 8 is an illustrative, simplified block diagram of an example computing device 800 that may be used to practice at least one embodiment of the present disclosure. In various embodiments, the computing device 800 may be used to implement any of the systems illustrated herein and described above. For example, the computing device 800 may be configured for use as a data server, a web server, a portable computing device, a personal computer, or any electronic computing device. As shown in FIG. 8, the computing device 800 may include one or more processors 802 that may be configured to communicate with, and are operatively coupled to, a number of peripheral subsystems via a bus subsystem 804. The processors 802 may be utilized for the traversal of decision trees in random forest of supervised models in embodiments of the present disclosure (e.g., cause the evaluation of inverse document frequencies of various search terms, etc.). These peripheral subsystems may include a storage subsystem 806, comprising a memory subsystem 808 and a file storage subsystem 810, one or more user interface input devices 812, one or more user interface output devices 814, and a network interface subsystem 816. Such storage subsystem 806 may be used for temporary or long-term storage of information such as details associated with transactions described in the present disclosure, databases of historical records described in the present disclosure, and storage of decision rules of the supervised models in the present disclosure).
  • The bus subsystem 804 may provide a mechanism for enabling the various components and subsystems of computing device 800 to communicate with each other as intended. Although the bus subsystem 804 is shown schematically as a single bus, alternative embodiments of the bus subsystem utilize multiple busses. The network interface subsystem 816 may provide an interface to other computing devices and networks. The network interface subsystem 816 may serve as an interface for receiving data from, and transmitting data to, other systems from the computing device 800. For example, the network interface subsystem 816 may enable a data technician to connect the device to a wireless network such that the data technician may be able to transmit and receive data while in a remote location, such as a user data center. The bus subsystem 804 may be utilized for communicating data, such as details, search terms, and so on to the supervised model of the present disclosure, and may be utilized for communicating the output of the supervised model to the one or more processors 802 and to merchants and/or creditors via the network interface subsystem 816.
  • The user interface input devices 812 may include one or more user input devices, such as a keyboard, pointing devices such as an integrated mouse, trackball, touchpad, or graphics tablet, a scanner, a barcode scanner, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and mechanisms for inputting information to the computing device 800. The one or more user interface output devices 814 may include a display subsystem, a printer, or non-visual displays such as audio output devices, etc. The display subsystem may be a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), light emitting diode (LED) display, or a projection or other display device. In general, use of the term “output device” is intended to include all possible types of devices and mechanisms for outputting information from the computing device 800. The one or more output devices 814 may be used, for example, to present user interfaces to facilitate user interaction with applications performing processes described herein and variations therein, where such interaction may be appropriate.
  • The storage subsystem 806 may provide a computer-readable storage medium for storing the basic programming and data constructs that may provide the functionality of at least one embodiment of the present disclosure. The applications (programs, code modules, instructions) that, as a result of being executed by one or more processors, may provide the functionality of one or more embodiments of the present disclosure, and may be stored in the storage subsystem 806. These application modules or instructions may be executed by the one or more processors 802. The storage subsystem 806 may additionally provide a repository for storing data used in accordance with the present disclosure. The storage subsystem 806 may comprise a memory subsystem 808 and a file/disk storage subsystem 810.
  • The memory subsystem 808 may include a number of memories, including a main random access memory (RAM) 818 for storage of instructions and data during program execution and a read only memory (ROM) 820 in which fixed instructions may be stored. The file storage subsystem 810 may provide a non-transitory persistent (non-volatile) storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a Compact Disk Read Only Memory (CD-ROM) drive, an optical drive, removable media cartridges, and other like storage media.
  • The computing device 800 may include at least one local clock 824. The local clock 824 may be a counter that represents the number of ticks that have transpired from a particular starting date and may be located integrally within the computing device 800. The local clock 824 may be used to synchronize data transfers in the processors for the computing device 800 and all of the subsystems included therein at specific clock pulses and may be used to coordinate synchronous operations between the computing device 800 and other systems in a data center. In one embodiment, the local clock 824 is an atomic clock. In another embodiment, the local clock is a programmable interval timer.
  • The computing device 800 may be of various types, including a portable computer device, tablet computer, a workstation, or any other device described below. Additionally, the computing device 800 may include another device that may be connected to the computing device 800 through one or more ports (e.g., USB, a headphone jack, Lightning connector, etc.). The device that may be connected to the computing device 800 may include a plurality of ports configured to accept fiber-optic connectors. Accordingly, this device may be configured to convert optical signals to electrical signals that may be transmitted through the port connecting the device to the computing device 800 for processing. Due to the ever-changing nature of computers and networks, the description of the computing device 800 depicted in FIG. 8 is intended only as a specific example for purposes of illustrating the preferred embodiment of the device. Many other configurations having more or fewer components from the system depicted in FIG. 8 are possible.
  • The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. However, it will be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope of the invention as set forth in the claims.
  • Other variations are within the spirit of the present disclosure. Thus, while the disclosed techniques are susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific form or forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions and equivalents falling within the spirit and scope of the invention, as defined in the appended claims.
  • The use of the terms “a” and “an” and “the” and similar referents in the context of describing the disclosed embodiments (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. The term “connected,” where unmodified and referring to physical connections, is to be construed as partly or wholly contained within, attached to or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein and each separate value is incorporated into the specification as if it were individually recited herein. The use of the term “set” (e.g., “a set of items”) or “subset” unless otherwise noted or contradicted by context, is to be construed as a nonempty collection comprising one or more members. Further, unless otherwise noted or contradicted by context, the term “subset” of a corresponding set does not necessarily denote a proper subset of the corresponding set, but the subset and the corresponding set may be equal.
  • Conjunctive language, such as phrases of the form “at least one of A, B, and C,” or “at least one of A, B and C,” unless specifically stated otherwise or otherwise clearly contradicted by context, is otherwise understood with the context as used in general to present that an item, term, etc., may be either A or B or C, or any nonempty subset of the set of A and B and C. For instance, in the illustrative example of a set having three members, the conjunctive phrases “at least one of A, B, and C” and “at least one of A, B and C” refer to any of the following sets: {A}, {B}, {C}, {A, B}, {A, C}, {B, C}, {A, B, C}. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of A, at least one of B and at least one of C each to be present.
  • Operations of processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. Processes described herein (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof. The code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors. The computer-readable storage medium may be non-transitory.
  • The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
  • Embodiments of this disclosure are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate and the inventors intend for embodiments of the present disclosure to be practiced otherwise than as specifically described herein. Accordingly, the scope of the present disclosure includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the scope of the present disclosure unless otherwise indicated herein or otherwise clearly contradicted by context.
  • All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

Claims (20)

1. A computer-implemented method, comprising:
under the control of one or more computer systems that execute instructions,
providing second executable instructions to a first computing device associated with a user that, as a result of being executed by the first computing device, causes the first computing device to:
collect a set of client data that includes:
personally identifiable information about the user;
an identifier associated with the first computing device; and
a set of measurements associated with interactions between the user and a user interface of the first computing device, individual measurements of the set of measurements including:
 an action performed by the user to an object in the user interface;
 an identity of the object; and
 a time value indicating a time at which the action was performed on the object; and
provide the set of client data to the one or more computer systems;
obtaining a set of internal data associated with one or more previous transactions involving the user;
obtaining a set of verification data verifying that the personally identifiable information is accurate;
transforming the set of client data, the set of internal data, and the set of verification data into one or more variables;
generating a fidelity score based at least in part on the one or more variables; and
updating the user interface based at least in part on the fidelity score.
2. The computer-implemented method of claim 1, the method further comprising:
providing the second executable instructions to a third computing device associated with a second user that, as a result of being executed by the third computing device, causes the third computing device to collect a second set of client data associated with the second user;
obtaining a second set of internal data associated with the second user, the second set of internal data associated with one or more previous transactions involving the second user;
obtaining a second set of verification data associated with the user, the second set of verification data including identification verifying that second personally identifiable information in the second set of client data is accurate;
transforming the second set of client data, the second set of internal data, and the second set of verification data into one or more second variables;
generating a second fidelity score based at least in part on the one or more second variables, the second fidelity score indicating a risk of default on the transaction by the second user;
based at least in part on a comparison of the second fidelity score against a threshold, determining to submit a request, to the second computing device, for additional information associated with the second user;
receiving the additional information from the second computing device in response to the request;
updating the second fidelity score based at least in part on the additional information to yield an updated second fidelity score; and
updating a second user interface based at least in part on the updated second fidelity score.
3. The computer-implemented method of claim 1, further comprising generating a statistical certainty value indicating a variance of the fidelity score, and updating the user interface is further based at least in part on the statistical certainty value.
4. The computer-implemented method of claim 1, wherein the user interface is updated based at least in part on the fidelity score to provide one or more payment or credit options for the transaction.
5. A system, comprising:
one or more processors; and
memory including instructions that, as a result of being executed by the one or more processors, cause the system to:
obtain a set of client data, including:
personally identifiable information of a user;
an identifier associated with a computing device associated with the user; and
one or more measurements associated with interactions by the user with an interface, each of the one or more measurements including:
an action performed by the user to an object in the interface;
an identity of the object; and
a time value indicating a time at which the action was performed on the object;
obtain a set of internal data associated with the user;
obtain a set of verification data associated with the user;
generate a score based at least in part on the set of client data, the set of internal data, and the set of verification data;
determine, based at least in part the score, whether to:
request, from a third party entity, additional information associated with the user; and
update the score based at least in part the additional information; and
update the interface based at least in part on the score.
6. The system of claim 5, wherein the set of client data further includes one or more of:
a type and version of web browser being used by the user, or
a type of operating system installed on a computing device being used by the user.
7. The system of claim 5, wherein the one or more measurements are obtained from one or more of:
a user input device,
a gyroscope,
an image sensor,
accelerometer,
global positioning receiver, or
microphone.
8. The system of claim 5, wherein set of internal data includes one or more of:
whether the user has made a payment on a previous transaction,
a payment type of the previous transaction,
an amount owed from the previous transaction, or
an amount paid on the previous transaction.
9. The system of claim 10, wherein the set of verification data includes one or more of:
confirmation that a name of the user is associated with a geographical location provided by the user,
confirmation that the name of the user is associated with an email address provided by the user, or
confirmation that the name of the user is associated with a date of birth provided by the user.
10. The system of claim 5, wherein the set of verification data provides verification that at least a subset of the personally identifiable information of the user is accurate.
11. The system of claim 5, wherein the additional information includes one or more of:
a credit score of the user,
credit history of the user, or
public records associated with the user.
12. The system of claim 5, wherein the set of verification data is obtained from an entity external to the system.
13. A non-transitory computer-readable storage medium having stored thereon executable instructions that, as a result of being executed by one or more processors of a computer system, cause the computer system to at least:
obtain a first set of data associated with interactions by a user conducting a current transaction using an interface of a computing device associated with the user, the first set of data indicating:
an identifier associated with the computing device;
an action performed by the user to an object in the interface;
an identity of the object; and
a time value indicating a time at which the action was performed on the object;
obtain a second set of data associated with a previous transaction conducted by the user;
obtain a third set of data associated with verification of an identity of the user;
generate a first score based at least in part on the first set of data, the second set of data, and the third set of data;
on a condition that the first score does not reach a value relative to a threshold, update the interface based at least in part on the first score; and
on a condition that the first score reaches a value relative to the threshold:
obtain, from an external entity, a fourth set of data associated with the user;
generate a second score based at least in part on the fourth set of data; and
update the interface based at least in part on the second score.
14. The non-transitory computer-readable storage medium of claim 13, wherein the executable instructions that cause the computer system to update the interface based at least in part on the first score or the second score further include executable instructions that cause the computer system to:
determine a subset of options for payment or credit, based at least in part on the first score or second score, from a set of options for payment or credit; and
update the interface to present the subset of options to the user.
15. The non-transitory computer-readable storage medium of claim 13, wherein the executable instructions that cause the computer system to update the interface based at least in part on the first score or the second score further include executable instructions that cause the computer system to:
determine one or more product or service offerings based at least in part on the first score or second score; and
update the interface to present one or more product or service offerings to the user.
16. The non-transitory computer-readable storage medium of claim 13, wherein the first set of data includes one or more of:
a geographic address of the user,
a telephone number of the user, or
an email address of the user.
17. The non-transitory computer-readable storage medium of claim 13, wherein:
the first set of data is obtained by the computer system by execution of code embedded in a web page of a merchant;
the computer system is a computer system of a payment service provider different from the merchant; and
the code, as a result of being executed, causes the first set of data to be transmitted to the computer system.
18. The non-transitory computer-readable storage medium of claim 13, wherein:
the interface is a web page hosted by a first entity;
the computer system is a computer system of a second entity different from the user and the first entity; and
the executable instructions that cause the computer system to update the interface include executable instructions that cause the computer system to provide at least a portion of the web page of the first entity for display on the computing device of the user.
19. The non-transitory computer-readable storage medium of claim 13, wherein the instructions further include instructions that cause the computer system to generate a variance for the first score.
20. The non-transitory computer-readable storage medium of claim 19, wherein the executable instructions that cause the computer system to obtain the fourth set of data, generate the second score, and update the interface based at least in part the second score are further executed if the first score combined with the variance crosses the threshold.
US14/918,169 2015-07-01 2015-10-20 Workflow processing and user interface generation based on activity data Pending US20170004573A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/918,169 US20170004573A1 (en) 2015-07-01 2015-10-20 Workflow processing and user interface generation based on activity data
US15/167,890 US10387882B2 (en) 2015-07-01 2016-05-27 Method for using supervised model with physical store

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562187620P 2015-07-01 2015-07-01
US14/918,169 US20170004573A1 (en) 2015-07-01 2015-10-20 Workflow processing and user interface generation based on activity data

Publications (1)

Publication Number Publication Date
US20170004573A1 true US20170004573A1 (en) 2017-01-05

Family

ID=56027851

Family Applications (8)

Application Number Title Priority Date Filing Date
US14/820,468 Active US9904916B2 (en) 2015-07-01 2015-08-06 Incremental login and authentication to user portal without username/password
US14/830,686 Active US9355155B1 (en) 2015-07-01 2015-08-19 Method for using supervised model to identify user
US14/830,690 Active US10417621B2 (en) 2015-07-01 2015-08-19 Method for using supervised model to configure user interface presentation
US14/918,169 Pending US20170004573A1 (en) 2015-07-01 2015-10-20 Workflow processing and user interface generation based on activity data
US15/167,916 Active US9886686B2 (en) 2015-07-01 2016-05-27 Method for using supervised model to identify user
US15/875,977 Abandoned US20180144315A1 (en) 2015-07-01 2018-01-19 Incremental login and authentication to user portal without username/password
US15/885,526 Active US10607199B2 (en) 2015-07-01 2018-01-31 Method for using supervised model to identify user
US16/803,864 Active 2036-01-27 US11461751B2 (en) 2015-07-01 2020-02-27 Method for using supervised model to identify user

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US14/820,468 Active US9904916B2 (en) 2015-07-01 2015-08-06 Incremental login and authentication to user portal without username/password
US14/830,686 Active US9355155B1 (en) 2015-07-01 2015-08-19 Method for using supervised model to identify user
US14/830,690 Active US10417621B2 (en) 2015-07-01 2015-08-19 Method for using supervised model to configure user interface presentation

Family Applications After (4)

Application Number Title Priority Date Filing Date
US15/167,916 Active US9886686B2 (en) 2015-07-01 2016-05-27 Method for using supervised model to identify user
US15/875,977 Abandoned US20180144315A1 (en) 2015-07-01 2018-01-19 Incremental login and authentication to user portal without username/password
US15/885,526 Active US10607199B2 (en) 2015-07-01 2018-01-31 Method for using supervised model to identify user
US16/803,864 Active 2036-01-27 US11461751B2 (en) 2015-07-01 2020-02-27 Method for using supervised model to identify user

Country Status (1)

Country Link
US (8) US9904916B2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170070539A1 (en) * 2015-09-04 2017-03-09 Swim.IT Inc. Method of and system for privacy awarness
US20180287852A1 (en) * 2017-04-03 2018-10-04 Bank Of America Corporation Data Transfer, Over Session or Connection, and Between Computing Device and One or More Servers to Determine Third Party Routing Network For User Device
US20190087692A1 (en) * 2017-09-21 2019-03-21 Royal Bank Of Canada Device and method for assessing quality of visualizations of multidimensional data
US10601718B2 (en) 2017-04-03 2020-03-24 Bank Of America Corporation Data transfer, over session or connection, and between computing device and server associated with a routing network for modifying one or more parameters of the routing network
US10601934B2 (en) 2017-04-03 2020-03-24 Bank Of America Corporation Data transfer, over session or connection, and between computing device and one or more servers for transmitting data to a third party computing device
US10608918B2 (en) 2017-04-03 2020-03-31 Bank Of America Corporation Data transfer, over session or connection, and between computing device and one or more servers to determine likelihood of user device using a routing network
US10609156B2 (en) * 2017-04-03 2020-03-31 Bank Of America Corporation Data transfer, over session or connection, and between computing device and server associated with one or more routing networks in response to detecting activity
US10716060B2 (en) 2017-04-03 2020-07-14 Bank Of America Corporation Data transfer between computing device and user device at different locations and over session or connection to display one or more routing networks to use
US10846383B2 (en) * 2019-07-01 2020-11-24 Advanced New Technologies Co., Ltd. Applet-based account security protection method and system
US10853359B1 (en) 2015-12-21 2020-12-01 Amazon Technologies, Inc. Data log stream processing using probabilistic data structures
US20210049624A1 (en) * 2019-08-16 2021-02-18 The Toronto-Dominion Bank System and Method for Identifying Prospective Entities to Interact With
US11005839B1 (en) * 2018-03-11 2021-05-11 Acceptto Corporation System and method to identify abnormalities to continuously measure transaction risk
US20210150010A1 (en) * 2015-10-14 2021-05-20 Pindrop Security, Inc. Fraud detection in interactive voice response systems
US11075941B2 (en) * 2018-07-11 2021-07-27 Advanced New Technologies Co., Ltd. Risk control method, risk control apparatus, electronic device, and storage medium
US20210240144A1 (en) * 2018-06-07 2021-08-05 Omron Corporation Control system, control method, learning device, control device, learning method, and non-transitory computer-readable recording medium
US11126340B2 (en) * 2019-02-20 2021-09-21 Mastercard International Incorporated Systems and methods for dynamically generating customized web-based payment interfaces
US11144904B2 (en) * 2019-07-08 2021-10-12 Synchrony Bank Post-purchase credit offer and tender switch
US11151535B1 (en) * 2016-06-13 2021-10-19 Square, Inc. Utilizing APIs to facilitate open ticket synchronization
US11252174B2 (en) * 2016-12-16 2022-02-15 Worldpay, Llc Systems and methods for detecting security risks in network pages
US11386490B1 (en) * 2022-01-12 2022-07-12 Chime Financial, Inc. Generating graphical user interfaces comprising dynamic credit value user interface elements determined from a credit value model
US20220360847A1 (en) * 2019-06-25 2022-11-10 The Regents Of The University Of California Systems and methods for characterizing joint attention during real world interaction
US20230033901A1 (en) * 2021-07-28 2023-02-02 Citicorp Credit Services, Inc. (Usa) Dynamic revision of webpages with customized options
US20230058158A1 (en) * 2021-08-19 2023-02-23 Allstate Insurance Company Automated iterative predictive modeling computing platform
US11671462B2 (en) 2020-07-23 2023-06-06 Capital One Services, Llc Systems and methods for determining risk ratings of roles on cloud computing platform
US11790470B1 (en) 2018-03-16 2023-10-17 Block, Inc. Storage service for sensitive customer data
US11966972B2 (en) * 2022-06-24 2024-04-23 Chime Financial, Inc. Generating graphical user interfaces comprising dynamic credit value user interface elements determined from a credit value model

Families Citing this family (156)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9824394B1 (en) 2015-02-06 2017-11-21 Square, Inc. Payment processor financing of customer purchases
US9779432B1 (en) 2015-03-31 2017-10-03 Square, Inc. Invoice financing and repayment
US9904916B2 (en) 2015-07-01 2018-02-27 Klarna Ab Incremental login and authentication to user portal without username/password
US10387882B2 (en) * 2015-07-01 2019-08-20 Klarna Ab Method for using supervised model with physical store
US10643226B2 (en) * 2015-07-31 2020-05-05 Microsoft Technology Licensing, Llc Techniques for expanding a target audience for messaging
US10650325B2 (en) 2015-07-31 2020-05-12 Microsoft Technology Licensing, Llc Deterministic message distribution
US10248783B2 (en) 2015-12-22 2019-04-02 Thomson Reuters (Grc) Llc Methods and systems for identity creation, verification and management
US10055332B2 (en) * 2016-02-12 2018-08-21 International Business Machines Corporation Variable detection in source code to reduce errors
US11720983B2 (en) 2016-03-02 2023-08-08 Up N' Go System to text a payment link
US20170256007A1 (en) * 2016-03-02 2017-09-07 Touradj Barman Text payment system
CA3013371A1 (en) 2016-03-22 2017-09-28 Visa International Service Association Adaptable authentication processing
US10346900B1 (en) * 2016-03-24 2019-07-09 Amazon Technologies, Inc. System for determining content for advance rendering
US11244367B2 (en) 2016-04-01 2022-02-08 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US20220164840A1 (en) 2016-04-01 2022-05-26 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
CN108780390B (en) * 2016-06-06 2022-09-27 金融与风险组织有限公司 System and method for providing identity scores
US10878127B2 (en) 2016-06-10 2020-12-29 OneTrust, LLC Data subject access request processing systems and related methods
US10606916B2 (en) 2016-06-10 2020-03-31 OneTrust, LLC Data processing user interface monitoring systems and related methods
US10997318B2 (en) 2016-06-10 2021-05-04 OneTrust, LLC Data processing systems for generating and populating a data inventory for processing data access requests
US11544667B2 (en) 2016-06-10 2023-01-03 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11416798B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for providing training in a vendor procurement process
US11520928B2 (en) 2016-06-10 2022-12-06 OneTrust, LLC Data processing systems for generating personal data receipts and related methods
US10909265B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Application privacy scanning systems and related methods
US11277448B2 (en) 2016-06-10 2022-03-15 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11727141B2 (en) 2016-06-10 2023-08-15 OneTrust, LLC Data processing systems and methods for synching privacy-related user consent across multiple computing devices
US11294939B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11586700B2 (en) 2016-06-10 2023-02-21 OneTrust, LLC Data processing systems and methods for automatically blocking the use of tracking tools
US11227247B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems and methods for bundled privacy policies
US11336697B2 (en) 2016-06-10 2022-05-17 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11354434B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US11366909B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11341447B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Privacy management systems and methods
US10909488B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Data processing systems for assessing readiness for responding to privacy-related incidents
US10510031B2 (en) 2016-06-10 2019-12-17 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11392720B2 (en) 2016-06-10 2022-07-19 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US10467432B2 (en) 2016-06-10 2019-11-05 OneTrust, LLC Data processing systems for use in automatically generating, populating, and submitting data subject access requests
US11328092B2 (en) 2016-06-10 2022-05-10 OneTrust, LLC Data processing systems for processing and managing data subject access in a distributed environment
US11222139B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems and methods for automatic discovery and assessment of mobile software development kits
US11222309B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11625502B2 (en) 2016-06-10 2023-04-11 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US11134086B2 (en) 2016-06-10 2021-09-28 OneTrust, LLC Consent conversion optimization systems and related methods
US11410106B2 (en) 2016-06-10 2022-08-09 OneTrust, LLC Privacy management systems and methods
US11366786B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing systems for processing data subject access requests
US11475136B2 (en) 2016-06-10 2022-10-18 OneTrust, LLC Data processing systems for data transfer risk identification and related methods
US11418492B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for using a data model to select a target data asset in a data migration
US11651104B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Consent receipt management systems and related methods
US11651106B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10318761B2 (en) 2016-06-10 2019-06-11 OneTrust, LLC Data processing systems and methods for auditing data request compliance
US10282559B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11188862B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Privacy management systems and methods
US11222142B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for validating authorization for personal data collection, storage, and processing
US11416589B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11416590B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11188615B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Data processing consent capture systems and related methods
US11343284B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11438386B2 (en) 2016-06-10 2022-09-06 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10740487B2 (en) 2016-06-10 2020-08-11 OneTrust, LLC Data processing systems and methods for populating and maintaining a centralized database of personal data
US11210420B2 (en) 2016-06-10 2021-12-28 OneTrust, LLC Data subject access request processing systems and related methods
US11562097B2 (en) 2016-06-10 2023-01-24 OneTrust, LLC Data processing systems for central consent repository and related methods
US11675929B2 (en) 2016-06-10 2023-06-13 OneTrust, LLC Data processing consent sharing systems and related methods
US11461500B2 (en) 2016-06-10 2022-10-04 OneTrust, LLC Data processing systems for cookie compliance testing with website scanning and related methods
US10846433B2 (en) 2016-06-10 2020-11-24 OneTrust, LLC Data processing consent management systems and related methods
US11354435B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for data testing to confirm data deletion and related methods
US10592648B2 (en) 2016-06-10 2020-03-17 OneTrust, LLC Consent receipt management systems and related methods
US11403377B2 (en) 2016-06-10 2022-08-02 OneTrust, LLC Privacy management systems and methods
US10284604B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing and scanning systems for generating and populating a data inventory
US11238390B2 (en) 2016-06-10 2022-02-01 OneTrust, LLC Privacy management systems and methods
US10685140B2 (en) 2016-06-10 2020-06-16 OneTrust, LLC Consent receipt management systems and related methods
US11301796B2 (en) 2016-06-10 2022-04-12 OneTrust, LLC Data processing systems and methods for customizing privacy training
US11636171B2 (en) 2016-06-10 2023-04-25 OneTrust, LLC Data processing user interface monitoring systems and related methods
US10678945B2 (en) 2016-06-10 2020-06-09 OneTrust, LLC Consent receipt management systems and related methods
US11416109B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Automated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11481710B2 (en) 2016-06-10 2022-10-25 OneTrust, LLC Privacy management systems and methods
US11228620B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10949565B2 (en) * 2016-06-10 2021-03-16 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11295316B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems for identity validation for consumer rights requests and related methods
US10027671B2 (en) * 2016-06-16 2018-07-17 Ca, Inc. Restricting access to content based on a posterior probability that a terminal signature was received from a previously unseen computer terminal
WO2017223522A1 (en) * 2016-06-23 2017-12-28 Mohammad Shami Neural network systems and methods for generating distributed representations of electronic transaction information
US10032116B2 (en) * 2016-07-05 2018-07-24 Ca, Inc. Identifying computer devices based on machine effective speed calibration
US10671626B2 (en) * 2016-09-27 2020-06-02 Salesforce.Com, Inc. Identity consolidation in heterogeneous data environment
AU2017361132B2 (en) * 2016-11-21 2022-11-10 Isx Ip Ltd "identifying an entity"
US10645086B1 (en) * 2016-12-30 2020-05-05 Charles Schwab & Co., Inc. System and method for handling user requests for web services
CN108596410B (en) * 2017-03-09 2021-01-22 创新先进技术有限公司 Automatic wind control event processing method and device
US11049101B2 (en) * 2017-03-21 2021-06-29 Visa International Service Association Secure remote transaction framework
US10298401B1 (en) * 2017-03-22 2019-05-21 Amazon Technologies, Inc. Network content search system and method
US10615965B1 (en) 2017-03-22 2020-04-07 Amazon Technologies, Inc. Protected search index
CN108665120B (en) * 2017-03-27 2020-10-20 创新先进技术有限公司 Method and device for establishing scoring model and evaluating user credit
US10592706B2 (en) 2017-03-29 2020-03-17 Valyant AI, Inc. Artificially intelligent order processing system
US10013577B1 (en) 2017-06-16 2018-07-03 OneTrust, LLC Data processing systems for identifying whether cookies contain personally identifying information
US11631125B2 (en) * 2017-06-30 2023-04-18 Meta Platforms, Inc. Calculating bids for content items based on value of a product associated with the content item
US10728760B2 (en) * 2017-07-06 2020-07-28 Bank Of America Corporation Frictionless hardening of digital consent
US11366872B1 (en) * 2017-07-19 2022-06-21 Amazon Technologies, Inc. Digital navigation menus with dynamic content placement
US10673859B2 (en) * 2017-09-12 2020-06-02 International Business Machines Corporation Permission management
WO2019070644A2 (en) * 2017-10-02 2019-04-11 Arconic Inc. Systems and methods for utilizing multicriteria optimization in additive manufacture
WO2019070853A1 (en) * 2017-10-04 2019-04-11 The Dun & Bradstreet Corporation System and method for identity resolution across disparate distributed immutable ledger networks
GB201717251D0 (en) * 2017-10-20 2017-12-06 Palantir Technologies Inc Serving assets in a networked environment
US20190147543A1 (en) 2017-11-14 2019-05-16 International Business Machines Corporation Composite account structure
US10541881B2 (en) * 2017-12-14 2020-01-21 Disney Enterprises, Inc. Automated network supervision including detecting an anonymously administered node, identifying the administrator of the anonymously administered node, and registering the administrator and the anonymously administered node
US11100568B2 (en) * 2017-12-22 2021-08-24 Paypal, Inc. System and method for creating and analyzing a low-dimensional representation of webpage sequences
US11087237B2 (en) * 2018-01-30 2021-08-10 Walmart Apollo, Llc Machine learning techniques for transmitting push notifications
US10887305B1 (en) * 2018-03-30 2021-01-05 Mckesson Corporation Method and apparatus for generating and providing a temporary password to control access to a record created in response to an electronic message
JP6993498B2 (en) * 2018-04-16 2022-01-13 株式会社Nttドコモ Mobile terminal device
US11301939B2 (en) * 2018-05-02 2022-04-12 Gist Technology Inc. System for generating shareable user interfaces using purchase history data
US11544409B2 (en) 2018-09-07 2023-01-03 OneTrust, LLC Data processing systems and methods for automatically protecting sensitive data within privacy management systems
US10803202B2 (en) 2018-09-07 2020-10-13 OneTrust, LLC Data processing systems for orphaned data identification and deletion and related methods
US11301853B2 (en) * 2018-09-13 2022-04-12 Paypal, Inc. Speculative transaction operations for recognized devices
US10567375B1 (en) 2018-10-02 2020-02-18 Capital One Services, Llc Systems and methods for data access control and account management
US11276046B2 (en) * 2018-10-16 2022-03-15 Dell Products L.P. System for insights on factors influencing payment
US11328329B1 (en) * 2018-11-16 2022-05-10 Level 3 Communications, Llc Telecommunications infrastructure system and method
US10984388B2 (en) 2018-12-14 2021-04-20 International Business Machines Corporation Identifying complaints from messages
US20200210775A1 (en) * 2018-12-28 2020-07-02 Harman Connected Services, Incorporated Data stitching and harmonization for machine learning
US11321716B2 (en) * 2019-02-15 2022-05-03 Visa International Service Association Identity-based transaction processing
US11494850B1 (en) * 2019-03-13 2022-11-08 Alight Solutions Llc Applied artificial intelligence technology for detecting anomalies in payroll data
CN110503507B (en) * 2019-07-05 2023-09-22 中国平安财产保险股份有限公司 Insurance product data pushing method and system based on big data and computer equipment
US20210073848A1 (en) * 2019-09-05 2021-03-11 Jpmorgan Chase Bank, N.A. Method and system for offer targeting
US11127073B2 (en) * 2019-10-03 2021-09-21 Capital One Services, Llc Systems and methods for obtaining user parameters of e-commerce users to auto complete checkout forms
US11379092B2 (en) 2019-11-11 2022-07-05 Klarna Bank Ab Dynamic location and extraction of a user interface element state in a user interface that is dependent on an event occurrence in a different user interface
US11726752B2 (en) 2019-11-11 2023-08-15 Klarna Bank Ab Unsupervised location and extraction of option elements in a user interface
US11366645B2 (en) 2019-11-11 2022-06-21 Klarna Bank Ab Dynamic identification of user interface elements through unsupervised exploration
US10839033B1 (en) * 2019-11-26 2020-11-17 Vui, Inc. Referring expression generation
US20210192511A1 (en) * 2019-12-18 2021-06-24 The Toronto-Dominion Bank Systems and methods for configuring data transfers
US11593677B1 (en) * 2019-12-31 2023-02-28 American Express Travel Related Services Company, Inc. Computer-based systems configured to utilize predictive machine learning techniques to define software objects and methods of use thereof
US20230039728A1 (en) * 2019-12-31 2023-02-09 Starkey Laboratories, Inc. Hearing assistance device model prediction
CN111510422B (en) * 2020-01-09 2021-07-09 中国石油大学(华东) Identity authentication method based on terminal information extension sequence and random forest model
US11386356B2 (en) 2020-01-15 2022-07-12 Klama Bank AB Method of training a learning system to classify interfaces
US10846106B1 (en) 2020-03-09 2020-11-24 Klarna Bank Ab Real-time interface classification in an application
US11880351B1 (en) * 2020-04-14 2024-01-23 Wells Fargo Bank, N.A. Systems and methods for storing and verifying data
US11568128B2 (en) * 2020-04-15 2023-01-31 Sap Se Automated determination of data values for form fields
US11797528B2 (en) 2020-07-08 2023-10-24 OneTrust, LLC Systems and methods for targeted data discovery
US20220020049A1 (en) * 2020-07-14 2022-01-20 Accenture Global Solutions Limited Artificial intelligence (ai) based payments processor
US11277265B2 (en) 2020-07-17 2022-03-15 The Government of the United States of America, as represented by the Secretary of Homeland Security Verified base image in photo gallery
WO2022026564A1 (en) 2020-07-28 2022-02-03 OneTrust, LLC Systems and methods for automatically blocking the use of tracking tools
US20220036414A1 (en) * 2020-07-30 2022-02-03 International Business Machines Corporation Product description-based line item matching
US20230289376A1 (en) 2020-08-06 2023-09-14 OneTrust, LLC Data processing systems and methods for automatically redacting unstructured data from a data subject access request
WO2022060860A1 (en) 2020-09-15 2022-03-24 OneTrust, LLC Data processing systems and methods for detecting tools for the automatic blocking of consent requests
WO2022061270A1 (en) 2020-09-21 2022-03-24 OneTrust, LLC Data processing systems and methods for automatically detecting target data transfers and target data processing
GB2598952A (en) * 2020-09-22 2022-03-23 Mastercard International Inc System and method for prevention of unintended checkout
CA3196696A1 (en) * 2020-10-28 2022-05-05 Piggy Llc Improved secure transaction process utilizing integration layer
WO2022099023A1 (en) 2020-11-06 2022-05-12 OneTrust, LLC Systems and methods for identifying data processing activities based on data discovery results
CN112487411A (en) * 2020-12-15 2021-03-12 中国电子科技集团公司第三十研究所 Password guessing method and system based on random forest
US20220198464A1 (en) * 2020-12-18 2022-06-23 Orolia Usa Inc. Methods for automated predictive modeling to assess customer confidence and devices thereof
WO2022159901A1 (en) 2021-01-25 2022-07-28 OneTrust, LLC Systems and methods for discovery, classification, and indexing of data in a native computing system
WO2022170047A1 (en) 2021-02-04 2022-08-11 OneTrust, LLC Managing custom attributes for domain objects defined within microservices
CN112860959B (en) * 2021-02-05 2021-11-05 哈尔滨工程大学 Entity analysis method based on random forest improvement
WO2022170254A1 (en) 2021-02-08 2022-08-11 OneTrust, LLC Data processing systems and methods for anonymizing data samples in classification analysis
US11687939B2 (en) * 2021-02-09 2023-06-27 Capital One Services, Llc Fraud prevention systems and methods for selectively generating virtual account numbers
US20240098109A1 (en) 2021-02-10 2024-03-21 OneTrust, LLC Systems and methods for mitigating risks of third-party computing system functionality integration into a first-party computing system
US20220253864A1 (en) * 2021-02-10 2022-08-11 Klarna Bank Ab Triggering computer system processes through messaging systems
WO2022178089A1 (en) 2021-02-17 2022-08-25 OneTrust, LLC Managing custom workflows for domain objects defined within microservices
WO2022178219A1 (en) 2021-02-18 2022-08-25 OneTrust, LLC Selective redaction of media content
EP4305539A1 (en) 2021-03-08 2024-01-17 OneTrust, LLC Data transfer discovery and analysis systems and related methods
US11562078B2 (en) 2021-04-16 2023-01-24 OneTrust, LLC Assessing and managing computational risk involved with integrating third party computing functionality within a computing system
US11750686B2 (en) 2021-08-03 2023-09-05 The Toronto-Dominion Bank System and method for enabling one or more transfer features associated with a real-time transfer protocol
US11915313B2 (en) * 2021-08-16 2024-02-27 Capital One Services, Llc Using email history to estimate creditworthiness for applicants having insufficient credit history
US20230066992A1 (en) * 2021-08-31 2023-03-02 The Headcount Inc. Systems and methods for identifying and verifying assets and employment information at a constructions site
US20230281615A1 (en) * 2022-03-04 2023-09-07 Kashish Soien Systems and methods for user identification
US11620142B1 (en) 2022-06-03 2023-04-04 OneTrust, LLC Generating and customizing user interfaces for demonstrating functions of interactive user environments

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010023414A1 (en) * 1998-12-08 2001-09-20 Srihari Kumar Interactive calculation and presentation of financial data results through a single interface on a data-packet-network
US20010032182A1 (en) * 1998-12-08 2001-10-18 Srihari Kumar Interactive bill payment center
US20100042487A1 (en) * 2008-08-12 2010-02-18 Yosef Barazani Apparatus and Method of Monetizing Hyperlinks
US20110131122A1 (en) * 2009-12-01 2011-06-02 Bank Of America Corporation Behavioral baseline scoring and risk scoring
US8171545B1 (en) * 2007-02-14 2012-05-01 Symantec Corporation Process profiling for behavioral anomaly detection
US20130301953A1 (en) * 2012-05-12 2013-11-14 Roland Wescott Montague Rotatable Object System For Visual Communication And Analysis
US8606696B1 (en) * 2012-09-11 2013-12-10 Simplexity, Inc. Assessing consumer purchase behavior in making a financial contract authorization decision
US20140074687A1 (en) * 2012-09-11 2014-03-13 Simplexity, Inc. Assessing consumer purchase behavior in making a financial contract authorization decision
US20140136608A1 (en) * 2012-11-15 2014-05-15 Tencent Technology (Shenzhen) Company Limited Method, device and system for processing client environment data
US20140164218A1 (en) * 2011-01-13 2014-06-12 Lenddo, Limited Risk-related scoring using online social footprint
US20150127628A1 (en) * 2012-04-16 2015-05-07 Onepatont Software Limited Method and System for Display Dynamic & Accessible Actions with Unique Identifiers and Activities
US9355155B1 (en) * 2015-07-01 2016-05-31 Klarna Ab Method for using supervised model to identify user
US9514452B2 (en) * 2012-11-20 2016-12-06 Paypal, Inc. System and method for simplified checkout with payment float
US20170004487A1 (en) * 2015-07-01 2017-01-05 Klarna Ab Method for using supervised model with physical store

Family Cites Families (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6029193A (en) 1996-06-25 2000-02-22 Matsushita Electric Industrial Co., Ltd. Data sending/receiving system, data broadcasting method and data receiving apparatus for television broadcasting
US7260643B2 (en) 2001-03-30 2007-08-21 Xerox Corporation Systems and methods for identifying user types using multi-modal clustering and information scent
US7266840B2 (en) 2001-07-12 2007-09-04 Vignette Corporation Method and system for secure, authorized e-mail based transactions
KR100408892B1 (en) 2002-01-29 2003-12-11 케이비 테크놀러지 (주) System for electronically settling accounts
JP3682529B2 (en) 2002-01-31 2005-08-10 独立行政法人情報通信研究機構 Summary automatic evaluation processing apparatus, summary automatic evaluation processing program, and summary automatic evaluation processing method
JP2003248676A (en) 2002-02-22 2003-09-05 Communication Research Laboratory Solution data compiling device and method, and automatic summarizing device and method
US7805366B2 (en) * 2003-03-21 2010-09-28 Ebay Inc. Method and system to facilitate payments to satisfy payment obligations resulting from purchase transactions
US20050021462A1 (en) 2003-07-21 2005-01-27 Don Teague Method and system to process a billing failure in a network-based commerce facility
US8572391B2 (en) 2003-09-12 2013-10-29 Emc Corporation System and method for risk based authentication
US20050125338A1 (en) 2003-12-09 2005-06-09 Tidwell Lisa C. Systems and methods for assessing the risk of a financial transaction using reconciliation information
US7594121B2 (en) 2004-01-22 2009-09-22 Sony Corporation Methods and apparatus for determining an identity of a user
US7970858B2 (en) 2004-10-29 2011-06-28 The Go Daddy Group, Inc. Presenting search engine results based on domain name related reputation
US7802723B2 (en) 2005-04-19 2010-09-28 American Exrpess Travel Related Services Company, Inc. System and method for nameless biometric authentication and non-repudiation validation
US20120204257A1 (en) 2006-04-10 2012-08-09 International Business Machines Corporation Detecting fraud using touchscreen interaction behavior
US7634464B2 (en) * 2006-06-14 2009-12-15 Microsoft Corporation Designing record matching queries utilizing examples
US20080133407A1 (en) 2006-11-30 2008-06-05 Checkfree Corporation Methods and Systems for Determining and Displaying Payment Options in an Electronic Payment System
BRPI0720718A2 (en) 2006-12-29 2014-04-01 Thomson Reuters Glo Resources METHODS, INFORMATION RECOVERY SYSTEMS AND SOFTWARE WITH CONCEPT-BASED SEARCH AND CLASSIFICATION
US8131759B2 (en) * 2007-10-18 2012-03-06 Asurion Corporation Method and apparatus for identifying and resolving conflicting data records
US8315951B2 (en) 2007-11-01 2012-11-20 Alcatel Lucent Identity verification for secure e-commerce transactions
US9497583B2 (en) * 2007-12-12 2016-11-15 Iii Holdings 2, Llc System and method for generating a recommendation on a mobile device
US8738486B2 (en) 2007-12-31 2014-05-27 Mastercard International Incorporated Methods and apparatus for implementing an ensemble merchant prediction system
US8078651B2 (en) * 2008-01-24 2011-12-13 Oracle International Corporation Match rules to identify duplicate records in inbound data
KR20090126666A (en) 2008-06-05 2009-12-09 주진호 Credit loan method based on cyber-activity
US8543091B2 (en) 2008-06-06 2013-09-24 Ebay Inc. Secure short message service (SMS) communications
US8494897B1 (en) 2008-06-30 2013-07-23 Alexa Internet Inferring profiles of network users and the resources they access
US9324098B1 (en) 2008-07-22 2016-04-26 Amazon Technologies, Inc. Hosted payment service system and method
US8913991B2 (en) 2008-08-15 2014-12-16 At&T Intellectual Property I, L.P. User identification in cell phones based on skin contact
US9471920B2 (en) 2009-05-15 2016-10-18 Idm Global, Inc. Transaction assessment and/or authentication
US8204833B2 (en) * 2009-05-27 2012-06-19 Softroute Corporation Method for fingerprinting and identifying internet users
US8103650B1 (en) 2009-06-29 2012-01-24 Adchemy, Inc. Generating targeted paid search campaigns
US10438181B2 (en) 2009-07-22 2019-10-08 Visa International Service Association Authorizing a payment transaction using seasoned data
US9070146B2 (en) 2010-02-04 2015-06-30 Playspan Inc. Method and system for authenticating online transactions
US20110307381A1 (en) 2010-06-10 2011-12-15 Paul Kim Methods and systems for third party authentication and fraud detection for a payment transaction
US8577399B2 (en) 2010-06-15 2013-11-05 Cox Communications, Inc. Systems and methods for facilitating a commerce transaction over a distribution network
GB2482524A (en) 2010-08-05 2012-02-08 John Henry Pearson Locking system for adjustable prop
US8898159B2 (en) 2010-09-28 2014-11-25 International Business Machines Corporation Providing answers to questions using logical synthesis of candidate answers
US9477826B2 (en) * 2010-11-29 2016-10-25 Biocatch Ltd. Device, system, and method of detecting multiple users accessing the same account
US9183588B2 (en) * 2011-01-20 2015-11-10 Ebay, Inc. Three dimensional proximity recommendation system
US9455982B2 (en) * 2011-05-20 2016-09-27 Steve Smith Identification authentication in a communications network
US20120324367A1 (en) 2011-06-20 2012-12-20 Primal Fusion Inc. System and method for obtaining preferences with a user interface
US9529910B2 (en) 2011-07-13 2016-12-27 Jean Alexandera Munemann Systems and methods for an expert-informed information acquisition engine utilizing an adaptive torrent-based heterogeneous network solution
CA2747153A1 (en) 2011-07-19 2013-01-19 Suleman Kaheer Natural language processing dialog system for obtaining goods, services or information
US10810218B2 (en) * 2011-10-14 2020-10-20 Transunion, Llc System and method for matching of database records based on similarities to search queries
KR101766952B1 (en) 2012-05-02 2017-08-09 유니버시티 오브 매니토바 User identity detection on interactive surfaces
US9571514B2 (en) 2012-06-29 2017-02-14 International Business Machines Corporation Notification of security question compromise level based on social network interactions
US10089639B2 (en) 2013-01-23 2018-10-02 [24]7.ai, Inc. Method and apparatus for building a user profile, for personalization using interaction data, and for generating, identifying, and capturing user data across interactions using unique user identification
US9626629B2 (en) 2013-02-14 2017-04-18 24/7 Customer, Inc. Categorization of user interactions into predefined hierarchical categories
US9544381B2 (en) 2013-03-13 2017-01-10 Arizona Board Of Regents On Behalf Of Arizona State University User identification across social media
US10140664B2 (en) * 2013-03-14 2018-11-27 Palantir Technologies Inc. Resolving similar entities from a transaction database
US20140279509A1 (en) 2013-03-14 2014-09-18 Facebook, Inc. Method for implementing an alternative payment
US20150046302A1 (en) 2013-08-09 2015-02-12 Mastercard International Incorporated Transaction level modeling method and apparatus
WO2015034295A1 (en) 2013-09-05 2015-03-12 Samsung Electronics Co., Ltd. Method and apparatus for configuring and recommending device action using user context
US9628482B2 (en) * 2013-10-31 2017-04-18 Cellco Partnership Mobile based login via wireless credential transfer
US8818910B1 (en) 2013-11-26 2014-08-26 Comrise, Inc. Systems and methods for prioritizing job candidates using a decision-tree forest algorithm
WO2015081086A1 (en) 2013-11-27 2015-06-04 The Johns Hopkins University System and method for medical data analysis and sharing
US9571427B2 (en) 2013-12-31 2017-02-14 Google Inc. Determining strength of association between user contacts
EP3108612B1 (en) * 2014-02-18 2020-07-22 Secureauth Corporation Fingerprint based authentication for single sign on
US9485209B2 (en) 2014-03-17 2016-11-01 International Business Machines Corporation Marking of unfamiliar or ambiguous expressions in electronic messages
US9871714B2 (en) 2014-08-01 2018-01-16 Facebook, Inc. Identifying user biases for search results on online social networks
US9934507B2 (en) 2014-08-11 2018-04-03 International Business Machines Corporation Mapping user actions to historical paths to determine a predicted endpoint
US9384357B2 (en) 2014-10-01 2016-07-05 Quixey, Inc. Providing application privacy information
US20160203485A1 (en) * 2015-01-08 2016-07-14 Ca, Inc. Selective authentication based on similarities of ecommerce transactions from a same user terminal across financial accounts
US20160239837A1 (en) 2015-02-18 2016-08-18 Apriva, Llc Method and system for facilitating a payment transaction with a mobile payment server
US10026097B2 (en) * 2015-02-18 2018-07-17 Oath (Americas) Inc. Systems and methods for inferring matches and logging-in of online users across devices
US10121157B2 (en) 2015-04-17 2018-11-06 GoodData Corporation Recommending user actions based on collective intelligence for a multi-tenant data analysis system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010023414A1 (en) * 1998-12-08 2001-09-20 Srihari Kumar Interactive calculation and presentation of financial data results through a single interface on a data-packet-network
US20010032182A1 (en) * 1998-12-08 2001-10-18 Srihari Kumar Interactive bill payment center
US8171545B1 (en) * 2007-02-14 2012-05-01 Symantec Corporation Process profiling for behavioral anomaly detection
US20100042487A1 (en) * 2008-08-12 2010-02-18 Yosef Barazani Apparatus and Method of Monetizing Hyperlinks
US20110131122A1 (en) * 2009-12-01 2011-06-02 Bank Of America Corporation Behavioral baseline scoring and risk scoring
US20140164218A1 (en) * 2011-01-13 2014-06-12 Lenddo, Limited Risk-related scoring using online social footprint
US20150127628A1 (en) * 2012-04-16 2015-05-07 Onepatont Software Limited Method and System for Display Dynamic & Accessible Actions with Unique Identifiers and Activities
US20130301953A1 (en) * 2012-05-12 2013-11-14 Roland Wescott Montague Rotatable Object System For Visual Communication And Analysis
US8606696B1 (en) * 2012-09-11 2013-12-10 Simplexity, Inc. Assessing consumer purchase behavior in making a financial contract authorization decision
US20140074687A1 (en) * 2012-09-11 2014-03-13 Simplexity, Inc. Assessing consumer purchase behavior in making a financial contract authorization decision
US20140136608A1 (en) * 2012-11-15 2014-05-15 Tencent Technology (Shenzhen) Company Limited Method, device and system for processing client environment data
US9514452B2 (en) * 2012-11-20 2016-12-06 Paypal, Inc. System and method for simplified checkout with payment float
US9355155B1 (en) * 2015-07-01 2016-05-31 Klarna Ab Method for using supervised model to identify user
US20170004487A1 (en) * 2015-07-01 2017-01-05 Klarna Ab Method for using supervised model with physical store

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10362067B2 (en) * 2015-09-04 2019-07-23 Swim.IT Inc Method of and system for privacy awareness
US10367852B2 (en) 2015-09-04 2019-07-30 Swim.IT Inc. Multiplexed demand signaled distributed messaging
US20170070539A1 (en) * 2015-09-04 2017-03-09 Swim.IT Inc. Method of and system for privacy awarness
US20210150010A1 (en) * 2015-10-14 2021-05-20 Pindrop Security, Inc. Fraud detection in interactive voice response systems
US11748463B2 (en) * 2015-10-14 2023-09-05 Pindrop Security, Inc. Fraud detection in interactive voice response systems
US10853359B1 (en) 2015-12-21 2020-12-01 Amazon Technologies, Inc. Data log stream processing using probabilistic data structures
US11151535B1 (en) * 2016-06-13 2021-10-19 Square, Inc. Utilizing APIs to facilitate open ticket synchronization
US11252174B2 (en) * 2016-12-16 2022-02-15 Worldpay, Llc Systems and methods for detecting security risks in network pages
US10601718B2 (en) 2017-04-03 2020-03-24 Bank Of America Corporation Data transfer, over session or connection, and between computing device and server associated with a routing network for modifying one or more parameters of the routing network
US10716060B2 (en) 2017-04-03 2020-07-14 Bank Of America Corporation Data transfer between computing device and user device at different locations and over session or connection to display one or more routing networks to use
US10798007B2 (en) 2017-04-03 2020-10-06 Bank Of America Corporation Data transfer, over session or connection, and between computing device and server associated with a routing network for modifying one or more parameters of the routing network
US10609156B2 (en) * 2017-04-03 2020-03-31 Bank Of America Corporation Data transfer, over session or connection, and between computing device and server associated with one or more routing networks in response to detecting activity
US10608918B2 (en) 2017-04-03 2020-03-31 Bank Of America Corporation Data transfer, over session or connection, and between computing device and one or more servers to determine likelihood of user device using a routing network
US10601934B2 (en) 2017-04-03 2020-03-24 Bank Of America Corporation Data transfer, over session or connection, and between computing device and one or more servers for transmitting data to a third party computing device
US20180287852A1 (en) * 2017-04-03 2018-10-04 Bank Of America Corporation Data Transfer, Over Session or Connection, and Between Computing Device and One or More Servers to Determine Third Party Routing Network For User Device
US11157781B2 (en) * 2017-09-21 2021-10-26 Royal Bank Of Canada Device and method for assessing quality of visualizations of multidimensional data
US20190087692A1 (en) * 2017-09-21 2019-03-21 Royal Bank Of Canada Device and method for assessing quality of visualizations of multidimensional data
US11005839B1 (en) * 2018-03-11 2021-05-11 Acceptto Corporation System and method to identify abnormalities to continuously measure transaction risk
US11790470B1 (en) 2018-03-16 2023-10-17 Block, Inc. Storage service for sensitive customer data
US20210240144A1 (en) * 2018-06-07 2021-08-05 Omron Corporation Control system, control method, learning device, control device, learning method, and non-transitory computer-readable recording medium
US11681261B2 (en) * 2018-06-07 2023-06-20 Omron Corporation Control system, control method, learning device, control device, learning method for controlling an operation of a subject device on basis of a detemined command value
US11075941B2 (en) * 2018-07-11 2021-07-27 Advanced New Technologies Co., Ltd. Risk control method, risk control apparatus, electronic device, and storage medium
US11126340B2 (en) * 2019-02-20 2021-09-21 Mastercard International Incorporated Systems and methods for dynamically generating customized web-based payment interfaces
US20220360847A1 (en) * 2019-06-25 2022-11-10 The Regents Of The University Of California Systems and methods for characterizing joint attention during real world interaction
US10846383B2 (en) * 2019-07-01 2020-11-24 Advanced New Technologies Co., Ltd. Applet-based account security protection method and system
US11144904B2 (en) * 2019-07-08 2021-10-12 Synchrony Bank Post-purchase credit offer and tender switch
US11893564B2 (en) 2019-07-08 2024-02-06 Synchrony Bank Post-purchase credit offer and tender switch
US20210049624A1 (en) * 2019-08-16 2021-02-18 The Toronto-Dominion Bank System and Method for Identifying Prospective Entities to Interact With
US11671462B2 (en) 2020-07-23 2023-06-06 Capital One Services, Llc Systems and methods for determining risk ratings of roles on cloud computing platform
US20230033901A1 (en) * 2021-07-28 2023-02-02 Citicorp Credit Services, Inc. (Usa) Dynamic revision of webpages with customized options
US20230058158A1 (en) * 2021-08-19 2023-02-23 Allstate Insurance Company Automated iterative predictive modeling computing platform
US11386490B1 (en) * 2022-01-12 2022-07-12 Chime Financial, Inc. Generating graphical user interfaces comprising dynamic credit value user interface elements determined from a credit value model
US20230222578A1 (en) * 2022-01-12 2023-07-13 Chime Financial, Inc. Generating graphical user interfaces comprising dynamic credit value user interface elements determined from a credit value model
US11966972B2 (en) * 2022-06-24 2024-04-23 Chime Financial, Inc. Generating graphical user interfaces comprising dynamic credit value user interface elements determined from a credit value model

Also Published As

Publication number Publication date
US20170004136A1 (en) 2017-01-05
US10607199B2 (en) 2020-03-31
US9355155B1 (en) 2016-05-31
US20180144315A1 (en) 2018-05-24
US20200202317A1 (en) 2020-06-25
US20180158037A1 (en) 2018-06-07
US20170004469A1 (en) 2017-01-05
US9904916B2 (en) 2018-02-27
US11461751B2 (en) 2022-10-04
US10417621B2 (en) 2019-09-17
US20170004582A1 (en) 2017-01-05
US9886686B2 (en) 2018-02-06

Similar Documents

Publication Publication Date Title
US20170004573A1 (en) Workflow processing and user interface generation based on activity data
US10387882B2 (en) Method for using supervised model with physical store
US10565641B2 (en) Financial gadgets
US10417706B1 (en) Integrating externally-supplied interface component into transaction platform
US20130117154A1 (en) Method and System of Evaluating Credibility of Online Trading User
US10909590B2 (en) Merchant and item ratings
US20160321722A1 (en) Systems and methods for obtaining consumer data
US20170193596A1 (en) System and Method for Providing a Financial Product Using a Customer Product Criteria
US20210090165A1 (en) Application programing interface for providing financial-product eligibility quotation
JP2023546849A (en) Machine learning to predict, recommend, and buy and sell securities in currency markets
JP2019091355A (en) Determination device, determination method and determination program
JP2019185595A (en) Information processor, method for processing information, information processing program, determination device, method for determination, and determination program
JP2018116694A (en) Calculation device, calculation method and calculation program
JP6267812B1 (en) Calculation device, calculation method, and calculation program
US20210217035A1 (en) Fair price estimator
US10956925B1 (en) Method and system for performing transactions using aggregate payment media
WO2016209990A1 (en) Presenting opportunities for instant transactions
KR102171487B1 (en) Method for managing or using platform capable of providing one or more secured loan as customized secured loan to eligible lenders, and server and financial institution terminal by using same
CA3030386A1 (en) Application programing interface for providing financial-product eligibility quotation
JP7337298B1 (en) Information processing method, information processing program, and information processing apparatus
US20210256486A1 (en) Computer Based System and Method for Controlling Third Party Transacting Through a single Interface
US20140258092A1 (en) Systems and methods for analyzing financial accounts and business efforts based on lender information
WO2023007384A1 (en) Systems and methods for asset authentication and management
CN117437044A (en) Medium bond and bond transaction data processing method, device, equipment and medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: KLARNA AB, SWEDEN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HUSSAIN, MIKAEL;REEL/FRAME:036837/0794

Effective date: 20151019

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

AS Assignment

Owner name: KLARNA BANK AB, SWEDEN

Free format text: CHANGE OF NAME;ASSIGNOR:KLARNA AB;REEL/FRAME:055177/0578

Effective date: 20190927

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS