WO2001039428A2 - Method and system for protecting of user privacy - Google Patents

Method and system for protecting of user privacy Download PDF

Info

Publication number
WO2001039428A2
WO2001039428A2 PCT/US2000/042241 US0042241W WO0139428A2 WO 2001039428 A2 WO2001039428 A2 WO 2001039428A2 US 0042241 W US0042241 W US 0042241W WO 0139428 A2 WO0139428 A2 WO 0139428A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
personal
data
response
encrypted
Prior art date
Application number
PCT/US2000/042241
Other languages
French (fr)
Other versions
WO2001039428A3 (en
Inventor
James F. Moore
Original Assignee
Geopartners Research, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Geopartners Research, Inc. filed Critical Geopartners Research, Inc.
Priority to AU30835/01A priority Critical patent/AU3083501A/en
Publication of WO2001039428A2 publication Critical patent/WO2001039428A2/en
Publication of WO2001039428A3 publication Critical patent/WO2001039428A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • H04L63/0414Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden during transmission, i.e. party's identity is protected against eavesdropping, e.g. by using temporary identifiers, but is known to the other party or parties involved in the communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2211/00Indexing scheme relating to details of data-processing equipment not covered by groups G06F3/00 - G06F13/00
    • G06F2211/007Encryption, En-/decode, En-/decipher, En-/decypher, Scramble, (De-)compress

Definitions

  • the invention relates generally to the exposure of personal data and, more particularly, to a method and system that enables individual end users to voluntarily disclose personal data while protecting their personal privacy.
  • a person with a serious medical condition may desire to search and surf the web or visit chat rooms to find answers to issues regarding the medical condition, but the web and chat rooms are inefficient, and expose the person to significant privacy risks.
  • Searching and surfing the web also exposes the user' s keystrokes as the target of companies such as EngageTM that generate profiles based on a user's web behavior. At best this generates targeted banner ads that may or may not be desired, and at worst these profiles may be linked to the user' s name and offline data.
  • Chat rooms also require that the user provide his/her e-mail address or identity, which may invite undesired responses.
  • the collectors of personal data about individuals face substantial problems in responding to individuals. While they have the technical ability to create sites that are "mass customized” and communicate with a "unit of one" to customers, the collectors of data don't have sources of data and insight about customers that are reliable enough to drive such systems effectively. What they have is data that is partial, fragmentary, demographic and perhaps broadly psychographic, but that does not relate directly to things like customer values, intentions, and specific needs. This lack of data and insight is primarily due to the unwillingness of the customer to provide overt personal data, because of privacy concerns or lack of effective ways to do keep personal data private, and the resulting covert, unverified nature of the data that collectors have.
  • Such methods and systems should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use.
  • the present invention accordingly, provides a method for enabling an individual end user to disclose personal data and enter into a mass-customized dialogue with one or more web sites, while protecting personal privacy.
  • the method comprises steps performed by an individual end user to fill out questionnaires by way of a software application residing on the end user' s computer or similar device, and generating packet messages containing encrypted personal data and an encrypted personal identifier of the user.
  • the packet is sent to a digital data agency (DDA) , which decrypts the personal date, but leaves the personal ID encrypted, and then forwards the packet messages to one or more digital data collector/responders (DDCR).
  • DDA digital data agency
  • DDCR digital data collector/responders
  • the DDA receives from the DDCRs responses to the packet messages, and decrypts the encrypted personal identifier to determine the individual end user.
  • the DDA then encrypts the response, and forwards the encrypted response to an interface for review by the individual end user.
  • personal data of individual persons may be collected, stored, disseminated, and audited in accordance with approvals and permissions provided by the individual.
  • Data elements may also be processed individually (i.e., element-by-element), thereby providing additional privacy to an individual.
  • a user may also differentiate between data elements to provide different levels of protection for each data element.
  • the transmission of data elements in packets also facilitates quick responses.
  • the present invention should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use.
  • FIGURE 1 is a high-level conceptual block diagram illustrating a system embodying features of the present invention
  • FIGURE 1A exemplifies a questionnaire that may be used in connection with the system of FIG. 1;
  • FIGURE 2 is a flow chart illustrating steps executed on the system of FIG. 1 for practicing the present invention
  • FIGURE 3 exemplifies entries made by a user for transmission to a digital data agency of FIG. 1;
  • FIGURE 4 shows the structure of a data message sent by a user into the system of FIG. 1.
  • the reference numeral 100 generally designates a system embodying features of the present invention that enables individual persons, i.e., end users (not shown), to disclose personal data while protecting their personal privacy, by entering anonymously into mass-customized, automated dialogues of query and response with selected, preferably automated, web sites.
  • the system 100 includes an interface 102, such as a computer terminal, personal digital assistant (PDA), or the like, through which an individual person (hereinafter "end user” or simply "user”) or other provider of personal data may enter personal data.
  • PDA personal digital assistant
  • the interface 102 is connected in data communication with a digital data agency (DDA) 104 which acts in an intermediary role between the interface 102 and one or more audited, preferably automated, web sites, referred to herein as digital data collector/responders (DDCR) 106 (e.g., a medical clinic), as discussed in further detail below.
  • DDA digital data agency
  • DDCR digital data collector/responders
  • the interface 102 further includes an applet 103 (small application program containing computer code) for execution on the interface 102 for enabling the user to enter personal data as discussed below.
  • the interface 102 still further includes a registry 105, or access to an open, public registry, which contains a list of standard, generic questions, the answers to which would provide a DDCR 106 with sufficient information to enable it to be responsive to the needs of the user.
  • the registry 105 also provides a data element registry number which is assigned to each question for purposes discussed below.
  • FIGURE 2 is a flowchart of steps executed in accordance with the present invention for disclosing a user's personal data while protecting the user's personal privacy.
  • a personal identity and a digital signature must be established in a public key encryption (e.g., PGP) relationship between the user and the DDA 104.
  • the DDA 104 may optionally request additional identifying information about the user, such as the user's home address, telephone number, and the like.
  • PGP public key encryption
  • PGP public key encryption
  • the DDA 104 may optionally request additional identifying information about the user, such as the user's home address, telephone number, and the like.
  • Personal identities, digital signatures, encryption, and the like are considered to be well-known in the art and, therefore, will not be discussed in further detail herein.
  • step 202 the user obtains a suitable questionnaire from a suitable source, such as a DDCR 106 via the Internet, and completes it.
  • a suitable source such as a DDCR 106 via the Internet
  • FIGURE 1A exemplifies a questionnaire 120 that a user may obtain.
  • the questionnaire 120 requests that a user enter his/her personal ID in a blank 122, and then respond in blanks 124 to a number of corresponding questions that are relevant, for example, to dealing with Lupus.
  • the questionnaire 120 then asks the user to fill in five approval/permission parameters 126, 128, 130, 132, and 134 relating to responses 124.
  • a user identifies what use (e.g., medical diagnostics) the responses 124 may be used for.
  • the blank 128 requests that a user identify what uses (e.g., an emergency referral to health a provider) other than those listed in the blank 126 a respective response 124 may be used for.
  • the user identifies which parties (e.g., web sites recognized by the user to be highly reliable sources of relevant information such as the Mayo Clinic, the National Institute of Health, and Dr. Koop, and who operate mass- customized automated response capabilities in accordance with the present invention) the responses 124 may be disclosed to.
  • parties e.g., web sites recognized by the user to be highly reliable sources of relevant information such as the Mayo Clinic, the National Institute of Health, and Dr. Koop, and who operate mass- customized automated response capabilities in accordance with the present invention
  • the responses 124 may be disclosed to.
  • a user identifies whether any parties, other than those identified in the parameter 130, may receive the responses 124.
  • a user identifies a length of time (e.g., three hours) that the approval/permission parameters 126, 128, 130, and 132 apply with respect to the responses 124.
  • the questionnaire 120 may be customized in any of a number of different ways. For example, the parameters 126, 128, 130, 132, and 134 may be applied to each response 124 individually rather than as a group.
  • step 204 Upon completion of step 202, execution proceeds to step 204, wherein the completed questionnaire is processed by the applet 103 to generate a table 300 such as exemplified in FIGURE 3.
  • the table 300 includes eight columns, or fields, 302, 304, 306, 308, 310, 312, 314, and 316, and any number of rows 314.
  • the table 300 is generated based on the responses entered into the questionnaire 120 in step 202, and on data stored in the registry 105.
  • Each row 314 corresponds to one response 124. More specifically, the user's personal ID 122 is encrypted and stored in the field 302.
  • Each question corresponding to a respective response 124 is correlated through the registry 105 with a data element registry number, which is then entered into the field 304 of a respective row 314.
  • the user's response 124 corresponding to the respective question, or data element registry number, is entered into the field 306 of a respective row 314.
  • the fields 308, 310, 312, 314, and 316 correspond directly with the parameters 126, 128, 130, 132, and 134, respectively, for each respective response 124.
  • the parameters 126, 128, 130, 132, and 134 would be the same for all rows 314.
  • the parameters 126, 128, 130, 132, and 134 may be individualized for each response 124, in which case the fields 308, 310, 312, 314, and 316 may differ for each row 314.
  • the applet 105 then appends the user's aforementioned digital signature in a field 316.
  • the user' s e-mail reply address may be entered in the field 318 for facilitating further communications and notifications from the DDA 104 regarding the data entered in the table 300.
  • execution proceeds to step 206, wherein the applet 103 converts each row 314 of the table 300 of data to a packet message (also referred to as a "digital identity packet") 400, as depicted in FIGURE 4.
  • a packet message also referred to as a "digital identity packet
  • Each packet message 400 contains eight fields 402, 404, 406, 408, 410, 412, 414, and 416 which correspond directly to each field 302, 304, 306, 308, 310, 312, 314, and 316, respectively, of a row 314 of the table 300.
  • the fields 402, 404, 406, 408, 410, 412, 414, and 416 of each packet message 400 are then preferably encrypted (hence, the personal ID is preferably encrypted twice) , and suitable headers (not shown) and the like, well-know in the art, are appended to the packet message for facilitating transmission of the packet message 400 to the DDA 104.
  • the packet messages 400 are then transmitted from the interface 102 to the DDA 104.
  • the DDA 104 receives and decrypts the packet messages 400 (hence rendering the personal ID still singly encrypted).
  • the fields 412 and 414 of the decrypted packet messages 400 are then examined to identify the DDCRs 106 that should receive the packet messages 400.
  • the packet messages 400 are transmitted to the DDCRs identified in step 208.
  • the DDA 104 may optionally remove the fields 412 and 414 from the packet message 400.
  • thepacket messages may be made available for searching by the DDCRs, which may respond as desired.
  • each DDCR 106 receives the packet messages 400 and analyzes the fields, namely, the fields 404 and 406, and from such analysis, generates an appropriate response.
  • the DDCR 106 preferably utilizes rule-based software (e.g., expert systems) to quickly generate responses to the packet messages.
  • Each DDCR also notes and respects the use and time parameters identified in the fields 408, 410, and 416.
  • the DDCR 106 may optionally also correlate the packet messages together based on the encrypted personal ID carried within the field 402 of each packet message to thereby perform a better analysis and generate a more meaningful response.
  • each DDCR 106 is not enabled to decrypt the encrypted personal ID carried within the field 402 of the packet 400, but does include it in the response that it generates so that the DDA 104 may track the user to whom the response applies.
  • each DDCR 106 transmits the response generated in step 212, along with the encrypted personal ID carried within the field 402, to the DDAs 104 from which the DDCR received the packet messages 400.
  • the DDA 104 receives the responses and associated encrypted personal ID from the DDCRs 106.
  • the DDA 104 then decrypts the personal ID to identify the user that generated the packet messages to which the responses pertain.
  • the DDA 104 encrypts the response received from the DDCRs 106, and forwards the encrypted responses to the interface 102 of the identified user.
  • the interface 102 receives the encrypted messages and decrypts the responses. The interface 102 then presents the responses to the user in any conventional manner, such as via monitor or hardcopy.
  • a method and system is provided by which personal data from individual persons may be collected, stored, disseminated, and audited in accordance with approvals and permissions provided by the individual.
  • the use of the table 300 facilitates the handling of each individual data element (e.g., the responses 124 and corresponding fields 304 and 306 of each row 314) with individual (i.e., element-by- element) approvals and permissions.
  • a user may thus differentiate between data elements to provide different levels of protection and approval for each data element.
  • Each data element may also be processed individually, thereby providing additional privacy to an individual user.
  • the transmission of data elements in packets 400 also facilitates quick responses.
  • the present invention should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use.
  • the present invention can take many forms and embodiments. Accordingly, several variations may be made in the foregoing without departing from the spirit or the scope of the invention.
  • the DDCR may query the intermediary DDA 104 to determine whether the individual would be willing to receive additional response.
  • the DDA 104 might query individual end users on its own behalf, to determine if they would be interested in receiving either questionnaires or responses from additional sites.
  • a particular DDCR 106 may offer to respond to questionnaires provided by other DDCRs, and could make this offer either by way of an intermediary DDA 104 or by mass appeals directly to potential end users (of course not knowing which or how many of the appeal group are current or past users of the system) .
  • a DDCR' s response to an individual end user may itself include an additional questionnaire, thus stimulating additional information-sharing by the end user, and providing more information for the DDCR to use in preparing subsequent responses.

Abstract

A method for individuals to disclose personal data and enter into a mass-customized relationship with a web site, while protecting their personal privacy by affixing digital signatures and approvals-for-use to individual data items. The items are then sent to an intermediary site, referred to as a 'digital data agency' that holds the personal data secure, but transmits the personal data to digital data collectors/responders (DDCRs) in accordance with permission provided by the user. Responses from the DDCRs are then sent to the DDA which encrypts the responses and forwards them to the user. The user may then decrypt the responses and review them.

Description

METHOD AND SYSTEM FOR DISCLOSING PERSONAL DATA WHILE PROTECTING PERSONAL PRIVACY
TECHNICAL FIELD
The invention relates generally to the exposure of personal data and, more particularly, to a method and system that enables individual end users to voluntarily disclose personal data while protecting their personal privacy.
BACKGROUND Periodically, individual persons will have a need to expose personal data about themselves to other parties such as, for example, marketers of products, providers of services such as health care or financial services, government agencies, other individual persons, and the like. In many cases, the individual wishes to make this data available only in the context of a particular transaction or relationship, or for a limited period of time, or until a specific event occurs. The individual may desire to provide specific permissions and restrictions for the use of such personal data, and prohibit non-approved uses. The individual may also desire prior, concurrent, or subsequent notification to himself or herself of the use or conveyance of each element of personal data conveyed to the other party.
By way of example, a person with a serious medical condition may desire to search and surf the web or visit chat rooms to find answers to issues regarding the medical condition, but the web and chat rooms are inefficient, and expose the person to significant privacy risks. Searching and surfing the web also exposes the user' s keystrokes as the target of companies such as Engage™ that generate profiles based on a user's web behavior. At best this generates targeted banner ads that may or may not be desired, and at worst these profiles may be linked to the user' s name and offline data. Chat rooms also require that the user provide his/her e-mail address or identity, which may invite undesired responses.
Thus, conventional technologies provide no practical method for managing an approval process that relates to collecting, storing, disseminating, and auditing the use of personal data.
On the other hand, so-called "anonymizer" or identity masking services, such as that provided by Zero- Knowledge Systems™, completely mask a user's identity to web sites, and make it impossible for sites to provide customized automated responses to particular users. Such systems conspicuously lack a technology that encourages the exchange of complex, conditional responses between a user and an automated website. What is needed is technology that integrates a privacy protection with enhancement of person-to-machine dialogue. In addition to the foregoing, some governmental authorities, particularly several European governments and the European Community, have passed laws and regulations that require collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use. At this time, though, there is no practical method for accomplishing this policy aim and complying with these laws.
The collectors of personal data about individuals, such as marketers of products, providers of services such as health care or financial services, government agencies, and the like, face substantial problems in responding to individuals. While they have the technical ability to create sites that are "mass customized" and communicate with a "unit of one" to customers, the collectors of data don't have sources of data and insight about customers that are reliable enough to drive such systems effectively. What they have is data that is partial, fragmentary, demographic and perhaps broadly psychographic, but that does not relate directly to things like customer values, intentions, and specific needs. This lack of data and insight is primarily due to the unwillingness of the customer to provide overt personal data, because of privacy concerns or lack of effective ways to do keep personal data private, and the resulting covert, unverified nature of the data that collectors have. Moreover, even when collectors of personal data obtain valid data that generates useful insights, they cannot easily communicate with individuals in regard to elements of data. They collect increasing amounts of data without the approval of the individuals involved. They seek to use this data to create new value, but are increasingly either constrained by regulation or at risk of offending their customers if they use personal data in new ways without approval. On the other hand, there is no practical way for them to gain and manage such approval . Thus, a need has arisen for methods and systems for protecting privacy while encouraging interaction between individuals and mass-customized, automated web services, and in the process gaining and managing approval from individuals for the collecting, storing, disseminating, and auditing the use of their personal data.
Such methods and systems should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use. SUMMARY
The present invention, accordingly, provides a method for enabling an individual end user to disclose personal data and enter into a mass-customized dialogue with one or more web sites, while protecting personal privacy. The method comprises steps performed by an individual end user to fill out questionnaires by way of a software application residing on the end user' s computer or similar device, and generating packet messages containing encrypted personal data and an encrypted personal identifier of the user. The packet is sent to a digital data agency (DDA) , which decrypts the personal date, but leaves the personal ID encrypted, and then forwards the packet messages to one or more digital data collector/responders (DDCR). The DDA then receives from the DDCRs responses to the packet messages, and decrypts the encrypted personal identifier to determine the individual end user. The DDA then encrypts the response, and forwards the encrypted response to an interface for review by the individual end user. By the use of the present invention, personal data of individual persons may be collected, stored, disseminated, and audited in accordance with approvals and permissions provided by the individual. Data elements may also be processed individually (i.e., element-by-element), thereby providing additional privacy to an individual. A user may also differentiate between data elements to provide different levels of protection for each data element. The transmission of data elements in packets also facilitates quick responses. The present invention should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use.
The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated by those skilled in the art that the conception and the specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the invention as set forth in the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
FIGURE 1 is a high-level conceptual block diagram illustrating a system embodying features of the present invention;
FIGURE 1A exemplifies a questionnaire that may be used in connection with the system of FIG. 1;
FIGURE 2 is a flow chart illustrating steps executed on the system of FIG. 1 for practicing the present invention; FIGURE 3 exemplifies entries made by a user for transmission to a digital data agency of FIG. 1; and
FIGURE 4 shows the structure of a data message sent by a user into the system of FIG. 1. DETAILED DESCRIPTION
In the following discussion, numerous specific details are set forth to provide a thorough understanding of the present invention. However, it will be obvious to those skilled in the art that the present invention may be practiced without such specific details. In other instances, well-known components have been illustrated in schematic or block diagram form in order not to obscure the present invention in unnecessary detail. Additionally, for the most part, and in the interest of conciseness, details concerning the Internet and the like have been omitted inasmuch as such details are not considered necessary to obtain a complete understanding of the present invention, and are within the skills of persons of ordinary skill in the relevant art. It is further noted that, unless indicated otherwise, all functions described herein are performed by a processor such as computer or electronic data processor in accordance with code such as computer program code, software, or integrated circuits that are coded to perform certain functions.
Referring to FIGURE 1 of the drawings, the reference numeral 100 generally designates a system embodying features of the present invention that enables individual persons, i.e., end users (not shown), to disclose personal data while protecting their personal privacy, by entering anonymously into mass-customized, automated dialogues of query and response with selected, preferably automated, web sites. The system 100 includes an interface 102, such as a computer terminal, personal digital assistant (PDA), or the like, through which an individual person (hereinafter "end user" or simply "user") or other provider of personal data may enter personal data. The interface 102 is connected in data communication with a digital data agency (DDA) 104 which acts in an intermediary role between the interface 102 and one or more audited, preferably automated, web sites, referred to herein as digital data collector/responders (DDCR) 106 (e.g., a medical clinic), as discussed in further detail below. The interface 102 further includes an applet 103 (small application program containing computer code) for execution on the interface 102 for enabling the user to enter personal data as discussed below. The interface 102 still further includes a registry 105, or access to an open, public registry, which contains a list of standard, generic questions, the answers to which would provide a DDCR 106 with sufficient information to enable it to be responsive to the needs of the user. The registry 105 also provides a data element registry number which is assigned to each question for purposes discussed below.
FIGURE 2 is a flowchart of steps executed in accordance with the present invention for disclosing a user's personal data while protecting the user's personal privacy. Prior to executing the steps shown in FIG. 2, a personal identity and a digital signature must be established in a public key encryption (e.g., PGP) relationship between the user and the DDA 104. The DDA 104 may optionally request additional identifying information about the user, such as the user's home address, telephone number, and the like. Personal identities, digital signatures, encryption, and the like, are considered to be well-known in the art and, therefore, will not be discussed in further detail herein. In step 202, the user obtains a suitable questionnaire from a suitable source, such as a DDCR 106 via the Internet, and completes it. For example, a person with Lupus may obtain a questionnaire to complete that would help him/her determine how he/she should deal with it. FIGURE 1A exemplifies a questionnaire 120 that a user may obtain. As shown, the questionnaire 120 requests that a user enter his/her personal ID in a blank 122, and then respond in blanks 124 to a number of corresponding questions that are relevant, for example, to dealing with Lupus. The questionnaire 120 then asks the user to fill in five approval/permission parameters 126, 128, 130, 132, and 134 relating to responses 124. In the parameter 126, a user identifies what use (e.g., medical diagnostics) the responses 124 may be used for. The blank 128 requests that a user identify what uses (e.g., an emergency referral to health a provider) other than those listed in the blank 126 a respective response 124 may be used for. In the parameter 130, the user identifies which parties (e.g., web sites recognized by the user to be highly reliable sources of relevant information such as the Mayo Clinic, the National Institute of Health, and Dr. Koop, and who operate mass- customized automated response capabilities in accordance with the present invention) the responses 124 may be disclosed to. In the parameter 132, a user identifies whether any parties, other than those identified in the parameter 130, may receive the responses 124. In the parameter 134, a user identifies a length of time (e.g., three hours) that the approval/permission parameters 126, 128, 130, and 132 apply with respect to the responses 124. The questionnaire 120 may be customized in any of a number of different ways. For example, the parameters 126, 128, 130, 132, and 134 may be applied to each response 124 individually rather than as a group.
Upon completion of step 202, execution proceeds to step 204, wherein the completed questionnaire is processed by the applet 103 to generate a table 300 such as exemplified in FIGURE 3. As shown therein, the table 300 includes eight columns, or fields, 302, 304, 306, 308, 310, 312, 314, and 316, and any number of rows 314. The table 300 is generated based on the responses entered into the questionnaire 120 in step 202, and on data stored in the registry 105. Each row 314 corresponds to one response 124. More specifically, the user's personal ID 122 is encrypted and stored in the field 302. Each question corresponding to a respective response 124 is correlated through the registry 105 with a data element registry number, which is then entered into the field 304 of a respective row 314. The user's response 124 corresponding to the respective question, or data element registry number, is entered into the field 306 of a respective row 314. The fields 308, 310, 312, 314, and 316 correspond directly with the parameters 126, 128, 130, 132, and 134, respectively, for each respective response 124. For the questionnaire exemplified in FIG. 1A, the parameters 126, 128, 130, 132, and 134 would be the same for all rows 314. As mentioned above, however, the parameters 126, 128, 130, 132, and 134 may be individualized for each response 124, in which case the fields 308, 310, 312, 314, and 316 may differ for each row 314. The applet 105 then appends the user's aforementioned digital signature in a field 316. The user' s e-mail reply address may be entered in the field 318 for facilitating further communications and notifications from the DDA 104 regarding the data entered in the table 300. Upon completion of step 204, execution proceeds to step 206, wherein the applet 103 converts each row 314 of the table 300 of data to a packet message (also referred to as a "digital identity packet") 400, as depicted in FIGURE 4. Each packet message 400 contains eight fields 402, 404, 406, 408, 410, 412, 414, and 416 which correspond directly to each field 302, 304, 306, 308, 310, 312, 314, and 316, respectively, of a row 314 of the table 300. The fields 402, 404, 406, 408, 410, 412, 414, and 416 of each packet message 400 are then preferably encrypted (hence, the personal ID is preferably encrypted twice) , and suitable headers (not shown) and the like, well-know in the art, are appended to the packet message for facilitating transmission of the packet message 400 to the DDA 104. The packet messages 400 are then transmitted from the interface 102 to the DDA 104. In step 208, the DDA 104 receives and decrypts the packet messages 400 (hence rendering the personal ID still singly encrypted). The fields 412 and 414 of the decrypted packet messages 400 are then examined to identify the DDCRs 106 that should receive the packet messages 400. In step 210, the packet messages 400 are transmitted to the DDCRs identified in step 208. Prior to transmitting the packet messages 400 in step 210, the DDA 104 may optionally remove the fields 412 and 414 from the packet message 400. Alternatively, rather than transmitting the packet messages to DDCRs, thepacket messages may be made available for searching by the DDCRs, which may respond as desired.
In step 212, each DDCR 106 receives the packet messages 400 and analyzes the fields, namely, the fields 404 and 406, and from such analysis, generates an appropriate response. The DDCR 106 preferably utilizes rule-based software (e.g., expert systems) to quickly generate responses to the packet messages. Each DDCR also notes and respects the use and time parameters identified in the fields 408, 410, and 416. The DDCR 106 may optionally also correlate the packet messages together based on the encrypted personal ID carried within the field 402 of each packet message to thereby perform a better analysis and generate a more meaningful response. It is noted, however, that the DDCR 106 is not enabled to decrypt the encrypted personal ID carried within the field 402 of the packet 400, but does include it in the response that it generates so that the DDA 104 may track the user to whom the response applies. In step 214, each DDCR 106 transmits the response generated in step 212, along with the encrypted personal ID carried within the field 402, to the DDAs 104 from which the DDCR received the packet messages 400.
In step 216, the DDA 104 receives the responses and associated encrypted personal ID from the DDCRs 106. The DDA 104 then decrypts the personal ID to identify the user that generated the packet messages to which the responses pertain. In step 218, the DDA 104 encrypts the response received from the DDCRs 106, and forwards the encrypted responses to the interface 102 of the identified user. In step 220, the interface 102 receives the encrypted messages and decrypts the responses. The interface 102 then presents the responses to the user in any conventional manner, such as via monitor or hardcopy. By the use of the present invention, a method and system is provided by which personal data from individual persons may be collected, stored, disseminated, and audited in accordance with approvals and permissions provided by the individual. The use of the table 300 facilitates the handling of each individual data element (e.g., the responses 124 and corresponding fields 304 and 306 of each row 314) with individual (i.e., element-by- element) approvals and permissions. A user may thus differentiate between data elements to provide different levels of protection and approval for each data element. Each data element may also be processed individually, thereby providing additional privacy to an individual user. The transmission of data elements in packets 400 also facilitates quick responses. The present invention should, among other things, also be effective for implementing the policy aim of, and complying with the laws of, governmental authorities, particularly European governments and the European Community, who have passed laws and regulations requiring collectors of personal data to provide individuals with the ability to restrict the use of data about themselves, unless those individuals give specific approval in advance for its wider dissemination and use.
It is understood that the present invention can take many forms and embodiments. Accordingly, several variations may be made in the foregoing without departing from the spirit or the scope of the invention. For example, should a DDCR 106, after it has provided a response, desire to provide additional responses in the future to an individual end user, the DDCR may query the intermediary DDA 104 to determine whether the individual would be willing to receive additional response. In a second example, the DDA 104 might query individual end users on its own behalf, to determine if they would be interested in receiving either questionnaires or responses from additional sites. In a third example, a particular DDCR 106 may offer to respond to questionnaires provided by other DDCRs, and could make this offer either by way of an intermediary DDA 104 or by mass appeals directly to potential end users (of course not knowing which or how many of the appeal group are current or past users of the system) . In a fourth example, a DDCR' s response to an individual end user may itself include an additional questionnaire, thus stimulating additional information-sharing by the end user, and providing more information for the DDCR to use in preparing subsequent responses. Having thus described the present invention by reference to certain of its preferred embodiments, it is noted that the embodiments disclosed are illustrative rather than limiting in nature and that a wide range of variations, modifications, changes, and substitutions are contemplated in the foregoing disclosure and, in some instances, some features of the present invention may be employed without a corresponding use of the other features. Many such variations and modifications may be considered obvious and desirable by those skilled in the art based upon a review of the foregoing description of preferred embodiments. Accordingly, it is appropriate that the appended claims be construed broadly and in a manner consistent with the scope of the invention.

Claims

1. A method for enabling a user to disclose personal data while protecting personal privacy, the method comprising the steps performed by an intermediary digital data agency of: receiving at least one packet message containing personal data, including an encrypted personal identification (ID); forwarding the at least one packet message to a digital data collector/responder (DDCR) ; receiving from the DDCR a response to the packet message, the response including the encrypted personal ID; decrypting the encrypted personal ID to identify the user that generated the at least one packet message; encrypting the response received from the DDCR; and forwarding the encrypted response to an interface for review by the user.
2. A method for enabling web sites to establish mass-customized, automated dialogues of query and response with an anonymous individual user, the method comprising the steps of: providing a questionnaire to the user; and the user entering into a mass-customized dialogue with the automated site, utilizing information and an encrypted identification (ID) supplied by the user in response to the questionnaire, and communicating by way of a trusted intermediary digital data agency (DDA) .
3. A system for enabling a user to disclose personal data while protecting personal privacy, the system comprising: a) an interface through which the user may enter personal data with encrypted personal identification
(ID) , and retrieve responses therefrom; b) a digital data agency (DDA) coupled in data communication for receiving the personal data and encrypted personal ID from the interface; and c) a digital data collector/responder coupled in data communication for receiving the personal data from the digital data agency, for generating a response to the personal data, and for transmitting the response to theDDA, which DDA forwards the response to the interface for retrieval by the user.
PCT/US2000/042241 1999-11-24 2000-11-24 Method and system for protecting of user privacy WO2001039428A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU30835/01A AU3083501A (en) 1999-11-24 2000-11-24 Method and system for disclosing personal data while protecting personal privacy

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US16728699P 1999-11-24 1999-11-24
US60/167,286 1999-11-24
US72183800A 2000-11-23 2000-11-23
US09/721,838 2000-11-23

Publications (2)

Publication Number Publication Date
WO2001039428A2 true WO2001039428A2 (en) 2001-05-31
WO2001039428A3 WO2001039428A3 (en) 2002-02-07

Family

ID=26863026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/042241 WO2001039428A2 (en) 1999-11-24 2000-11-24 Method and system for protecting of user privacy

Country Status (2)

Country Link
AU (1) AU3083501A (en)
WO (1) WO2001039428A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1349034A2 (en) * 2002-03-15 2003-10-01 Matsushita Electric Industrial Co., Ltd. Service providing system in which services are provided from service provider apparatus to service user apparatus via network
WO2004047445A1 (en) * 2002-11-15 2004-06-03 Koninklijke Philips Electronics N.V. Usage data harvesting
WO2004046964A2 (en) * 2002-11-15 2004-06-03 Koninklijke Philips Electronics N.V. Accessing on-line services
WO2011107490A3 (en) * 2010-03-01 2015-09-03 Cvon Innovations Ltd User information and distribution system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0855659A1 (en) * 1997-01-22 1998-07-29 Lucent Technologies Inc. System and method for providing anonymous personalized browsing in a network
US5790665A (en) * 1996-01-17 1998-08-04 Micali; Silvio Anonymous information retrieval system (ARS)

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5790665A (en) * 1996-01-17 1998-08-04 Micali; Silvio Anonymous information retrieval system (ARS)
EP0855659A1 (en) * 1997-01-22 1998-07-29 Lucent Technologies Inc. System and method for providing anonymous personalized browsing in a network

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1349034A2 (en) * 2002-03-15 2003-10-01 Matsushita Electric Industrial Co., Ltd. Service providing system in which services are provided from service provider apparatus to service user apparatus via network
EP1349034A3 (en) * 2002-03-15 2004-02-25 Matsushita Electric Industrial Co., Ltd. Service providing system in which services are provided from service provider apparatus to service user apparatus via network
US7254705B2 (en) 2002-03-15 2007-08-07 Matsushita Electric Industrial Co., Ltd. Service providing system in which services are provided from service provider apparatus to service user apparatus via network
WO2004047445A1 (en) * 2002-11-15 2004-06-03 Koninklijke Philips Electronics N.V. Usage data harvesting
WO2004046964A2 (en) * 2002-11-15 2004-06-03 Koninklijke Philips Electronics N.V. Accessing on-line services
WO2004046964A3 (en) * 2002-11-15 2004-10-14 Koninkl Philips Electronics Nv Accessing on-line services
JP2006506883A (en) * 2002-11-15 2006-02-23 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Usage data collection method and apparatus
CN100393130C (en) * 2002-11-15 2008-06-04 皇家飞利浦电子股份有限公司 Usage data collection
WO2011107490A3 (en) * 2010-03-01 2015-09-03 Cvon Innovations Ltd User information and distribution system

Also Published As

Publication number Publication date
AU3083501A (en) 2001-06-04
WO2001039428A3 (en) 2002-02-07

Similar Documents

Publication Publication Date Title
US20050038699A1 (en) System and method for targeted advertising via commitment
Chiasson et al. HIV behavioral research online
Bouguettaya et al. Privacy on the Web: facts, challenges, and solutions
LaRose et al. Your privacy is assured-of being disturbed: websites with and without privacy seals
McFarlane et al. Internet-based health promotion and disease control in the 8 cities: Successes, barriers, and future plans
US7930252B2 (en) Method and system for sharing anonymous user information
Best et al. New approaches to assessing opinion: The prospects for electronic mail surveys
US20140372176A1 (en) Method and apparatus for anonymous data profiling
US20020120573A1 (en) Secure extranet operation with open access for qualified medical professional
Baer et al. Obtaining sensitive data through the Web: an example of design and methods
US20050076089A1 (en) Method and system for communication from anonymous sender(s) to known recipient(s) for feedback applications
JP2002533845A (en) Impact of Prescriptions for Consumers and Methods for Healthcare Professional Information
JP2002529839A (en) Remote doctor authentication service
EP0923825A1 (en) Method and system for establishing and maintaining user-controlled anonymous communications
Kulyk et al. Does my smart device provider care about my privacy? Investigating trust factors and user attitudes in IoT systems
US20080294559A1 (en) Transmission of Anonymous Information Through a Communication Network
Chon et al. Determinants of the intention to protect personal information among Facebook users
Stallworth Future imperfect: Googling for principles in online behavioral advertising
WO2001039428A2 (en) Method and system for protecting of user privacy
Scott Protecting Consumer Data While Allowing the Web to Develop Self-Sustaining Architecture: Is a trans-Atlantic browser-based opt-in for behavioral tracking the right solution
Watzlaf et al. VoIP for telerehabilitation: A pilot usability study for HIPAA compliance
Alharthi et al. Location privacy challenges in spatial crowdsourcing
McColgan et al. Internet poses multiple risks to children and adolescents
US20220343018A1 (en) Method for providing a privacy-enabled service to users
Yee et al. Privacy and trust in e-government

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase