WO2007002002A1 - Method and apparatus for grouping spam email messages - Google Patents

Method and apparatus for grouping spam email messages Download PDF

Info

Publication number
WO2007002002A1
WO2007002002A1 PCT/US2006/023847 US2006023847W WO2007002002A1 WO 2007002002 A1 WO2007002002 A1 WO 2007002002A1 US 2006023847 W US2006023847 W US 2006023847W WO 2007002002 A1 WO2007002002 A1 WO 2007002002A1
Authority
WO
WIPO (PCT)
Prior art keywords
email messages
probe
spam
messages
probe email
Prior art date
Application number
PCT/US2006/023847
Other languages
French (fr)
Inventor
Sanford Jensen
Original Assignee
Symantec Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Symantec Corporation filed Critical Symantec Corporation
Publication of WO2007002002A1 publication Critical patent/WO2007002002A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/212Monitoring or handling of messages using filtering or selective blocking

Definitions

  • the present invention relates to filtering electronic mail (email); more particularly, the present invention relates to creating filters to detect email spam.
  • spam has become a major problem for all Internet users. As the cost of processing power, email address acquisition and email software continue to fall, spam becomes increasingly cost-effective for spammers. Given the negligible cost involved in sending millions of unsolicited email messages, spammers need only capture a small response rate to make a profit. The growth trend of spam shows no sign of abating. According to recent statistics, spam currently accounts for over half of all email traffic in the U.S. This increase in both the volume and percentage of spam is not only worsening a resource drain for IT, it is also affecting how end users view email, which has become the primary form of communication in the enterprise.
  • a method and system for grouping spam email messages are described.
  • the method includes receiving probe email messages indicative of spam and modifying the probe email messages to reduce noise.
  • the method further includes comparing the probe email messages using fuzzy logic to identify similar email messages, and creating groups of similar email messages. Each of the created groups pertains to a distinct spam attack.
  • Figure 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail.
  • Figure 2 is a block diagram of one embodiment of a probe mail processing module.
  • Figure 3 is a block diagram of one embodiment of a grouping module.
  • Figure 4 is a flow diagram of one embodiment of a process for facilitating detection of spam email messages.
  • Figure 5 is a flow diagram of one embodiment of a process for grouping messages.
  • Figure 6 illustrates a process of grouping messages using fuzzy logic, according to one embodiment of the present invention.
  • Figure 7 is a flow diagram of one embodiment of a process for generating mathematical signatures of documents.
  • Figure 8 is a block diagram of an exemplary computer system.
  • the present invention also relates to apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium includes read only memory ("ROM”); random access. memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
  • FIG. 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail (email).
  • the system includes a control center 102 coupled to a communications network 100 such as a public network (e.g., the Internet, a wireless network, etc.) or a private network (e.g., LAN, Intranet, etc.).
  • the control center 102 communicates with multiple network servers 104 via the network 100.
  • Each server 104 communicates with user terminals 106 using a private or public network.
  • the control center 102 is an anti-spam facility that is responsible for analyzing messages indicative of spam, developing filtering rules for detecting spam, and distributing the filtering rules to the servers 104.
  • a message may be indicative of spam because it was collected via a "probe network" 112.
  • the probe network is formed by fictitious probe email addresses specifically selected to make their way into as many spammer mailing lists as possible.
  • the fictitious probe email addresses may also be selected to appear high up on spammers' lists in order to receive spam mailings early in the mailing process (e.g., using the e-mail address "aardvark@aol.com" ensures relatively high placement on an alphabetical mailing list).
  • the fictitious probe email addresses may include, for example, decoy accounts and expired domains.
  • a certain percentage of assignable e-mail addresses offered by an ISP or private network may be reserved for use as probe email addresses.
  • the probe network 112 may also receive email identified as spam by users of terminals 106.
  • a server 104 may be a mail server that receives and stores messages addressed to users of corresponding user terminals.
  • a server 104 may be a different server (e.g., a gateway of an Internet Service Provider (ISP)) coupled to a mail server.
  • ISP Internet Service Provider
  • Servers 104 are responsible for filtering incoming messages based on the filtering rules received from the control center 102.
  • Servers 104 operate as clients receiving services of the control center 102.
  • control center 102 includes a probe mail processing module 108 that is responsible for identifying spam email messages resulted from distinct spam attacks, generating filters for the distinct spam attacks, and distributing the filters to the servers 104 for detection of spam email resulted from these spam attacks at the customer sites.
  • Each server 104 includes an email filtering module 110 that is responsible for storing filters received from the control center 102 and detecting spam email using these filters.
  • each server 104 hosts both the probe mail processing module 108 that generates spam filters and the email filtering module 110 that uses the generated filters to detect spam email.
  • FIG. 2 is a block diagram of one embodiment of a probe mail processing module 200.
  • the probe mail processing module 200 includes a probe email parser 202, a noise reduction algorithm 204, a grouping module 206, a filter generator 208 and a filter transmitter 210.
  • the probe email parser 202 is responsible for parsing the body of probe email messages.
  • the noise reduction algorithm 204 is responsible for detecting data indicative of noise and removing noise from probe email messages.
  • Noise represents data invisible to a recipient that was added to an email message to hide its spam nature.
  • Such data may include, for example, formatting data (e.g., HTML tags), numeric character references, character entity references, URL data of predefined categories, etc.
  • Numeric character references specify the code position of a character in the document character set.
  • Character entity references use symbolic names so that authors need not remember code positions. For example, the character entity reference "&aring” refers to the lowercase "a" character topped with a ring.
  • Predefined categories of URL data may include, for example, numerical character references contained in the URL and the URL "password” syntax data that adds characters before an "@" in the URL hostname.
  • the grouping module 206 is responsible for grouping probe email messages that are likely to result from distinct spam attacks.
  • the grouping module 206 compares probe email messages or their portions (e.g., message headers, message bodies (or portions of message body), message senders, or any combination of the above) to find similar probe email messages.
  • the comparison may be done using regular expressions or mathematical signatures of probe email messages.
  • Mathematical signatures of probe email messages may consist of checksums, hash values or some other data identifying the message content, and may be created using various algorithms that enable the use of similarity measures in comparing different email messages.
  • the filter generator 208 is responsible for generating filters for individual groups created by the grouping module 206.
  • a filter may include a mathematical signature of a probe email message, a regular expression characterizing a probe email message, one or more URLs extracted from a probe email message, or any other data characterizing probe email messages resulted from a spam attack.
  • the filter transmitter 210 is responsible for distributing filters to participating servers such as servers 104 of Figure 1.
  • each server 104 periodically (e.g., each 5 minutes) initiates a connection (e.g., a secure HTTPS connection) with the call center 102.
  • a connection e.g., a secure HTTPS connection
  • filters are transmitted from the call center 102 to the relevant server 104 for detection of spam email at relevant customer sites.
  • Figure 3 is a block diagram of one embodiment of a grouping module 300.
  • the grouping module 300 includes a feature extractor 302, a signature generator 304, a signature matching algorithm 306, and a group creator 308.
  • the feature extractor 302 is responsible for determining the size of probe email messages and expanding a probe email message if its size is below a threshold. In particular, if the feature extractor 302 determines that the size of an email message is below a threshold, it identifies a predefined feature in the probe email message and appends this feature to the probe email message to increase its size. The feature may be appended to the probe email message several times until the size of the probe email message reaches a threshold.
  • the feature may be any data contained in the message that characterizes the content of the message or its sender or recipient. For example, the feature may be a URL, a telephone number, a keyword, a company name or a person's name.
  • the signature generator 304 is responsible for generating mathematical signatures for probe email messages.
  • the signature generator 304 may generate mathematical signatures using various algorithms that enable the use of similarity measures in comparing different email messages.
  • the signature generator 304 may generate mathematical signatures using the MD5 algorithm or a character-based algorithm that extracts from a message the most frequently occurring combinations of characters.
  • the signature matching algorithm 306 is responsible for comparing signatures of probe email messages to identify similar probe email messages.
  • the signature matching algorithm 306 operates using fuzzy logic to not only identify matching probe email messages but also probe email message that are significantly similar and, therefore, are likely to have been originated from the same spam attack.
  • the group creator 308 is responsible for grouping similar probe email messages into groups corresponding to distinct spam attacks.
  • the grouping module 300 operates in real time, efficiently grouping a large number of probe email messages (e.g., several million messages per day) to identify spam attacks at an early stage and allow for creation of filters that detect spam email messages resulting from these spam attacks at the customer sites.
  • probe email messages e.g., several million messages per day
  • FIG 4 is a flow diagram of one embodiment of a process 400 for facilitating detection of spam email messages.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a control center 102 of Figure 1.
  • process 400 begins with processing logic receiving probe email messages indicative of spam (processing block 402).
  • Probe email messages are indicative of spam because they are collected via a probe network.
  • processing logic modifies the probe email messages to reduce noise
  • processing logic modifies the probe email messages by removing formatting data, translating numeric character references and character entity references to their ASCII (American Standard Code for Information Interchange) equivalents, and modifying URL data of predefined categories.
  • formatting data is removed if it does not qualify as an exception.
  • HTML formatting does not add anything to the information content of a message.
  • exceptions are the tags that contain useful information for further processing of the message (e.g., tags ⁇ BODY>, ⁇ A>, ⁇ MG>, and ⁇ FONT>).
  • tags ⁇ BODY> and ⁇ BODY> tags are needed for "white on white" text elimination, and the ⁇ A> and ⁇ IMG> tags typically contain link information that may be used for passing data to other components of the system.
  • processing logic modifies URL data of predefined categories by removing numerical character references contained in the URL, removing additional characters added before an "@" in the URL hostname, and removing the "query” part of the URL, following a string "?” at the end of the URL.
  • processing logic generates mathematical signatures of the modified probe messages.
  • Mathematical signatures of probe email messages may consist of checksums, hash values or some other data identifying the message content, and may be created using various algorithms that enable the use of similarity measures in comparing different email messages.
  • a mathematical signature for an email message is created using a character-based mechanism that identifies the most frequently used character combinations in a document. This mechanism may use an ASCII character set based on the Roman alphabet or an extended character set that also includes non-ASCII characters to cover international email messages using non-Roman alphabets (e.g., Cyrillic alphabet, Greek alphabet, Arabic alphabet, etc.).
  • non-Roman alphabets e.g., Cyrillic alphabet, Greek alphabet, Arabic alphabet, etc.
  • processing logic expands the size of the email message prior to generating a mathematical signature.
  • the size may be expanded by repeatedly appending to the email messages an important feature extracted from the email message.
  • Such a feature may be a URL, a telephone number, a keyword or a name contained in the email messages.
  • processing logic compares the mathematical signatures of probe email messages using fuzzy logic to identify similar probe email messages.
  • fuzzy logic One embodiment of a process for finding similar email messages will be discussed in more detail below in conjunction with Figure 6.
  • processing logic creates groups of similar email messages. Each group contains email messages that have likely been originated from a distinct spam attack.
  • processing logic creates a filter for each group of email messages.
  • Each filter is intended to detect spam resulting from a corresponding spam attack at the customer sites.
  • a filter may include a mathematical signature of a probe email message, a regular expression characterizing a probe email message, one or more URLs extracted from a probe email message, or any other data characterizing a probe email message.
  • processing logic distributes the filters to the clients for detection of spam at the customer sites.
  • FIG. 5 is a flow diagram of one embodiment of a process 500 for grouping messages.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a control center 102 of Figure 1.
  • process 500 begins with processing logic determining whether the size of a new message is below a threshold (processing block 502).
  • a new message may be an email message or any other message or document in electronic form.
  • a threshold may be a minimum message size that enables a meaningful comparison of similarities in messages. If the size of the new message is equal to, or exceeds, the threshold, processing logic proceeds to processing box 510. Otherwise, if the size of the new message is below the threshold, processing logic proceeds to processing box 504.
  • processing logic finds a predefined feature in the new message.
  • the predefined feature is a URL.
  • the predefined feature may refer to any other data that is known to be an important part of the message content.
  • the predefined feature may be a telephone number, a keyword, a company name, a name of a person, etc.
  • processing logic appends the predefined feature to the message.
  • processing logic extracts top-level information from each URL found in the message and adds the extracted data to the end of the message. For example, processing logic may extract from each URL a host name (e.g., www.google.com), ignoring randomized sub-domains, redirections to target URLs, randomized paths, etc.
  • host name e.g., www.google.com
  • processing logic determines whether the size of the expanded message reaches the threshold. If not, processing logic repeats the addition of the extracted feature to the message (processing block 506) until the message size reaches the threshold. When the message size reaches the threshold, processing logic proceeds to processing block 510. [0054] At processing block 510, processing logic generates a mathematical signature of the new message.
  • One embodiment of a process for creating a mathematical signature of a message will be discussed in more detail below in conjunction with Figure 7.
  • processing logic compares the signature of the new message with signatures of other messages from the existing groups (processing block 512) and determines whether the signature of the new message is similar to any other signatures (processing block 514). If so, processing logic groups the signature of the new message with the similar signatures (processing block 518). If not, processing logic creates a new group for this signature that denotes a new spam attack (processing block 516).
  • Figure 6 illustrates a process of grouping messages using fuzzy logic, according to one embodiment of the present invention.
  • signature 1 is a mathematical signature of a probe email message initially received by the control center 102.
  • Group 1 contains signature 1 and corresponds to a first spam attack characterized by space 600.
  • Space 600 defines an allowable degree of similarity between signatures of group 1.
  • Group 2 contains signature 2 and corresponds to a second spam attack characterized by space 602.
  • Group 3 contains signature 3 and corresponds to a third spam attack characterized by space 604. Space 604 overlaps with spaces 600 and 602.
  • signatures of newly received messages are compared with existing signatures, they are grouped depending on their similarities with the existing signatures. If a signature is similar to signatures from multiple groups, it is assigned to the oldest group (i.e., the group that was created earlier than the other group or groups). For example, signature 6 belongs both to space 600 and space 604. Because group 1 was created before group 3, signature 6 is assigned to group 6.
  • FIG. 7 is a flow diagram of one embodiment of a process 700 for generating a mathematical signature of a document.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a control center 102 of Figure !
  • process 700 begins with processing logic preprocessing a document (processing block 702).
  • the document is pre- processed by identifying all URLs and email addresses with the document.
  • processing logic divides these URLs and email addresses into elements.
  • Each element is a combination of two or more portions of a relevant URL or email address.
  • an email message may contain the URL "http://www.fake-domain.com/place.html" and the email address "user@badguy.com”.
  • the URL may be divided into "http://www.fake-domain.com", "www.fake-domain.com”, "fake- domain.com” and "place.html”.
  • the email address may be divided into "user@badguy.com” and "badguy.com”.
  • processing logic creates a mathematical signature of the document by compiling the elements created at processing block 704 into a list.
  • the elements are listed in the descending or ascending order.
  • the mathematical signature may include a list of elements resulting from the URL and the email address.
  • the mathematical signature of the document may be compared with other signatures. In one embodiment, the signature is considered sufficiently similar with some other signature if the two signatures include a certain number or percentage of the same list elements.
  • a signature created for a message containing the URL "http://www.fake-domain.com/otherplace.html” and the email address "otheruser@badguv.com” may be considered sufficiently similar with the signature of the referenced-above message that contains the URL "http://www.fake-domain.com/place.html” and the email address "user@badguy.com” because the two signatures include 4 common list elements (about 67 percent).
  • a signature can be created for the message body only, the message header only, a combination of the message body and header, a combination of the message body and some other data (e.g., recipient or sender information), or a combination of the message header and some other data (e.g., recipient or sender information).
  • FIG. 8 is a block diagram of an exemplary computer system 800 that may be used to perform one or more of the operations described herein.
  • the machine may comprise a network router, a network switch, a network bridge, Personal Digital Assistant (PDA), a cellular telephone, a web appliance or any machine capable of executing a sequence of instructions that specify actions to be taken by that machine.
  • PDA Personal Digital Assistant
  • the computer system 800 includes a processor 802, a main memory 804 and a static memory 806, which communicate with each other via a bus 808.
  • the computer system 800 may further include a video display unit 810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 800 also includes an alpha-numeric input device 812 (e.g., a keyboard), a cursor control device 814 (e.g., a mouse), a disk drive unit 816, a signal generation device 820 (e.g., a speaker) and a network interface device 822.
  • the disk drive unit 816 includes a computer-readable medium 824 on which is stored a set of instructions (i.e., software) 826 embodying any one, or all, of the methodologies described above.
  • the software 826 is also shown to reside, completely or at least partially, within the main memory 804 and/or within the processor 802.
  • the software 826 may further be transmitted or received via the network interface device 822.
  • the term "computer-readable medium” shall be taken to include any medium that is capable of storing or encoding a sequence of instructions for execution by the computer and that cause the computer to perform any one of the methodologies of the present invention.
  • the term “computer-readable medium” shall accordingly be taken to included, but not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals.

Abstract

A method and system for grouping spam email messages are described. In one embodiment, the method includes receiving probe email messages indicative of spam and modifying the probe email messages to reduce noise. The method further includes comparing the probe email messages using fuzzy logic to identify similar email messages, and creating groups of similar email messages. Each of the created groups pertains to a distinct spam attack.

Description

METHOD AiND APPARATUS FOR GROUPING SPAM EMAIL MESSAGES
FIELD OF THE INVENTION
[0001] The present invention relates to filtering electronic mail (email); more particularly, the present invention relates to creating filters to detect email spam.
BACKGROUND OF THE INVENTION
[0002] In recent years, spam has become a major problem for all Internet users. As the cost of processing power, email address acquisition and email software continue to fall, spam becomes increasingly cost-effective for spammers. Given the negligible cost involved in sending millions of unsolicited email messages, spammers need only capture a small response rate to make a profit. The growth trend of spam shows no sign of abating. According to recent statistics, spam currently accounts for over half of all email traffic in the U.S. This increase in both the volume and percentage of spam is not only worsening a resource drain for IT, it is also affecting how end users view email, which has become the primary form of communication in the enterprise.
[0003] Presently, there are products for filtering out unwanted email messages. However, these products typically fail to effectively compensate for the escalating volumes of spam.
SUMMARY OF THE INVENTION
[0004] A method and system for grouping spam email messages are described. According to one aspect, the method includes receiving probe email messages indicative of spam and modifying the probe email messages to reduce noise. The method further includes comparing the probe email messages using fuzzy logic to identify similar email messages, and creating groups of similar email messages. Each of the created groups pertains to a distinct spam attack.
[0005] Other features of the present invention will be apparent from the accompanying drawings and from the detailed description that follows.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which, however, should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
[0007] Figure 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail.
[0008] Figure 2 is a block diagram of one embodiment of a probe mail processing module.
[0009] Figure 3 is a block diagram of one embodiment of a grouping module.
[0010] Figure 4 is a flow diagram of one embodiment of a process for facilitating detection of spam email messages.
[0011] Figure 5 is a flow diagram of one embodiment of a process for grouping messages.
[0012] Figure 6 illustrates a process of grouping messages using fuzzy logic, according to one embodiment of the present invention.
[0013] Figure 7 is a flow diagram of one embodiment of a process for generating mathematical signatures of documents.
[0014] Figure 8 is a block diagram of an exemplary computer system.
DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0015] A method and apparatus for grouping spam email messages are described. In the following description, numerous details are set forth. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
[0016] Some portions of the detailed descriptions which follow are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0017] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0018] The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
[0019] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
[0020] A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium includes read only memory ("ROM"); random access. memory ("RAM"); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
[0021] Figure 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail (email). The system includes a control center 102 coupled to a communications network 100 such as a public network (e.g., the Internet, a wireless network, etc.) or a private network (e.g., LAN, Intranet, etc.). The control center 102 communicates with multiple network servers 104 via the network 100. Each server 104 communicates with user terminals 106 using a private or public network.
[0022] The control center 102 is an anti-spam facility that is responsible for analyzing messages indicative of spam, developing filtering rules for detecting spam, and distributing the filtering rules to the servers 104. A message may be indicative of spam because it was collected via a "probe network" 112. In one embodiment, the probe network is formed by fictitious probe email addresses specifically selected to make their way into as many spammer mailing lists as possible. The fictitious probe email addresses may also be selected to appear high up on spammers' lists in order to receive spam mailings early in the mailing process (e.g., using the e-mail address "aardvark@aol.com" ensures relatively high placement on an alphabetical mailing list). The fictitious probe email addresses may include, for example, decoy accounts and expired domains. In addition, a certain percentage of assignable e-mail addresses offered by an ISP or private network may be reserved for use as probe email addresses. The probe network 112 may also receive email identified as spam by users of terminals 106.
[0023] A server 104 may be a mail server that receives and stores messages addressed to users of corresponding user terminals. Alternatively, a server 104 may be a different server (e.g., a gateway of an Internet Service Provider (ISP)) coupled to a mail server. Servers 104 are responsible for filtering incoming messages based on the filtering rules received from the control center 102. Servers 104 operate as clients receiving services of the control center 102.
[0024] In one embodiment, the control center 102 includes a probe mail processing module 108 that is responsible for identifying spam email messages resulted from distinct spam attacks, generating filters for the distinct spam attacks, and distributing the filters to the servers 104 for detection of spam email resulted from these spam attacks at the customer sites.
[0025] Each server 104 includes an email filtering module 110 that is responsible for storing filters received from the control center 102 and detecting spam email using these filters.
[0026] In an alternative embodiment, each server 104 hosts both the probe mail processing module 108 that generates spam filters and the email filtering module 110 that uses the generated filters to detect spam email.
[0027] Figure 2 is a block diagram of one embodiment of a probe mail processing module 200. The probe mail processing module 200 includes a probe email parser 202, a noise reduction algorithm 204, a grouping module 206, a filter generator 208 and a filter transmitter 210.
[0028] The probe email parser 202 is responsible for parsing the body of probe email messages.
[0029] The noise reduction algorithm 204 is responsible for detecting data indicative of noise and removing noise from probe email messages. Noise represents data invisible to a recipient that was added to an email message to hide its spam nature. Such data may include, for example, formatting data (e.g., HTML tags), numeric character references, character entity references, URL data of predefined categories, etc. Numeric character references specify the code position of a character in the document character set. Character entity references use symbolic names so that authors need not remember code positions. For example, the character entity reference "&aring" refers to the lowercase "a" character topped with a ring. Predefined categories of URL data may include, for example, numerical character references contained in the URL and the URL "password" syntax data that adds characters before an "@" in the URL hostname.
[0030] The grouping module 206 is responsible for grouping probe email messages that are likely to result from distinct spam attacks. The grouping module 206 compares probe email messages or their portions (e.g., message headers, message bodies (or portions of message body), message senders, or any combination of the above) to find similar probe email messages. The comparison may be done using regular expressions or mathematical signatures of probe email messages. Mathematical signatures of probe email messages may consist of checksums, hash values or some other data identifying the message content, and may be created using various algorithms that enable the use of similarity measures in comparing different email messages.
[0031] The filter generator 208 is responsible for generating filters for individual groups created by the grouping module 206. A filter may include a mathematical signature of a probe email message, a regular expression characterizing a probe email message, one or more URLs extracted from a probe email message, or any other data characterizing probe email messages resulted from a spam attack.
[0032] The filter transmitter 210 is responsible for distributing filters to participating servers such as servers 104 of Figure 1. In one embodiment, each server 104 periodically (e.g., each 5 minutes) initiates a connection (e.g., a secure HTTPS connection) with the call center 102. Using this pull-based connection, filters are transmitted from the call center 102 to the relevant server 104 for detection of spam email at relevant customer sites.
[0033] Figure 3 is a block diagram of one embodiment of a grouping module 300. The grouping module 300 includes a feature extractor 302, a signature generator 304, a signature matching algorithm 306, and a group creator 308.
[0034] The feature extractor 302 is responsible for determining the size of probe email messages and expanding a probe email message if its size is below a threshold. In particular, if the feature extractor 302 determines that the size of an email message is below a threshold, it identifies a predefined feature in the probe email message and appends this feature to the probe email message to increase its size. The feature may be appended to the probe email message several times until the size of the probe email message reaches a threshold. The feature may be any data contained in the message that characterizes the content of the message or its sender or recipient. For example, the feature may be a URL, a telephone number, a keyword, a company name or a person's name.
[0035] The signature generator 304 is responsible for generating mathematical signatures for probe email messages. The signature generator 304 may generate mathematical signatures using various algorithms that enable the use of similarity measures in comparing different email messages. For example, the signature generator 304 may generate mathematical signatures using the MD5 algorithm or a character-based algorithm that extracts from a message the most frequently occurring combinations of characters.
[0036] The signature matching algorithm 306 is responsible for comparing signatures of probe email messages to identify similar probe email messages. In one embodiment, the signature matching algorithm 306 operates using fuzzy logic to not only identify matching probe email messages but also probe email message that are significantly similar and, therefore, are likely to have been originated from the same spam attack.
[00371 The group creator 308 is responsible for grouping similar probe email messages into groups corresponding to distinct spam attacks.
[0038] The grouping module 300 operates in real time, efficiently grouping a large number of probe email messages (e.g., several million messages per day) to identify spam attacks at an early stage and allow for creation of filters that detect spam email messages resulting from these spam attacks at the customer sites.
[0039] Figure 4 is a flow diagram of one embodiment of a process 400 for facilitating detection of spam email messages. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a control center 102 of Figure 1.
[0040] Referring to Figure 4, process 400 begins with processing logic receiving probe email messages indicative of spam (processing block 402). Probe email messages are indicative of spam because they are collected via a probe network.
[0041] At processing block 404, processing logic modifies the probe email messages to reduce noise, hi one embodiment, processing logic modifies the probe email messages by removing formatting data, translating numeric character references and character entity references to their ASCII (American Standard Code for Information Interchange) equivalents, and modifying URL data of predefined categories. In one embodiment, formatting data is removed if it does not qualify as an exception. Typically, HTML formatting does not add anything to the information content of a message. However, a few exceptions exist. These exceptions are the tags that contain useful information for further processing of the message (e.g., tags <BODY>, <A>, <MG>, and <FONT>). For example, the <FONT> and <BODY> tags are needed for "white on white" text elimination, and the <A> and <IMG> tags typically contain link information that may be used for passing data to other components of the system.
[0042] In one embodiment, processing logic modifies URL data of predefined categories by removing numerical character references contained in the URL, removing additional characters added before an "@" in the URL hostname, and removing the "query" part of the URL, following a string "?" at the end of the URL.
[0043] At processing block 406, processing logic generates mathematical signatures of the modified probe messages. Mathematical signatures of probe email messages may consist of checksums, hash values or some other data identifying the message content, and may be created using various algorithms that enable the use of similarity measures in comparing different email messages. In one embodiment, a mathematical signature for an email message is created using a character-based mechanism that identifies the most frequently used character combinations in a document. This mechanism may use an ASCII character set based on the Roman alphabet or an extended character set that also includes non-ASCII characters to cover international email messages using non-Roman alphabets (e.g., Cyrillic alphabet, Greek alphabet, Arabic alphabet, etc.). One embodiment of a process for creating a mathematical signature of a document will be discussed in more detail below in conjunction with Figure 7.
[0044] In one embodiment, if a probe email message is small (e.g., if the size of the message is below a threshold), processing logic expands the size of the email message prior to generating a mathematical signature. The size may be expanded by repeatedly appending to the email messages an important feature extracted from the email message. Such a feature may be a URL, a telephone number, a keyword or a name contained in the email messages. One embodiment of a process for grouping small message will be discussed in more detail below in conjunction with Figure 5.
[0045] At processing block 408, processing logic compares the mathematical signatures of probe email messages using fuzzy logic to identify similar probe email messages. One embodiment of a process for finding similar email messages will be discussed in more detail below in conjunction with Figure 6.
[0046] At processing block 410, processing logic creates groups of similar email messages. Each group contains email messages that have likely been originated from a distinct spam attack.
[0047] At processing block 412, processing logic creates a filter for each group of email messages. Each filter is intended to detect spam resulting from a corresponding spam attack at the customer sites. A filter may include a mathematical signature of a probe email message, a regular expression characterizing a probe email message, one or more URLs extracted from a probe email message, or any other data characterizing a probe email message.
[0048] At processing block 414, processing logic distributes the filters to the clients for detection of spam at the customer sites.
[0049] Figure 5 is a flow diagram of one embodiment of a process 500 for grouping messages. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a control center 102 of Figure 1. [0050] Referring to Figure 5, process 500 begins with processing logic determining whether the size of a new message is below a threshold (processing block 502). A new message may be an email message or any other message or document in electronic form. A threshold may be a minimum message size that enables a meaningful comparison of similarities in messages. If the size of the new message is equal to, or exceeds, the threshold, processing logic proceeds to processing box 510. Otherwise, if the size of the new message is below the threshold, processing logic proceeds to processing box 504.
[0051] At processing box 504, processing logic finds a predefined feature in the new message. In one embodiment, the predefined feature is a URL. Alternatively, the predefined feature may refer to any other data that is known to be an important part of the message content. For example, the predefined feature may be a telephone number, a keyword, a company name, a name of a person, etc.
[0052] At processing block 506, processing logic appends the predefined feature to the message. In one embodiment, in which the predefined feature is a URL, processing logic extracts top-level information from each URL found in the message and adds the extracted data to the end of the message. For example, processing logic may extract from each URL a host name (e.g., www.google.com), ignoring randomized sub-domains, redirections to target URLs, randomized paths, etc.
[0053] At processing block 508, processing logic determines whether the size of the expanded message reaches the threshold. If not, processing logic repeats the addition of the extracted feature to the message (processing block 506) until the message size reaches the threshold. When the message size reaches the threshold, processing logic proceeds to processing block 510. [0054] At processing block 510, processing logic generates a mathematical signature of the new message. One embodiment of a process for creating a mathematical signature of a message will be discussed in more detail below in conjunction with Figure 7.
[0055] Next, processing logic compares the signature of the new message with signatures of other messages from the existing groups (processing block 512) and determines whether the signature of the new message is similar to any other signatures (processing block 514). If so, processing logic groups the signature of the new message with the similar signatures (processing block 518). If not, processing logic creates a new group for this signature that denotes a new spam attack (processing block 516).
[0056] Figure 6 illustrates a process of grouping messages using fuzzy logic, according to one embodiment of the present invention.
[0057] Referring to Figure 6, signature 1 is a mathematical signature of a probe email message initially received by the control center 102. Group 1 contains signature 1 and corresponds to a first spam attack characterized by space 600. Space 600 defines an allowable degree of similarity between signatures of group 1.
[0058] Group 2 contains signature 2 and corresponds to a second spam attack characterized by space 602.
[0059] Group 3 contains signature 3 and corresponds to a third spam attack characterized by space 604. Space 604 overlaps with spaces 600 and 602.
[0060] Subsequently, when signatures of newly received messages are compared with existing signatures, they are grouped depending on their similarities with the existing signatures. If a signature is similar to signatures from multiple groups, it is assigned to the oldest group (i.e., the group that was created earlier than the other group or groups). For example, signature 6 belongs both to space 600 and space 604. Because group 1 was created before group 3, signature 6 is assigned to group 6.
[0061] Figure 7 is a flow diagram of one embodiment of a process 700 for generating a mathematical signature of a document. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a control center 102 of Figure !
[0062] Referring to Figure 7, process 700 begins with processing logic preprocessing a document (processing block 702). In one embodiment, the document is pre- processed by identifying all URLs and email addresses with the document.
[0063] At processing block 704, processing logic divides these URLs and email addresses into elements. Each element is a combination of two or more portions of a relevant URL or email address. For example, an email message may contain the URL "http://www.fake-domain.com/place.html" and the email address "user@badguy.com". The URL may be divided into "http://www.fake-domain.com", "www.fake-domain.com", "fake- domain.com" and "place.html". The email address may be divided into "user@badguy.com" and "badguy.com".
[0064] At processing block 706, processing logic creates a mathematical signature of the document by compiling the elements created at processing block 704 into a list. In one embodiment, the elements are listed in the descending or ascending order. In the example above, the mathematical signature may include a list of elements resulting from the URL and the email address. [0065] Subsequently, the mathematical signature of the document may be compared with other signatures. In one embodiment, the signature is considered sufficiently similar with some other signature if the two signatures include a certain number or percentage of the same list elements. For example, a signature created for a message containing the URL "http://www.fake-domain.com/otherplace.html" and the email address "otheruser@badguv.com" may be considered sufficiently similar with the signature of the referenced-above message that contains the URL "http://www.fake-domain.com/place.html" and the email address "user@badguy.com" because the two signatures include 4 common list elements (about 67 percent).
[0066] It should be noted that a signature can be created for the message body only, the message header only, a combination of the message body and header, a combination of the message body and some other data (e.g., recipient or sender information), or a combination of the message header and some other data (e.g., recipient or sender information).
[0067] It will be understood by one of ordinary skill in the art that various techniques other than those described above may be used by embodiments of the present invention to generate mathematical signatures of email message.
An Exemplary Computer System
[0068] Figure 8 is a block diagram of an exemplary computer system 800 that may be used to perform one or more of the operations described herein. In alternative embodiments, the machine may comprise a network router, a network switch, a network bridge, Personal Digital Assistant (PDA), a cellular telephone, a web appliance or any machine capable of executing a sequence of instructions that specify actions to be taken by that machine.
[0069] The computer system 800 includes a processor 802, a main memory 804 and a static memory 806, which communicate with each other via a bus 808. The computer system 800 may further include a video display unit 810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 800 also includes an alpha-numeric input device 812 (e.g., a keyboard), a cursor control device 814 (e.g., a mouse), a disk drive unit 816, a signal generation device 820 (e.g., a speaker) and a network interface device 822.
[0070] The disk drive unit 816 includes a computer-readable medium 824 on which is stored a set of instructions (i.e., software) 826 embodying any one, or all, of the methodologies described above. The software 826 is also shown to reside, completely or at least partially, within the main memory 804 and/or within the processor 802. The software 826 may further be transmitted or received via the network interface device 822. For the purposes of this specification, the term " computer-readable medium" shall be taken to include any medium that is capable of storing or encoding a sequence of instructions for execution by the computer and that cause the computer to perform any one of the methodologies of the present invention. The term "computer-readable medium" shall accordingly be taken to included, but not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals.
[0071] Whereas many alterations and modifications of the present invention will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that any particular embodiment shown and described by way of illustration is in no way intended to be considered limiting. Therefore, references to details of various embodiments are not intended to limit the scope of the claims which in themselves recite only those features regarded as essential to the invention.

Claims

CLAIMSWe claim:
1. A method comprising: receiving a plurality of probe email messages indicative of spam; modifying the plurality of probe email messages to reduce noise; comparing the plurality of probe email messages using fuzzy logic to identify similar email messages; and creating groups of similar email messages, each of the groups pertaining to a distinct spam attack.
2. The method of claim 1 further comprising: creating a filter for each of the groups; and distributing resulting filters to a plurality of clients for detection of spam email messages at the plurality of clients.
3. The method of claim 1 wherein the plurality of probe email messages are collected using a plurality of fictitious probe email addresses selected to appear on spam email mailing lists.
4. The method of claim 1 wherein modifying the plurality of probe email messages to reduce noise comprises removing data indicative of noise from the plurality of probe email messages, the data indicative of noise being selected from the group consisting of formatting data, numeric character references, character entity references, and uniform resource locator (URL) data of predefined categories.
5. The method of claim 1 wherein comparing the plurality of probe email messages comprises: generating a mathematical signature of each of the plurality of probe email messages; and finding similar mathematical signatures.
6. The method of claim 5 further comprising: determining that a size of one of the plurality of probe email messages is below a threshold; finding a predefined feature in the one of the plurality of probe email messages; appending the predefined feature to the one of the plurality of probe email messages; and if the size of the one of the plurality of probe email messages remains below the threshold, continuing to append the predefined feature to the one of the plurality of probe email messages until the size of one of the plurality of probe email messages reaches the threshold.
7. The method of claim 6 wherein the predefined feature is any one of a uniform resource locator (URL), a phone number, a keyword, and a name.
8. A method comprising: determining that a size of a new message indicative of spam is below a threshold; finding a predefined feature in the new message; appending the predefined feature to the new message until the size of the new message reaches the threshold; determining whether the new message is similar to a message from any of existing message groups; and if the new message is different from messages in the existing message groups, creating a new message group denoting a new spam attack.
9. The method of claim 8 wherein the new message is a probe email message collected using one of a plurality of fictitious probe email addresses selected to appear on spam email mailing lists.
10. The method of claim 8 further comprising: removing data indicative of noise from the new message prior to determining that the size of the new message is below the threshold, the data indicative of noise being selected from the group consisting of formatting data, numeric character references, character entity references, and uniform resource locator (URL) data of predefined categories.
11. The method of claim 8 wherein determining whether the new message is similar to messages from any of the existing message groups comprises: generating a mathematical signature of the new message; and determining whether the mathematical signature of the new message is similar to mathematical signatures of messages from any of the existing groups.
12. The method of claim 8 wherein the predefined feature is any one of a uniform resource locator (URL), a phone number, a keyword, and a name.
13. A system comprising: probe email parser to receive a plurality of probe email messages indicative of spam; noise reduction algorithm to modify the plurality of probe email messages to reduce noise; and a grouping module to compare the plurality of probe email messages using fuzzy . logic to identify similar email messages, and to create groups of similar email messages, each of the groups pertaining to a distinct spam attack.
14. The system of claim 13 wherein the grouping module comprises a signature generator to generate a mathematical signature of each of the plurality of probe email messages, and a signature matching algorithm to find similar mathematical signatures.
15. The system of claim 14 wherein the grouping module further comprises a feature extractor to determine that a size of one of the plurality of probe email messages is below a threshold, to find a predefined feature in the one of the plurality of probe email messages, to append the predefined feature to the one of the plurality of probe email messages, and to continue appending the predefined feature to the one of the plurality of probe email messages until the size of one of the plurality of probe email messages reaches the threshold.
16. The system of claim 15 wherein the predefined feature is any one of a uniform resource locator (URL), a phone number, a keyword, and a name.
17. The system of claim 13 further comprising: a filter generator to create a filter for each of the groups; and a filter transmitter to distribute resulting filters to a plurality of clients for detection of spam email messages at the plurality of clients.
18. The system of claim 13 wherein the plurality of probe email messages are collected using a plurality of fictitious probe email addresses selected to appear on spam email mailing lists.
19. An apparatus comprising: means for receiving a plurality of probe email messages indicative of spam; means for modifying the plurality of probe email messages to reduce noise; means for comparing the plurality of probe email messages using fuzzy logic to identify similar email messages; and means for creating groups of similar email messages, each of the groups pertaining to a distinct spam attack.
20. A computer readable medium comprising executable instructions which when executed on a processing system cause said processing system to perform a method comprising: receiving a plurality of probe email messages indicative of spam; modifying the plurality of probe email messages to reduce noise; comparing the plurality of probe email messages using fuzzy logic to identify similar email messages; and creating groups of similar email messages, each of the groups pertaining to a distinct spam attack.
PCT/US2006/023847 2005-06-20 2006-06-20 Method and apparatus for grouping spam email messages WO2007002002A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/157,327 2005-06-20
US11/157,327 US7739337B1 (en) 2005-06-20 2005-06-20 Method and apparatus for grouping spam email messages

Publications (1)

Publication Number Publication Date
WO2007002002A1 true WO2007002002A1 (en) 2007-01-04

Family

ID=37025100

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/023847 WO2007002002A1 (en) 2005-06-20 2006-06-20 Method and apparatus for grouping spam email messages

Country Status (2)

Country Link
US (1) US7739337B1 (en)
WO (1) WO2007002002A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8464341B2 (en) * 2008-07-22 2013-06-11 Microsoft Corporation Detecting machines compromised with malware
CN107294834A (en) * 2016-03-31 2017-10-24 阿里巴巴集团控股有限公司 A kind of method and apparatus for recognizing spam

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7832011B2 (en) * 2002-08-30 2010-11-09 Symantec Corporation Method and apparatus for detecting malicious code in an information handling system
US7331062B2 (en) 2002-08-30 2008-02-12 Symantec Corporation Method, computer software, and system for providing end to end security protection of an online transaction
US20050108340A1 (en) * 2003-05-15 2005-05-19 Matt Gleeson Method and apparatus for filtering email spam based on similarity measures
US20100042687A1 (en) * 2008-08-12 2010-02-18 Yahoo! Inc. System and method for combating phishing
US8621614B2 (en) * 2009-05-26 2013-12-31 Microsoft Corporation Managing potentially phishing messages in a non-web mail client context
US9098333B1 (en) 2010-05-07 2015-08-04 Ziften Technologies, Inc. Monitoring computer process resource usage
US8966620B2 (en) * 2010-05-27 2015-02-24 Microsoft Technology Licensing, Llc Campaign detection
US8990316B1 (en) * 2010-11-05 2015-03-24 Amazon Technologies, Inc. Identifying message deliverability problems using grouped message characteristics
US9559868B2 (en) * 2011-04-01 2017-01-31 Onavo Mobile Ltd. Apparatus and methods for bandwidth saving and on-demand data delivery for a mobile device
US8873813B2 (en) 2012-09-17 2014-10-28 Z Advanced Computing, Inc. Application of Z-webs and Z-factors to analytics, search engine, learning, recognition, natural language, and other utilities
US11195057B2 (en) 2014-03-18 2021-12-07 Z Advanced Computing, Inc. System and method for extremely efficient image and pattern recognition and artificial intelligence platform
US9916538B2 (en) 2012-09-15 2018-03-13 Z Advanced Computing, Inc. Method and system for feature detection
US11074495B2 (en) 2013-02-28 2021-07-27 Z Advanced Computing, Inc. (Zac) System and method for extremely efficient image and pattern recognition and artificial intelligence platform
US11914674B2 (en) 2011-09-24 2024-02-27 Z Advanced Computing, Inc. System and method for extremely efficient image and pattern recognition and artificial intelligence platform
US8311973B1 (en) 2011-09-24 2012-11-13 Zadeh Lotfi A Methods and systems for applications for Z-numbers
CN103312585B (en) * 2012-03-08 2016-12-28 中兴通讯股份有限公司 A kind of rubbish message processing method and system
US10187339B2 (en) * 2014-06-26 2019-01-22 MailWise Email Solutions Ltd. Email message grouping
US10657182B2 (en) 2016-09-20 2020-05-19 International Business Machines Corporation Similar email spam detection
US10447635B2 (en) 2017-05-17 2019-10-15 Slice Technologies, Inc. Filtering electronic messages
US11803883B2 (en) 2018-01-29 2023-10-31 Nielsen Consumer Llc Quality assurance for labeled training data
US10896290B2 (en) * 2018-09-06 2021-01-19 Infocredit Services Private Limited Automated pattern template generation system using bulk text messages
RU2710739C1 (en) * 2019-03-29 2020-01-10 Акционерное общество "Лаборатория Касперского" System and method of generating heuristic rules for detecting messages containing spam

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6052709A (en) * 1997-12-23 2000-04-18 Bright Light Technologies, Inc. Apparatus and method for controlling delivery of unsolicited electronic mail
WO2001053965A1 (en) * 2000-01-20 2001-07-26 Odyssey Development Pty Ltd E-mail spam filter
US20050060643A1 (en) * 2003-08-25 2005-03-17 Miavia, Inc. Document similarity detection and classification system

Family Cites Families (124)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5121345A (en) 1988-11-03 1992-06-09 Lentz Stephen A System and method for protecting integrity of computer data and software
CA1321656C (en) 1988-12-22 1993-08-24 Chander Kasiraj Method for restricting delivery and receipt of electronic message
GB8918553D0 (en) 1989-08-15 1989-09-27 Digital Equipment Int Message control system
JPH03117940A (en) 1989-09-25 1991-05-20 Internatl Business Mach Corp <Ibm> Method of managing electronic mail
US5822527A (en) 1990-05-04 1998-10-13 Digital Equipment Corporation Method and apparatus for information stream filtration using tagged information access and action registration
US5734826A (en) * 1991-03-29 1998-03-31 International Business Machines Corporation Variable cyclic redundancy coding method and apparatus for use in a multistage network
GB2271002B (en) 1992-09-26 1995-12-06 Digital Equipment Int Data processing system
US5634005A (en) 1992-11-09 1997-05-27 Kabushiki Kaisha Toshiba System for automatically sending mail message by storing rule according to the language specification of the message including processing condition and processing content
US5440723A (en) 1993-01-19 1995-08-08 International Business Machines Corporation Automatic immune system for computers and computer networks
TW237588B (en) 1993-06-07 1995-01-01 Microsoft Corp
JP3220886B2 (en) 1993-06-23 2001-10-22 株式会社日立製作所 Document search method and apparatus
JP2837815B2 (en) 1994-02-03 1998-12-16 インターナショナル・ビジネス・マシーンズ・コーポレイション Interactive rule-based computer system
US5675507A (en) 1995-04-28 1997-10-07 Bobo, Ii; Charles R. Message storage and delivery system
US5537540A (en) 1994-09-30 1996-07-16 Compaq Computer Corporation Transparent, secure computer virus detection method and apparatus
US5758257A (en) 1994-11-29 1998-05-26 Herz; Frederick System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US5619648A (en) 1994-11-30 1997-04-08 Lucent Technologies Inc. Message filtering techniques
US5649182A (en) 1995-03-17 1997-07-15 Reitz; Carl A. Apparatus and method for organizing timeline data
WO1996035994A1 (en) 1995-05-08 1996-11-14 Compuserve Incorporated Rules based electronic message management system
US5678041A (en) 1995-06-06 1997-10-14 At&T System and method for restricting user access rights on the internet based on rating information stored in a relational database
US5696898A (en) 1995-06-06 1997-12-09 Lucent Technologies Inc. System and method for database access control
US5845263A (en) 1995-06-16 1998-12-01 High Technology Solutions, Inc. Interactive visual ordering system
US5826269A (en) 1995-06-21 1998-10-20 Microsoft Corporation Electronic mail interface for a network server
GB2303947A (en) 1995-07-31 1997-03-05 Ibm Boot sector virus protection in computer systems
US5889943A (en) 1995-09-26 1999-03-30 Trend Micro Incorporated Apparatus and method for electronic mail virus detection and elimination
US6189030B1 (en) 1996-02-21 2001-02-13 Infoseek Corporation Method and apparatus for redirection of server external hyper-link references
US5751956A (en) 1996-02-21 1998-05-12 Infoseek Corporation Method and apparatus for redirection of server external hyper-link references
US5862325A (en) 1996-02-29 1999-01-19 Intermind Corporation Computer-based communication system and method using metadata defining a control structure
US5826022A (en) 1996-04-05 1998-10-20 Sun Microsystems, Inc. Method and apparatus for receiving electronic mail
US5870548A (en) 1996-04-05 1999-02-09 Sun Microsystems, Inc. Method and apparatus for altering sent electronic mail messages
US5809242A (en) 1996-04-19 1998-09-15 Juno Online Services, L.P. Electronic mail system for displaying advertisement at local computer received from remote system while the local computer is off-line the remote system
US5884033A (en) 1996-05-15 1999-03-16 Spyglass, Inc. Internet filtering system for filtering data transferred over the internet utilizing immediate and deferred filtering actions
US5864684A (en) 1996-05-22 1999-01-26 Sun Microsystems, Inc. Method and apparatus for managing subscriptions to distribution lists
US5905863A (en) 1996-06-07 1999-05-18 At&T Corp Finding an e-mail message to which another e-mail message is a response
US5790789A (en) 1996-08-02 1998-08-04 Suarez; Larry Method and architecture for the creation, control and deployment of services within a distributed computer environment
US5978837A (en) 1996-09-27 1999-11-02 At&T Corp. Intelligent pager for remotely managing E-Mail messages
US5930479A (en) 1996-10-21 1999-07-27 At&T Corp Communications addressing system
US5796948A (en) 1996-11-12 1998-08-18 Cohen; Elliot D. Offensive message interceptor for computers
US6146026A (en) 1996-12-27 2000-11-14 Canon Kabushiki Kaisha System and apparatus for selectively publishing electronic-mail
US6173364B1 (en) 1997-01-15 2001-01-09 At&T Corp. Session cache and rule caching method for a dynamic filter
US5995597A (en) 1997-01-21 1999-11-30 Woltz; Robert Thomas E-mail processing system and method
US5956481A (en) 1997-02-06 1999-09-21 Microsoft Corporation Method and apparatus for protecting data files on a computer from virus infection
US6182059B1 (en) 1997-04-03 2001-01-30 Brightware, Inc. Automatic electronic message interpretation and routing system
US6189026B1 (en) 1997-06-16 2001-02-13 Digital Equipment Corporation Technique for dynamically generating an address book in a distributed electronic mail system
US6185551B1 (en) 1997-06-16 2001-02-06 Digital Equipment Corporation Web-based electronic mail service apparatus and method using full text and label indexing
US6023700A (en) 1997-06-17 2000-02-08 Cranberry Properties, Llc Electronic mail distribution system for integrated electronic communication
JP3148152B2 (en) 1997-06-27 2001-03-19 日本電気株式会社 Delivery method of broadcast mail using electronic mail system
US7315893B2 (en) 1997-07-15 2008-01-01 Computer Associates Think, Inc. Method and apparatus for filtering messages based on context
US20050081059A1 (en) 1997-07-24 2005-04-14 Bandini Jean-Christophe Denis Method and system for e-mail filtering
US6073165A (en) 1997-07-29 2000-06-06 Jfax Communications, Inc. Filtering computer network messages directed to a user's e-mail box based on user defined filters, and forwarding a filtered message to the user's receiver
US5919257A (en) 1997-08-08 1999-07-06 Novell, Inc. Networked workstation intrusion detection system
US5999967A (en) 1997-08-17 1999-12-07 Sundsted; Todd Electronic mail filtering by electronic stamp
US6199102B1 (en) 1997-08-26 2001-03-06 Christopher Alan Cobb Method and system for filtering electronic messages
US5983348A (en) 1997-09-10 1999-11-09 Trend Micro Incorporated Computer network malicious code scanner
JP3439330B2 (en) 1997-09-25 2003-08-25 日本電気株式会社 Email server
US6195686B1 (en) 1997-09-29 2001-02-27 Ericsson Inc. Messaging application having a plurality of interfacing capabilities
US6393568B1 (en) 1997-10-23 2002-05-21 Entrust Technologies Limited Encryption and decryption system and method with content analysis provision
US6381592B1 (en) 1997-12-03 2002-04-30 Stephen Michael Reuning Candidate chaser
WO1999032985A1 (en) 1997-12-22 1999-07-01 Accepted Marketing, Inc. E-mail filter and method thereof
US6023723A (en) 1997-12-22 2000-02-08 Accepted Marketing, Inc. Method and system for filtering unwanted junk e-mail utilizing a plurality of filtering mechanisms
US6088804A (en) 1998-01-12 2000-07-11 Motorola, Inc. Adaptive system and method for responding to computer network security attacks
US5999932A (en) 1998-01-13 1999-12-07 Bright Light Technologies, Inc. System and method for filtering unsolicited electronic mail messages using data matching and heuristic processing
US6212552B1 (en) 1998-01-15 2001-04-03 At&T Corp. Declarative message addressing
US5968117A (en) 1998-01-20 1999-10-19 Aurora Communications Exchange Ltd. Device and system to facilitate accessing electronic mail from remote user-interface devices
US6157630A (en) 1998-01-26 2000-12-05 Motorola, Inc. Communications system with radio device and server
US6182227B1 (en) 1998-06-22 2001-01-30 International Business Machines Corporation Lightweight authentication system and method for validating a server access request
US6161130A (en) 1998-06-23 2000-12-12 Microsoft Corporation Technique which utilizes a probabilistic classifier to detect "junk" e-mail by automatically updating a training and re-training the classifier based on the updated training set
US6314454B1 (en) 1998-07-01 2001-11-06 Sony Corporation Method and apparatus for certified electronic mail messages
US6226630B1 (en) 1998-07-22 2001-05-01 Compaq Computer Corporation Method and apparatus for filtering incoming information using a search engine and stored queries defining user folders
US6275850B1 (en) 1998-07-24 2001-08-14 Siemens Information And Communication Networks, Inc. Method and system for management of message attachments
US6112227A (en) 1998-08-06 2000-08-29 Heiner; Jeffrey Nelson Filter-in method for reducing junk e-mail
US6158031A (en) 1998-09-08 2000-12-05 Lucent Technologies, Inc. Automated code generating translator for testing telecommunication system devices and method
US6360254B1 (en) 1998-09-15 2002-03-19 Amazon.Com Holdings, Inc. System and method for providing secure URL-based access to private resources
US6377949B1 (en) 1998-09-18 2002-04-23 Tacit Knowledge Systems, Inc. Method and apparatus for assigning a confidence level to a term within a user knowledge profile
US6757713B1 (en) 1998-09-23 2004-06-29 John W. L. Ogilvie Method for including a self-removing indicator in a self-removing message
US6266774B1 (en) 1998-12-08 2001-07-24 Mcafee.Com Corporation Method and system for securing, managing or optimizing a personal computer
US6499109B1 (en) 1998-12-08 2002-12-24 Networks Associates Technology, Inc. Method and apparatus for securing software distributed over a network
US6546416B1 (en) 1998-12-09 2003-04-08 Infoseek Corporation Method and system for selectively blocking delivery of bulk electronic mail
US6330588B1 (en) 1998-12-21 2001-12-11 Philips Electronics North America Corporation Verification of software agents and agent activities
US6549957B1 (en) 1998-12-22 2003-04-15 International Business Machines Corporation Apparatus for preventing automatic generation of a chain reaction of messages if a prior extracted message is similar to current processed message
US6654787B1 (en) * 1998-12-31 2003-11-25 Brightmail, Incorporated Method and apparatus for filtering e-mail
US6438125B1 (en) 1999-01-22 2002-08-20 Nortel Networks Limited Method and system for redirecting web page requests on a TCP/IP network
US6571275B1 (en) 1999-04-27 2003-05-27 International Business Machines Corporation Method and apparatus for filtering messages in a data processing system
US6640301B1 (en) * 1999-07-08 2003-10-28 David Way Ng Third-party e-mail authentication service provider using checksum and unknown pad characters with removal of quotation indents
US6523120B1 (en) 1999-10-29 2003-02-18 Rstar Corporation Level-based network access restriction
US7673329B2 (en) 2000-05-26 2010-03-02 Symantec Corporation Method and apparatus for encrypted communications to a secure server
US20030159070A1 (en) 2001-05-28 2003-08-21 Yaron Mayer System and method for comprehensive general generic protection for computers against malicious programs that may steal information and/or cause damages
US20020046065A1 (en) 2000-06-15 2002-04-18 Nighan Robert J. Method and system for insuring against loss in connection with an online financial transaction
US6785732B1 (en) 2000-09-11 2004-08-31 International Business Machines Corporation Web server apparatus and method for virus checking
US6901398B1 (en) * 2001-02-12 2005-05-31 Microsoft Corporation System and method for constructing and personalizing a universal information classifier
EP1360585A4 (en) 2001-02-14 2008-04-30 Invicta Networks Inc Systems and methods for creating a code inspection system
US7114177B2 (en) 2001-03-28 2006-09-26 Geotrust, Inc. Web site identity assurance
US20020147780A1 (en) 2001-04-09 2002-10-10 Liu James Y. Method and system for scanning electronic mail to detect and eliminate computer viruses using a group of email-scanning servers and a recipient's email gateway
US7603703B2 (en) 2001-04-12 2009-10-13 International Business Machines Corporation Method and system for controlled distribution of application code and content data within a computer network
US20020174137A1 (en) 2001-05-15 2002-11-21 Wolff Daniel Joseph Repairing alterations to computer files
US6792543B2 (en) 2001-08-01 2004-09-14 Networks Associates Technology, Inc. Virus scanning on thin client devices using programmable assembly language
US20030097451A1 (en) 2001-11-16 2003-05-22 Nokia, Inc. Personal data repository
US7096500B2 (en) 2001-12-21 2006-08-22 Mcafee, Inc. Predictive malware scanning of internet data
US7093121B2 (en) 2002-01-10 2006-08-15 Mcafee, Inc. Transferring data via a secure network connection
US8578480B2 (en) 2002-03-08 2013-11-05 Mcafee, Inc. Systems and methods for identifying potentially malicious messages
US6836272B2 (en) 2002-03-12 2004-12-28 Sun Microsystems, Inc. Frame buffer addressing scheme
US7092995B2 (en) * 2002-06-25 2006-08-15 Microsoft Corporation Testing distributed applications
WO2004010662A1 (en) 2002-07-22 2004-01-29 Fujitsu Limited Electronic mail server, electronic mail delivery relaying method, and computer program
US7331062B2 (en) 2002-08-30 2008-02-12 Symantec Corporation Method, computer software, and system for providing end to end security protection of an online transaction
US7832011B2 (en) 2002-08-30 2010-11-09 Symantec Corporation Method and apparatus for detecting malicious code in an information handling system
US7509679B2 (en) 2002-08-30 2009-03-24 Symantec Corporation Method, system and computer program product for security in a global computer network transaction
US7748039B2 (en) 2002-08-30 2010-06-29 Symantec Corporation Method and apparatus for detecting malicious code in an information handling system
US7072944B2 (en) 2002-10-07 2006-07-04 Ebay Inc. Method and apparatus for authenticating electronic mail
US20040078422A1 (en) 2002-10-17 2004-04-22 Toomey Christopher Newell Detecting and blocking spoofed Web login pages
US20040083270A1 (en) * 2002-10-23 2004-04-29 David Heckerman Method and system for identifying junk e-mail
US6732157B1 (en) * 2002-12-13 2004-05-04 Networks Associates Technology, Inc. Comprehensive anti-spam system, method, and computer program product for filtering unwanted e-mail messages
US7624110B2 (en) 2002-12-13 2009-11-24 Symantec Corporation Method, system, and computer program product for security within a global computer network
US8327442B2 (en) 2002-12-24 2012-12-04 Herz Frederick S M System and method for a distributed application and network security system (SDI-SCAM)
US20040177120A1 (en) 2003-03-07 2004-09-09 Kirsch Steven T. Method for filtering e-mail messages
US20060168006A1 (en) * 2003-03-24 2006-07-27 Mr. Marvin Shannon System and method for the classification of electronic communication
US20050108340A1 (en) * 2003-05-15 2005-05-19 Matt Gleeson Method and apparatus for filtering email spam based on similarity measures
US7272853B2 (en) 2003-06-04 2007-09-18 Microsoft Corporation Origination/destination features and lists for spam prevention
US7051077B2 (en) * 2003-06-30 2006-05-23 Mx Logic, Inc. Fuzzy logic voting method and system for classifying e-mail using inputs from multiple spam classifiers
US7200637B2 (en) 2003-07-16 2007-04-03 Thomas John Klos System for processing electronic mail messages with specially encoded addresses
US7421498B2 (en) 2003-08-25 2008-09-02 Microsoft Corporation Method and system for URL based filtering of electronic communications and web pages
US7451487B2 (en) 2003-09-08 2008-11-11 Sonicwall, Inc. Fraudulent message detection
US7257564B2 (en) * 2003-10-03 2007-08-14 Tumbleweed Communications Corp. Dynamic message filtering
US7395657B2 (en) 2003-10-20 2008-07-08 General Electric Company Flade gas turbine engine with fixed geometry inlet
US20050137980A1 (en) 2003-12-17 2005-06-23 Bank Of America Corporation Active disablement of malicious code in association with the provision of on-line financial services
US8010609B2 (en) 2005-06-20 2011-08-30 Symantec Corporation Method and apparatus for maintaining reputation lists of IP addresses to detect email spam

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6052709A (en) * 1997-12-23 2000-04-18 Bright Light Technologies, Inc. Apparatus and method for controlling delivery of unsolicited electronic mail
WO2001053965A1 (en) * 2000-01-20 2001-07-26 Odyssey Development Pty Ltd E-mail spam filter
US20050060643A1 (en) * 2003-08-25 2005-03-17 Miavia, Inc. Document similarity detection and classification system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RUBIN P: "Re: spam (hash codes)", RE: SPAM (HASH CODES), 20 February 1995 (1995-02-20), XP002229452 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8464341B2 (en) * 2008-07-22 2013-06-11 Microsoft Corporation Detecting machines compromised with malware
CN107294834A (en) * 2016-03-31 2017-10-24 阿里巴巴集团控股有限公司 A kind of method and apparatus for recognizing spam

Also Published As

Publication number Publication date
US7739337B1 (en) 2010-06-15

Similar Documents

Publication Publication Date Title
US7739337B1 (en) Method and apparatus for grouping spam email messages
US7831667B2 (en) Method and apparatus for filtering email spam using email noise reduction
US8145710B2 (en) System and method for filtering spam messages utilizing URL filtering module
US7941490B1 (en) Method and apparatus for detecting spam in email messages and email attachments
US7668921B2 (en) Method and system for phishing detection
US7882189B2 (en) Using distinguishing properties to classify messages
KR101045452B1 (en) Advanced spam detection techniques
US8010609B2 (en) Method and apparatus for maintaining reputation lists of IP addresses to detect email spam
EP1738519B1 (en) Method and system for url-based screening of electronic communications
US8874658B1 (en) Method and apparatus for simulating end user responses to spam email messages
US9246860B2 (en) System, method and computer program product for gathering information relating to electronic content utilizing a DNS server
US8473556B2 (en) Apparatus, a method, a program and a system for processing an e-mail
US8135778B1 (en) Method and apparatus for certifying mass emailings
JP2005135024A (en) Anti-spam method and anti-spam program
Nor Improving Antispam Techniques by Embracing Pattern-based Filtering
JP2009110423A (en) Information processor, program, and address leak web site identification method
JP2010092251A (en) Information processing apparatus, information processing method, and information processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06785122

Country of ref document: EP

Kind code of ref document: A1