US8036432B2 - System and method of saving digital content classified by person-based clustering - Google Patents

System and method of saving digital content classified by person-based clustering Download PDF

Info

Publication number
US8036432B2
US8036432B2 US12/025,109 US2510908A US8036432B2 US 8036432 B2 US8036432 B2 US 8036432B2 US 2510908 A US2510908 A US 2510908A US 8036432 B2 US8036432 B2 US 8036432B2
Authority
US
United States
Prior art keywords
digital content
person
data structure
face
cluster
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/025,109
Other versions
US20090028393A1 (en
Inventor
Yong-Sung Kim
Duck-hoon Kim
Hee-seon Park
Hye-Soo Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US12/025,109 priority Critical patent/US8036432B2/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, DUCK-HOON, KIM, YONG-SUNG, LEE, HYE-SOO, PARK, HEE-SEON
Publication of US20090028393A1 publication Critical patent/US20090028393A1/en
Application granted granted Critical
Publication of US8036432B2 publication Critical patent/US8036432B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/40Data acquisition and logging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification
    • G06F16/287Visualization; Browsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Definitions

  • aspects of the present invention relate to a system and method of saving digital content classified by person-based clustering, and more particularly, to a system and method of classifying digital content by person-based clustering and saving the classified content in a database.
  • photos may be classified into user's own photos, photos of the user's wife, photos of the user's son, photos of the user's parents, photos of the user's friend, and so forth, and videos may be classified according to the appearance of characters in various scenes.
  • aspects of the present invention provide a system and method of saving digital content classified by person-based clustering when an addition or deletion of the digital content is performed.
  • Additional aspects of the present invention provide a system and method of classifying digital content by person-based clustering and efficiently saving the classified digital content in a database by introducing flags through a cluster grouping when an addition or deletion of the digital content is performed.
  • a system for saving digital content classified by person-based clustering includes a database to save a plurality of digital content classified by person-based clustering; a data structure generation unit to generate a data structure including a plurality of nodes using the plurality of digital content; a face recognition unit to extract a face descriptor of new digital content to be saved in the database; a cluster classification unit to classify the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and a data structure update unit to update the data structure according to the classification.
  • a system for saving digital content classified by person-based clustering includes a data structure generation unit to generate a data structure including a plurality of nodes using a plurality of digital content; a cluster classification unit to classify the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted; a data structure update unit to update the data structure according to the classification; and a database to update and save the remaining digital content by reflecting the updated data structure.
  • a method of saving digital content classified by person-based clustering includes generating a data structure including a plurality of nodes using a schema of a database that stores a plurality of digital content; extracting a face descriptor of new digital content to be saved in the database; classifying the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and updating the data structure according to the classification.
  • a method of saving digital content classified by person-based clustering which includes generating a data structure including a plurality of nodes using a plurality of digital content; classifying the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted; updating the data structure according to the classification; and updating and saving the remaining digital content in a database by reflecting the updated data structure.
  • FIG. 1 is a block diagram illustrating a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 2 is a view illustrating a database schema in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 3 is an exemplary view illustrating a data structure in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 4 is a block diagram illustrating the construction of a face recognition unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 5 is a block diagram illustrating the construction of a cluster classification unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIGS. 6 to 9 are views illustrating examples of cluster grouping performed by a cluster transform unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 10 is a block diagram illustrating the construction of a data structure update unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 11 is a view explaining flag setting performed by a flag setting unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIGS. 12 to 15 are views explaining processes of updating a data structure performed by a data structure application unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention
  • FIG. 16 is a flowchart illustrating a method of saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • FIG. 17 is a flowchart illustrating a method of updating a data structure in a method of saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • These computer program instructions may also be stored in a computer usable or computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer usable or computer-readable memory implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus operations to implement the functions specified in the flowchart block or blocks.
  • Each block of the flowchart illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions to implement the specified logical function(s).
  • the functions noted in the blocks may occur out of order.
  • two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in reverse order, depending upon the functionality involved.
  • the term “unit”, as used herein, may be implemented as a kind of module.
  • module refers to, but is not limited to, a software or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks.
  • a module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors.
  • a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • the functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules.
  • FIG. 1 shows a system 100 for saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • FIG. 2 shows a database schema in the system 100 , according to an embodiment of the present invention.
  • FIG. 3 shows a data structure in the system 100 , according to an embodiment of the present invention.
  • FIG. 4 shows a face recognition unit 130 in the system 100 , according to an embodiment of the present invention.
  • FIG. 5 shows a cluster classification unit 150 in the system 100 , according to an embodiment of the present invention.
  • the system 100 includes a data structure generation unit 120 , the face recognition unit 130 , the cluster classification unit 150 , a data structure update unit 170 , a database 180 , and a content browsing unit 190 .
  • the content browsing unit 190 shows information about a plurality of digital content classified by person-based clustering, which are saved in the database 180 , so that a user can see the digital content.
  • the user can browse the saved digital content by person-based clustering while the user views the digital content information on a display (not shown).
  • the database 180 saves a plurality of digital content.
  • Digital content indicates content that includes a person's images, such as photos, moving images, and so forth.
  • the database 180 can save the digital content classified by person-based clustering.
  • the digital content may include photos.
  • a schema as shown in FIG. 2 may be applied.
  • aspects of the present invention may also be applied to other types of content that may or may not include faces, such as motion pictures, movies, or text.
  • the schema which may be applied to save photos classified by person-based clustering, may include a photo information table 123 for saving the photo, a person information table 125 for saving person information, and a face information table 127 for saving face information.
  • the photo information table 123 saves a photo identifier PhotoID as a primary key for properly classifying the content, and also saves content-related information, such as a file name of the content.
  • the person information table 125 saves a person identifier PersonID as a primary key for properly classifying the person, and also saves person-related information, such as a name of the person.
  • the face information table 127 which is a table for saving characters, content, and parts of the content in which the characters appear, saves a face identifier FaceID as a primary key for properly classifying respective faces, primary keys of the photo information table and the person information table as foreign keys in order to classify the persons and the content in which the faces appear, and face-related information, such as locations of the content where the faces appear.
  • a face descriptor may be included as a quantitative value of the corresponding face.
  • the schema may be applied in order to save the content classified by person-based clustering in the database. A similar schema may be used to classify motion picture content, as motion picture content can be seen as a series of still images.
  • the data structure generation unit 120 generates a data structure using a plurality of digital content provided from the database.
  • the data structure generation unit 120 can generate the data structure classified by person-based clustering through the use of the schema.
  • a data structure classified by person-based clustering is shown with respect to the plurality of digital content saved in the database by the data structure generation unit 120 .
  • person nodes 315 are generated and linked by person-based clustering, and face nodes 325 classified as the corresponding person are generated and linked to the respective person node.
  • the data structure generation unit 120 generates the linked list using the schema of the database classified by person-based clustering.
  • the face recognition unit 130 recognizes a face in an image of newly input digital content.
  • the face recognition unit 130 recognizes a person in the image included in the new digital content and extracts a face extractor to classify the content by person-based clustering from the recognized person.
  • General face recognition techniques may be divided into a technique of pre-registering a reference face and matching a given face to the most similar reference face, and a clustering technique for grouping similar faces when a reference face is not registered.
  • the digital content may be classified by persons through a cluster grouping that is estimated as the same person by a clustering technique.
  • Face recognition techniques are described in detail in Korean Unexamined Patent Publication Nos. 2005-0013467, 2006-0008814, and 2007-0047063. Other face recognition and clustering techniques may be employed as well.
  • the face recognition unit 130 includes a face extraction unit 210 , a face normalization unit 220 , a feature point extraction unit 230 , and a face descriptor computation unit 240 .
  • the face extraction unit 210 extracts a face region by extracting a face and constituent element of the face from an image included in the digital content.
  • the face extraction unit 210 detects a rough face from the image, extracts eyes, a nose, a mouth, and so forth, which are specified constituent elements of the face, and then extracts the face region based on the extracted face constituent elements. For example, a process of cutting the face region from the input image based on the distance between the two eyes, as a pre-process for face recognition, can minimize the influence of the background of the image, the change of the hair style of the person, and so forth.
  • the face normalization unit 220 normalizes the size of the face region using the extracted face region information. By normalizing the size of the face region, specified personal appearance of the face region, such as the distance between two eyes, the distance between an eye and a nose, and so forth, can be maintained.
  • the feature point extraction unit 230 extracts feature points, such as eyes, from the normalized face region.
  • the feature points may be selected as points on the face region that enlarge the difference between images of different input faces, through experiments for a plurality of persons. For example, when applying a Gabor filter to the selected feature points, the feature points that improve the face recognition capability may be selected by making results of filter application with respect to the face images differ clearly.
  • the face descriptor computation unit 240 computes a face descriptor using the extracted feature points.
  • the face descriptor can be computed by selecting and using PCA (Principal Component Analysis), LDA (Linear Discriminant Analysis), and so forth, adopted by the MPEG-7 standard.
  • the face descriptor is a measured value for the face extracted from the image included in the digital content through the use of the extracted feature points, for example, a feature vector, a feature matrix, and so forth.
  • the computation of the face descriptor is described in detail in Korean Unexamined Patent Publication Nos. 2005-0013467, 2006-0008814, and 2007-0047063 as described above.
  • the face descriptor may also be computed by a general technique of recognizing a face through extraction of the feature points from the face image, or by other techniques.
  • the cluster classification unit 150 classifies the digital content by persons through a cluster grouping.
  • the cluster classification unit 150 includes a face descriptor distance comparison unit 310 and a cluster transform unit 320 .
  • the face descriptor distance comparison unit 310 compares the distances between face descriptors of faces computed by the face recognition unit 130 . For example, when representing the face descriptor in the form of a feature vector or a feature matrix, the face descriptor, such as the feature vector or the feature matrix, of an image included in the digital content previously saved in the database can be computed through the same process. Accordingly, if the faces are recognized to be similar to each other, the distance between the feature vectors or feature matrices, which are represented in space or dimensions, becomes below the threshold value.
  • the face descriptor distance comparison unit 310 computes the distance between the face descriptors and determines whether the distance is below the threshold value.
  • the threshold value corresponds to the maximum value of the distance differences between the face descriptors computed from the face images which are determined to correspond to the same person, and can be selected through an analysis or simulation process.
  • the cluster transform unit 320 classifies the digital content by clusters using the distance information of the face descriptors.
  • the classification by clusters is performed in a manner that the face image having the face descriptor in a specified range is determined as the same cluster, i.e., the same person.
  • FIGS. 6 to 9 show examples of cluster grouping performed by a cluster transform unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • cluster grouping refers to a process of generating, extending, or combining the clusters that represent the range where the face images are judged to correspond to the same person.
  • a new face is added to a collection of previously registered faces.
  • the faces may be considered as points in a vector space. If the distance between the points is smaller than a threshold value, the new face can be considered to correspond to the same person. If no previously registered face exists within the threshold distance of the newly added face, as shown in FIG. 6 , the newly registered face remains in an unclassified state.
  • a new cluster 500 is generated. Due to the generation of the new cluster, the face previously registered as an unclassified face and the face included in the newly added digital content are estimated to correspond to the same person, and thus the digital content is classified accordingly.
  • the face of the newly added content may have correlation information with the newly generated cluster, and the faces previously registered as unclassified faces may be updated to be correlated with the newly generated cluster.
  • the distance between the face descriptor of the face of the newly added digital content and the face descriptor of the neighboring person (i) classified as an unclassified or specified person may be below the threshold value.
  • the face of the newly added digital content, as well as the faces of other unclassified digital content may be classified as the same person as the person (i) to extend the cluster 500 .
  • a recognized face of new digital content is located between a person (i) and a person (j) classified as different persons in two clusters 470 and 480 for the person (i) and the person (j), respectively. If the distance between a distance descriptor of the recognized face of the new digital content and a distance descriptor of the person (i) or the person (j) is below the threshold value, the person (i), the person (j), and the recognized face of the new digital content are classified as the same person. By inputting the recognized face of the new digital content, the existing clusters are combined to extend the cluster 500 .
  • the cluster classification unit 150 can generate, extend, and combine the cluster 500 through cluster grouping.
  • the face of the newly added digital content and the face previously saved in the database can be classified by person through the cluster grouping that is performed by the cluster classification unit 150 .
  • the cluster grouping can be applied in the case of adding and deleting the digital content. When deleting the digital content, the cluster grouping can be performed in an opposite manner to that as shown in FIGS. 6 to 9 .
  • FIG. 10 shows the data structure update unit 170 in the system 100 , according to an embodiment of the present invention.
  • the data structure update unit 170 updates the data structure of the digital content classified by person through the cluster classification unit 150 .
  • the data structure update unit 170 introduces flags to apply the cluster transform, such as cluster generation, deletion, or combination, which is performed by the cluster classification unit 150 , to the data structure.
  • the data structure update unit 170 includes a flag setting unit 410 and a data structure application unit 420 .
  • the flag setting unit 410 sets flags of nodes that arrange the digital content.
  • the flags are information that reflect the change of the nodes generated according to the cluster grouping by persons classified by the cluster classification unit 150 .
  • the flag setting unit 410 may save the flags in a separate storage device or temporarily in a memory.
  • the data structure application unit 420 automatically updates the data structure according to the set flag.
  • FIG. 11 shows flag setting performed by the flag setting unit 410 in the system 100 , according to an embodiment of the present invention.
  • FIGS. 12 to 15 show processes of updating a data structure performed by a data structure application unit in the system 100 , according to an embodiment of the present invention.
  • the digital content may include photos and have a data structure classified by persons as shown in FIG. 3 .
  • the flag may be set to “Modified”, “Added”, or “None”, and diverse flags may be used by a designer or a user. All nodes may have an initial state where the flag is set to “None”.
  • the “Face 3 - 3 ” node 1120 is added to the “Person 3 ” node 1100 . Accordingly, the flag 1170 in the “Person 3 ” node 1100 is set to “Modified”, and the flag 1170 in the “Face 3 - 3 ” node 1120 is set to “Added”.
  • the respective node may include an information table 1150 for saving correlation information. The flag 1170 may be saved in the information table 1150 or a separate storage device.
  • FIG. 12 shows the addition of a new “Face” node 1220 . If a new face is added as shown in FIG. 6 , but is not judged to be a specified person, the new face may be added to the “Face” node 1220 that is a lower node of the “Unclassified” node 1200 . Accordingly, the flag setting unit 410 sets the flag 1170 in the “Unclassified” node 1200 to “Modified”, and the flag in the “Face” node 1220 to “Added”. The data structure application unit 420 updates the data structure according to the set flag 1170 as illustrated in FIG. 12 .
  • FIG. 13 shows the generation of the new cluster 500 and the addition of a new “Person” node 1300 is added as shown in FIG. 7 .
  • the “Face” nodes 1310 , 1320 , 1330 , and 1220 which are classified as the same person, are arranged in the new “Person” node 1300 .
  • the flag setting unit 410 sets the flag in “Person” node 1300 to “Added”, the flags in the “Person” nodes 1310 , 1320 , and 1330 that are lower nodes of the existing “Unclassified” node 1200 to “Modified”, and the flag in the new “Face” node 1220 to “Added”, respectively.
  • the data structure setting unit 420 adds the “Face” nodes 1310 , 1320 , and 1330 to the new “Person” node 1300 as the data structure setting unit 420 deletes the “Face” nodes 1310 , 1320 , and 1330 from the “Unclassified” node 1200 , according to the set flag. Accordingly, the flag setting according to the generation of the new cluster 500 and the update of the data structure through the data structure application unit 420 can be performed. Boxes in FIG. 13 indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
  • FIG. 14 shows the flag setting and the update of the data structure performed in the event that the cluster is extended as shown in FIG. 8 .
  • faces of other photos that remain unclassified are classified as the same person.
  • the “Face” nodes 1410 and 1320 of the “Unclassified” node 1200 are modified into lower nodes 1410 and 1320 of the “Person 3 ” node 1400 , and the “Face” node 1220 of a new photo is added to the “Person 3 ” node 1400 .
  • the flag of the “Person 3 ” node 1400 is set to “Modified”, the flags of the “Face” nodes 1410 and 1320 that are lower nodes of the “Person 3 ” node 1400 are set to “Modified”, and the flag of the new “Face” node 1220 is set to “Added”.
  • the data structure can be updated according to the set flag. Boxes in FIG. 14 indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
  • FIG. 15 shows the flag setting and the update of the data structure being performed in the event that the clusters are combined as shown in FIG. 9 .
  • two persons judged as different persons are combined due to the addition of the new face. For example, if “Person 2 ” and “Person 3 ” are determined to be the same person, two face nodes 1510 and 1520 linked to the “Person 2 ” node 1500 are linked to the “Person 3 ” node 1400 .
  • the new “Face” node 1220 of the new photo is also added to the “Person 3 ” node 1400 .
  • the flag setting unit 410 sets the flags of the “Person 3 ” node 1400 and “Face” nodes 1510 and 1520 to “Modified” according to the cluster combination, and sets the flag of the new “Face” node 1220 to “Added”.
  • the data structure can be updated according to the set flag. Boxes indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
  • the face recognition unit 130 extracts a face descriptor from new content inputted thereto, and the cluster classification unit 150 classifies the digital content by persons through the cluster grouping.
  • the data structure of the digital content classified by person-based clustering can be efficiently updated through the flag setting of the classified digital content. By updating only the changed part of the data structure and saving the updated part of the data structure in the database, the details of the change according to the cluster grouping can be efficiently reflected in the database.
  • the flag is described as set in the node.
  • the set flags may also be saved in the face and person information tables or in a predefined storage device.
  • a plurality of digital content is classified by person-based clustering, and thus a user can easily retrieve and browse the digital content.
  • the classification of the digital content can be performed even without a reference image.
  • the flag setting even if digital content is newly added or deleted by the flag setting, such addition or deletion of the digital content can be efficiently reflected in the database.
  • FIG. 16 shows a process of saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • a data structure in which faces are arranged by persons is generated at operation S 1600 .
  • the generated data structure may be a two-dimensional (2D) linked list.
  • the schema may include a content (photo) information table, a person information table, and a face information table.
  • a face descriptor is extracted from a face image included in new digital content to be saved in the database at operation S 1610 .
  • a face is extracted from the face image included in the new digital content, and feature points are extracted from the extracted face to compute the face descriptor.
  • the face descriptor may be a feature vector.
  • the new digital content and the plurality of digital content saved in the database are classified by person through a cluster grouping at operation S 1620 .
  • a cluster grouping In order to determine whether the face images are within a range where the face images are considered to correspond to the same person, whether the face descriptors are located within the same cluster range is determined through comparison of distances between the face descriptors.
  • the cluster can be generated, extended, or combined through the cluster grouping. As described above, even where new digital content is added through the cluster grouping or the existing digital content is deleted, the digital content can be classified by person-based clustering.
  • the data structure can be updated by setting flags at operation S 1630 .
  • flags For example, if the generated data structure is the 2D linked list, a flag is set to indicate that the location of the node is changed or that a node is newly added through the cluster grouping. The flag may have, for example, one value of “Added”, “Modified”, or “None”.
  • the generated data structure is updated according to the set flag. As the data structure is updated, a new node may be generated or the location of the existing node may be changed.
  • the digital content is updated and saved in the database according to the updated data structure at operation S 1640 .
  • the update and saving of the digital content in the database can be performed through the update of the schema of the database. Accordingly, using the updated schema, the saved digital content can be easily browsed and retrieved by persons.
  • the digital content when adding or deleting digital content, the digital content can be classified by person-based clustering, and such classification of the digital content can be efficiently reflected in the database.
  • the classification of the digital content by person-based clustering can be automatically performed through the generation, extension, or combination of the clusters.
  • FIG. 17 shows a updating a data structure in a process of saving digital content classified by person-based clustering according to an embodiment of the present invention.
  • the digital content is a photo
  • a 2D linked list for linking face nodes based on person nodes as shown in FIG. 3 is generated as the data structure.
  • aspects of the present invention may also be applied to other types of digital content, as discussed above.
  • a first person node is selected from the linked list at operation S 1700 . Whether the flag of the first person node is set to “Added” or “Modified” is determined at operation S 1710 . If the flag is not set to “Added”, whether the flag is set to “Modified” is determined at S 1720 . If the flag is not set to “Modified”, whether the present person node is the last person node is determined at operation S 1760 . If the present person node is not the last person node, the next person node is selected at operation S 1770 .
  • the flag is set to “Added”, this indicates that a new person is registered, and thus a person node is added at operation S 1730 . If the flag is not set to “Added” but is set to “Modified”, a face node is selected under the corresponding person node at operation S 1740 . The flag setting is determined as the face node is scanned, and the face node is modified or added according to the set flag at operation S 1750 . If the scanning of the face nodes is completed, whether the present person node is the last person node is determined at operation S 1760 , and if the present person node is the last person node, the update of the data structure ends.
  • the data structure is updated according to the flag setting, and the details of the change are reflected in the database. Accordingly, the digital content saved in the database can be classified by person-based clustering and efficiently managed.
  • the digital content in a variable environment where digital content is added or deleted, can be classified by person-based clustering, and the classified digital content can be efficiently reflected in the database.
  • a plurality of digital content is classified by person-based clustering in the environment where the digital content is added or deleted, a user can easily retrieve and browse the digital content.
  • the digital content is variably classified by person-based clustering through generation, extension, or combination of clusters, the classification of the digital content by person-based clustering can be performed even without a reference image.

Abstract

A system and method of saving digital content classified by person-based clustering. The system for saving digital content classified by person-based clustering, includes a database to save a plurality of digital content classified by person-based clustering; a data structure generation unit to generate a data structure composed of a plurality of nodes using the plurality of digital content; a face recognition unit to extract a face descriptor of new digital content to be saved in the database; a cluster classification unit to classify the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor, and a data structure update unit to update the data structure according to the classification.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application No. 60/951,568 filed in the United States Patent and Trademark Office on Jul. 24, 2007 and Korean Patent Application No. 2007-107427 filed in the Korean Intellectual Property Office on Oct. 24, 2007, the disclosures of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
Aspects of the present invention relate to a system and method of saving digital content classified by person-based clustering, and more particularly, to a system and method of classifying digital content by person-based clustering and saving the classified content in a database.
2. Description of the Related Art
Recently, as use of mass digital storage devices, such as HDD, flash memory, and so forth, has spread, the amount of digital content that can be stored in such digital storage devices has been increasing geometrically. In addition, with the activation of P2P (Peer-to-Peer), online shopping, UCC (User Created Content), and so forth, many kinds of digital content have been produced, which can be managed by multimedia devices that can manage digital content, such as digital cameras, MP3 players, digital TVs, PVRs (Personal Video Recorders), PMPs (Portable Multimedia Player), and so forth.
Accordingly, there is an increasing demand to browse or retrieve content in diverse forms, and diverse retrieving techniques have been used for such content browsing. However, as the use of high-integrated and subminiature memories has been generalized due to the development of memory technology, and digital image compression technology that does not greatly damage the picture quality has been developed, a large amount of digital content can be stored in multimedia devices, and thus a system and method for effectively managing the stored digital content is required.
Users have a tendency to classify and arrange digital content by times, persons, or specified categories. Such classified digital content can be browsed by a user or easily shared with another user. However, it takes a lot of time and effort in grouping or labeling a large amount of digital content one by one, and another user must reclassify the digital content even after the content has been classified by the original user.
As described above, with the increasing demand to browse or retrieve content in diverse forms, person information included in the content may be used as a primary tool in browsing or retrieving content. For example, photos may be classified into user's own photos, photos of the user's wife, photos of the user's son, photos of the user's parents, photos of the user's friend, and so forth, and videos may be classified according to the appearance of characters in various scenes.
Accordingly, there is a need for a system and method for classifying digital content by person-based clustering and efficiently saving the classified content in a database. In addition, there is a need for a system and method for automatically classifying digital content by person-based clustering and saving the classified content in a database even where the digital content is added or deleted.
SUMMARY OF THE INVENTION
Aspects of the present invention provide a system and method of saving digital content classified by person-based clustering when an addition or deletion of the digital content is performed.
Additional aspects of the present invention provide a system and method of classifying digital content by person-based clustering and efficiently saving the classified digital content in a database by introducing flags through a cluster grouping when an addition or deletion of the digital content is performed.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
According to an aspect of the invention, a system for saving digital content classified by person-based clustering is provided. The system includes a database to save a plurality of digital content classified by person-based clustering; a data structure generation unit to generate a data structure including a plurality of nodes using the plurality of digital content; a face recognition unit to extract a face descriptor of new digital content to be saved in the database; a cluster classification unit to classify the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and a data structure update unit to update the data structure according to the classification.
According to another aspect of the present invention, a system for saving digital content classified by person-based clustering is provided. The system includes a data structure generation unit to generate a data structure including a plurality of nodes using a plurality of digital content; a cluster classification unit to classify the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted; a data structure update unit to update the data structure according to the classification; and a database to update and save the remaining digital content by reflecting the updated data structure.
According to another aspect of the present invention, a method of saving digital content classified by person-based clustering is provided. The method includes generating a data structure including a plurality of nodes using a schema of a database that stores a plurality of digital content; extracting a face descriptor of new digital content to be saved in the database; classifying the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and updating the data structure according to the classification.
According to another aspect of the present invention, there is provided a method of saving digital content classified by person-based clustering, which includes generating a data structure including a plurality of nodes using a plurality of digital content; classifying the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted; updating the data structure according to the classification; and updating and saving the remaining digital content in a database by reflecting the updated data structure.
Additional aspects and/or advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of the invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a block diagram illustrating a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 2 is a view illustrating a database schema in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 3 is an exemplary view illustrating a data structure in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 4 is a block diagram illustrating the construction of a face recognition unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 5 is a block diagram illustrating the construction of a cluster classification unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIGS. 6 to 9 are views illustrating examples of cluster grouping performed by a cluster transform unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 10 is a block diagram illustrating the construction of a data structure update unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 11 is a view explaining flag setting performed by a flag setting unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIGS. 12 to 15 are views explaining processes of updating a data structure performed by a data structure application unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention;
FIG. 16 is a flowchart illustrating a method of saving digital content classified by person-based clustering according to an embodiment of the present invention; and
FIG. 17 is a flowchart illustrating a method of updating a data structure in a method of saving digital content classified by person-based clustering according to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Reference will now be made in detail to the present embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.
Aspects of the present invention will be described herein with reference to the accompanying drawings illustrating block diagrams and flowcharts for explaining a system and method of saving digital content classified by person-based clustering according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by computer program instructions. These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the functions specified in the flowchart block or blocks.
These computer program instructions may also be stored in a computer usable or computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer usable or computer-readable memory implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus operations to implement the functions specified in the flowchart block or blocks.
Each block of the flowchart illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions to implement the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of order. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in reverse order, depending upon the functionality involved.
According to aspects of the present invention, the term “unit”, as used herein, may be implemented as a kind of module. The term “module” refers to, but is not limited to, a software or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks. A module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors. Thus, a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules.
FIG. 1 shows a system 100 for saving digital content classified by person-based clustering according to an embodiment of the present invention. FIG. 2 shows a database schema in the system 100, according to an embodiment of the present invention. FIG. 3 shows a data structure in the system 100, according to an embodiment of the present invention. FIG. 4 shows a face recognition unit 130 in the system 100, according to an embodiment of the present invention. FIG. 5 shows a cluster classification unit 150 in the system 100, according to an embodiment of the present invention. As shown in FIG. 1, the system 100 includes a data structure generation unit 120, the face recognition unit 130, the cluster classification unit 150, a data structure update unit 170, a database 180, and a content browsing unit 190.
The content browsing unit 190 shows information about a plurality of digital content classified by person-based clustering, which are saved in the database 180, so that a user can see the digital content. The user can browse the saved digital content by person-based clustering while the user views the digital content information on a display (not shown).
The database 180 saves a plurality of digital content. Digital content indicates content that includes a person's images, such as photos, moving images, and so forth. The database 180 can save the digital content classified by person-based clustering. For example, the digital content may include photos. In order to save photos by person-based clustering, a schema as shown in FIG. 2 may be applied. Although described herein with respect to photos and faces, aspects of the present invention may also be applied to other types of content that may or may not include faces, such as motion pictures, movies, or text.
The schema, which may be applied to save photos classified by person-based clustering, may include a photo information table 123 for saving the photo, a person information table 125 for saving person information, and a face information table 127 for saving face information. The photo information table 123 saves a photo identifier PhotoID as a primary key for properly classifying the content, and also saves content-related information, such as a file name of the content. The person information table 125 saves a person identifier PersonID as a primary key for properly classifying the person, and also saves person-related information, such as a name of the person. The face information table 127, which is a table for saving characters, content, and parts of the content in which the characters appear, saves a face identifier FaceID as a primary key for properly classifying respective faces, primary keys of the photo information table and the person information table as foreign keys in order to classify the persons and the content in which the faces appear, and face-related information, such as locations of the content where the faces appear. In each face identifier FaceID, a face descriptor may be included as a quantitative value of the corresponding face. The schema may be applied in order to save the content classified by person-based clustering in the database. A similar schema may be used to classify motion picture content, as motion picture content can be seen as a series of still images.
Referring again to FIG. 1, the data structure generation unit 120 generates a data structure using a plurality of digital content provided from the database. The data structure generation unit 120 can generate the data structure classified by person-based clustering through the use of the schema.
As shown in FIG. 3, a data structure classified by person-based clustering is shown with respect to the plurality of digital content saved in the database by the data structure generation unit 120. For example, when using a data structure of a two-dimensional (2D) linked list, person nodes 315 are generated and linked by person-based clustering, and face nodes 325 classified as the corresponding person are generated and linked to the respective person node. The data structure generation unit 120 generates the linked list using the schema of the database classified by person-based clustering.
Referring again to FIG. 1, the face recognition unit 130 recognizes a face in an image of newly input digital content. The face recognition unit 130 recognizes a person in the image included in the new digital content and extracts a face extractor to classify the content by person-based clustering from the recognized person. General face recognition techniques may be divided into a technique of pre-registering a reference face and matching a given face to the most similar reference face, and a clustering technique for grouping similar faces when a reference face is not registered.
According to one embodiment of the present invention, the digital content may be classified by persons through a cluster grouping that is estimated as the same person by a clustering technique. Face recognition techniques are described in detail in Korean Unexamined Patent Publication Nos. 2005-0013467, 2006-0008814, and 2007-0047063. Other face recognition and clustering techniques may be employed as well.
Referring to FIG. 4, the face recognition unit 130 includes a face extraction unit 210, a face normalization unit 220, a feature point extraction unit 230, and a face descriptor computation unit 240. The face extraction unit 210 extracts a face region by extracting a face and constituent element of the face from an image included in the digital content. The face extraction unit 210 detects a rough face from the image, extracts eyes, a nose, a mouth, and so forth, which are specified constituent elements of the face, and then extracts the face region based on the extracted face constituent elements. For example, a process of cutting the face region from the input image based on the distance between the two eyes, as a pre-process for face recognition, can minimize the influence of the background of the image, the change of the hair style of the person, and so forth.
The face normalization unit 220 normalizes the size of the face region using the extracted face region information. By normalizing the size of the face region, specified personal appearance of the face region, such as the distance between two eyes, the distance between an eye and a nose, and so forth, can be maintained.
The feature point extraction unit 230 extracts feature points, such as eyes, from the normalized face region. The feature points may be selected as points on the face region that enlarge the difference between images of different input faces, through experiments for a plurality of persons. For example, when applying a Gabor filter to the selected feature points, the feature points that improve the face recognition capability may be selected by making results of filter application with respect to the face images differ clearly.
The face descriptor computation unit 240 computes a face descriptor using the extracted feature points. The face descriptor can be computed by selecting and using PCA (Principal Component Analysis), LDA (Linear Discriminant Analysis), and so forth, adopted by the MPEG-7 standard. The face descriptor is a measured value for the face extracted from the image included in the digital content through the use of the extracted feature points, for example, a feature vector, a feature matrix, and so forth. The computation of the face descriptor is described in detail in Korean Unexamined Patent Publication Nos. 2005-0013467, 2006-0008814, and 2007-0047063 as described above. The face descriptor may also be computed by a general technique of recognizing a face through extraction of the feature points from the face image, or by other techniques.
As shown in FIG. 5, the cluster classification unit 150 classifies the digital content by persons through a cluster grouping. The cluster classification unit 150 includes a face descriptor distance comparison unit 310 and a cluster transform unit 320.
The face descriptor distance comparison unit 310 compares the distances between face descriptors of faces computed by the face recognition unit 130. For example, when representing the face descriptor in the form of a feature vector or a feature matrix, the face descriptor, such as the feature vector or the feature matrix, of an image included in the digital content previously saved in the database can be computed through the same process. Accordingly, if the faces are recognized to be similar to each other, the distance between the feature vectors or feature matrices, which are represented in space or dimensions, becomes below the threshold value. The face descriptor distance comparison unit 310 computes the distance between the face descriptors and determines whether the distance is below the threshold value. The threshold value corresponds to the maximum value of the distance differences between the face descriptors computed from the face images which are determined to correspond to the same person, and can be selected through an analysis or simulation process.
The cluster transform unit 320 classifies the digital content by clusters using the distance information of the face descriptors. The classification by clusters is performed in a manner that the face image having the face descriptor in a specified range is determined as the same cluster, i.e., the same person.
FIGS. 6 to 9 show examples of cluster grouping performed by a cluster transform unit in a system for saving digital content classified by person-based clustering according to an embodiment of the present invention. The term “cluster grouping” refers to a process of generating, extending, or combining the clusters that represent the range where the face images are judged to correspond to the same person.
In the example shown in FIG. 6, a new face is added to a collection of previously registered faces. The faces may be considered as points in a vector space. If the distance between the points is smaller than a threshold value, the new face can be considered to correspond to the same person. If no previously registered face exists within the threshold distance of the newly added face, as shown in FIG. 6, the newly registered face remains in an unclassified state.
As shown in FIG. 7, if the distance between the face descriptor of the face included in the newly added digital content and the face descriptor previously registered as an unclassified face is within the threshold distance, then a new cluster 500 is generated. Due to the generation of the new cluster, the face previously registered as an unclassified face and the face included in the newly added digital content are estimated to correspond to the same person, and thus the digital content is classified accordingly. In this case, the face of the newly added content may have correlation information with the newly generated cluster, and the faces previously registered as unclassified faces may be updated to be correlated with the newly generated cluster.
In the example shown in FIG. 8, the distance between the face descriptor of the face of the newly added digital content and the face descriptor of the neighboring person (i) classified as an unclassified or specified person may be below the threshold value. In this case, the face of the newly added digital content, as well as the faces of other unclassified digital content, may be classified as the same person as the person (i) to extend the cluster 500.
In the example shown in FIG. 9, a recognized face of new digital content is located between a person (i) and a person (j) classified as different persons in two clusters 470 and 480 for the person (i) and the person (j), respectively. If the distance between a distance descriptor of the recognized face of the new digital content and a distance descriptor of the person (i) or the person (j) is below the threshold value, the person (i), the person (j), and the recognized face of the new digital content are classified as the same person. By inputting the recognized face of the new digital content, the existing clusters are combined to extend the cluster 500.
As described above, the cluster classification unit 150 can generate, extend, and combine the cluster 500 through cluster grouping. The face of the newly added digital content and the face previously saved in the database can be classified by person through the cluster grouping that is performed by the cluster classification unit 150. In addition, the cluster grouping can be applied in the case of adding and deleting the digital content. When deleting the digital content, the cluster grouping can be performed in an opposite manner to that as shown in FIGS. 6 to 9.
FIG. 10 shows the data structure update unit 170 in the system 100, according to an embodiment of the present invention. The data structure update unit 170 updates the data structure of the digital content classified by person through the cluster classification unit 150. The data structure update unit 170 introduces flags to apply the cluster transform, such as cluster generation, deletion, or combination, which is performed by the cluster classification unit 150, to the data structure. The data structure update unit 170 includes a flag setting unit 410 and a data structure application unit 420.
The flag setting unit 410 sets flags of nodes that arrange the digital content. The flags are information that reflect the change of the nodes generated according to the cluster grouping by persons classified by the cluster classification unit 150. The flag setting unit 410 may save the flags in a separate storage device or temporarily in a memory. The data structure application unit 420 automatically updates the data structure according to the set flag.
FIG. 11 shows flag setting performed by the flag setting unit 410 in the system 100, according to an embodiment of the present invention. FIGS. 12 to 15 show processes of updating a data structure performed by a data structure application unit in the system 100, according to an embodiment of the present invention.
For example, the digital content may include photos and have a data structure classified by persons as shown in FIG. 3. The flag may be set to “Modified”, “Added”, or “None”, and diverse flags may be used by a designer or a user. All nodes may have an initial state where the flag is set to “None”.
As shown in FIG. 11, if the “Face3-3” that is a new photo to be saved in the database is judged to correspond to “Person3” by the face recognition unit 130 and the cluster classification unit 150, the “Face3-3node 1120 is added to the “Person3node 1100. Accordingly, the flag 1170 in the “Person3node 1100 is set to “Modified”, and the flag 1170 in the “Face3-3node 1120 is set to “Added”. The respective node may include an information table 1150 for saving correlation information. The flag 1170 may be saved in the information table 1150 or a separate storage device.
FIG. 12 shows the addition of a new “Face” node 1220. If a new face is added as shown in FIG. 6, but is not judged to be a specified person, the new face may be added to the “Face” node 1220 that is a lower node of the “Unclassified” node 1200. Accordingly, the flag setting unit 410 sets the flag 1170 in the “Unclassified” node 1200 to “Modified”, and the flag in the “Face” node 1220 to “Added”. The data structure application unit 420 updates the data structure according to the set flag 1170 as illustrated in FIG. 12.
FIG. 13 shows the generation of the new cluster 500 and the addition of a new “Person” node 1300 is added as shown in FIG. 7. As the new cluster 500 is generated, the “Face” nodes 1310, 1320, 1330, and 1220, which are classified as the same person, are arranged in the new “Person” node 1300. Accordingly, the flag setting unit 410 sets the flag in “Person” node 1300 to “Added”, the flags in the “Person” nodes 1310, 1320, and 1330 that are lower nodes of the existing “Unclassified” node 1200 to “Modified”, and the flag in the new “Face” node 1220 to “Added”, respectively. The data structure setting unit 420 adds the “Face” nodes 1310, 1320, and 1330 to the new “Person” node 1300 as the data structure setting unit 420 deletes the “Face” nodes 1310, 1320, and 1330 from the “Unclassified” node 1200, according to the set flag. Accordingly, the flag setting according to the generation of the new cluster 500 and the update of the data structure through the data structure application unit 420 can be performed. Boxes in FIG. 13 indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
FIG. 14 shows the flag setting and the update of the data structure performed in the event that the cluster is extended as shown in FIG. 8. As a new photo is added, faces of other photos that remain unclassified are classified as the same person. The “Face” nodes 1410 and 1320 of the “Unclassified” node 1200 are modified into lower nodes 1410 and 1320 of the “Person3node 1400, and the “Face” node 1220 of a new photo is added to the “Person3node 1400. The flag of the “Person3node 1400 is set to “Modified”, the flags of the “Face” nodes 1410 and 1320 that are lower nodes of the “Person3node 1400 are set to “Modified”, and the flag of the new “Face” node 1220 is set to “Added”. The data structure can be updated according to the set flag. Boxes in FIG. 14 indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
FIG. 15 shows the flag setting and the update of the data structure being performed in the event that the clusters are combined as shown in FIG. 9. In this case, as a new photo is added, two persons judged as different persons are combined due to the addition of the new face. For example, if “Person2” and “Person3” are determined to be the same person, two face nodes 1510 and 1520 linked to the “Person2node 1500 are linked to the “Person3node 1400. In addition, the new “Face” node 1220 of the new photo is also added to the “Person3node 1400.
The flag setting unit 410 sets the flags of the “Person3node 1400 and “Face” nodes 1510 and 1520 to “Modified” according to the cluster combination, and sets the flag of the new “Face” node 1220 to “Added”. The data structure can be updated according to the set flag. Boxes indicated by dotted lines refer to boxes that are moved to other locations with the deletion of the nodes.
As described above, the face recognition unit 130 extracts a face descriptor from new content inputted thereto, and the cluster classification unit 150 classifies the digital content by persons through the cluster grouping. The data structure of the digital content classified by person-based clustering can be efficiently updated through the flag setting of the classified digital content. By updating only the changed part of the data structure and saving the updated part of the data structure in the database, the details of the change according to the cluster grouping can be efficiently reflected in the database.
The flag is described as set in the node. However, according to other aspects of the present invention, the set flags may also be saved in the face and person information tables or in a predefined storage device.
As described above, in an embodiment of the present invention, a plurality of digital content is classified by person-based clustering, and thus a user can easily retrieve and browse the digital content. By variably classifying the digital content by persons through the generation, extension, or combination of the clusters, the classification of the digital content can be performed even without a reference image. In addition, even if digital content is newly added or deleted by the flag setting, such addition or deletion of the digital content can be efficiently reflected in the database.
FIG. 16 shows a process of saving digital content classified by person-based clustering according to an embodiment of the present invention. Using a plurality of digital content saved in a database, a data structure in which faces are arranged by persons is generated at operation S1600. For example, the generated data structure may be a two-dimensional (2D) linked list. Using a schema saved in the database, the data structure for the plurality of digital content previously classified by person-based clustering is generated. The schema may include a content (photo) information table, a person information table, and a face information table.
A face descriptor is extracted from a face image included in new digital content to be saved in the database at operation S1610. In order to extract the face descriptor from the face image included in the new digital content, a face is extracted from the face image included in the new digital content, and feature points are extracted from the extracted face to compute the face descriptor. The face descriptor may be a feature vector.
The new digital content and the plurality of digital content saved in the database are classified by person through a cluster grouping at operation S1620. In order to determine whether the face images are within a range where the face images are considered to correspond to the same person, whether the face descriptors are located within the same cluster range is determined through comparison of distances between the face descriptors. The cluster can be generated, extended, or combined through the cluster grouping. As described above, even where new digital content is added through the cluster grouping or the existing digital content is deleted, the digital content can be classified by person-based clustering.
After the classification of the digital content by person-based clustering, the data structure can be updated by setting flags at operation S1630. For example, if the generated data structure is the 2D linked list, a flag is set to indicate that the location of the node is changed or that a node is newly added through the cluster grouping. The flag may have, for example, one value of “Added”, “Modified”, or “None”. The generated data structure is updated according to the set flag. As the data structure is updated, a new node may be generated or the location of the existing node may be changed.
After the data structure is updated, the digital content is updated and saved in the database according to the updated data structure at operation S1640. The update and saving of the digital content in the database can be performed through the update of the schema of the database. Accordingly, using the updated schema, the saved digital content can be easily browsed and retrieved by persons.
As described above, when adding or deleting digital content, the digital content can be classified by person-based clustering, and such classification of the digital content can be efficiently reflected in the database. In addition, the classification of the digital content by person-based clustering can be automatically performed through the generation, extension, or combination of the clusters.
FIG. 17 shows a updating a data structure in a process of saving digital content classified by person-based clustering according to an embodiment of the present invention. In one example, the digital content is a photo, and a 2D linked list for linking face nodes based on person nodes as shown in FIG. 3 is generated as the data structure. However, aspects of the present invention may also be applied to other types of digital content, as discussed above.
In updating the data structure according to the flag setting, a first person node is selected from the linked list at operation S1700. Whether the flag of the first person node is set to “Added” or “Modified” is determined at operation S1710. If the flag is not set to “Added”, whether the flag is set to “Modified” is determined at S1720. If the flag is not set to “Modified”, whether the present person node is the last person node is determined at operation S1760. If the present person node is not the last person node, the next person node is selected at operation S1770.
If the flag is set to “Added”, this indicates that a new person is registered, and thus a person node is added at operation S1730. If the flag is not set to “Added” but is set to “Modified”, a face node is selected under the corresponding person node at operation S1740. The flag setting is determined as the face node is scanned, and the face node is modified or added according to the set flag at operation S1750. If the scanning of the face nodes is completed, whether the present person node is the last person node is determined at operation S1760, and if the present person node is the last person node, the update of the data structure ends.
Accordingly, in the linked list data structure, the data structure is updated according to the flag setting, and the details of the change are reflected in the database. Accordingly, the digital content saved in the database can be classified by person-based clustering and efficiently managed.
As described above, according to aspects of the present invention, in a variable environment where digital content is added or deleted, the digital content can be classified by person-based clustering, and the classified digital content can be efficiently reflected in the database. In addition, since a plurality of digital content is classified by person-based clustering in the environment where the digital content is added or deleted, a user can easily retrieve and browse the digital content. Further, since the digital content is variably classified by person-based clustering through generation, extension, or combination of clusters, the classification of the digital content by person-based clustering can be performed even without a reference image.
Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in this embodiment without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (35)

1. A system for saving digital content classified by person-based clustering, comprising:
a database located on a non-transitory computer readable medium to save a plurality of digital content classified by person-based clustering;
a data structure generation unit located on the non-transitory computer readable medium to generate a data structure including a plurality of nodes using the plurality of digital content;
a face recognition unit located on the non-transitory computer readable medium to extract a face descriptor of new digital content to be saved in the database;
a cluster classification unit located on the non-transitory computer readable medium to classify the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and
a data structure update unit located on the non-transitory computer readable medium to update the data structure according to the classification.
2. The system of claim 1, wherein the face recognition unit comprises:
a face extraction unit to extract a face region from an image of a person in the new digital content; and
a feature point extraction unit to extract feature points from the extracted face region.
3. The system of claim 1, wherein the face descriptor is a feature vector extracted from an image of a person in the digital content.
4. The system of claim 3, wherein the cluster classification unit classifies the new digital content and the plurality of digital content by the person-based clustering through comparison of feature vector distances using the feature vectors.
5. The system of claim 1, wherein the cluster classification unit classifies the digital content using cluster generation, cluster extension, or cluster combination.
6. The system of claim 1, wherein the data structure is a two-dimensional linked list data structure that comprises person nodes linked together by persons and face nodes linked to the person nodes to save face information of the digital content that is determined to be the same face.
7. The system of claim 1, wherein the data structure update unit updates the data structure of the new digital content and the plurality of the digital content by setting flags of person nodes and face nodes.
8. The system of claim 7, wherein the flag has a value of “Added”, “Modified”, or “None”.
9. The system of claim 1, wherein the database saves the new digital content and the plurality of digital content according to a database schema that includes a content information table, a person information table, and a face information table.
10. The system of claim 1, wherein the database updates and saves the new digital content and the plurality of digital content by reflecting the updated data structure.
11. A system for saving digital content classified by person-based clustering, comprising:
a data structure generation unit located on one or more processors to generate a data structure including a plurality of nodes using a plurality of digital content;
a cluster classification unit located on the one or more processors to classify the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted;
a data structure update unit located on the one or more processors to update the data structure according to the classification; and
a database located on the one or more processors to update and save the remaining digital content by reflecting the updated data structure.
12. The system of claim 11, wherein the data structure update unit updates the data structure of the new digital content and the plurality of the digital content by setting flags of person nodes and face nodes.
13. The system of claim 12, wherein the flag has a value of “Added”, “Modified”, or “None”.
14. The system of claim 11, wherein the database saves the remaining digital content according to a database schema that is provided with a content information table, a person information table, and a face information table.
15. A method of saving digital content classified by person-based clustering, comprising:
generating, on a non-transitory computer readable medium, a data structure composed of a plurality of nodes using a schema of a database that stores a plurality of digital content;
extracting, on the non-transitory computer readable medium, a face descriptor of new digital content to be saved in the database;
classifying, on the non-transitory computer readable medium, the new digital content and the plurality of digital content by the person-based clustering using the extracted face descriptor; and
updating, on the non-transitory computer readable medium, the data structure according to the classification.
16. The method of claim 15, wherein the schema comprises a content information table, a person information table, and a face information table.
17. The method of claim 15, further comprising updating and saving the new digital content and the plurality of digital content in the database by reflecting the updated data structure.
18. The method of claim 15, wherein the extracting comprises:
extracting a face region from the new digital content; and
computing a feature vector by extracting feature points from the extracted face region;
wherein the classifying comprises determining whether the face descriptor is within a cluster range through comparison of distances between the computed feature vector and feature vectors of faces included in the plurality of digital content using the computed feature vector.
19. The method of claim 15, wherein the classifying comprises classifying the digital content by cluster generation, cluster extension, or cluster combination.
20. The method of claim 15, wherein the updating comprises updating the data structure of the new digital content and the plurality of the digital content by setting flags of person nodes and face nodes.
21. The method of claim 20, wherein the flag has one value of “Added”, “Modified”, and “None”.
22. A computer readable recording medium recorded with computer-readable instructions to cause a computer to execute the method of claim 15.
23. A method of saving digital content classified by person-based clustering, comprising:
generating, on a processor, a data structure including a plurality of nodes using a plurality of digital content;
classifying, on the processor, the remaining digital content by the person-based clustering using a face descriptor if a part of the plurality of digital content is deleted;
updating, on the processor, the data structure according to the classification; and
updating and saving, on the processor, the remaining digital content in a database by reflecting the updated data structure.
24. The method of claim 23, wherein the classifying comprises classifying the remaining digital content by cluster generation, cluster extension, or cluster combination.
25. The method of claim 23, wherein the updating comprises updating the data structure of the remaining digital content by setting flags of person nodes and face nodes.
26. A multimedia device to automatically and efficiently classify content stored therein, comprising:
a storage unit located on a processor to store a plurality of digital content;
a data structure generation unit located on the processor to generate a data structure having a plurality of nodes, based on the content stored in the storage unit;
an extraction unit located on the processor to extract a descriptor of the new digital content to be stored in the storage unit;
a cluster classification unit located on the processor to classify the new digital content and the plurality of digital content using a clustering scheme and the extracted descriptor; and
a data structure update unit located on the processor to update the data structure based on the classification.
27. The multimedia device of claim 26, wherein the cluster classification reclassifies the plurality of digital content when digital content is deleted from the storage unit.
28. The multimedia device of claim 26, wherein the data structure update unit only updates changed portions of the data structure.
29. The multimedia device of claim 26, wherein the cluster classification unit classifies the new digital content through one of generating of a plurality of the clusters, extending of the plurality of the clusters, combining of the plurality of the clusters, or a combination thereof, without using a reference value.
30. The multimedia device of claim 29, wherein the reference value is a reference image.
31. The multimedia device of claim 26, wherein the cluster classification unit classifies the plurality of digital content into clusters, determines a distance between the descriptor and each of the clusters, and classifies the new digital content based on the distance.
32. The multimedia device of claim 31, wherein the cluster classification unit classifies the new digital content as part of a cluster if the descriptor is within a predetermined distance from the cluster, and classifies the new digital content as unclassified if the descriptor is not within the predetermined distance from any of the clusters.
33. The multimedia device of claim 31, wherein, if the descriptor is within a predetermined distance from one or more unclassified descriptors corresponding to one or more of the plurality of digital content, the cluster classification unit generates a new cluster including the new digital content and the one or more of the plurality of digital content corresponding to the unclassified descriptors.
34. The multimedia device of claim 31, wherein, if the descriptor is within a predetermined distance from a cluster and within a predetermined distance from one or more unclassified descriptors corresponding to one or more of the plurality of digital content, the cluster classification unit classifies the new digital content and the one or more of the plurality of digital content corresponding to the unclassified descriptors as part of the cluster.
35. The multimedia device of claim 31, wherein, if the descriptor is within a predetermined distance from two or more clusters, the cluster classification unit combines the two clusters into a single cluster and classifies the new digital content as part of the single cluster.
US12/025,109 2007-07-24 2008-02-04 System and method of saving digital content classified by person-based clustering Active 2030-08-12 US8036432B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/025,109 US8036432B2 (en) 2007-07-24 2008-02-04 System and method of saving digital content classified by person-based clustering

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US95156807P 2007-07-24 2007-07-24
KR2007-107427 2007-10-24
KR1020070107427A KR101428715B1 (en) 2007-07-24 2007-10-24 System and method for saving digital contents classified with person-based clustering
KR10-2007-107427 2007-10-24
US12/025,109 US8036432B2 (en) 2007-07-24 2008-02-04 System and method of saving digital content classified by person-based clustering

Publications (2)

Publication Number Publication Date
US20090028393A1 US20090028393A1 (en) 2009-01-29
US8036432B2 true US8036432B2 (en) 2011-10-11

Family

ID=40489957

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/025,109 Active 2030-08-12 US8036432B2 (en) 2007-07-24 2008-02-04 System and method of saving digital content classified by person-based clustering

Country Status (4)

Country Link
US (1) US8036432B2 (en)
EP (2) EP2168064A4 (en)
KR (1) KR101428715B1 (en)
WO (1) WO2009014323A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090263007A1 (en) * 2005-07-26 2009-10-22 Sharp Kabushiki Kaisha Stereoscopic image recording device and program
US20100098341A1 (en) * 2008-10-21 2010-04-22 Shang-Tzu Ju Image recognition device for displaying multimedia data
US20100226584A1 (en) * 2009-03-06 2010-09-09 Cyberlink Corp. Method of Grouping Images by Face
US20100238191A1 (en) * 2009-03-19 2010-09-23 Cyberlink Corp. Method of Browsing Photos Based on People
US11282249B2 (en) * 2015-06-05 2022-03-22 International Business Machines Corporation System and method for perspective preserving stitching and summarizing views
US11295114B2 (en) 2014-04-28 2022-04-05 Microsoft Technology Licensing, Llc Creation of representative content based on facial analysis

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080187231A1 (en) * 2005-03-10 2008-08-07 Koninklijke Philips Electronics, N.V. Summarization of Audio and/or Visual Data
KR101658413B1 (en) * 2009-09-22 2016-09-22 삼성전자주식회사 Apparatus and method for extracting character information in a motion picture
KR101747299B1 (en) * 2010-09-10 2017-06-15 삼성전자주식회사 Method and apparatus for displaying data object, and computer readable storage medium
US9342569B2 (en) * 2010-12-15 2016-05-17 Sap Se System and method of adding user interface element groups
WO2012140315A1 (en) * 2011-04-15 2012-10-18 Nokia Corporation Method, apparatus and computer program product for providing incremental clustering of faces in digital images
US9552376B2 (en) 2011-06-09 2017-01-24 MemoryWeb, LLC Method and apparatus for managing digital files
KR101180471B1 (en) * 2011-09-27 2012-09-07 (주)올라웍스 Method, apparatus and computer-readable recording medium for managing reference face database to improve efficiency of face recognition using small-capacity memory
KR101498944B1 (en) * 2011-11-28 2015-03-06 세종대학교산학협력단 Method and apparatus for deciding product seller related document
JP5836095B2 (en) * 2011-12-05 2015-12-24 キヤノン株式会社 Image processing apparatus and image processing method
US20140015855A1 (en) * 2012-07-16 2014-01-16 Canon Kabushiki Kaisha Systems and methods for creating a semantic-driven visual vocabulary
CN102917126A (en) * 2012-10-11 2013-02-06 中兴通讯股份有限公司 Method and system for processing photos
KR101297736B1 (en) * 2013-05-07 2013-08-23 주식회사 파이브지티 Face recognition method and system
CA2839761A1 (en) * 2014-01-20 2015-07-20 Yp-It Ltd. Content digitization and digitized content characterization systems and methods
US9519826B2 (en) * 2014-05-08 2016-12-13 Shutterfly, Inc. Automatic image product creation for user accounts comprising large number of images
CN105678127A (en) * 2014-11-21 2016-06-15 阿里巴巴集团控股有限公司 Verification method and device for identity information
US9785699B2 (en) * 2016-02-04 2017-10-10 Adobe Systems Incorporated Photograph organization based on facial recognition
US10403016B2 (en) * 2017-06-02 2019-09-03 Apple Inc. Face syncing in distributed computing environment
KR102060110B1 (en) * 2017-09-07 2019-12-27 네이버 주식회사 Method, apparatus and computer program for classifying object in contents
CN109102010B (en) * 2018-07-27 2021-06-04 北京以萨技术股份有限公司 Image classification method based on bidirectional neural network structure

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5774576A (en) * 1995-07-17 1998-06-30 Nec Research Institute, Inc. Pattern recognition by unsupervised metric learning
KR19980024383A (en) 1996-09-05 1998-07-06 겐마 아키라 Classification method of facial features and map showing facial features
US20030210808A1 (en) * 2002-05-10 2003-11-13 Eastman Kodak Company Method and apparatus for organizing and retrieving images containing human faces
KR20040039788A (en) 2002-11-04 2004-05-12 삼성전자주식회사 System and method for detecting face
JP2004305560A (en) 2003-04-09 2004-11-04 Glory Ltd System for managing game customer information
KR20050013467A (en) 2003-07-28 2005-02-04 삼성전자주식회사 Method for feature extraction using locally linear transformations, and method and apparatus for image recognition employing the same
KR20060008814A (en) 2004-07-24 2006-01-27 삼성전자주식회사 Method for recognizing face image of various pose
JP2006092214A (en) 2004-09-22 2006-04-06 Fuji Xerox Co Ltd Program for supporting compilation of classification structure of document information, information classification structure compilation support method, and information classification structure compilation support device
US7085783B1 (en) 1999-11-22 2006-08-01 Kabushiki Kaisha Toshiba Electronic catalog maintenance system for enabling out-of-standard electronic catalog changes
JP2006243849A (en) 2005-02-28 2006-09-14 Toshiba Corp Equipment control device and its method
JP2006323621A (en) 2005-05-19 2006-11-30 Noritsu Koki Co Ltd Electronic album system
US7152097B1 (en) * 2000-06-19 2006-12-19 Diskxpress Us, Inc. System and methods of updating compact discs and graphical user interface for updating same
US7187786B2 (en) 2002-04-23 2007-03-06 Samsung Electronics Co., Ltd. Method for verifying users and updating database, and face verification system using the same
KR20070047063A (en) 2005-11-01 2007-05-04 삼성전자주식회사 Method and apparatus for reducing the number of photo in photo album, semi automatic enrollment method and apparatus of photo album, and photo album system using them
US7551755B1 (en) * 2004-01-22 2009-06-23 Fotonation Vision Limited Classification and organization of consumer digital images using workflow, and face detection and recognition

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6819783B2 (en) * 1996-09-04 2004-11-16 Centerframe, Llc Obtaining person-specific images in a public venue
US7565139B2 (en) * 2004-02-20 2009-07-21 Google Inc. Image-based search engine for mobile phones with camera
WO2005096213A1 (en) * 2004-03-05 2005-10-13 Thomson Licensing Face recognition system and method
US7734067B2 (en) * 2004-12-07 2010-06-08 Electronics And Telecommunications Research Institute User recognition system and method thereof

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5774576A (en) * 1995-07-17 1998-06-30 Nec Research Institute, Inc. Pattern recognition by unsupervised metric learning
KR19980024383A (en) 1996-09-05 1998-07-06 겐마 아키라 Classification method of facial features and map showing facial features
US7085783B1 (en) 1999-11-22 2006-08-01 Kabushiki Kaisha Toshiba Electronic catalog maintenance system for enabling out-of-standard electronic catalog changes
US7152097B1 (en) * 2000-06-19 2006-12-19 Diskxpress Us, Inc. System and methods of updating compact discs and graphical user interface for updating same
US7187786B2 (en) 2002-04-23 2007-03-06 Samsung Electronics Co., Ltd. Method for verifying users and updating database, and face verification system using the same
US20030210808A1 (en) * 2002-05-10 2003-11-13 Eastman Kodak Company Method and apparatus for organizing and retrieving images containing human faces
KR20040039788A (en) 2002-11-04 2004-05-12 삼성전자주식회사 System and method for detecting face
JP2004305560A (en) 2003-04-09 2004-11-04 Glory Ltd System for managing game customer information
KR20050013467A (en) 2003-07-28 2005-02-04 삼성전자주식회사 Method for feature extraction using locally linear transformations, and method and apparatus for image recognition employing the same
US7551755B1 (en) * 2004-01-22 2009-06-23 Fotonation Vision Limited Classification and organization of consumer digital images using workflow, and face detection and recognition
KR20060008814A (en) 2004-07-24 2006-01-27 삼성전자주식회사 Method for recognizing face image of various pose
JP2006092214A (en) 2004-09-22 2006-04-06 Fuji Xerox Co Ltd Program for supporting compilation of classification structure of document information, information classification structure compilation support method, and information classification structure compilation support device
JP2006243849A (en) 2005-02-28 2006-09-14 Toshiba Corp Equipment control device and its method
JP2006323621A (en) 2005-05-19 2006-11-30 Noritsu Koki Co Ltd Electronic album system
KR20070047063A (en) 2005-11-01 2007-05-04 삼성전자주식회사 Method and apparatus for reducing the number of photo in photo album, semi automatic enrollment method and apparatus of photo album, and photo album system using them

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090263007A1 (en) * 2005-07-26 2009-10-22 Sharp Kabushiki Kaisha Stereoscopic image recording device and program
US20100098341A1 (en) * 2008-10-21 2010-04-22 Shang-Tzu Ju Image recognition device for displaying multimedia data
US20100226584A1 (en) * 2009-03-06 2010-09-09 Cyberlink Corp. Method of Grouping Images by Face
US8121358B2 (en) * 2009-03-06 2012-02-21 Cyberlink Corp. Method of grouping images by face
US20100238191A1 (en) * 2009-03-19 2010-09-23 Cyberlink Corp. Method of Browsing Photos Based on People
US8531478B2 (en) 2009-03-19 2013-09-10 Cyberlink Corp. Method of browsing photos based on people
US11295114B2 (en) 2014-04-28 2022-04-05 Microsoft Technology Licensing, Llc Creation of representative content based on facial analysis
US11282249B2 (en) * 2015-06-05 2022-03-22 International Business Machines Corporation System and method for perspective preserving stitching and summarizing views

Also Published As

Publication number Publication date
US20090028393A1 (en) 2009-01-29
EP4113325A2 (en) 2023-01-04
EP2168064A1 (en) 2010-03-31
EP2168064A4 (en) 2012-08-15
EP4113325A3 (en) 2023-01-25
KR101428715B1 (en) 2014-08-11
KR20090010855A (en) 2009-01-30
WO2009014323A1 (en) 2009-01-29

Similar Documents

Publication Publication Date Title
US8036432B2 (en) System and method of saving digital content classified by person-based clustering
US10346677B2 (en) Classification and organization of consumer digital images using workflow, and face detection and recognition
US9430719B2 (en) System and method for providing objectified image renderings using recognition information from images
US7564994B1 (en) Classification system for consumer digital images using automatic workflow and face detection and recognition
US7555148B1 (en) Classification system for consumer digital images using workflow, face detection, normalization, and face recognition
US9875303B2 (en) System and process for building a catalog using visual objects
US7551755B1 (en) Classification and organization of consumer digital images using workflow, and face detection and recognition
US8897505B2 (en) System and method for enabling the use of captured images through recognition
US7587068B1 (en) Classification database for consumer digital images
US7809192B2 (en) System and method for recognizing objects from images and identifying relevancy amongst images and information
US8027549B2 (en) System and method for searching a multimedia database using a pictorial language
US7809722B2 (en) System and method for enabling search and retrieval from image files based on recognized information
US8107689B2 (en) Apparatus, method and computer program for processing information
Zhang et al. Efficient propagation for face annotation in family albums
WO2012073421A1 (en) Image classification device, image classification method, program, recording media, integrated circuit, and model creation device
JP4336813B2 (en) Image description system and method
Cavalcanti et al. A survey on automatic techniques for enhancement and analysis of digital photography
Wong et al. A Knowledge framework for histogram-based image retrieval

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, YONG-SUNG;KIM, DUCK-HOON;PARK, HEE-SEON;AND OTHERS;REEL/FRAME:020494/0270

Effective date: 20080114

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12