WO2001029769A2 - Method and apparatus for aligning and comparing images of the face and body from different imagers - Google Patents

Method and apparatus for aligning and comparing images of the face and body from different imagers Download PDF

Info

Publication number
WO2001029769A2
WO2001029769A2 PCT/US2000/041320 US0041320W WO0129769A2 WO 2001029769 A2 WO2001029769 A2 WO 2001029769A2 US 0041320 W US0041320 W US 0041320W WO 0129769 A2 WO0129769 A2 WO 0129769A2
Authority
WO
WIPO (PCT)
Prior art keywords
minutiae
image
images
infrared
face
Prior art date
Application number
PCT/US2000/041320
Other languages
French (fr)
Other versions
WO2001029769A9 (en
WO2001029769A3 (en
Inventor
Francine J. Prokoski
Original Assignee
Prokoski Francine J
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Prokoski Francine J filed Critical Prokoski Francine J
Priority to JP2001532489A priority Critical patent/JP2003512684A/en
Priority to EP00986798A priority patent/EP1194893A4/en
Priority to CA002354594A priority patent/CA2354594A1/en
Publication of WO2001029769A2 publication Critical patent/WO2001029769A2/en
Publication of WO2001029769A3 publication Critical patent/WO2001029769A3/en
Publication of WO2001029769A9 publication Critical patent/WO2001029769A9/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/41Detecting, measuring or recording for evaluating the immune or lymphatic systems
    • A61B5/414Evaluating particular organs or parts of the immune or lymphatic systems
    • A61B5/415Evaluating particular organs or parts of the immune or lymphatic systems the glands, e.g. tonsils, adenoids or thymus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/41Detecting, measuring or recording for evaluating the immune or lymphatic systems
    • A61B5/414Evaluating particular organs or parts of the immune or lymphatic systems
    • A61B5/418Evaluating particular organs or parts of the immune or lymphatic systems lymph vessels, ducts or nodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Immunology (AREA)
  • Vascular Medicine (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Endocrinology (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method and apparatus for comparing (26) an infrared image of a person (12) to a database of visual images of persons (6) and calculating the probability that each is a match (42) to the infrared image is characterized by extracting minutiae from the infrared image and extracting visible minutiae from the visible images. Coincident minutiae which occur in both spectra are used to scale and register the infrared and the visible images. Other minutiae are spectrum-dependent (8), but must obey rules relative to minutiae of the other spectrum, due to the anatomical structure (34) of the human face and body. The primary application is for identification of persons seen in infrared surveillance imagery, using a reference database of visual images. Other applications includes compression of talking head video and animation of synthetic faces. The method and apparatus can also be applied to areas of the body other than the face, to compare images from different spectra including images from medical sensors.

Description

METHOD AND APPARATUS FOR ALIGNING AND COMPARING IMAGES OF THE FACE AND BODY FROM DIFFERENT IMAGERS
This application claims the benefit of provisional application No 60/105,217 filed October 22, 1998
BACKGROUND OF THE INVENTION
There are common features between infrared (IR) and visual images of the human bod> Using the face as an example, head shape and size, and the relative location, shape, and size of features such as the eyes, mouth, and nostπls are the same in both imaging modes A database of images can be segmented into classes using metπcs deπved from those common features, and the same classification will be obtained from either visual or infrared images Height can be also used as a classification measure when it can be inferred from the collected image or from separate sensor data An infrared image of an unclothed area of the body, such as the face, presents much more detailed and person-specific information than does a visible image However, visible images are more commonly collected and large histoπcal databases of visual images exist It is therefore desirable to automate a process for compaπng imagery from both the visual and infrared modes
Infrared images are unique to each person, even for identical twins Visual images are not unique because many people look similar and can disguise themselves to look enough like one another that an automated identification system cannot distinguish them Therefore, in a large database, it is not possible to automatically perform a one-to-one linkage between infrared and visual images because the visual images are not sufficiently unique However, for each infrared image, an automated system can eliminate all visual images which cannot be a match due to insufficient correspondence between minutiae characteπstics In general, it is estimated that more than 95% of a visual database can be eliminated as a match to a given infrared image This has application to the use of infrared surveillance cameras to identify wanted persons for whom only visual images are on file The infrared-visual matching system compares each person it sees in infrared and classifies him as either a possible match to someone on the visual image watch list or not a match Persons who are possible matches can then receive greater attention from immigration or secuπty authoπties This allows the use of infrared surveillance imagery to proceed without waiting until a large database of infrared images is established
The use of infrared imagery also provides for the detection of disguises, whether worn or surgical, which may not be detectable from visible imagery For example, artificial facial hair such as a mustache is readily detectable in an infrared image although it appears natural in visible images. The fact that infrared surveillance imagery shows a man with a fake mustache provides a clue to consider in matching against a visible image database Surgical disguises such as a face lift leave telltale short and longer term variations in the facial thermogram, while the visual image may appear to be a different person and show no sign of surgery The ability to detect in IR images that surgical changes have been made to a particular area of the face permits an automated system to broaden the parameters for searching for possible matching visual images in an histoπcal database
High definition visual images of the face and body are routinely produced and stored for medical, diagnostic and forensic use. Common examples are the photographing of cπminal suspects through booking stations producing "mug shots", dπver's license photographs produced by each state, and passport photos used by the State Department
Many such large facial image databases exist, in hardcopy and in electronic form, and there is increasing research ongoing into automated matching of newly taken images with those databases. For example, there are frequent attempts to match surveillance images of a person using a stolen credit card at an ATM with photographs of persons previously convicted of similar cπmes. Visual imagery, particularly from surveillance cameras, is often of poor quality due to dim illumination at the scene Low light level or infrared cameras are expected to become more widely used for surveillance as their cost diminishes There is therefore a need to correlate between newly acquired infrared images and existing databases of video images Even in the future, when simultaneous collection of video and IR images will generate correlated databases, there will always be a need to match images taken in one spectral domain with images taken in another This can include matching images taken in one IR band (such as 3-5 micron) with images taken in another IR band (such as 8-12 micron) Since IR cameras are passive, emitting no radiation and therefore presenting no health hazards, they may be used in conjunction with other imaging medical devices such as x-ray, sonogram, CAT scan devices, etc. Minutiae deπved from the IR image may then be supeπmposed or annotated onto the resulting medical image. This presents a standard technique for generating standardized reference points on all medical imagery Subsequently, the method and apparatus of this invention can be used to search a database of annotated medical images to find a match with a current IR image or current medical image annotated with IR minutiae
Regions of Interest (ROI) may be utilized instead of minutiae, where the ROI may be elemental or other shapes including fractal or wavelet-deπved structures, segments of blood vessels, locations underneath or otherwise relative to tattoos, moles, freckles, or other distinguishable features, or wiremesh or finite elements used for thermodynamic or visible modeling of the body. Rules may relate the shapes and positions of such elements, their centroids and other features Time sequences of minutiae and ROIs may be compared, with the decision as to a possible match made on the basis of cumulative thresholds and rule tolerances over the sequence Facial expression and speech modeling has application to synthetic videoconferencing and face animation Substantial bandwidth and storage reduction can result Use of IR minutiae offers more precise modeling than current use of visual images The present invention provides a technique by which IR images can be tied to the visual image being displayed
BRIEF DESCRIPTION OF THE PRIOR ART
The identification of persons from infrared images is known in the art as evidenced by the Prokoski et al U S patent No 5,163,094 which discloses a method and apparatus for analyzing closed thermal contours, called "elemental shapes" which are created by the vascular system interacting with the anatomical structure Fifty or more elemental shapes can be identified for example in a human face imaged with an IR camera which has an NETD (noise equivalent thermal difference) of 0.07°C and a spatial resolution of 256x256 pixels Characteπstics of those shapes, such as the centroid location and ratio of area to peπmeter, remain relatively constant regardless of the absolute temperature of the face, which vanes with ambient and physiological conditions. Two infrared images are compared by compaπng the characteπstics of corresponding shapes. A distance metπc is defined and calculated for each pair of images If the value is within a threshold, the two images are considered to be from the same person
In the Prokoski et al U.S. patent application S/N 08/514,456, there is disclosed a method and apparatus for extracting and compaπng thermal minutiae corresponding to specific vascular and other subsurface anatomical locations from two images Minutiae may be deπved from thermal contours, or may be absolutely associated with specific anatomical locations which can be seen in the thermal image, such as the branching of blood vessels Each minutia is then associated with a relative position in the image and with characteπstics such as apparent temperature, the type of branching or other anatomical feature, vector directions of the branching, and its relation to other minutiae The compaπson of thermal minutiae from two facial images is analogous to the compaπson of sets of fingerpπnt minutiae, in that two images are said to identify the same person if a significant subset of the two sets are found to correspond sufficiently in relative positions and characteπstics Classification of the facial thermograms can be performed to partition a database and reduce the search for matching facial patterns Alternately, encoding of the minutiae patterns offers a unique FaceCode which may be repeatably deπved from each person, minimizing the need for searching a database Infrared imaging can be used to locate minutiae points over the entire body surface which correspond to specific anatomical locations such as intersection points and branch points of the underlying blood vessels The thermal minutiae technique and apparatus utilizes a built-in set of whole-body registration points viewable in IR on the face and body surface The registration points can then be used to compare infrared images taken with different equipment at different times of different people and under different conditions to facilitate compaπson of those images
The IR camera is totally passive, emitting no energy or other radiation of its own, but merely collecting and focusing the thermal radiation spontaneously and continuously emitted from the surface of the human body Current IR cameras operating in the mid to long wavelength region of 3-12 microns, record patterns caused by superficial blood vessels which lay up to 4 cm below the skin surface Future cameras will have increased sensitivity which will translate into even more defined minutiae With current IR cameras, approximately 175 thermal facial minutiae may be identified in thermal images from superficial blood vessels in the face More than 1000 thermal minutiae may be identified over the whole body surface Using more sensitive infrared cameras, additional minutiae from deeper vascular structures may be identified in the thermal images The normal body is basically thermally bilaterally symmetric. Side to side variations are typically less than 0.25 degrees Celsius. This fact is used in assigning axes to the body's image. Where the skin surface is unbroken, there is a gradual variation of temperatures across blood vessels, with the highest temperatures across the body surface being directly on top of major blood vessels. Major thermal discontinuities occur at entrances to body cavities such as the eye sockets, nostrils, or mouth. These provide global reference points for automatic orientation of the thermal image. Local and relatively minor discontinuities in the skin surface occur at scars, moles, burns, and areas of infection. The thermal surface can be distorted through pressures and activities such as eating, exercising, wearing tight hats and other clothing, sinus inflammation, infection, weight gain and loss, and body position. However, the minutiae points remain constant with respect to their position relative to the underlying anatomy.
The technique for thermal minutiae extraction and matching can be summarized as follows: 1. The current thermal image is digitized.
2. The current image is divided into pixels, where the size of the pixel relates to the resolution or quality of the result desired
3. Certain pixels are selected as minutiae points
4. Each minutia is assigned characteristics such as one or more vectors having magnitude and directional information in relation to the surrounding areas of the thermal image about that minutia, absolute or relative temperature at or around the minutia location, shape of the surrounding thermal area or areas, curvature of the related shape or shapes, size of the surrounding shape or shapes, location of the minutia relative to the body, distance to other minutiae, vector length and direction to other minutiae, number of crossings of thermal contours between it and other minutiae, number of other minutiae within a certain range and direction, the type of minutiae such as the apparent end point of a blood vessel, a point of maximum curvature of a thermal contour, all points on an anatomical element such as a blood vessel which can be distinguished by thresholding or range gating or focusing the thermal camera or image, the centroid of a lymph node, or the centroid or other reference of an anatomical structure with distinguishing thermal capacitance. Either active or passive infrared imaging can be used. For active imaging, the subject can be subjected to heat or cold by external application of hot or cold air, illumination, dehumidification, ingestion of hot or cold foodstuffs, or ingestion of materials which cause vasodilation or vasoconstriction.
5. A set of minutiae characteristics of the current image is compared by computer to the set of minutiae characteristics of other images.
6. The comparison results are used to determine corresponding minutiae from the two images, and to morph or mathematically adjust one image with respect to the other to facilitate comparison.
7. The differences between the current image and database images are computed for the entire image or for areas of interest.
8. The differences are compared to a threshold and image pairs which exceed the threshold are considered impossible matches.
Infrared facial minutiae may be derived from elemental shapes (such as by using the centroids of each shape or the zero locations resulting from wavelet compression and expansion). Particularly when high quality infrared images are used, absolute minutiae can be directly extracted without the computationally intensive analysis required for template or shape comparisons.
It is also known in the prior art to compare visible images through fiducial points involving definition of face metrics which may be considered to have aspects in common with the present invention. For example, the Tal U.S. patent No. 4,975,969 discloses a method and apparatus for uniquely identifying individuals by measurement of particular physical characteπstics viewable by the naked eye or by imaging in the visible spectrum Tal defined facial parameters which are the distances between identifiable parameters on the human face, and/or ratios of the facial parameters, and used them to identify an individual since he claims that the set of parameters for each individual is unique Particular parameters such as the distance between the eye retina, the distance from each eye retina to the nose bottom and to the mouth center, and the distance from the nose bottom to the mouth center are set forth, as they may be particularly defined due to the shadowed definable points at each end
The approach disclosed in the Tal patent utilizes visible features on the face from which a unique set of measurements and ratios allegedly can be developed for each individual This approach is not particularly satisfactory, nor does it pertain to identical twins In addition, the "rubber sheeting" effect caused by changes in facial expression, the aging effects which cause lengthening of the nose, thinning of the lips, wπnkles, and deepening of the creases on the sides of the nose, would all cause changes in the parameters and in their ratios Therefore, very few measurements which can be made on a human face are constant over time, and the paucity of such constant measurements makes it improbable that facial metπcs in visible images can be useful for identification of sizable populations The Tal patent does not deal with compaπson of images from other than visible detectors, and so does not consider the specific focus of the present invention which is the compaπson of images from different spectral bands Moreover, the Tal patent does not specifically caution about varying lighting conditions, which could severely limit the utility of the technique, even for classification
Visible face metπcs may be useful as a classification technique, but the visible features can be modified cosmetically or surgically without detection, resulting in mis- classification By contrast, the technique of the present invention utilizes hidden micro parameters which he below the skin surface, and which cannot be forged The current patent's use of underlying features which are fixed into the face at birth and remain relatively unaffected by aging provides for less inherent vaπabi ty in the values of the parameters over time than is provided by the pπor art.
Visible metπcs require ground truth distance measurements unless they rely stπctly upon ratios of measurements. They can be fooled by intentional disguises, and they are subject to vaπations caused by facial expressions, makeup, sunburns, shadows and similar unintentional disguises Detecting disguises and distinguishing between identical twins may or may not be possible from visible imagery if sufficient resolution and controlled lighting is available. However, the level of resolution which may be required significantly increases the computational complexity of the identification task, and makes the recognition accuracy vulnerable to unintentional normal vaπations.
The use of eigenanalysis of visual faces to develop a set of characteπstic features is disclosed in Pentland (MIT Media Laboratory Perceptual Computing Section, Technical Report No. 245 View-Based and Modular Eigenspaces for Face Recognition). Faces are then descπbed in terms of weighting of those features. The approach claims to accommodate head position changes and the weaπng of glasses, as well as changes in facial expressions. A representative sample of 128 faces was used from a database of 7,562 images of approximately 3000 people. A principal components analysis was performed on a representative sample. The first 20 eigenvectors were used. Each image was annotated by hand as to sex, race, approximate age, facial expression, etc. Pentland does not deal with compaπng images from different spectral bands. Nor does his technique perform well in the case of visible images obtained under diffeπng lighting conditions.
Pentland discloses that pre-processing for registration is essential to eigenvector recognition systems. The processing required to establish the eigenvector set is extensive especially for large databases Addition of new faces to the database requires the re- running of the eigenanalysis Pentland and other "eigenface" approaches are database- dependent and computationally intensive In contrast, the proposed minutiae compaπson of the present invention is independent of the database context of any two images Minutiae are directly deπved from each image, visible or IR, and compared using fixed rules, regardless of the number or content of other images in the database
An approach for compaπng two sets of image feature points to determine if they are from two similar objects is disclosed in Sclaroff (Sclaroff and Pentland MTT Media Laboratory, Perceptual Computing Technical Report #304) He suggests that first a body- centered coordinate frame be determined for each object, and then an attempt be made to match up the feature points Many methods for finding a body-centered frame have been suggested, including moment of inertia methods, symmetry finders, and polar Fouπer descπptors These methods generally suffer from three difficulties sampling error, parameteπzation error, and non-uniqueness
Sclaroff introduces a shape descπption that is relatively robust with respect to sampling by using Falerkin interpolation, which is the mathematical underpinning of the finite element method Next, he introduces a new type of Galerkin interpolation based on Gaussians that allow efficient deπvation of shape parameteπzation directly from the data Third, he uses the eigenmodes of this shape descπption to obtain a canonical, frequency- ordered orthogonal coordinate system This coordinate system is considered the shape's generalized symmetry axes By descπbing feature point locations in the body-centered coordinate system, it is straight-forward to match corresponding points, and to measure the similaπty of different objects
Applicant has previously utilized a pπncipal components analysis of thermal shapes found in facial thermograms The resulting accuracy of 97% from IR images equals or surpasses the results reported by Pentland with visible facial images Applicant's training database, furthermore, included identical twins and involved non-cooperative imaging of about 200 persons Thus, the head sizes and oπentations were not pre-determined as they were in the Pentland study As a result, the use of eigenanalysis of thermal shapes is more robust than the use of eigenanalysis of visual facial features However, the basic requirements of eigenanalysis still pertain to their use in matching of thermal images by consideration of inherent elemental shapes That is, the approach is computationally intensive, requires a pre-formed database, and requires standardization of the images through pre-processing
The present invention differs from pπor visible and IR recognition approaches in that it does not merely sample a finite number of points on an image gπd. it extracts points which have particular meaning in each spectrum and automatically distinguishes between cross-spectrum minutiae which are coincident and those which are related by rules associated with anatomical bases It assigns a difference or feature space distance to each pair of coincident minutiae, with a total distance calculated over all such pairs. This first step may be used to eliminate candidate matches which produce distances above a threshold Then the spectrum-dependent minutiae are compared relative to anatomical rules to further eliminate impossible candidate matches The pπor art has not addressed alignment and compaπson of visual/IR or IR/IR human images based upon anatomical rules and the characteπstics of features viewable in the IR image
SUMMARY OF THE INVENTION It is a pπmary object of the present invention to provide a method and apparatus for identifying visual images which may be a match to infrared images of faces or bodies A thermal image of a portion of the individual's body is generated and is processed to produce a set of minutiae points, together with characteπstics which descπbe each such point and its relation to other minutiae That combination of minutiae and characteπstics is considered unique to the individual and essentially persistent in spite of ambient, physiological, emotional, and other vaπations which occur on a daily basis Any portion of the body can be utilized, but the face is preferred due to its availability Since parts of the face may be blocked by glasses, facial hair, or oπentation to the sensor, such as a camera, the system and method allows for identification based on partial faces Candidate visual images are processed to extract minutiae characteπstic of the subject and the visual spectrum The IR and visual images are scaled to the same standard and aligned based upon minutiae which are coincident in the two spectra A measure of the amount of warping required to accomplish the alignment is calculated Then other spectrum-dependent minutiae are compared, with relation to certain rules which would be met if the two images were of the same person, based upon anatomical structures of the human face and body A measure of the degree of compliance with the rules is calculated The decision to include or exclude a given visual image from the class of possible matching images to the infrared image is made based upon these measures relative to thresholds which are established to control possible errors in the system Just as locating the center of a f gerpnnt is essential to certain fmgerpnnt matching algoπthms, establishing axes for the facial minutiae is also essential In an interactive system, human operators establish face axes, similar to fmgerpnnt examiners setting the oπentation of latents A human demarcates the eye pupils, canthi and/or nostnls by manipulating a cursor on the system display Axes are then automatically generated vertically through the center of mass of the eye pupils or canthi and nostπls and hoπzontally through the pupils or canthi centroids If the axes are not perpendicular, the vertical axis can be adjusted to not necessaπly bisect the nostπls The human operator also indicates any unusual features, such as a missing eye or eye patch, weaπng of bandages, tattoos, deformation of the lips or other visible gross thermal asymmetπes of the face An automated system can perform these as well The unknown face is partitioned into segments, and corresponding segments matched. This will accommodate matching of partial faces when faces are partially disguised or hidden behind other faces in a crowd.
In the full-frontal face, the thermal image is grossly symmetrical bilaterally. The canthi or sinus areas in normal individuals are the hottest extended areas of the face. When glasses are not worn, it is a simple process to locate the canthi in the thermal image and use them to establish axes for the face. Other features which may be used are the nostrils, which may present alternately hot and cold bilaterally symmetric areas as the individual breathes in and out. The horizontal axis may be drawn through the outer corners of each eye, which are readily distinguishable in the infrared images or through the pupils which may be seen in some IR imagery. The vertical axis may then be drawn through the bow of the upper lip, or through the center point of the two nostrils, or at the midpoint between the eye corners. The intersection of the two axes will occur at the center of the two eyes. The midpoint between the horizontal through the eyes is defined as the center of the face. If the person is wearing glasses, the pattern of the glasses, which block the infrared emissions from the face and thereby produce an extended cold area with sharp cut-off thermally, can be used to approximate the facial axes. If a sufficient number of minutiae are obtainable from portions of the face not blocked by glasses, facial hair, or other concealments, a person may be identifiable. Alternatively, if fewer than a minimum number of minutiae specified for a particular scenario are extracted by an automated system for a particular person, that person may be considered by the system to be a potential match, but be tagged as having a low number of minutiae.
Various perturbations, such as facial expression changes, can distort the relative locations of minutiae points to an extent. This is analogous to the deformations which occur in fingerprints due to movement between the fingers and the print surface. The minutiae matching algorithms allow for variations in the position and characteristics of the minutiae, as well as in the subset of minutiae which are seen due to the field of view of the camera and to possible obstruction of certain areas of the face in the image
The face surface presents a smooth continuum of thermal levels, and reflects metabolic activity, ambient and internal temperatures, and ambient sources of thermal energy Discontinuities occur at breaks in the skm continuum, such as caused by the nostπls, the mouth opening, the eyes, facial hair, moles or other sk disturbances, and any applique such as bandages
According to a preferred embodiment of the invention, minutiae are used from the face The minutiae are referenced to axes denved from specific physiological features Although many different approaches may be used to obtain repeatable minutiae from facial thermograms, the preferred approach uses a number of extraction routines to produce a plurality of minutiae sufficient for an intended purpose Thus, for a relatively low order of required secuπty, on the order of ten minutiae may be extracted using absolute anatomical positions such as branch locations of the carotid and facial arteπes For a high secuπty requirement, on the order of 100 deπved minutiae may be extracted using additional computations to identify further deπved and absolute minutiae The minutiae extraction and characteπzation procedure locates the position of each minutia In addition it may note charactenstics of each point such as: a vector indicating the oπentation of the corresponding blood vessel, a second vector indicating the relative oπentation of the branching blood vessel; the normalized apparent temperature; and the apparent width of the corresponding blood vessels As with some fmgerpnnt minutiae matching machines, use of the charactenstics can enhance the speed and accuracy of identification Furthermore, it can improve the accuracy and speed of automatic fusion of medical imagery This basic technique can be employed on an area-by-area basis when portions of the body cannot be seen or when significant changes have occurred in portions of the thermogram such as when portions of the body have suffered external wounds This would be done by segmenting the thermogram to consider only the portions of the body in which minutiae can be detected Functionally this is equivalent to matching a latent partial fmgerpnnt found at a cnme scene to a full rolled pπnt filed in the FBI system The set of minutiae points, together with charactenstics which descnbe each such point and its relation to other minutiae is considered unique to the individual and persistent, for both contact fingerpnnts and thermal minutiae
Veπfication that two images from different spectra may be from the same person can be an end goal in itself or the first step in further processing the two images to extract companson data
A change in facial expression or the action of speech causes movements in affected areas of the face, particularly the lips, but also the eye, chin, forehead, and cheek areas Encoding of facial expressions and facial movements dunng speech is currently being studied for bandwidth reduction in the transmission of "talking head" video for applications such as videophone, videoconferencing, video email, synthetic speech, and face animation The intent is to transmit a baseline image followed by encoded changes to that image, with reconstruction of the animated face at the receiving end This process offers significant bandwidth reduction, but may produce imagery in which the talking face seems stiff and unnatural or does not appear to be synchronized with the audio, giving the unacceptable look of a dubbed foreign film
All such studies involve modeling the facial movements based upon the relocation of certain observable points of the face, such as the corners of the mouth The vanous models differ in the extent to which they consider the underlying facial muscles and nerves There are few observable reference points on a generalized face, especially under uncontrolled lighting conditions In particular, there are no observable reference points in the cheek areas, and none in the forehead area except possibly skin creases When the talking head is that of a dark skinned person, the reconstructed image may show further degradation of subtle facial features
Use of an IR camera in conjunction with a video camera, or use of a dualband camera at the transmission end offers the potential for marked improvements. Infrared minutiae are more numerous than visible markers and are present throughout the face, including areas of the cheeks and forehead and chm where no visible minutiae may be present Therefore, modeling of the movements of infrared minutiae can provide finer detailed replication of expressions and speech than modeling based upon visual references.
At the transmitting end, a visual baseline image of the subject face is sent, followed by transmission of only the movement vectors of those infrared minutiae which move from frame to frame. At the receiving end, the baseline face is animated based upon overlaying the IR minutiae movements on the visual image
Early results indicate a minimum of 150: 1 compression for highly energetic faces, to 400: 1 for mildly mobile faces when 30 frames per second are processed. A pπmary application for this technique is videoconferencing, where the goal is to provide acceptable quality imagery over dial-up lines, at acceptable cost.
Video e-mail and videophone could also utilize the significant bandwidth reduction and automated re-synchronization of voice and image
By processing sequences of images taken from known expressions and/or known speech elements, a sequence of movements of infrared minutiae can be extracted which corresponds to that expression or speech element for that person or for persons in general Subsequently, when the same sequence of movements of infrared minutiae is seen, it can be inferred that the person is displaying the same expression or speech element as dunng the initial sequence This enables the automated determi nation of expression or speech, allowing for compression of transmitted video conjunction with audio. The combination may offer additional composite compression and improved synchronization. The same basic technique can also be used to create a dictionary of facial expressions and speech elements for use in animation of a synthetic face
The talking head video compression system will have both video and IR cameras, and can be used to recognize and/or generate facial expressions and/or speech-related facial movements from the IR image and supenmpose them on a contemporaneous visual image The use of correlated infrared and video facial images offers significantly better fidelity of expression and speech-related vanations in compression and reconstruction of talking head video, while also ensuπng the authenticity of the related transmissions
BRIEF DESCRIPTION OF THE FIGURES Figs la and lb are a visual image and facial thermogram, respectively, taken of the same face from a distance of 15 feet showing coincident minutiae for each modality
Figs 2a - 2d are visual images of four different faces, respectively, showing coincident minutiae,
Figs 3a - 3d are image of the vascular structure and feature images from infrared minutiae of the visual images of Figs la, 2a, and 2b, respectively, generated by thresholding the IR image and using all pixels hotter than threshold,
Fig 4 is an infrared image of an individual with a scar which is not detectable in a visible image owing to make-up on the individual,
Fig 5a illustrates an overlay of the IR image of Fig 3a onto the corresponding visual image of Fig 1 a to illustrate the alignment of coincident minutiae,
Fig 5b illustrates an overlay of the IR image of Fig 3b onto the visual image of Fig la to illustrate the misalignment of coincident minutiae,
Figs 6a - 6c are thresholded infrared images of the frontal face, side face, and neck, respectively, of an individual taken with an indium antimonide focal plane array camera, Figs. 7a and 7b are images of vascular structure minutiae for an individual smiling and frowning, respectively;
Fig. 8 is a flow diagram showing the method according to the invention, Figs 9a and 9b are illustrations of two different visual images overlaid with a thermal image of vascular minutiae showing a match and mismatch, respectively; and Fig. 10 is a block diagram showing the apparatus according to the invention.
DETAILED DESCRIPTION
The preferred method for aligning and compaπng images of the face and body from different images according to the invention will now be descnbed. The vascular system supplying the human face typically exhibits thermal vaπations on the order of 7°C across the facial surface. Certain general features, such as hot patches in the smus areas, relatively cool cheeks, and cold hair pertain to all facial thermograms. Other features such as specific thermal shapes in certain areas of the face are characteπstic of a particular person Vanations in temperature across the facial surface can be imaged by thermal cameras sensitive to wavelengths in the 3-5, 8-12, or 2-15 micron ranges. Current commercially available cameras provide thermal resolution of 0.025°C and spatial resolution of better than .02", resulting in 65,000 to 265,000 discrete thermal measurements across the surface of the face. For most cameras, the thermal map is regenerated 30 times per second to produce either a standard video output which can then be recorded and processed on standard videotape equipment, or a direct digital signal which can be input to a computer.
In Figs, la and lb, there are shown the visible and infrared images of the same individual taken via a conventional camera and an infrared camera, respectively These images contain minutiae 2. Similarly, Figs. 2a-2d are visual images of different people, each image having identifiable minutiae points 2. Figs. 3a-3c are thermal or infrared images of the individuals shown in Figs, la, 2a, and 2b.
In addition to branch points of superficial blood vessels, various other types of minutiae may be automatically extracted, including: (1) the centroid of each constant thermal area;
(2) points of maximum curvature on constant thermal contours;
(3) anastomoses;
(4) lymph nodes, glands, other anatomical areas of distinguishable thermal capacitance; (5) head outline and hairlines;
(6) scars, tattoos, and other marks which may or may not be visible in normal photographs;
(7) undefined locations generated by wavelet or fractal-based compression and expansion of the thermal image; and (8) apparent end points where the blood vessel goes too deep to be seen.
Use of various combinations of minutiae types can provide additional resolution and accuracy, and can also increase the security of identification systems by using a particular and undisclosed set of minutiae and characteristics.
Since every pixel in an IR image represents a thermal measurement of the skin at that corresponding location on the body, every pixel in an IR image can be considered a minutia. In particular, thresholding an IR image and considering all hotter points to be minutiae leads to a simple realization of the preferred embodiment of the invention. There is a tradeoff to be made in constructing operational systems based on this invention: whether to utilize fewer minutiae which are selected with more computational complexity, or to use more minutiae from less selective processing. The methods according to the invention are the same whether the analysis is done more at the minutiae-extraction stage or at the minutiae compaπson stage
Simply taking obvious facial feature landmarks such as head outline, hairlines, the center of each nostπl, pupil spacing, and the comers of each eye, provides a rudimentary set of metπcs for classification or veπfication of a face The Tal patent No 4,975,969 discloses such a method for identifying faces based upon a limited number of measurements between visible features such as the ends of the mouth and ratios between those measurements According to Tal, no two persons have the same set of such measurements However, vanations in such measurements for a given individual at different times appear to often be larger than the vanations between persons Positive identification of individuals, especially when one individual is attempting to appear to be another, requires the matching of a greater number of minutiae points than are available in the video image For high secunty applications, it is desirable that the number of minutiae points extracted be such that it is virtually impossible to locate two individuals who would have identical minutiae sets
Scars 4, tattoos, and other marks which are visible in photographs should be selected as shown in Fig 4 All related pixels can be used as visible minutiae, or a procedure can be established wherein certain features, such as the centroid, or outline, are selected as representative minutiae The infrared image will in general contain more details than will the visible image Particularly when the visible image is not high resolution, the IR image can be used to distinguish between brands and tattoos and temporary marks better than can a photograph When makeup is worn, there may be no apparent visible mark
Also, since it is of interest to identify faces seen in crowds, or faces turned at any angle, a significant number of minutiae points must be extractable for those applications so that even a partial face can be used for identification Compaπson or alignment of sets of minutiae in two images requires a number of steps First, the face axes are located Overlaying the two sets of axes provides the initial approximate correspondence between two different images. In the full-frontal face, the thermal image is grossly symmetπcal bilaterally The canthi or sinus areas in normal individuals are the hottest extended areas of the face When glasses are not worn, it is normally a simple process to locate the canthi in the thermal image and use them to establish axes for the face Other features which may be used are the nostπls, which may present alternately hot and cold bilaterally symmetnc areas as the individual breathes in and out. The hoπzontal axis is drawn through the pupils or canthi, which are readily distinguishable in the infrared images. The vertical axis is then drawn through the bow of the upper lip, or through the center point of the two nostnls, to the midpoint between the eyes. The intersection of the two axes occurs at the center of the two eyes which is defined as the center of the face. Axes for the visible face images are similarly drawn. Axes can be forced to be perpendicular. However, many people have an eyeline which is not perpendicular to the vertical axis of their head. Allowing the axes to vary in relative oπentation preserves a useful identifying characteπstic.
Next, all images are scaled to a standard size pπor to companson. If there is sufficient ground truth for all images in the database, the scaling is done in terms of actual size. In general, however, actual size cannot be precisely determined after the fact for all images in a database. Therefore the scaling is done by enforcing a standard distance between specific minutiae. For visible images, one good metnc for scaling is the distance between pupils of the eyes. This distance is approximately the same for all adults at about 7 cm For infrared images, in which eye pupils cannot be distinguished, a good metnc is the shortest line between canthi which is parallel to the hoπzontal axis of the face. This is approximately the same for all adults at about 3 cm Infrared minutiae are categorized as absolute if they are directly extractable from the thermal image, and derived if they result from some level of image transformation. Visible minutiae are all assumed to be absolute. Methods for their extraction are set forth below. Other methods may be used within the scope of this invention.
Infrared Minutiae
Infrared minutiae are selected. The number of minutiae obtained is a function of the sensitivity and resolution of the infrared camera. Candidate minutiae include:
1. Absolute minutiae directly extractable from the thermal image, such as: head outline, hairlines, branch points, and apparent end points of the superficial blood vessels.
2. Derived minutiae requiring processing of the image, including the following: A. the centroid of each constant thermal area:
1. Where the digitized thermal image has N bits of grey scale, begin by dividing the image into two slices (thresholding) about the average grey value. The resulting image will have some number of areas of constant value. Locate the centroid of each, which is labeled as a minutiae point.
2. Increase the number of slices to 4, and repeat the above step, labeling the resulting centroids as minutiae.
3. Continue increasing the number of slices by a factor of 2, and labeling the resulting minutiae, until 2**N slices are obtained.
4. If additional minutiae are desired, continue the process using odd numbers of slices.
5. The minutiae set consists of the centroids labeled as (x, y, z) where (x,y) is the location on the face relative to the face axes with (0,0) at the designated face center, and z is the corresponding thermal value. B. the points of maximum curvature on constant thermal contours, either concave or convex cusps having less than a given radius of curvature.
1 Consider all thermal contours in the digitized image If the data is considered noisy, reduce the number of grey levels to represent true differences in the thermal data
2 Establish a radius of curvature such that any portion of any contour line which has a tighter curvature will generate a minutiae point.
3 The added minutiae set will consist of the maximum inflection points labeled as (x, y, z, a, D), where (x, y) is the location of the minutia point relative to the facial axes, z is the thermal value at that point, a is the angle subtended by a tangent to the thermal contour at the minutia point, and D is the range of thermal values (equal to the number of constant thermal contours crossed) between the minutia point and the centroid of its thermal contour. C. run length encoding start and stop locations.
1. Perform run length encoding of the thermal image.
2. Each stop/start location generates a minutia point.
3. The added minutiae set will consist of the (x, y, z) value associated with those points. D. undefined locations generated by compression and subsequent expansion.
1. Perform wavelet or fractal-based compression on the thermal image.
2. Expand the compressed image and compare it with the onginal.
3. The added minutiae set will consist of the undefined locations and will be labeled as (x, y, z,w) where (x,y) is the location of the point relative to the facial axes, z is the thermal value at that location in the original thermal image, and w is a set of wavelet coefficients.
E. All pixels above a selected threshold, of all pixels within a selected thermal range and distance from other defined pixels.
Visible Minutiae
Visible minutiae are selected depending on the resolution, contrast, and clarity of the visible images. Candidate minutiae include: head outline, hairlines, pupils, eye inner and outer corners, nostrils, mouth corners, lip bow, and tip of nose.
Tables are then created of the infrared minutiae and the visible minutiae. Table entries include the locations of each minutiae relative to the face axes. Coincident minutiae are linked either manually or automatically. Coincident minutiae include: pupils, inner and outer eye corners, nostrils, head outline, hairlines, and ear - head connection points.
After selection of coincident minutiae, they are matched. Various perturbations, such as facial expression changes, can distort the relative locations of minutiae points to an extent. This is analogous to the deformations which occur in fingeφrints due to movement between the fingers and the print surface. The minutiae matching algorithms allow for variations in the position and characteristics of the minutiae, as well as in the subset of minutiae which are seen due to the field of view of the camera and to possible obstruction of certain areas of the face in the image. The difference between locations of available coincident minutiae is calculated relative to the face axes. Different methods can be used to evaluate the difference between the two sets of minutiae.
One such method is standard graph matching, with tolerances established for errors due to imperfect knowledge of head position and distance, and errors associated with treating the head/face as a two-dimensional surface or as a sphere, and errors associated with residual errors even if a true three-dimensional model of the head is made, using laser interferometry or other techniques
Another method is Flash Correlation® as descπbed in the Prokoski U.S. patent No 5,583,950 Large circular areas at each minutiae location are used, where the size of the area or dot represents the uncertainty associated with the exact minutiae location, due to facial expression changes, camera resolution, and other factors
A further method for evaluating the difference between two sets of minutiae is analogous to ftngeφπnt minutiae, using any of the many minutiae companson techniques developed to compare location and charactenstics of sets of minutiae In Figs. 5a and 5b, matching of coincident minutiae is illustrated More particularly, in Fig. 5a, the infrared image of Fig 3a is overlaid onto the corresponding visual image of Fig. la to illustrate the alignment of coincident minutiae and thus a match of individuals In Fig. 5b, the infrared image of Fig. 3b is overlaid onto the visual image of Fig la to illustrate the misalignment of coincident minutiae, thus indicating no match of the individuals
For high secuπty applications or where the database subjects may have been disguised, may have aged or changed their weight or appearance, the matching algoπthm considers such possible vaπations in deciding possible matches.
Whichever minutiae extraction and compaπson techniques are used, they produce a metnc which can be compared to a threshold which is set or determined adaptively by considenng databases where the images are of known persons. The threshold (CM) is set for the desired trade off of the rates of false positive and false negative results.
If consideration of coincident minutiae leads to the conclusion that a match is possible, that decision can be refined by consideration of the spectrum-dependent minutiae The two images are optimally aligned according to the face axes, and waφed so that the coincident minutiae are overlaid Then each spectrum-dependent minutiae is considered relative to a rule which relates it to the other image. The rule also assigns a point value to the degree of compliance with the rule. Next the system confirms adherence or violation of the rules and computes the cumulative score associated with all of the rules.
An Exclusion Test is the simplest rule. It states that no vascular structure or minutiae seen in the IR image can be overlaid outside the head outline of the visual image, or inside of the eye, mouth or nostril areas.
Anatomical rules including the following:
1. the facial vein and the facial artery must lie outside nose boundaries, must not go through mouth or eyes or nostrils, and must be inside the face from the ears; 2. the supraorbital and opthalmic arteries must lie above the eyes;
3. the transverse facial vein and artery must lie below the eyes;
4. the transverse vein must lie inside face area between the eyes, and outside the area of the nose; and
5. the labial vein and artery must surround the mouth. A particular class of problems which is of interest includes images taken over long periods of time, whether of children or adults. In these cases, the set of coincident minutiae and the rules governing spectrum-dependent minutiae will vary to accommodate anatomical changes associated with growth and aging. Either of the images being compared may be artificially aged to the other, prior to minutiae being extracted for comparison.
From the standpoint of evidentiary use, it might be argued that the application of eigenanalysis to a very large database of faces, such as all mug shots in the FBI files, would be considered so esoteric by the public at large that automated matches based upon its use will not readily be acceptable to a jury as convincing evidence of identity. By comparison, the proposed facial minutiae matching technique, being analogous to fingeφrint identification, is expected to find a more understanding reception by the law enforcement community, and to be more acceptable for evidentiary puφoses within a reasonable number of years after its introduction.
A threshold is set or determined adaptively, such that pairs of images having a calculated value within the threshold are considered to be possible matches. The decision algorithm utilizes a cumulative rule score or simply exclude any image which breaks any rule. The quality of the imagery used, and the possibility of disguise will be considered in establishing the decision algorithm to determine possible or impossible matches. Figs.6a- 6c show the threshold infrared image of the front face, side face, and neck of an individual. Two alternative embodiments of the method for aligning and comparing images of the face and body from different images according to the invention will now be described.
For compression of talking head video, a dualband R/visual camera is used. The processor at the transmitting end continuously extracts IR minutiae from each frame of the
IR video. It locates and tracks the face axes, detecting when there is significant head movement. A visual baseline image of the subject is sent, followed by transmission of only the movement vectors of those infrared minutiae which move from frame to frame. If significant head movement occurs, then a new baseline video image is transmitted, followed again by transmission sequences of only the movement vectors.
At the receiving end, the baseline face is animated based upon overlaying the IR minutiae movements on the visual image. Moφhing techniques are used to smooth the transition to a new baseline image. If the moφhing indicates too much change in the new baseline, then a signal is sent back to the transmission end to reduce the allowed head movement before a new baseline is transmitted. The technique allows for greater bandwidth compression for talking heads with little movement, while allowing automated accommodation of very mobile faces. Separate IR and visual cameras can be used, but the processing time required is greater. Muscles of the face involved in facial expression and speech are shown in Figs 7a and 7b Change in expression or action of speech causes movements in affected areas of the face, distorting the locations of the infrared spectral-dependent minutiae, and also distorting visible minutiae. However, the infrared minutiae are more numerous and are present in areas where no visible minutiae are present. Therefore, modeling of the infrared minutiae provides finer detailed modeling of expressions and speech than does modeling based upon visual minutiae.
Based upon processing sequences of images taken dunng known expressions and/or known speech elements, a sequence of movements of infrared minutiae can be extracted which corresponds to that expression or speech element for that person.
Subsequently, when the same sequence of movements of infrared minutiae is seen, it can be inferred that the person is displaying the same expression or speech element as dunng the initial sequence. This enables the automated determination of expression or speech, allowing for compression of transmitted video. A baseline image of the person can be transmitted, and then a code for the expression or speech element is transmitted. At the receive end, the expression or speech element is reconstructed and a simulated animation of the face presented.
This technique can also be used to create a dictionary of facial expressions and speech elements for use in animation of a synthetic face. An overview of the method of the invention will be descnbed with reference to
Fig. 8
First, a database of images of known individuals is generated 6 The images can include infrared, visual, hyperspectral images, or medical images which have been annotated with infrared minutiae. Each image is scaled to a common reference. Next, the images in the database are processed for spectrum-dependent features and minutiae 8. The processing locates IR minutiae annotated onto other sensor images, assigns face axes, courts the number of minutiae, tags the image with the resulting data, and assigns a quality measure to the image based on the number of minutiae identified and the quality thereof based on the minutiae extraction process In the process reference step 10, selected images of a threshold quality are stored The image of an unknown individual is captured 12 using an infrared camera or other sensor This image is processed 14 to locate the face axes, scale the image, locate IR minutiae, and assign a quality measure similar to the process step 8
The captured image is classified 16 as is the database image 18 to reduce search time Appropnate classification techniques include the use of pnncipal component parameters or symmetry waveforms when both captured and reference databases include only IR images, coincident minutiae metncs when both databases include only IR and visual images, or IR minutiae metπcs when both databases include images annotated with ER. minutiae Specific application of a classification technique will depend on the size of the database Using distance metπcs computed from coincident IR and visual minutiae, for example, twelve measurements may be taken which are the same both IR and visual images Very large databases can be partitioned effectively using such metπcs
Next, the classified captured image and the database images are compared to select a potential match 20 from the database If no potential matches are found, this is the end result However, if a potential match is found, further processing occurs to venfy a match The captured image is positioned 22 to determine the rotation, tip and tilt thereof The database image is similarly positioned 24 If necessary, corrections in position are made so that the images to be compared are similarly oπented Next, the captured and database images are overlaid in alignment 26 This is shown in Figs 9a and 9b The distances between coincident minutiae (those which occur in both image modes) are calculated For each minutiae area of the face, an error band is established which represents the possible vanation in position of that minutiae due to facial expression change or speech-related movement
Those pairs of coincident minutiae where the captured and database images' minutiae are both within the error band of the other are counted 28 The count is compared to a pre-established threshold If the count is below the threshold, that database image is not considered a possible match and the next sequential image from the database is selected 20 for companson. If the count is equal to or greater than the threshold, the process continues
Next, the composite distance between pairs of coincident minutiae are measured and compared to a pre-determined threshold 30 If the measure is greater than the threshold, that database image is not considered a possible match and the next sequential image from the database is selected 20 for compaπson. If the measure is equal to or less than the threshold, the process continues
Next, an exclusion zone for the database image is established 32 in which the eyes, nostrils, mouth, and outside boundaπes of the face are set as exclusion zones to form a mask of the database image. The mask is aligned with and supeπmposed on the captured image. If any IR minutiae in the captured image fall within the exclusion zones, it is considered a violation, and that database image is no longer considered a possible match and the next image is selected. If no violations occur, the process continues with testing for anatomical rules 34 governing where specific IR minutiae may be located. Those rules are tested against the database images using the captured image. For example, the facial artery must lie between the nose and the ear When the captured and database images are aligned and overlaid, each anatomical rule is tested Any violation results m that database image no longer being considered and the next image is selected If no violations occur, the process continues From the database images which progress through the processing steps, a candidate list is created 36. The results are weighed 38 in accordance with certain factors such as the database size and completeness. For example, if the database in known to include several images of all employees of a company, that fact will influence the reliability of a match when multiple database images of the same person are found as possible matches to the captured image.
Based on the weighed results, the candidate matching images from the database are ranked 40 and output 42.
The apparatus according to the invention will be described with reference with Fig. 10. The apparatus includes a digital storage device 44 for the capture of infrared images. Connected therewith is a standardization processor 46 which standardizes the image and a minutiae processor 48 which extracts and analyzes minutiae for each IR image.
The apparatus also includes a digital database 50 which stores a plurality of reference images. A standardization processor 52 standardizes the images which are delivered to a database 54 containing standardized reference images. A minutiae processor
56 extracts and analyzes spectrum independent minutiae and IR minutiae superimposed on medical images.
The minutiae processor 48 for the captured image and the minutiae processor for the database image are connected with a selector comparator device 58 which determines whether a match exists between the images to identify the individual from which the captured image was taken. More particularly, the selector aligns the images to determine if there is an initial match. If not, the comparator compares the coincident minutiae within the images. A first comparison is made by counting the number of coincident minutiae. If the number exceeds a predetermined threshold, the processing continues. If the threshold count is not reached, then the database image is rejected and the next image in the database is selected for comparison. A second comparison is made of the measured distance between coincident minutiae. If the distance exceeds a threshold, the database image is rejected and the next database image is selected for comparison. If the measured distance is below the threshold, processing continues.
An evaluator 60 tests the database image for exclusion zones and anatomical rules. If any minutiae of the captured image fall within the exclusion zone, a violation occurs and the database image is rejected. The anatomical rules specify where specific infrared minutiae may be located. When the captured and database images are overlaid and aligned, each anatomical rule is tested. If a violation occurs, the database image is rejected.
The database images which pass through the comparison and evaluation stages are weighed according to the strength of match. The ranked potential matches are then output through the output device 62.
The method and apparatus of the invention can be extended to the comparison of images other than visual images such as, for example x-rays or sonograms. The x-ray and sonogram images can be aligned by first annotating each with coincident IR minutiae, then moφhing the two sets of IR minutiae as overlays onto the medical images, or moφhing each medical image to a standard IR image. The moφhing can be in three dimensions when depth information is provided for the IR minutiae.

Claims

WHAT IS CLAIMED IS
1 A method for aligning images from two different spectral images, compnsing the steps of
(a) identifying spectrum-dependent minutiae in each image,
(b) identifying coincident minutiae which occur in both images, and
(c) overlaying the coincident minutiae by moφhing one image to the other image to determine whether there is a match of the two images
2 A method as defined in claim 1 , wherein one of said images compπses an image of standard anatomy
3 A method as defined in claim 1 , and further compπsing the step of calculating the degree of moφhing required to achieve optimal alignment, said moφhing step including the steps of stretching, waφing, and shnnking said one image with respect to said other image
4 A method for identifying an unknown person from a first image taken in one spectral band by companng it with a plurality of second images of known persons taken in a different spectral band and stored in a database, compπsmg the steps of
(a) identifying spectrum-dependent minutiae in each image, (b) identifying coincident minutiae which occur in the images,
(c) overlaying the coincident minutiae by moφhing said first image onto each of said second images to determine whether there is a match of images,
(d) selecting one of said second images requiπng the least moφhing as the most likely matching image from said database, and
(e) companng the degree of moφhmg with a threshold to determine whether the match is sufficient to identify the unknown person as one from the database
5 A method as defined in claim 4, and further compπsing the stop of assigning a level of confidence to the match based on the difference between the threshold and the degree of moφhing
6 A method for encoding facial expression and movement, compπsing the steps of
(a) generating simultaneous and corresponding visual and infrared images of a face over an interval of time,
(b) identifying and extracting spectrum-dependent minutiae from each image at selected intervals within said time peπod,
(c) tracking the movement of said minutiae dunng said time penod, (d) aligning said visual and infrared images via said coincident minutiae for each image; and
(e) calculating the locations of coincident minutiae in the visual image corresponding to the infrared minutiae.
7. A method as defined in claim 6, and further comprising the step of storing said images in a database in accordance with the locations of coincident minutiae.
8. Apparatus for aligning images from two different spectral images, comprising
(a) means for generating images from different spectra;
(b) means for identifying coincident minutiae which occur in both images; and
(c) means for overlaying the coincident minutiae by moφhing one image to the other image to compare the minutiae and determine whether there is a match of the two images.
9. Apparatus as defined in claim 8, and further comprising means for calculating the degree of moφhing required to achieve optimal alignment, said overlaying means stretching, waφing, and shrinking said one image with respect to said other image.
10. Apparatus as defined in claim 9, wherein said image generating means comprise a camera for generating a visual image and an infrared camera for generating an infrared image.
11 Apparatus as defined in claim 10, and further compπsing a database for stonng a plurality of visual images of known individuals whereby an image from an unknown individual can be compared with said stored images to identify the unknown individual
12 A method for compression of talking head video, compπsing of
(a) taking simultaneous and corresponding visual and infrared video image of a face,
(b) extracting infrared minutiae from each infrared frame;
(c) determining the face axes of the infrared face,
(d) transmitting a baseline video frame;
(e) tracking the movement direction and extent of each infrared minutia frame- to-frame;
(f) tracking the movement direction and extent of the face axes frame-to- frame; (g) establishing a range of face movement to be permitted without sending a new baseline video image; (h) transmitting the movement vectors for all facial infrared minutiae when the face movement is within the permitted range, (I) transmitting a new visual baseline image when the face movement is outside the permitted range,
(j) displaying the baseline video image at a receiving end; and (k) distorting the displayed video image by superimposing moφhing in accordance with the transmitted infrared minutiae vectors frame-to-frame.
13. A method as defined in claim 12, further comprising the steps of
(a) moφhing between a new baseline image and the last presented image in order to smooth the transition to a new baseline;
(b) determining the amount of moφhing needed to accomplish a smooth transition; and
(c) sending a signal to the transmission end to change the permitted range of face movement.
14. Apparatus for talking head video compression, comprising
(a) a dualband infrared/visual camera;
(b) an infrared minutiae extraction and face axes subsystem connected with said camera;
(c) a face movement tracker connected with said subsystem;
(d) an infrared spectral minutiae tracker connected with said face movement tracker; (e) a transmitter of baseline visual image;
(f) a transmitter of infrared spectral minutiae movements within baseline connected with said image tracker;
(g) a receiver of minutiae movement changes connected with said minutiae movement tracker;
(h) a display of baseline and animated visual image connected with said receiver;
(i) a visual face animator using infrared minutiae vectors;
(j) a receiver of new head baseline position when head movement exceeds permitted range;
(k) a moφher of new baseline head position and last constructed head position; and
(1) a tuner which calculates the amount of moφhing needed in response to a new baseline.
PCT/US2000/041320 1999-10-21 2000-10-20 Method and apparatus for aligning and comparing images of the face and body from different imagers WO2001029769A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2001532489A JP2003512684A (en) 1999-10-21 2000-10-20 Method and apparatus for aligning and comparing face and body images from different imagers
EP00986798A EP1194893A4 (en) 1999-10-21 2000-10-20 Method and apparatus for aligning and comparing images of the face and body from different imagers
CA002354594A CA2354594A1 (en) 1999-10-21 2000-10-20 Method and apparatus for aligning and comparing images of the face and body from different imagers

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/422,273 US6496594B1 (en) 1998-10-22 1999-10-21 Method and apparatus for aligning and comparing images of the face and body from different imagers
US09/422,273 1999-10-21

Publications (3)

Publication Number Publication Date
WO2001029769A2 true WO2001029769A2 (en) 2001-04-26
WO2001029769A3 WO2001029769A3 (en) 2002-01-10
WO2001029769A9 WO2001029769A9 (en) 2002-08-01

Family

ID=23674141

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/041320 WO2001029769A2 (en) 1999-10-21 2000-10-20 Method and apparatus for aligning and comparing images of the face and body from different imagers

Country Status (5)

Country Link
US (2) US6496594B1 (en)
EP (1) EP1194893A4 (en)
JP (1) JP2003512684A (en)
CA (1) CA2354594A1 (en)
WO (1) WO2001029769A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008130905A2 (en) * 2007-04-17 2008-10-30 Mikos, Ltd. System and method for using three dimensional infrared imaging to provide detailed anatomical structure maps
WO2011112422A2 (en) 2010-03-10 2011-09-15 Elc Management Llc System for skin treatment analysis using spectral image data to generate 3d rgb model
WO2014004179A1 (en) * 2012-06-26 2014-01-03 Qualcomm Incorporated Systems and method for facial verification
US9996726B2 (en) 2013-08-02 2018-06-12 Qualcomm Incorporated Feature identification using an RGB-NIR camera pair
US11033188B2 (en) 2014-11-27 2021-06-15 Koninklijke Philips N.V. Imaging device and method for generating an image of a patient

Families Citing this family (176)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7343351B1 (en) 1999-08-31 2008-03-11 American Express Travel Related Services Company, Inc. Methods and apparatus for conducting electronic transactions
US7953671B2 (en) 1999-08-31 2011-05-31 American Express Travel Related Services Company, Inc. Methods and apparatus for conducting electronic transactions
US7505941B2 (en) * 1999-08-31 2009-03-17 American Express Travel Related Services Company, Inc. Methods and apparatus for conducting electronic transactions using biometrics
US7889052B2 (en) 2001-07-10 2011-02-15 Xatra Fund Mx, Llc Authorizing payment subsequent to RF transactions
JP4396873B2 (en) * 1999-10-01 2010-01-13 株式会社資生堂 How to choose lipstick or eye shadow
WO2001061523A1 (en) * 2000-02-18 2001-08-23 Cedere Corporation Automated on-line business bandwidth planning methodology
US6807290B2 (en) * 2000-03-09 2004-10-19 Microsoft Corporation Rapid computer modeling of faces for animation
JP2001283216A (en) * 2000-04-03 2001-10-12 Nec Corp Image collating device, image collating method and recording medium in which its program is recorded
US7044602B2 (en) 2002-05-30 2006-05-16 Visx, Incorporated Methods and systems for tracking a torsional orientation and position of an eye
US6963659B2 (en) * 2000-09-15 2005-11-08 Facekey Corp. Fingerprint verification system utilizing a facial image-based heuristic search method
US6792136B1 (en) * 2000-11-07 2004-09-14 Trw Inc. True color infrared photography and video
JP4310916B2 (en) * 2000-11-08 2009-08-12 コニカミノルタホールディングス株式会社 Video display device
AU2002230449A1 (en) * 2000-11-15 2002-05-27 Mark Frigon Method and apparatus for processing objects in online images
US7020305B2 (en) * 2000-12-06 2006-03-28 Microsoft Corporation System and method providing improved head motion estimations for animation
WO2002061666A1 (en) * 2001-01-29 2002-08-08 Siemens Aktiengesellschaft Recognising people using a mobile appliance
US6654018B1 (en) * 2001-03-29 2003-11-25 At&T Corp. Audio-visual selection process for the synthesis of photo-realistic talking-head animations
US7725427B2 (en) 2001-05-25 2010-05-25 Fred Bishop Recurrent billing maintenance with radio frequency payment devices
US9024719B1 (en) 2001-07-10 2015-05-05 Xatra Fund Mx, Llc RF transaction system and method for storing user personal data
US20040236699A1 (en) 2001-07-10 2004-11-25 American Express Travel Related Services Company, Inc. Method and system for hand geometry recognition biometrics on a fob
US7735725B1 (en) 2001-07-10 2010-06-15 Fred Bishop Processing an RF transaction using a routing number
US8284025B2 (en) 2001-07-10 2012-10-09 Xatra Fund Mx, Llc Method and system for auditory recognition biometrics on a FOB
US20040239481A1 (en) * 2001-07-10 2004-12-02 American Express Travel Related Services Company, Inc. Method and system for facial recognition biometrics on a fob
US8294552B2 (en) 2001-07-10 2012-10-23 Xatra Fund Mx, Llc Facial scan biometrics on a payment device
US20040236700A1 (en) * 2001-07-10 2004-11-25 American Express Travel Related Services Company, Inc. Method and system for keystroke scan recognition biometrics on a fob
US20040233038A1 (en) * 2001-07-10 2004-11-25 American Express Travel Related Services Company, Inc. Method and system for retinal scan recognition biometrics on a fob
US20050116810A1 (en) * 2001-07-10 2005-06-02 American Express Travel Related Services Company, Inc. Method and system for vascular pattern recognition biometrics on a fob
US9031880B2 (en) 2001-07-10 2015-05-12 Iii Holdings 1, Llc Systems and methods for non-traditional payment using biometric data
US9454752B2 (en) 2001-07-10 2016-09-27 Chartoleaux Kg Limited Liability Company Reload protocol at a transaction processing entity
US7668750B2 (en) 2001-07-10 2010-02-23 David S Bonalle Securing RF transactions using a transactions counter
US7360689B2 (en) 2001-07-10 2008-04-22 American Express Travel Related Services Company, Inc. Method and system for proffering multiple biometrics for use with a FOB
US7705732B2 (en) 2001-07-10 2010-04-27 Fred Bishop Authenticating an RF transaction using a transaction counter
US8001054B1 (en) 2001-07-10 2011-08-16 American Express Travel Related Services Company, Inc. System and method for generating an unpredictable number using a seeded algorithm
US8548927B2 (en) 2001-07-10 2013-10-01 Xatra Fund Mx, Llc Biometric registration for facilitating an RF transaction
US7303120B2 (en) 2001-07-10 2007-12-04 American Express Travel Related Services Company, Inc. System for biometric security using a FOB
US7249112B2 (en) 2002-07-09 2007-07-24 American Express Travel Related Services Company, Inc. System and method for assigning a funding source for a radio frequency identification device
US20030063781A1 (en) * 2001-09-28 2003-04-03 Koninklijke Philips Electronics N.V. Face recognition from a temporal sequence of face images
DE60228744D1 (en) * 2001-10-09 2008-10-16 Sirf Tech Inc METHOD AND SYSTEM FOR SENDING POSITION-CODED IMAGES ON A WIRELESS NETWORK
JP3849517B2 (en) * 2001-12-14 2006-11-22 ソニー株式会社 Photo booth, image processing method, recording medium, and program
US7221809B2 (en) * 2001-12-17 2007-05-22 Genex Technologies, Inc. Face recognition system and method
JP4023716B2 (en) * 2001-12-27 2007-12-19 シャープ株式会社 Resolution correction apparatus, resolution correction program, and computer-readable recording medium recording resolution correction program
US7286692B2 (en) * 2001-12-27 2007-10-23 Amnart Kanarat Automatic celebrity face matching and attractiveness rating machine
US20040008223A1 (en) * 2002-03-16 2004-01-15 Catherine Britton Electronic healthcare management form navigation
IL164685A0 (en) 2002-04-22 2005-12-18 Marcio Marc Aurelio Martins Ab Apparatus and method for measuring biologic parameters
US8328420B2 (en) * 2003-04-22 2012-12-11 Marcio Marc Abreu Apparatus and method for measuring biologic parameters
US9848815B2 (en) 2002-04-22 2017-12-26 Geelux Holdings, Ltd. Apparatus and method for measuring biologic parameters
SE0201529D0 (en) * 2002-05-21 2002-05-21 Flir Systems Ab Method and apparatus for IR camera inspections
JP2003346149A (en) * 2002-05-24 2003-12-05 Omron Corp Face collating device and bioinformation collating device
JP4036051B2 (en) * 2002-07-30 2008-01-23 オムロン株式会社 Face matching device and face matching method
US6805287B2 (en) 2002-09-12 2004-10-19 American Express Travel Related Services Company, Inc. System and method for converting a stored value card to a credit card
US7149358B2 (en) * 2002-11-27 2006-12-12 General Electric Company Method and system for improving contrast using multi-resolution contrast based dynamic range management
US7619626B2 (en) * 2003-03-01 2009-11-17 The Boeing Company Mapping images from one or more sources into an image for display
US7711155B1 (en) * 2003-04-14 2010-05-04 Videomining Corporation Method and system for enhancing three dimensional face modeling using demographic classification
US7242807B2 (en) * 2003-05-05 2007-07-10 Fish & Richardson P.C. Imaging of biometric information based on three-dimensional shapes
US7421097B2 (en) * 2003-05-27 2008-09-02 Honeywell International Inc. Face identification verification using 3 dimensional modeling
JP4366119B2 (en) * 2003-05-29 2009-11-18 キヤノン株式会社 Document processing device
US7458683B2 (en) * 2003-06-16 2008-12-02 Amo Manufacturing Usa, Llc Methods and devices for registering optical measurement datasets of an optical system
US7317816B2 (en) * 2003-08-19 2008-01-08 Intel Corporation Enabling content-based search of objects in an image database with reduced matching
WO2005020030A2 (en) * 2003-08-22 2005-03-03 University Of Houston Multi-modal face recognition
US20050111705A1 (en) * 2003-08-26 2005-05-26 Roman Waupotitsch Passive stereo sensing for 3D facial shape biometrics
US7990384B2 (en) * 2003-09-15 2011-08-02 At&T Intellectual Property Ii, L.P. Audio-visual selection process for the synthesis of photo-realistic talking-head animations
JP4615272B2 (en) * 2003-09-29 2011-01-19 富士フイルム株式会社 Authentication system, program, and building
US20050123182A1 (en) * 2003-12-03 2005-06-09 Avision Inc. Temperature sensor
US10227063B2 (en) 2004-02-26 2019-03-12 Geelux Holdings, Ltd. Method and apparatus for biological evaluation
US20050226509A1 (en) * 2004-03-30 2005-10-13 Thomas Maurer Efficient classification of three dimensional face models for human identification and other applications
JP4059224B2 (en) * 2004-04-13 2008-03-12 株式会社デンソー Driver appearance recognition system
US7325724B2 (en) 2004-07-01 2008-02-05 American Express Travel Related Services Company, Inc. Method for registering a biometric for use with a smartcard
US20060016872A1 (en) * 2004-07-01 2006-01-26 American Express Travel Related Services Company, Inc. Method and system for iris scan recognition biometrics on a smartcard
US20060000895A1 (en) * 2004-07-01 2006-01-05 American Express Travel Related Services Company, Inc. Method and system for facial recognition biometrics on a smartcard
US20060016873A1 (en) * 2004-07-01 2006-01-26 American Express Travel Related Services Company, Inc. Method and system for retinal scan recognition biometrics on a smartcard
US7318550B2 (en) 2004-07-01 2008-01-15 American Express Travel Related Services Company, Inc. Biometric safeguard method for use with a smartcard
US7341181B2 (en) 2004-07-01 2008-03-11 American Express Travel Related Services Company, Inc. Method for biometric security using a smartcard
US7314165B2 (en) 2004-07-01 2008-01-01 American Express Travel Related Services Company, Inc. Method and system for smellprint recognition biometrics on a smartcard
JP4128600B2 (en) * 2004-07-09 2008-07-30 株式会社アイ・ピー・ビー Biological information acquisition method using millimeter wave electromagnetic wave, and apparatus for acquiring and displaying biological information
EP1769637A2 (en) * 2004-07-09 2007-04-04 Emitall Surveillance S.A. Smart video surveillance system ensuring privacy
JP4340618B2 (en) * 2004-10-08 2009-10-07 富士通株式会社 Biometric information authentication apparatus and method, biometric information authentication program, and computer-readable recording medium recording the biometric information authentication program
EP1815426B1 (en) * 2004-11-10 2011-07-27 Koninklijke Philips Electronics N.V. System and method for registration of medical images
US7469060B2 (en) * 2004-11-12 2008-12-23 Honeywell International Inc. Infrared face detection and recognition system
US7469074B2 (en) * 2004-11-17 2008-12-23 Lexmark International, Inc. Method for producing a composite image by processing source images to align reference points
US8049594B1 (en) 2004-11-30 2011-11-01 Xatra Fund Mx, Llc Enhanced RFID instrument security
US20060140444A1 (en) * 2004-12-27 2006-06-29 Yih-Ran Sheu Human face identification means in security system
EP2164056A2 (en) 2004-12-27 2010-03-17 Emitall Surveillance S.A. Efficient scrambling of regions of interests in an image or video to preserve privacy
US7548776B2 (en) * 2005-02-08 2009-06-16 Southern Taiwan University Of Technology Method and system for performing fever triage
US7925391B2 (en) * 2005-06-02 2011-04-12 The Boeing Company Systems and methods for remote display of an enhanced image
GB0512869D0 (en) * 2005-06-24 2005-08-03 Ibm Method and system for facial recognition in groups
US20070038059A1 (en) * 2005-07-07 2007-02-15 Garrett Sheffer Implant and instrument morphing
JP4624882B2 (en) * 2005-08-01 2011-02-02 株式会社東海理化電機製作所 Personal authentication device and personal authentication method
JP4750520B2 (en) * 2005-09-21 2011-08-17 富士フイルム株式会社 Human image correction apparatus and method
KR101370985B1 (en) 2005-10-24 2014-03-10 마시오 마크 아우렐리오 마틴스 애브리우 Apparatus and method for measuring biologic parameters
US8064716B2 (en) * 2005-11-08 2011-11-22 Soundstarts, Inc. Apparatus and methods for enhancing digital images
US7953253B2 (en) * 2005-12-31 2011-05-31 Arcsoft, Inc. Face detection on mobile devices
US7643659B2 (en) * 2005-12-31 2010-01-05 Arcsoft, Inc. Facial feature detection on mobile devices
US7822259B2 (en) * 2006-05-26 2010-10-26 Areva Np Inc. Method for positive identification of inner surface and outer surface of sample flakes
WO2008022210A2 (en) * 2006-08-15 2008-02-21 The Board Of Regents Of The University Of Texas System Methods, compositions and systems for analyzing imaging data
US7921120B2 (en) * 2006-11-30 2011-04-05 D&S Consultants Method and system for image recognition using a similarity inverse matrix
US8380558B1 (en) * 2006-12-21 2013-02-19 Videomining Corporation Method and system for analyzing shopping behavior in a store by associating RFID data with video-based behavior and segmentation data
US8665333B1 (en) * 2007-01-30 2014-03-04 Videomining Corporation Method and system for optimizing the observation and annotation of complex human behavior from video sources
JP5061645B2 (en) * 2007-02-26 2012-10-31 ソニー株式会社 Information extraction method, information extraction device, program, registration device, and verification device
GB2448050A (en) * 2007-03-22 2008-10-01 Artnix Inc A method and apparatus for extracting face images from video data and performing recognition matching for identification of people.
US7848548B1 (en) * 2007-06-11 2010-12-07 Videomining Corporation Method and system for robust demographic classification using pose independent model from sequence of face images
US7857452B2 (en) * 2007-08-27 2010-12-28 Catholic Healthcare West Eye movements as a way to determine foci of covert attention
JP4466702B2 (en) * 2007-09-12 2010-05-26 カシオ計算機株式会社 Imaging apparatus and imaging control program
US7996762B2 (en) * 2007-09-21 2011-08-09 Microsoft Corporation Correlative multi-label image annotation
WO2009062062A1 (en) * 2007-11-09 2009-05-14 Imacor, Llc Superimposed display of image contours
US8194933B2 (en) * 2007-12-12 2012-06-05 3M Innovative Properties Company Identification and verification of an unknown document according to an eigen image process
US8540158B2 (en) * 2007-12-12 2013-09-24 Yiwu Lei Document verification using dynamic document identification framework
DE102009014437B4 (en) 2008-03-26 2023-01-19 Continental Autonomous Mobility Germany GmbH Object Recognition System and Method
US8477147B2 (en) * 2008-04-01 2013-07-02 The United States Of America, As Represented By The Secretary Of The Navy Methods and systems of comparing face models for recognition
US20090319601A1 (en) * 2008-06-22 2009-12-24 Frayne Raymond Zvonaric Systems and methods for providing real-time video comparison
US8204340B2 (en) * 2008-09-29 2012-06-19 Two Pic Mc Llc Methods and apparatus for dot marker matching
WO2010064249A1 (en) * 2008-12-04 2010-06-10 Real Imaging Ltd. Method apparatus and system for determining a thermal signature
WO2010109384A1 (en) * 2009-03-27 2010-09-30 Koninklijke Philips Electronics N.V. Improvements to medical imaging
EP2490584B1 (en) * 2009-10-20 2019-02-20 Dignity Health Eye movements as a way to determine foci of covert attention
US8634900B2 (en) 2009-11-11 2014-01-21 Koninklijke Philips N.V. Mask comfort diagnostic method
US8509482B2 (en) * 2009-12-21 2013-08-13 Canon Kabushiki Kaisha Subject tracking apparatus, subject region extraction apparatus, and control methods therefor
WO2011079208A1 (en) 2009-12-24 2011-06-30 Flir Systems, Inc. Cameras with on-board reporting capabilities
US20110182493A1 (en) * 2010-01-25 2011-07-28 Martin Huber Method and a system for image annotation
WO2011152842A1 (en) * 2010-06-01 2011-12-08 Hewlett-Packard Development Company, L.P. Face morphing based on learning
US8311337B2 (en) 2010-06-15 2012-11-13 Cyberlink Corp. Systems and methods for organizing and accessing feature vectors in digital images
US8786698B2 (en) * 2010-09-23 2014-07-22 Sony Computer Entertainment Inc. Blow tracking user interface system and method
US8638364B2 (en) * 2010-09-23 2014-01-28 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
US10456209B2 (en) * 2010-10-13 2019-10-29 Gholam A. Peyman Remote laser treatment system with dynamic imaging
US11309081B2 (en) 2010-10-13 2022-04-19 Gholam A. Peyman Telemedicine system with dynamic imaging
JP5648452B2 (en) * 2010-12-03 2015-01-07 富士通株式会社 Image processing program and image processing apparatus
US20120243751A1 (en) * 2011-03-24 2012-09-27 Zhihong Zheng Baseline face analysis
US9143703B2 (en) * 2011-06-10 2015-09-22 Flir Systems, Inc. Infrared camera calibration techniques
FR2979728B1 (en) * 2011-09-01 2016-05-13 Morpho FRAUD DETECTION FOR BIOMETRIC TYPE ACCESS CONTROL SYSTEM
US20130148811A1 (en) * 2011-12-08 2013-06-13 Sony Ericsson Mobile Communications Ab Electronic Devices, Methods, and Computer Program Products for Determining Position Deviations in an Electronic Device and Generating a Binaural Audio Signal Based on the Position Deviations
US9088697B2 (en) 2011-12-13 2015-07-21 Google Inc. Processing media streams during a multi-user video conference
US9088426B2 (en) * 2011-12-13 2015-07-21 Google Inc. Processing media streams during a multi-user video conference
US8655152B2 (en) 2012-01-31 2014-02-18 Golden Monkey Entertainment Method and system of presenting foreign films in a native language
US9020192B2 (en) * 2012-04-11 2015-04-28 Access Business Group International Llc Human submental profile measurement
US9224248B2 (en) * 2012-07-12 2015-12-29 Ulsee Inc. Method of virtual makeup achieved by facial tracking
KR101415848B1 (en) 2012-12-12 2014-07-09 휴앤에스(주) Monitering apparatus of school-zone using detection of human body and vehicle
US9390149B2 (en) * 2013-01-16 2016-07-12 International Business Machines Corporation Converting text content to a set of graphical icons
US8924735B2 (en) * 2013-02-15 2014-12-30 Microsoft Corporation Managed biometric identity
JP2016520336A (en) 2013-03-12 2016-07-14 リー, スティーブン ピー.LEE, Steven P. Computer-controlled refraction and astigmatism determination
US9367909B2 (en) 2013-07-10 2016-06-14 Canon Kabushiki Kaisha Devices, systems, and methods for classifying materials based on a bidirectional reflectance distribution function
US9274052B2 (en) * 2013-07-10 2016-03-01 Canon Kabushiki Kaisha Feature vector for classifying specular objects based on material type
CN105814419A (en) 2013-10-11 2016-07-27 马尔西奥·马克·阿布雷乌 Method and apparatus for biological evaluation
US9347899B2 (en) 2013-12-06 2016-05-24 Rolls-Royce Corporation Thermographic inspection techniques
CA2936235A1 (en) 2014-01-10 2015-07-16 Marcio Marc Abreu Devices to monitor and provide treatment at an abreu brain tunnel
WO2015106137A1 (en) 2014-01-10 2015-07-16 Marcio Marc Abreu Device for measuring the infrared output of the abreu brain thermal tunnel
CA2936247A1 (en) 2014-01-22 2015-07-30 Marcio Marc Abreu Devices and methods for transdermal drug delivery
JP6213663B2 (en) 2014-03-13 2017-10-18 日本電気株式会社 Detection device, detection method, and program
KR101601475B1 (en) * 2014-08-25 2016-03-21 현대자동차주식회사 Pedestrian detection device and method for driving vehicle at night
JP6426433B2 (en) * 2014-10-27 2018-11-21 株式会社日立製作所 Image processing apparatus, image processing method, POI information creation system, warning system, and guidance system
US10395227B2 (en) 2015-01-14 2019-08-27 Tactilis Pte. Limited System and method for reconciling electronic transaction records for enhanced security
US10037528B2 (en) 2015-01-14 2018-07-31 Tactilis Sdn Bhd Biometric device utilizing finger sequence for authentication
US9607189B2 (en) 2015-01-14 2017-03-28 Tactilis Sdn Bhd Smart card system comprising a card and a carrier
US11872018B2 (en) 2015-03-10 2024-01-16 Brain Tunnelgenix Technologies Corp. Devices, apparatuses, systems, and methods for measuring temperature of an ABTT terminus
WO2016154123A2 (en) 2015-03-21 2016-09-29 Mine One Gmbh Virtual 3d methods, systems and software
US10853625B2 (en) 2015-03-21 2020-12-01 Mine One Gmbh Facial signature methods, systems and software
US9430697B1 (en) * 2015-07-03 2016-08-30 TCL Research America Inc. Method and system for face recognition using deep collaborative representation-based classification
JP6564271B2 (en) * 2015-08-07 2019-08-21 キヤノン株式会社 Imaging apparatus, image processing method, program, and storage medium
US10614288B2 (en) * 2015-12-31 2020-04-07 Cerner Innovation, Inc. Methods and systems for detecting stroke symptoms
JP6829001B2 (en) * 2016-03-18 2021-02-10 医療法人社団皓有会 Blood pressure information estimation device
US10210320B2 (en) * 2016-09-21 2019-02-19 Lextron Systems, Inc. System and method for secure 5-D user identification
US10398316B1 (en) * 2016-09-30 2019-09-03 Vium, Inc. Method and apparatus for determining physiological characteristics of experimental animals based upon infrared and visible light images
JP6809262B2 (en) * 2017-02-08 2021-01-06 トヨタ自動車株式会社 Driver status detector
US10275425B2 (en) * 2017-02-14 2019-04-30 Henry Edward Kernan Method for compressing, slicing, and transmitting image files for display and interpretation
US10412286B2 (en) 2017-03-31 2019-09-10 Westboro Photonics Inc. Multicamera imaging system and method for measuring illumination
CN108875474A (en) * 2017-07-18 2018-11-23 北京旷视科技有限公司 Assess the method, apparatus and computer storage medium of face recognition algorithms
US10289899B2 (en) * 2017-08-31 2019-05-14 Banuba Limited Computer-implemented methods and computer systems for real-time detection of human's emotions from visual recordings
CN115937776A (en) * 2017-09-15 2023-04-07 杭州海康威视数字技术股份有限公司 Monitoring method, device, system, electronic equipment and computer readable storage medium
TWI625679B (en) * 2017-10-16 2018-06-01 緯創資通股份有限公司 Live facial recognition method and system
US10643446B2 (en) 2017-12-28 2020-05-05 Cerner Innovation, Inc. Utilizing artificial intelligence to detect objects or patient safety events in a patient room
CN112580423A (en) * 2019-09-30 2021-03-30 托比股份公司 Method of processing an image of a user's eyes, associated system and computer program
IT201900018230A1 (en) * 2019-10-08 2021-04-08 Zoeen S R L DISPLAY DEVICE FOR SUB-SURFACE STRUCTURES AND METHOD OF VISUALIZATION OF SUCH SUB-SURFACE STRUCTURES
KR102203786B1 (en) * 2019-11-14 2021-01-15 오로라월드 주식회사 Method and System for Providing Interaction Service Using Smart Toy
JP2020074241A (en) * 2020-02-12 2020-05-14 日本電気株式会社 Detector
US11402273B2 (en) 2020-03-27 2022-08-02 Ecb Consulting Llc Systems and approaches for improving accuracy of temperature measurements using thermal imaging
US11326956B2 (en) * 2020-08-06 2022-05-10 Motorola Solutions, Inc. Face and inner canthi detection for thermographic body temperature measurement
US11204281B1 (en) * 2020-09-03 2021-12-21 Sensormatic Electronics, LLC Enhanced temperature measurement techniques
RU203731U1 (en) * 2020-12-21 2021-04-19 федеральное государственное бюджетное образовательное учреждение высшего образования "Кемеровский государственный университет" (КемГУ) SOFTWARE DEVICE FOR AUTOMATED CONTACTLESS DETECTION OF INCREASED TEMPERATURE ON THE BODY SURFACE AND BIOMETRIC IDENTIFICATION
CN117058131B (en) * 2023-10-11 2024-03-19 深圳市鹰瞳智能技术有限公司 Method for carrying out visual positioning on facial artery based on high-resolution thermal imaging
CN117084654B (en) * 2023-10-19 2024-01-12 深圳市鹰瞳智能技术有限公司 Method and system for evaluating ocular artery collateral blood flow dynamics based on facial thermal image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5890808A (en) * 1996-01-05 1999-04-06 Mcdonnell Douglas Corporation Image processing method and apparatus for correlating a test image with a template
US6111978A (en) * 1996-12-13 2000-08-29 International Business Machines Corporation System and method for determining ridge counts in fingerprint image processing

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4975969A (en) * 1987-10-22 1990-12-04 Peter Tal Method and apparatus for uniquely identifying individuals by particular physical characteristics and security system utilizing the same
US5163094A (en) 1991-03-20 1992-11-10 Francine J. Prokoski Method for identifying individuals from analysis of elemental shapes derived from biosensor data
US5659626A (en) * 1994-10-20 1997-08-19 Calspan Corporation Fingerprint identification system
US5687259A (en) * 1995-03-17 1997-11-11 Virtual Eyes, Incorporated Aesthetic imaging system
US6173068B1 (en) * 1996-07-29 2001-01-09 Mikos, Ltd. Method and apparatus for recognizing and classifying individuals based on minutiae
US5775806A (en) * 1996-09-12 1998-07-07 The United States Of America As Represented By The Secretary Of The Air Force Infrared assessment system
US5991429A (en) * 1996-12-06 1999-11-23 Coffin; Jeffrey S. Facial recognition system for security access and identification
US6072895A (en) * 1996-12-13 2000-06-06 International Business Machines Corporation System and method using minutiae pruning for fingerprint image processing
KR19990016896A (en) * 1997-08-20 1999-03-15 전주범 Eye region detection method in face image
US6049621A (en) * 1997-08-22 2000-04-11 International Business Machines Corporation Determining a point correspondence between two points in two respective (fingerprint) images
US6142876A (en) * 1997-08-22 2000-11-07 Cumbers; Blake Player tracking and identification system
US6002782A (en) * 1997-11-12 1999-12-14 Unisys Corporation System and method for recognizing a 3-D object by generating a 2-D image of the object from a transformed 3-D model
US6134340A (en) * 1997-12-22 2000-10-17 Trw Inc. Fingerprint feature correlator

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5890808A (en) * 1996-01-05 1999-04-06 Mcdonnell Douglas Corporation Image processing method and apparatus for correlating a test image with a template
US6111978A (en) * 1996-12-13 2000-08-29 International Business Machines Corporation System and method for determining ridge counts in fingerprint image processing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1194893A2 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008130905A2 (en) * 2007-04-17 2008-10-30 Mikos, Ltd. System and method for using three dimensional infrared imaging to provide detailed anatomical structure maps
WO2008130905A3 (en) * 2007-04-17 2009-12-30 Mikos, Ltd. System and method for using three dimensional infrared imaging to provide detailed anatomical structure maps
WO2011112422A2 (en) 2010-03-10 2011-09-15 Elc Management Llc System for skin treatment analysis using spectral image data to generate 3d rgb model
EP2545531A4 (en) * 2010-03-10 2016-11-09 Elc Man Llc System for skin treatment analysis using spectral image data to generate 3d rgb model
WO2014004179A1 (en) * 2012-06-26 2014-01-03 Qualcomm Incorporated Systems and method for facial verification
US10452894B2 (en) 2012-06-26 2019-10-22 Qualcomm Incorporated Systems and method for facial verification
US9996726B2 (en) 2013-08-02 2018-06-12 Qualcomm Incorporated Feature identification using an RGB-NIR camera pair
US11033188B2 (en) 2014-11-27 2021-06-15 Koninklijke Philips N.V. Imaging device and method for generating an image of a patient

Also Published As

Publication number Publication date
WO2001029769A9 (en) 2002-08-01
US6751340B2 (en) 2004-06-15
EP1194893A2 (en) 2002-04-10
US6496594B1 (en) 2002-12-17
WO2001029769A3 (en) 2002-01-10
CA2354594A1 (en) 2001-04-26
US20030108223A1 (en) 2003-06-12
EP1194893A4 (en) 2004-05-26
JP2003512684A (en) 2003-04-02

Similar Documents

Publication Publication Date Title
US6751340B2 (en) Method and apparatus for aligning and comparing images of the face and body from different imagers
Prokoski History, current status, and future of infrared identification
AU2007284299B2 (en) A system for iris detection, tracking and recognition at a distance
Nowara et al. Ppgsecure: Biometric presentation attack detection using photopletysmograms
Burge et al. Ear biometrics in computer vision
Burge et al. Ear biometrics
US6529617B1 (en) Method and apparatus for positioning an instrument relative to a patients body during a medical procedure
EP0533891B1 (en) Method for identifying individuals from analysis of elemental shapes derived from biosensor data
CN109558764A (en) Face identification method and device, computer equipment
US6920236B2 (en) Dual band biometric identification system
US8494227B2 (en) System and method for using three dimensional infrared imaging to identify individuals
Prokoski et al. Infrared identification of faces and body parts
KR102554391B1 (en) Iris recognition based user authentication apparatus and method thereof
Sathish et al. Multi-algorithmic iris recognition
Yoshino Recent advances in facial image identification
Heenaye et al. A study of dorsal vein pattern for biometric security
Jain et al. Iris Recognition
Barra Design of a Multi-biometric Platform, based on physical traits and physiological measures: Face, Iris, Ear, ECG and EEG
Ibitayo et al. Development Of Iris Based Age And Gender Detection System
Nair et al. Facial scan change detection
Prokoski et al. 9 INFRARED IDENTIFICATION OF
Burge et al. 13EAR BIOMETRICS
Wang Investigation of infrared hand vein pattern biometrics

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

ENP Entry into the national phase

Ref document number: 2354594

Country of ref document: CA

Ref country code: CA

Ref document number: 2354594

Kind code of ref document: A

Format of ref document f/p: F

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2001 532489

Kind code of ref document: A

Format of ref document f/p: F

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2000986798

Country of ref document: EP

AK Designated states

Kind code of ref document: A3

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WWP Wipo information: published in national office

Ref document number: 2000986798

Country of ref document: EP

AK Designated states

Kind code of ref document: C2

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

COP Corrected version of pamphlet

Free format text: PAGES 1/14-14/14, DRAWINGS, REPLACED BY NEW PAGES 1/12-12/12

WWW Wipo information: withdrawn in national office

Ref document number: 2000986798

Country of ref document: EP