US20070265507A1 - Visual attention and emotional response detection and display system - Google Patents
Visual attention and emotional response detection and display system Download PDFInfo
- Publication number
- US20070265507A1 US20070265507A1 US11/685,552 US68555207A US2007265507A1 US 20070265507 A1 US20070265507 A1 US 20070265507A1 US 68555207 A US68555207 A US 68555207A US 2007265507 A1 US2007265507 A1 US 2007265507A1
- Authority
- US
- United States
- Prior art keywords
- stimulus
- subject
- emotional response
- information
- emotional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Definitions
- the invention relates to computer-implemented systems and methods for determining and displaying visual attention and other physiological signal measurements (e.g., emotional response information of a person in response to presented stimuli) by collecting and analyzing eye movement, other eye properties and/or other data.
- visual attention and other physiological signal measurements e.g., emotional response information of a person in response to presented stimuli
- Eye tracking systems in general are known.
- Emotional response detection systems in general are known.
- various limitations and drawbacks exist with these known systems.
- One aspect of the invention relates to a system and method of determining and displaying visual attention information and emotional response information related to stimuli presented to a subject (e.g. a person being tested).
- visual attention information for example, fixation points and saccades
- a fixation point may be a point or area of a stimulus (e.g., visual image) on which a subject focused for at least a minimum amount of time.
- a fixation point may also refer to a fixation area identified by multiple fixation points and saccades.
- a spotlight may be an aggregation of fixation points visualized through an aggregated transparency on a black mask (or other type of mask) layered above the stimulus.
- the spotlight feature may be used to indicate one or more fixation points.
- Aggregated fixation points may also be used with temporal ordering to create attention points. Attention points may be visualized through numbering to indicate the temporal ordering of the aggregation of fixation points (e.g., spotlight).
- other points or areas e.g., ones that do not meet the threshold or other parameters may be selectively distinguished from the fixation points.
- the gaze plot with spotlight feature can graphically depict what portions of a stimulus that a subject fixated upon (and, if desired, obscure areas that the subject did not fixate on).
- an interested party e.g., a marketing consultant or other entity
- this information alone is useful, by itself it does not indicate whether the subject had an emotional response to the stimuli as a whole, much less an emotional response associated with one or more given fixation points.
- the type of emotion e.g., a positive emotion or a negative emotion
- a subject's emotional response can be determined and displayed for a given stimuli and/or for fixation points of a stimuli.
- a fixation point that is determined to correspond to an emotional response may be referred to as an interest point.
- Emotional response information e.g., type and/or strength of emotion
- visual attention information e.g., displaying emotional response information simultaneously with a gaze plot or other display of visual attention information.
- Interest points may also be displayed alone or simultaneously with visual attention and/or emotional response information.
- the displayed emotional response information may include display of one or more of emotional valence and/or emotional arousal. This information can indicate the type of emotion (e.g. a positive one or a negative one) and/or the strength of the emotion. For interest points, the type and strength of the emotional response (among other things) can be determined and/or displayed.
- the display may use different display characteristics to distinguish between different fixation points, attention points, temporal ordering, interest points and/or emotion types and strengths.
- Emotional response information can be determined in any of a number of ways.
- Various emotion detection techniques are known (e.g., reading facial movement, galvanic skin response and various other techniques).
- the emotional response information can be detected based, at least in part, on the subject's eye properties (e.g., eye movement, blink rate, pupil dilation and/or other eye properties).
- eye properties e.g., eye movement, blink rate, pupil dilation and/or other eye properties.
- eye properties e.g., eye movement, blink rate, pupil dilation and/or other eye properties.
- a system may include, among other things, a set-up module (e.g., for enabling set-up of one or more of test parameters, subject profile, stimuli parameters, calibrations and/or other set-up parameters), a stimuli presentation module (e.g., for managing the storage and presentation of stimuli), a data collection module, an analysis module (e.g., for analyzing the collected data to determine visual attention and/or emotional response) and an output module for selectively outputting information, including information relating to the determined visual attention and/or emotional response information, among other things.
- the output may be in any of a number of different forms, can include various types of information and can include various levels of detail.
- the output module enables the output of visual attention information, such as a gaze plot with a spotlight feature and/or attention points (e.g., as explained above).
- the output module enables output of a subject's emotional response information and/or interest point in motion. Other types and combinations of outputs may be selected.
- Any of the outputs can be for: a single stimulus presented to a single subject (e.g., a person); an aggregate output for a number of stimuli presented to the same subject; an aggregate output of a single stimulus presented to a group of subjects; and/or a number of stimuli presented to a group of subjects.
- Any of the outputs can include a “snapshot” view (e.g., a single result for information determined by sampling over a specific period of time) and/or a time series display (e.g. a series of snapshots over time), animation, and/or a video (e.g., a relatively continuous, motion display showing the subject's eye movement and/or other information over a period of time).
- the visual attention information and/or emotional response information may be recorded and played back to demonstrate the subject's visual attention and/or emotional response in a video replay mode. Playback controls may be provided.
- system and method of the invention may be configured to determine the visual attention of a subject regarding one or more specified stimulus and/or various portions (e.g., selected areas) of the stimuli.
- visual attention information e.g., fixation and/or saccades with respect to a visual stimulus presented on a computer display
- the visual attention information that is determined may include fixation points (gaze) and saccades (e.g., the path between fixation points) and/or other information.
- this enables a subject's eye movements, which may have previously been calibrated to display device coordinates, to be correlated to a visual stimulus or portions thereof.
- the visual attention information relates to what portion(s) of the stimulus the subject is looking at one or more points in time. All or some points/areas of the stimulus at which the subject looked may be identified and displayed or only points/areas meeting certain criteria may be displayed. For example, threshold values may be set to display only points/areas at which a subject fixated on for at least a predetermined minimum period of time or points/areas to which the subject came back to a number of times. Other criteria may include temporal ordering of the points/areas of the stimulus that are identified as fixations.
- a service provider may use the software/system to run test centers that subjects visit.
- one or more test leaders (and/or administrative users) may be used to assist/guide the subjects in conjunction with the testing.
- Self-operated and/or semi-automated test centers e.g., kiosks, PC, etc.
- Remotely supervised testing may also be implemented
- the service provider may collect fees on a variety of bases including, but not limited to, a per test fee, a per stimuli fee, per subject, per segment of population, and/or other bases. Additionally, the amount of fee may vary depending on the type and/or detail of the output. For example, a simple output (e.g., gaze plot only) may be provided for a first fee. A gaze plot with the spotlight feature may be a second fee. A simultaneous display of a gaze plot with basic emotional response information may be a third fee. Adding more detailed emotional response information may be a fourth fee. Other business models for such service providers may be implemented.
- a service provider may operate a remotely accessible (via the Internet or other network) test facility with which subjects can interact remotely therefrom.
- the subject can access the remotely accessible test facility in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise. Fees may be charged as indicated above or otherwise.
- the software may be licensed.
- the licensing may be on a modular basis.
- the visual attention module and/or emotional response module may respectively include a core visual response engine and a core emotional response engine.
- the core engines may each be licensed for a base fee. Separate plug-ins (or other modules) to provide enhanced functionality and/or greater level of detail may be provided for separate fees.
- Yet another business model may require a predetermined type of device to be licensed with the software. For example, a serial number of the eye tracking device may be determined to be an acceptable device before it is allowed access to software functions.
- Other licensing models can be used.
- An invoice module may monitor system activities to facilitate in any invoicing that may be necessary or desired.
- any of the set-up/calibration and/or running of tests may be done manually, automatically and/or semi-automatically. If desired, real-time monitoring of the results may be made available locally or remotely.
- FIG. 1 is an example of a high level representation of a method according to one embodiment of the invention.
- FIG. 2 schematically illustrates a functional block diagram of an example of portions of a system for determining visual attention and emotional response information relating to a stimuli presented to a subject according to an embodiment of the invention.
- FIG. 3 is an illustration of an exemplary functional block diagram of portions of a system according to one embodiment of the invention.
- FIG. 4 is a high-level exemplary flow diagram of methods for setting up and running tests and analyzing test results according to various embodiments of the invention.
- FIG. 5 is an illustration of an exemplary visual stimulus, according to an embodiment of the invention.
- FIG. 6 is an illustration of one example of a output generated by the system, according to an embodiment of the invention.
- FIG. 7 depicts examples of some components of outputs according to some aspects of the invention.
- FIG. 8 is an illustration of an output generated by the system, according to an embodiment of the invention.
- one scenario relates to situations where a subject (e.g., an individual) is tested by presenting stimuli and/or survey questions to the subject (e.g., to determine the subjects reaction to advertisements, a new product, a new feature of a product and/or packaging for a product, among other things).
- a subject e.g., an individual
- stimuli and/or survey questions e.g., to determine the subjects reaction to advertisements, a new product, a new feature of a product and/or packaging for a product, among other things.
- the invention will be discussed primarily in the context of such testing. This is not intended to limit the invention thereto.
- the invention can be used in a wide variety of other scenarios and applications as well.
- testing and/or “study/survey” may broadly refer to a wide variety of activities (e.g., advertising or marketing studies or surveys for new products, new features, new packaging or other testing or studies).
- a “subject” may, for example, include a person, animal or other test subject being tested.
- Stimuli may include any type of sensory stimuli corresponding to any one or more of the five senses (sight, sound, smell, touch, taste) and/or other stimuli.
- Visual stimuli may be presented on a display (e.g., as a single image, two or images sequentially or simultaneously, as a video, or otherwise).
- Examples of visual stimuli may include pictures, artwork, charts, graphs, text, movies, multimedia presentations, interactive content (e.g., video games), or other visual stimuli.
- Stimuli may be recorded (on any type of media) and/or include live scenarios (e.g., driving or riding in a vehicle, etc.)
- Various stimuli and/or stimuli types may be combined.
- stimuli may be selected based on the purpose and need. For example, in an advertising context, stimuli may correspond to a product advertisement to determine the overall reaction to the stimuli (the ad) and more detailed information (e.g., where the subject's attention is drawn to on the ad, and what emotions are felt while perceiving the stimuli or portion thereof).
- an “administrator,” or “administrative user” may refer to the person that performs at least some of the setup operations related to a test (and/or other functions).
- an administrator may interact with the system to input test setup critical parameters including for example, stimuli parameters, subject participants, background variables (e.g., age, gender, location, etc.) and/or parameters.
- a study/survey leader may assist in running the actual test.
- the administrator and the leader may be the same person or different people.
- FIG. 1 illustrates an example of a high level diagram of a method according to one embodiment of the invention.
- Various set-up/calibration steps may be performed (Step 2 ).
- Set-up and calibration techniques in general, are known. Examples of these steps may include, among other things, test set-up, subject setup, stimuli setup, various calibration steps and/or other steps.
- segmentation setup may include collecting both independent and dependent background variables. Stimuli may be presented to a subject (Step 4 ). If desired, survey questions may also be presented to the subject. Survey presentation and survey results collection, in general, are known. However, according to one novel aspect of the invention, survey responses, visual attention information and emotional response information may be correlated.
- Step 6 Data relating to the subject's reactions to the stimuli (including visual attention data and/or emotional response data) are collected (Step 6 ).
- the collected data (and/or other desired information) may be analyzed (Step 8 ).
- the analysis may include determining visual attention information (Step 10 ), emotional response information (Step 12 ), interest point(s) (Step 14 ) and/or other information (e.g., physiological information associated with a subject with respect to one or more presented stimuli).
- Analysis data may then be stored and/or selectively output (Step 16 ).
- the output can be in any of a variety of forms, including a computer displayed report or other type of output.
- One aspect of the invention relates to specific types of output as detailed below.
- FIG. 2 illustrates one example of parts of a simplified view of a system that can be used to implement some aspects of the invention.
- the system may include at least one or more of an eye tracking device 120 , a display device 130 , and computer device 110 .
- the Computer 110 may be programmed (or access a computer/server that is programmed) with at least one or more of a stimuli presentation module 203 , a visual attention engine 205 a , and/or emotional response engine 205 b .
- An output module 206 may be used to generate output 118 .
- One or more storage devices may store stimuli, data, analysis results and/or other information.
- a subject 50 may be positioned in proximity to display device 130 .
- Stimuli presentation module 203 may cause selected stimuli to be displayed on the display device 130 to expose subject 50 to one or more visual (or other) stimuli (e.g. stimuli displayed on display device 130 and/or other device).
- One or more data collection devices e.g., eye tracking device 120 and/or other data collection devices
- the collected data may include a desired number of discrete samples (e.g., 50-60 samples per second or any other desired frequency) over a predetermined period or variable period of time (e.g., 1-3 seconds or any other period).
- the collected data may include a continuous sampling (e.g. a video) for a fixed or variable period of time.
- the collected data may include eye movement and other eye properties, physiological data, environmental data and/or other data relating to the subject's response to various stimuli. Manual input from the user may also be received.
- the eye tracking device 120 may be integrated with and/or mounted on or in the display device 130 .
- these devices may also be implemented as separate units based on various detection environments and scenarios.
- a display device 130 may include a monitor, touch screen, LCD screen, and/or other display devices.
- a simple USB type video camera may be used as the eye-tracking device 120 .
- This (or other eye-tracking devices) may be integrated with or mounted to any usable display.
- Tobii 1750 Eye-tracker commercially available from Tobii Technology AB.
- the eye-tracking device may include or interact with a software program to control the eye-tracker and collection of data thereby.
- the eye-tracking device may include ClearviewTM software (provided by Tobii).
- ClearviewTM software provided by Tobii
- Other eye-tracking software can be used. This software may be a standalone application or maybe bundled with or part of one or more of the other software modules described herein.
- the eye-tracking software may incorporate one or more of the other software modules.
- Other eye-tracking devices, displays and/or technology may be used in place of, or with, the various components described herein.
- FIG. 3 illustrates a more detailed functional block diagram of a system (and other features), according to one embodiment of the invention.
- FIG. 3 illustrates a computer 110 having one or more interfaces 114 for interfacing with one or more input devices 100 , one or more presentation devices 101 and/or one or more output devices 102 .
- Computer 110 may further be in communication with one or more storage devices, such as stimuli database 240 , data collection database 241 , subject profiles database 242 , analysis results database 243 and/or other storage devices.
- One or more of databases 240 , 241 , 242 and 243 may be provided to store stimuli information, collected data, subject profile information, analysis results and/or other data. These databases may be separate databases, as shown for clarity, or one or more may be combined into a single database for storing application system data.
- the input devices 100 may be used for receiving input (e.g., from a subject 50 or other input).
- the input may include but is not limited to, information regarding a subject's visual attention, emotional response and/or other responses to stimuli.
- Other input may include user information received during a set-up/calibration procedure, survey responses and/or other user input, and other desired input.
- Sensors such as scent sensors, tactile sensors, sound sensors and/or other sensors may also be used as input devices.
- the presentation devices may include, for example, one or more of display device 130 , speaker(s) 180 , and other presentation devices.
- Display device 130 may be used for visually displaying and presenting visual stimuli to a subject.
- the output devices 102 may include, for example, one or more of a display device 130 (or other display), speakers 180 , printer 190 , and or other output devices.
- the display device 130 may include a video display for displaying a video playback of the collected data or a processed version of the collected data.
- Computer 110 is programmed with, or is in communication with a computer (e.g., a remote server) that is programmed with, a software application (e.g., application 200 illustrated in FIG. 3 ) to perform the functions described herein.
- Computer 110 may be a single computer or multiple computers.
- One or more computers 110 may be located locally (in proximity to the test subject 50 ) and one or more may be located remotely from the test subject 50 (e.g.
- One or more computers 110 can be standalone computers running an application 200 .
- One or more computers 110 can be networked (e.g., via network interface 209 to one another and/or any third party device 260 , to enable networked communication there between. This may enable, among other things, browser-based access from one computer to a central computer 110 running application 200 .
- the computer 110 may access the application 200 over a network 250 (e.g., the Internet, an intranet, WAN, LAN, etc.) via any wired and/or wireless communications links.
- a network 250 e.g., the Internet, an intranet, WAN, LAN, etc.
- Application 200 may include one or more computer software programs and/or modules that, among other things, perform functions set forth herein.
- application 200 may perform functions including one or more of setup/calibration, testing, stimuli presentation, data collection, analysis, output, generation, and/or formatting, invoicing, data mining, among others.
- modules 201 - 209 For convenience various ones of the functions may be carried out by various modules 201 - 209 , as shown for example in FIG. 3 .
- One or more modules may be combined and any module shown as a single module may include two or more modules.
- the modules may include at least one or more of an interface controller module 201 , a setup module 202 , a stimuli presentation module 203 a data collection module 204 , an analysis module 205 , an output module 206 , an invoice module 207 , a data mining module 208 and/or other modules. Not all modules need to be used in all situations.
- One or more interface controller module 201 may be associated with and/or in communication with one or more input devices 100 , presentation devices 101 , and output devices 102 in any known manner.
- One or more controllers 201 may be implemented as a hardware (and/or software) component of the computer 110 and used to enable communication with the devices attached to the computer 110 .
- the communication can be conducted over any type of wired or wireless communication link. Secure communication protocols can be used where desired.
- Setup module 202 includes sub-modules for one or more of subject setup 202 a , stimuli setup 202 b , calibration 202 c and/or other setup/calibration procedures. These procedures may include those referred to in connection with Step 2 of FIG. 1 , among others.
- Data received by the system during setup/calibration e.g., background variables, test parameters, stimuli parameters, subject parameters, etc.
- Stimuli presentation module 203 may be provided to facilitate the presentation of stimuli according to stimuli setup information, stored stimuli and/or other stimuli presentation properties.
- the stimuli presentation module 203 may include or interact with a graphical user interface (not shown) to enable stimuli to be managed (e.g., stored, deleted, modified, uploaded/downloaded, or otherwise managed) by an administrative user or otherwise. Additionally, a user interface can enable one more stimuli and stimuli presentation properties to be selected for use with a particular test or other application.
- the data collection module 204 may collect data (e.g., from one or more of input devices 100 or other input devices) during stimuli presentation (and at other times). The data collection module 204 may cause the collected data to be stored in data collection database 241 or other database for later (or real-time) analysis.
- Analysis may be done by the analysis module 205 and/or other processor.
- Analysis module 205 may include sub-modules for visual attention processing 205 a , emotional response processing 205 b , and/or other sub-modules. If desired, various plug-ins 205 c , may be used to enhance the functionality of a core emotional response engine and/or visual attention engine.
- Analysis results may be stored in analysis database 243 or other database.
- the analysis module 205 may process the collected data using one or more error detection and correction (data cleansing) techniques. As such, the collected data may be refined and filtered to decrease signaling noise and other errors. The clean data may be more easily and/or accurately analyzed.
- Various plug-ins 205 c may be used to offer greater level of detail to and/or additional functions regarding the visual attention and/or emotional response processing.
- interest points may be determined from the collected data. Some details may include detailed interest points, emotional valence determination, emotional arousal determination, emotion name and type.
- An output module 206 may selectively enable various types of outputs to output from the application 200 to one or more output devices 102 .
- the output module 206 may be used to produce reports based on analysis results. For example, visual attention information and emotional response information may be output and presented with respect to the actual stimuli in the report output 118 .
- Various electronic and/or printed output types may include, but are not limited to, representation in the form of graphs, text, illustrations, gaze plots, emotion meters, audio, and/or video play back, to name a few. Further details and examples of output are set forth in connection with FIGS. 6-8 . Other output types and formats may be used
- FIG. 4 illustrates examples of methods for carrying out various aspects of one embodiment of the invention.
- FIG. 4 illustrates a Study/Survey setup phase, a Study/Survey Run phase and a Study/Survey Analysis phase.
- These phases and/or other phases may be carried out at a test facility (or outside the test facility) in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise.
- Testing may be supervised, semi-supervised or unsupervised.
- the testing may be run by a study/survey leader on each subject manually. Outside of the testing facility, the subject 50 may run the study/survey with or without a study/survey leader.
- the subject's emotional state may remain unaltered and unaffected by the presence of a study/survey leader.
- a combination of aspects from the testing facility and from outside the testing facility may be used during the phases illustrated in FIG. 4 .
- Other testing environments may also be included within the scope of the invention.
- Study/Survey setup phase In some or all testing environments, there may be Study/Survey setup phase. In this phase the administrator (or other individual) enters or selects the stimuli and/or survey data and other setup parameters (e.g., background variables). This information may be stored in stimuli database 240 and/or other database(s) (step 501 ).
- the stimuli to be presented during the study/survey may be selected using the study/survey setup sub-module 202 b of the setup module 202 .
- the selected stimuli may be loaded on the computer 110 and/or stored in stimuli database 240 or other database.
- Various stimuli sources may be used.
- Remote stimuli (not shown) may be accessed via network interface 209 over the network 250 (e.g., internet, intranet, etc.) to download stimuli from the remote source such as an advertisement database.
- Another stimuli source may be a stimuli creation application which may allow the creation and/or customization of stimuli.
- the creation application may enable multimedia stimuli creation.
- stimuli presentation properties may also be selected. For example for a given test/study, one or more of the stimuli duration for one or more stimuli, the order of presentation of stimuli (e.g., random presentation of stimuli), whether any stimuli should be simultaneously presented, an/or other stimuli properties may be selected.
- the parameters for identifying a fixation point may be provided during a set up of stimuli properties (or at other times). For example, this may be based at least on threshold values for dwell time or other parameters.
- the visual display of spotlight(s) may be set-up to be based on a number of aggregated fixation points or other factors.
- the attention points may be set-up to visually indicate the temporal ordering (e.g., semitransparent number indicator) of aggregated fixation points with respect to identified spotlights.
- Interest points may be identified based on fixation point (e.g., as determined by selected criteria) and emotional response (as defined by selected criteria) at the fixation point. For example, it may be specified that if a particular type and/or strength of emotional response is associated with one or more fixation point(s), this may identify an interest point(s).
- Output presentation properties may also be specified using the setup module 202 .
- the output presentation properties may identify what analysis will be done, output type and/or format, who should receive the output and/or how the output will be received, among other things.
- the level of information to be included in an output report may be specified using, for example, a presentation format including predetermined templates.
- the parties to receive the output information and the associated transmission means may also be specified as part of the output presentation properties.
- the output(s) may be sent to a specified user/device using a predetermined transmission means (e.g., email, phone, FTP, etc.).
- the output presentation properties may be entered by one or more of the administrator, leader, and subject and/or other individual.
- the method of FIG. 4 may also include receiving profile (and/or other) information regarding the subject (e.g., background variables including age, gender, location, etc.)
- the leader may enter or guide the subject(s) to enter details of the participating subject (Step 502 ). This may include using subject set-up sub-module 202 a of the setup module 202 .
- the information may be stored in subject profile database 242 or other database.
- Calibration of the subject may also be performed, either manually, automatically and/or semi-automatically (step 504 ).
- stimuli and/or survey questions may be presented for display to the subject (Step 506 ).
- the subject may answer survey questions manually or otherwise (Step 508 ).
- Visual attention data and emotional response data may be collected as described elsewhere herein. In other testing environments, various ones of these steps may be performed without a leader (steps 512 - 514 , 516 and 518 ).
- Step 510 After the stimuli presentation of the study/survey is completed, it is determined whether another participating subject is available (Step 510 , 520 ). If so, the process may be repeated with another subject. If not, the study session may be concluded and/or analysis may be performed (Step 550 ).
- Analysis may be performed at the conclusion of a test/study and/or in real-time as data is collected.
- the analysis may include processing the collected data to determine visual attention information and/or emotional response information, among other things.
- Eye-tracking, emotional response (and other) calibration techniques in general are known. Examples of some aspects of the calibration routines that may be used with the invention are provided. Other calibration techniques may also be used.
- Calibration sub-module 202 c performs calibration activity, including subject/device calibration. Eye tracking device 120 and other input devices may be calibrated based on environmental settings and scenarios. Also during calibration, the calibration sub-module 202 c may present the subject with a number of calibrations points located in predetermined locations of the display device or the subject's field of vision for subject specific calibration.
- the calibration points may correspond to coordinates of the display device on which the subject may be prompted to focus and move between until the eye tracking device has calibrated the movement of the subject's eyes in relation to the display device coordinates (e.g., x, y, z coordinates)
- the point calibration information is recorded and stored with the subject profile data for future testing sessions.
- Emotional calibration may also be recorded and stored.
- the subject may be presented with predetermined stimuli used to evoke a certain emotion in order to observe the subjects emotional reaction in relation to their eye properties.
- a subject may be presented with a stimuli known to elicit a positive (e.g., pleasant), neutral, or negative (e.g., unpleasant) response.
- a subject may be presented with an emotionally neutral stimulus in order to record blink rate pattern, pupil response, saccadic movements, and/or other properties to characterize the subject's response to neutral stimuli.
- the subject may be presented with stimuli known to evoke a certain emotion based on the subject's demographic and other personal data.
- the emotional reaction may be used to set an emotional baseline for various emotions. Thus, study/survey stimuli may be compared with a subject's baseline to understand the magnitude of emotional valence.
- various data may be collected.
- the collected data may be processed in real-time or subsequently. Collected and processed data may be presented as output information to present visual attention, emotional response and/or other information in a variety of formats as discussed in reference to output presentation properties.
- One type of output may be a visual output to a display, a visual printout or other visual output. Non-visual output may also be provided.
- the output may include a graphical representation including visual attention information (e.g., one or more gaze plot) and/or emotional response information (e.g. one or more emotion meter) for one or more stimuli.
- the gaze plot(s) e.g., spotlight(s), attention points, interest points
- the gaze plot may be superimposed on the relevant stimulus (or stimuli if two or more are simultaneously displayed).
- the gaze plot may include a spotlight feature to highlight aggregated fixation points, attention points to highlight temporal ordering of aggregated fixation points and interest points to highlight emotional response.
- FIG. 5 is an illustration of exemplary visual stimuli, according to an embodiment of the invention.
- FIG. 6 is an example of an output, according to one aspect of the invention, relating to the stimuli of FIG. 5 .
- the output includes a simultaneous display of visual attention information 800 and emotional response information 810 (e.g., an emotion meter) relating to a subject's response to a visual stimulus (e.g., the stimulus 700 of FIG. 5 ).
- the visual attention information 800 includes a gaze plot with a spotlight feature and attention points.
- the spotlight feature highlights (or otherwise illustrates) one or more fixation points and/or interest points of the visual stimulus 700 .
- a virtual mask may be superimposed over all or some of the stimulus (e.g., visual image 700 ) and portions of the mask, corresponding to one or more fixation points (e.g., based on minimum time of fixation), may be effectively removed or made more transparent to reveal the underlying portion of the stimulus.
- Another approach is to start with the entire stimulus revealed and selectively mask non-fixation points.
- the mask may have a first set of optical characteristics and the removed portions may have a second set of optical characteristics (e.g., to distinguish the one or more fixation points from the rest of the stimulus).
- the mask may be at least relatively opaque (to fully or partially obscure the underlying portion of the stimulus) and the removed portions corresponding to the fixation points may be made at least relatively more transparent to highlight (or spotlight) the fixation points (as shown for example by pointers 801 - 804 ). Areas illustrated by 801 , 802 , 803 , and 804 may also be include attention points for numbering according to the temporal ordering of fixation. If desired, the actual stimulus may be displayed in proximity to the gaze plot to easily see the masked portions of the stimulus.
- the fixation points may be displayed more brightly than the other points.
- Other techniques for visually displaying distinctions between fixation points and non-fixation points may be used.
- a relative difference in optical characteristics may also be used to indicate the magnitude of fixation points. For example, if a subject dwells at a first fixation point for a longer time than a second fixation point, the first fixation point may be relatively more transparent than the second fixation point, yet each may be more transparent than non-fixation points. Other optical characteristics can be used to distinguish among fixation points and to distinguish fixation points from non-fixation points.
- the order of the fixation points may be visually indicated using attention points, either statically or dynamically. If static, the fixation points may be marked with numbers (or other indictors) to match the temporal ordering of one or more fixation points. If dynamic, a first fixation point may be highlighted as compared with other fixation points (e.g., displayed more transparently or more brightly). Then a second and other fixation points may be highlighted in a sequential fashion.
- a fixation point that is determined to correspond to an emotional response may be referred to as an interest point.
- One or more interest points may be displayed differently than fixation points that are not associated with an emotional response.
- one interest point may be displayed differently than another interest point based on the determined emotional valence and/or arousal associated with the point or other differences.
- a spotlight feature may be used to highlight one or more portions/areas of the visual stimulus that correspond to interest points. Characteristics of the interest point spotlights may vary to indicate the type and/or strength of a subject's emotional response associated with the fixation point.
- Emotional response information 810 may be displayed simultaneously with visual attention information 800 .
- the emotional response information 810 may include an overall emotional response based on the subject's response to the stimulus (or stimuli) and/or area related emotional response information corresponding to portions of one or more stimuli. For example, a more detailed level of emotional response may be provided by separately displaying emotional response information for one or more fixation points. As shown in FIG. 6 , by way of example only, an emotional response meter may show the emotional valence and/or arousal for one or more fixation points. Emotional valence may also be displayed for interest points, spotlights, and/or attention points.
- Textual information may be included at various locations on the report, if desired.
- FIG. 7 illustrates some options for display of visual attention information and emotional response information. Various permutations of these features may be used together. Not all features need be used in all cases.
- the visual attention information may include a gaze plot (with or without the spotlight feature, attention points, interest points).
- a gaze plot if used, may illustrate a scan path corresponding to the subject's eye movements, fixation points, and/or interest points.
- the visual attention information may be for one or more stimuli at a time.
- the visual attention information may be static or dynamic.
- a dynamic display may include a sequence of individual displays (e.g., a slide show mode), animated playback, one or more videos and/or other dynamic displays.
- Some output may be automatically generated according to one or more templates.
- Various templates and/or template parameters may be pre-stored in the system. Pre-stored templates can be selected and/or modified (e.g., by an administrative user, test-study leader or other entity). New templates may also be created and stored.
- Reports and other output 118 may be automatically sent to one or more recipients and/or recipient devices.
- subject 50 third party device 250 , a study/survey leader, an administrator, and/or other recipient.
- Output 118 may be stored for later retrieval, transmission, and/or data warehousing.
- Output and reports can be in any of a number of formats, including without limitation, JPEG, Word, PDF, XML and any other convenient output format.
- emotion maps may be displayed simultaneously and in synchronization with the stimuli that provoked them.
- a first gaze plot with spotlight feature for a first stimulus 900 a may be displayed in proximity to corresponding emotion map 900 b which depicts the emotional response of a subject to stimulus 900 a .
- a second gaze plot with spotlight feature for a second stimulus 904 a may be displayed in proximity to corresponding emotion map 904 b which depicts the emotional response of a subject to stimulus 904 a , and so on.
- Different display formats may be utilized.
- Report information along with data from databases may be further analyzed for data mining purposes.
- Data within theses databases may be used to uncover patterns and relationships contained within the collected data, subject data, and/or analysis results.
- Background variables e.g., collected during set-up or other time
- data mining can be done manually or automatically via data mining module 208 over all or portions of the data.
- Survey questions may be presented one at a time, or a number of survey questions may be shown at one time on a single screen.
- the order, timing, and display attributes of stimuli may be determined by the administrator and/or subject/survey leader at setup, based on what the administrator may want to analyze.
- the administrator may want to study the subject's response to two or more competing market brands. A simultaneous, side by side presentation of stimuli may elicit different visual attention information and emotional reaction with respect to the two or more brands than a sequential display. Other comparative studies may be conducted.
- Collected data may comprise eye property data, other physiological data, environmental data, and/or other data.
- Collected eye property data may include data relating to a subject's pupil size, blink properties, eye position (or gaze) properties, or other eye properties.
- Collected pupil data may comprise pupil size, velocity of change (contraction or dilation), acceleration (which may be derived from velocity), or other pupil data.
- Collected blink data may include, for example, blink frequency, blink duration, blink potention, blink magnitude, or other blink data.
- Collected gaze data may comprise, for example, saccades, express saccades, nystagmus, or other gaze data.
- Data relating to the movement of facial muscles (or facial expressions in general) may also be collected. If a subject is presented with stimuli, collected data may be synchronized with the presented stimuli.
- Visual attention information components may be decoded from the visual cues (e.g., collected eye property data). This may be done, for example, by applying one or more rules from a visual attention analysis sub-module 205 a . Determination and analysis of visual attention may involve various aspects including interest points and interest tracking. Interest points may be based on fixation (gaze) rate and the type of saccades on a portion or portions of the visual stimuli coupled with emotional response as determined by the eye properties. Processing gaze (or eye movement data) may comprise, for example, analyzing saccades, express saccades (e.g., saccades with a velocity greater than approximately 100 degrees per second), and nystagmus (rapid involuntary movements of the eye), or other data.
- Features of interest may include the velocity (deg/s) and direction of eye movements, fixation time (e.g., how long does the eye focus on one point), the location of the fixation in space (e.g., area as defined by x, y, z or other coordinates), or other features including return to fixation areas, relevance, vergence for depth evaluation, and scan activity.
- Visual attention may be determined by setting an adjustable fixation/gazing threshold.
- a sliding window measured in milliseconds (or other unit of time) can be set as a threshold, for example, 400 ms, in order to determine which points or areas on the visual stimuli the subject gazed at for at least 400 ms. If the user remains fixated on the area for at least the window of time, the area of the visual stimuli may be identified as a fixation point.
- the emotional response (e.g., arousal, valence, if any) corresponding to the fixation point may determine the level of interest in that fixation point. For example, if a determined fixation point also elicited an emotional response that exceeds a predetermined emotional threshold value, then the fixation point may be identified as an interest point.
- interest points/areas may be identified by the area(s) of a visual stimulus which the subject gazes or fixates upon for more than a predetermined period of time (the selectable threshold value) and elicits a measurable emotional response (emotional threshold value).
- the sliding window threshold is made smaller, for example 100 ms, the subject's entire scan path on the visual stimuli may be revealed. This may allow an administrator or analyzer to see if a specific feature of a visual stimulus was even looked at and for how long.
- Graphical representation of the subject's visual attention may be put in the form of a gaze plot.
- Emotional response components may include, for example, emotional valence, emotional arousal, emotional category, and/or emotional type. Other components may be determined.
- Emotional valence may indicate whether a subject's emotional response to a given stimulus is a positive emotional response (e.g., pleasant or “like”), negative emotional response (e.g., unpleasant or “dislike”), or neutral emotional response.
- Emotional arousal may comprise an indication of the intensity or emotional strength of a response subject a predetermined scale based on the calibrated emotional baseline.
- Known relationships exist between a subject's emotional valence and arousal and physical properties such as pupil size, blink properties, facial expressions, and eye movement.
- Processing pupil data may further comprise determining the velocity of change or how fast a dilation or contraction occurs in response to a stimulus, as well as acceleration which can be derived from velocity.
- Other pupil-related data including pupil base level and base distance may be determined as well as, for instance, minimum and maximum pupil sizes.
- Processing blink data may comprise, for example, determining blink frequency, blink duration, blink potention, blink magnitude, or other blink data.
- Blink frequency measurement may include determining the timeframe between sudden blink activity.
- Blink duration (in, for example, milliseconds) may also be processed to differentiate attentional blinks from physiological blinks.
- File Blink patterns may be differentiated based on their duration.
- Neutral blinks may be classified as those which correspond to the blinks measured during calibration.
- Long blink intervals may indicate increased attention, while short blinks may indicate that the subject may be searching for information.
- Very short blink intervals may indicate confusion, while half-blinks may serve as an indication of a heightened sense of alert.
- Blink velocity refers to how fast the amount of eyeball visibility is changing while the magnitude of a blink refers to how much of the eyeball is visible while blinking.
- analysis module 205 may decode emotional cues from extracted feature data by applying one or more rules from an emotional reaction analysis sub-module 205 b to the collected data to determine one or more emotional components.
- a service provider may use the software/system to run test centers that subjects physically visit. Tests/studies may be performed on behalf of a third party (e.g. a consumer products company). In this scenario, one or more test leaders may be used to assist/guide the subjects in conjunction with the testing. Self-operated test centers (e.g., kiosks) may also be used with or without a leader.
- the service provider may collect fees from the third party on a variety of bases.
- the fees may include, but are not limited to, a per test fee per subject, a per test fee for a number of subjects, a per stimuli fee, per segment of subjects and/or other bases.
- the amount of fee may vary depending on the type/detail of output.
- a simple visual attention output e.g., gaze plot only
- More detailed information e.g., a gaze plot with the spotlight feature
- a simultaneous display of visual attention information e.g., a gaze plot with or without a spotlight feature
- basic emotional response information may be a third fee.
- Adding more detailed emotional response information may be a fourth fee.
- Other types of outputs, video, animated, etc. may command other fees.
- Other business models for such service providers may be implemented.
- a service provider may operate a remotely accessible (via the Internet or other network) test facility with which subjects can interact remotely.
- the subject can access the remotely accessible test facility in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise. Fees may be charged as indicated above or otherwise.
- an invoice module (e.g., invoice module 207 ) may be used to at least partially automate the process of billing.
- the invoice module 207 may monitor system information and automatically determine fees and generate invoices. Fee information may be input during a setup phase or otherwise.
- the monitored information may include test run, subject tested, stimuli presented, type and/or level of detail of output and/or other information upon which fees maybe based.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application No. 60/781,321 filed on Mar. 13, 2006. The entire teachings of the above application are incorporated herein by reference.
- The invention relates to computer-implemented systems and methods for determining and displaying visual attention and other physiological signal measurements (e.g., emotional response information of a person in response to presented stimuli) by collecting and analyzing eye movement, other eye properties and/or other data.
- Eye tracking systems in general are known. Emotional response detection systems in general are known. However, various limitations and drawbacks exist with these known systems.
- The display of visual attention data in general is known. However, various limitations and drawbacks exist with known systems. Additionally, the simultaneous display of visual attention data and corresponding emotional response has not traditionally been used.
- Other drawbacks and limitations exist with known systems.
- One aspect of the invention relates to a system and method of determining and displaying visual attention information and emotional response information related to stimuli presented to a subject (e.g. a person being tested). According to one aspect of the invention, visual attention information (for example, fixation points and saccades) is determined and then displayed, for example, using a gaze plot with a spotlight feature. A fixation point may be a point or area of a stimulus (e.g., visual image) on which a subject focused for at least a minimum amount of time. As used herein, a fixation point may also refer to a fixation area identified by multiple fixation points and saccades. A spotlight may be an aggregation of fixation points visualized through an aggregated transparency on a black mask (or other type of mask) layered above the stimulus. For example, based on selectable thresholds (e.g., administrative user selected thresholds) and/or other parameters, the spotlight feature may be used to indicate one or more fixation points. Aggregated fixation points may also be used with temporal ordering to create attention points. Attention points may be visualized through numbering to indicate the temporal ordering of the aggregation of fixation points (e.g., spotlight). If desired, other points or areas (e.g., ones that do not meet the threshold or other parameters) may be selectively distinguished from the fixation points.
- One advantage of this is that the gaze plot with spotlight feature can graphically depict what portions of a stimulus that a subject fixated upon (and, if desired, obscure areas that the subject did not fixate on). This enables an interested party (e.g., a marketing consultant or other entity) to easily see what portions of the stimuli the subject fixated on and/or which portions were not fixated on for a given stimuli. While this information alone is useful, by itself it does not indicate whether the subject had an emotional response to the stimuli as a whole, much less an emotional response associated with one or more given fixation points. Nor does it indicate, if there was an emotional response, the type of emotion (e.g., a positive emotion or a negative emotion) or how strong the emotion was.
- According to another aspect of the invention, a subject's emotional response can be determined and displayed for a given stimuli and/or for fixation points of a stimuli. A fixation point that is determined to correspond to an emotional response may be referred to as an interest point. Emotional response information (e.g., type and/or strength of emotion) may be displayed simultaneously with visual attention information (e.g., displaying emotional response information simultaneously with a gaze plot or other display of visual attention information). Interest points may also be displayed alone or simultaneously with visual attention and/or emotional response information.
- The displayed emotional response information may include display of one or more of emotional valence and/or emotional arousal. This information can indicate the type of emotion (e.g. a positive one or a negative one) and/or the strength of the emotion. For interest points, the type and strength of the emotional response (among other things) can be determined and/or displayed. The display may use different display characteristics to distinguish between different fixation points, attention points, temporal ordering, interest points and/or emotion types and strengths.
- Emotional response information can be determined in any of a number of ways. Various emotion detection techniques are known (e.g., reading facial movement, galvanic skin response and various other techniques). According to one synergistic embodiment of the invention, the emotional response information can be detected based, at least in part, on the subject's eye properties (e.g., eye movement, blink rate, pupil dilation and/or other eye properties). Advantageously, this enables (if desired) the same eye tracking device that is used to collect visual attention data to collect emotional response data. Various other emotion detection techniques may used with, or instead of, eye property detection.
- Various configurations, features and functions may be used in various combinations within the scope of the invention. By way of example only, and without limitation, only some examples are described herein. According to one embodiment, a system may include, among other things, a set-up module (e.g., for enabling set-up of one or more of test parameters, subject profile, stimuli parameters, calibrations and/or other set-up parameters), a stimuli presentation module (e.g., for managing the storage and presentation of stimuli), a data collection module, an analysis module (e.g., for analyzing the collected data to determine visual attention and/or emotional response) and an output module for selectively outputting information, including information relating to the determined visual attention and/or emotional response information, among other things. The output may be in any of a number of different forms, can include various types of information and can include various levels of detail.
- According to one aspect of the invention, the output module enables the output of visual attention information, such as a gaze plot with a spotlight feature and/or attention points (e.g., as explained above). According to another aspect of the invention, the output module enables output of a subject's emotional response information and/or interest point in motion. Other types and combinations of outputs may be selected.
- Any of the outputs can be for: a single stimulus presented to a single subject (e.g., a person); an aggregate output for a number of stimuli presented to the same subject; an aggregate output of a single stimulus presented to a group of subjects; and/or a number of stimuli presented to a group of subjects. Any of the outputs can include a “snapshot” view (e.g., a single result for information determined by sampling over a specific period of time) and/or a time series display (e.g. a series of snapshots over time), animation, and/or a video (e.g., a relatively continuous, motion display showing the subject's eye movement and/or other information over a period of time). According to this aspect of the invention, the visual attention information and/or emotional response information may be recorded and played back to demonstrate the subject's visual attention and/or emotional response in a video replay mode. Playback controls may be provided.
- In one embodiment, the system and method of the invention may be configured to determine the visual attention of a subject regarding one or more specified stimulus and/or various portions (e.g., selected areas) of the stimuli.
- After any necessary and/or desired set-up (e.g., collection of background variables including subject's age, address, gender, and/or other demographic information) and/or calibration steps are performed, visual attention information (e.g., fixation and/or saccades with respect to a visual stimulus presented on a computer display) may be determined, at least in part, by tracking eye properties (including, for example, collecting data relating to eye position, eye movement, rate of eye movement, and/or other eye properties). The visual attention information that is determined may include fixation points (gaze) and saccades (e.g., the path between fixation points) and/or other information. According to one aspect of the invention, this enables a subject's eye movements, which may have previously been calibrated to display device coordinates, to be correlated to a visual stimulus or portions thereof. In general, the visual attention information relates to what portion(s) of the stimulus the subject is looking at one or more points in time. All or some points/areas of the stimulus at which the subject looked may be identified and displayed or only points/areas meeting certain criteria may be displayed. For example, threshold values may be set to display only points/areas at which a subject fixated on for at least a predetermined minimum period of time or points/areas to which the subject came back to a number of times. Other criteria may include temporal ordering of the points/areas of the stimulus that are identified as fixations. From a business perspective, a service provider may use the software/system to run test centers that subjects visit. In this scenario, one or more test leaders (and/or administrative users) may be used to assist/guide the subjects in conjunction with the testing. Self-operated and/or semi-automated test centers (e.g., kiosks, PC, etc.) may also be used with or without a test leader. Remotely supervised testing may also be implemented
- The service provider may collect fees on a variety of bases including, but not limited to, a per test fee, a per stimuli fee, per subject, per segment of population, and/or other bases. Additionally, the amount of fee may vary depending on the type and/or detail of the output. For example, a simple output (e.g., gaze plot only) may be provided for a first fee. A gaze plot with the spotlight feature may be a second fee. A simultaneous display of a gaze plot with basic emotional response information may be a third fee. Adding more detailed emotional response information may be a fourth fee. Other business models for such service providers may be implemented.
- According to another business method, a service provider may operate a remotely accessible (via the Internet or other network) test facility with which subjects can interact remotely therefrom. The subject can access the remotely accessible test facility in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise. Fees may be charged as indicated above or otherwise.
- According to another business model, the software may be licensed. As detailed below, the licensing may be on a modular basis. For example, the visual attention module and/or emotional response module may respectively include a core visual response engine and a core emotional response engine. The core engines may each be licensed for a base fee. Separate plug-ins (or other modules) to provide enhanced functionality and/or greater level of detail may be provided for separate fees. Yet another business model may require a predetermined type of device to be licensed with the software. For example, a serial number of the eye tracking device may be determined to be an acceptable device before it is allowed access to software functions. Other licensing models can be used. An invoice module may monitor system activities to facilitate in any invoicing that may be necessary or desired.
- To accommodate these and other business methods, any of the set-up/calibration and/or running of tests may be done manually, automatically and/or semi-automatically. If desired, real-time monitoring of the results may be made available locally or remotely.
- Various other features and functions may be used with one or more aspects of the invention. Not all of the features and function described herein need to be used in all cases. Any combination of features and/or functions may be used as desired. The examples provided below are for ease of understanding. The invention is not limited to any specific implementation.
-
FIG. 1 is an example of a high level representation of a method according to one embodiment of the invention. -
FIG. 2 schematically illustrates a functional block diagram of an example of portions of a system for determining visual attention and emotional response information relating to a stimuli presented to a subject according to an embodiment of the invention. -
FIG. 3 is an illustration of an exemplary functional block diagram of portions of a system according to one embodiment of the invention. -
FIG. 4 is a high-level exemplary flow diagram of methods for setting up and running tests and analyzing test results according to various embodiments of the invention. -
FIG. 5 is an illustration of an exemplary visual stimulus, according to an embodiment of the invention. -
FIG. 6 is an illustration of one example of a output generated by the system, according to an embodiment of the invention. -
FIG. 7 depicts examples of some components of outputs according to some aspects of the invention. -
FIG. 8 is an illustration of an output generated by the system, according to an embodiment of the invention. - The systems and methods of the invention have a broad range of applicability. For purposes of clarity, one scenario in which these features are beneficial will be described. By way of example, one scenario relates to situations where a subject (e.g., an individual) is tested by presenting stimuli and/or survey questions to the subject (e.g., to determine the subjects reaction to advertisements, a new product, a new feature of a product and/or packaging for a product, among other things). For convenience, the invention will be discussed primarily in the context of such testing. This is not intended to limit the invention thereto. The invention can be used in a wide variety of other scenarios and applications as well.
- As used herein, “testing” and/or “study/survey” may broadly refer to a wide variety of activities (e.g., advertising or marketing studies or surveys for new products, new features, new packaging or other testing or studies). A “subject” may, for example, include a person, animal or other test subject being tested. Stimuli may include any type of sensory stimuli corresponding to any one or more of the five senses (sight, sound, smell, touch, taste) and/or other stimuli. Visual stimuli may be presented on a display (e.g., as a single image, two or images sequentially or simultaneously, as a video, or otherwise).
- Examples of visual stimuli, for instance, may include pictures, artwork, charts, graphs, text, movies, multimedia presentations, interactive content (e.g., video games), or other visual stimuli. Stimuli may be recorded (on any type of media) and/or include live scenarios (e.g., driving or riding in a vehicle, etc.) Various stimuli and/or stimuli types may be combined. For any test or other scenario, stimuli may be selected based on the purpose and need. For example, in an advertising context, stimuli may correspond to a product advertisement to determine the overall reaction to the stimuli (the ad) and more detailed information (e.g., where the subject's attention is drawn to on the ad, and what emotions are felt while perceiving the stimuli or portion thereof).
- A used herein, an “administrator,” or “administrative user” (if one is used) may refer to the person that performs at least some of the setup operations related to a test (and/or other functions). For example, an administrator may interact with the system to input test setup critical parameters including for example, stimuli parameters, subject participants, background variables (e.g., age, gender, location, etc.) and/or parameters.
- A study/survey leader, (if one is used) may assist in running the actual test. The administrator and the leader may be the same person or different people.
-
FIG. 1 illustrates an example of a high level diagram of a method according to one embodiment of the invention. Various set-up/calibration steps may be performed (Step 2). Set-up and calibration techniques, in general, are known. Examples of these steps may include, among other things, test set-up, subject setup, stimuli setup, various calibration steps and/or other steps. According to one novel aspect of the invention, segmentation setup may include collecting both independent and dependent background variables. Stimuli may be presented to a subject (Step 4). If desired, survey questions may also be presented to the subject. Survey presentation and survey results collection, in general, are known. However, according to one novel aspect of the invention, survey responses, visual attention information and emotional response information may be correlated. - Data relating to the subject's reactions to the stimuli (including visual attention data and/or emotional response data) are collected (Step 6). During and/or after stimuli presentation, the collected data (and/or other desired information) may be analyzed (Step 8). The analysis may include determining visual attention information (Step 10), emotional response information (Step 12), interest point(s) (Step 14) and/or other information (e.g., physiological information associated with a subject with respect to one or more presented stimuli). Analysis data may then be stored and/or selectively output (Step 16). The output can be in any of a variety of forms, including a computer displayed report or other type of output. One aspect of the invention relates to specific types of output as detailed below.
-
FIG. 2 illustrates one example of parts of a simplified view of a system that can be used to implement some aspects of the invention. As illustrated, the system may include at least one or more of aneye tracking device 120, adisplay device 130, andcomputer device 110. TheComputer 110 may be programmed (or access a computer/server that is programmed) with at least one or more of astimuli presentation module 203, avisual attention engine 205 a, and/oremotional response engine 205 b. Anoutput module 206 may be used to generateoutput 118. One or more storage devices (not shown inFIG. 2 for simplicity) may store stimuli, data, analysis results and/or other information. - In operation, a subject 50 may be positioned in proximity to display
device 130.Stimuli presentation module 203 may cause selected stimuli to be displayed on thedisplay device 130 to expose subject 50 to one or more visual (or other) stimuli (e.g. stimuli displayed ondisplay device 130 and/or other device). One or more data collection devices (e.g.,eye tracking device 120 and/or other data collection devices) may collect data and/or record information regarding the subject's responses. The collected data may include a desired number of discrete samples (e.g., 50-60 samples per second or any other desired frequency) over a predetermined period or variable period of time (e.g., 1-3 seconds or any other period). Alternatively or in addition, the collected data may include a continuous sampling (e.g. a video) for a fixed or variable period of time. The collected data may include eye movement and other eye properties, physiological data, environmental data and/or other data relating to the subject's response to various stimuli. Manual input from the user may also be received. - According to one advantageous aspect of the invention, the
eye tracking device 120 may be integrated with and/or mounted on or in thedisplay device 130. However, these devices may also be implemented as separate units based on various detection environments and scenarios. Adisplay device 130 may include a monitor, touch screen, LCD screen, and/or other display devices. If desired, a simple USB type video camera may be used as the eye-trackingdevice 120. This (or other eye-tracking devices) may be integrated with or mounted to any usable display. One example of an integrated eye-tracking and display device is the Tobii 1750 Eye-tracker, commercially available from Tobii Technology AB. - The eye-tracking device may include or interact with a software program to control the eye-tracker and collection of data thereby. For example, the eye-tracking device may include Clearview™ software (provided by Tobii). Other eye-tracking software can be used. This software may be a standalone application or maybe bundled with or part of one or more of the other software modules described herein. The eye-tracking software may incorporate one or more of the other software modules. Other eye-tracking devices, displays and/or technology may be used in place of, or with, the various components described herein.
-
FIG. 3 illustrates a more detailed functional block diagram of a system (and other features), according to one embodiment of the invention.FIG. 3 illustrates acomputer 110 having one ormore interfaces 114 for interfacing with one or more input devices 100, one ormore presentation devices 101 and/or one ormore output devices 102.Computer 110 may further be in communication with one or more storage devices, such asstimuli database 240,data collection database 241,subject profiles database 242,analysis results database 243 and/or other storage devices. One or more ofdatabases - The input devices 100 (e.g., one or more of an
eye tracking device 120, touch screen 135, keyboard 140, mouse 150, microphone 160, sensors 170, and/or other input devices) may be used for receiving input (e.g., from a subject 50 or other input). The input may include but is not limited to, information regarding a subject's visual attention, emotional response and/or other responses to stimuli. Other input may include user information received during a set-up/calibration procedure, survey responses and/or other user input, and other desired input. Sensors, such as scent sensors, tactile sensors, sound sensors and/or other sensors may also be used as input devices. - The presentation devices may include, for example, one or more of
display device 130, speaker(s) 180, and other presentation devices.Display device 130 may be used for visually displaying and presenting visual stimuli to a subject. - The
output devices 102 may include, for example, one or more of a display device 130 (or other display), speakers 180, printer 190, and or other output devices. Thedisplay device 130 may include a video display for displaying a video playback of the collected data or a processed version of the collected data.Computer 110 is programmed with, or is in communication with a computer (e.g., a remote server) that is programmed with, a software application (e.g.,application 200 illustrated inFIG. 3 ) to perform the functions described herein.Computer 110 may be a single computer or multiple computers. One ormore computers 110 may be located locally (in proximity to the test subject 50) and one or more may be located remotely from the test subject 50 (e.g. at a central test facility) to enable remote testing of subjects and/or remote monitoring of tests. One ormore computers 110 can be standalone computers running anapplication 200. One ormore computers 110 can be networked (e.g., vianetwork interface 209 to one another and/or anythird party device 260, to enable networked communication there between. This may enable, among other things, browser-based access from one computer to acentral computer 110 runningapplication 200. Thecomputer 110 may access theapplication 200 over a network 250 (e.g., the Internet, an intranet, WAN, LAN, etc.) via any wired and/or wireless communications links. -
Application 200 may include one or more computer software programs and/or modules that, among other things, perform functions set forth herein. For example,application 200 may perform functions including one or more of setup/calibration, testing, stimuli presentation, data collection, analysis, output, generation, and/or formatting, invoicing, data mining, among others. - For convenience various ones of the functions may be carried out by various modules 201-209, as shown for example in
FIG. 3 . One or more modules may be combined and any module shown as a single module may include two or more modules. By way of example, the modules may include at least one or more of aninterface controller module 201, asetup module 202, a stimuli presentation module 203 adata collection module 204, ananalysis module 205, anoutput module 206, aninvoice module 207, adata mining module 208 and/or other modules. Not all modules need to be used in all situations. - One or more
interface controller module 201 may be associated with and/or in communication with one or more input devices 100,presentation devices 101, andoutput devices 102 in any known manner. One ormore controllers 201 may be implemented as a hardware (and/or software) component of thecomputer 110 and used to enable communication with the devices attached to thecomputer 110. The communication can be conducted over any type of wired or wireless communication link. Secure communication protocols can be used where desired. -
Setup module 202 includes sub-modules for one or more ofsubject setup 202 a, stimuli setup 202 b,calibration 202 c and/or other setup/calibration procedures. These procedures may include those referred to in connection withStep 2 ofFIG. 1 , among others. Data received by the system during setup/calibration (e.g., background variables, test parameters, stimuli parameters, subject parameters, etc.) may be stored in one ofstimuli database 240,subject profile database 242, and/or other databases. -
Stimuli presentation module 203 may be provided to facilitate the presentation of stimuli according to stimuli setup information, stored stimuli and/or other stimuli presentation properties. Thestimuli presentation module 203 may include or interact with a graphical user interface (not shown) to enable stimuli to be managed (e.g., stored, deleted, modified, uploaded/downloaded, or otherwise managed) by an administrative user or otherwise. Additionally, a user interface can enable one more stimuli and stimuli presentation properties to be selected for use with a particular test or other application. - The
data collection module 204 may collect data (e.g., from one or more of input devices 100 or other input devices) during stimuli presentation (and at other times). Thedata collection module 204 may cause the collected data to be stored indata collection database 241 or other database for later (or real-time) analysis. - Analysis may be done by the
analysis module 205 and/or other processor.Analysis module 205 may include sub-modules for visual attention processing 205 a,emotional response processing 205 b, and/or other sub-modules. If desired, various plug-ins 205 c, may be used to enhance the functionality of a core emotional response engine and/or visual attention engine. - Analysis results may be stored in
analysis database 243 or other database. Theanalysis module 205 may process the collected data using one or more error detection and correction (data cleansing) techniques. As such, the collected data may be refined and filtered to decrease signaling noise and other errors. The clean data may be more easily and/or accurately analyzed. - Various plug-
ins 205 c may be used to offer greater level of detail to and/or additional functions regarding the visual attention and/or emotional response processing. For example, interest points may be determined from the collected data. Some details may include detailed interest points, emotional valence determination, emotional arousal determination, emotion name and type. - An
output module 206 may selectively enable various types of outputs to output from theapplication 200 to one ormore output devices 102. For example, theoutput module 206 may be used to produce reports based on analysis results. For example, visual attention information and emotional response information may be output and presented with respect to the actual stimuli in thereport output 118. Various electronic and/or printed output types may include, but are not limited to, representation in the form of graphs, text, illustrations, gaze plots, emotion meters, audio, and/or video play back, to name a few. Further details and examples of output are set forth in connection withFIGS. 6-8 . Other output types and formats may be used -
FIG. 4 illustrates examples of methods for carrying out various aspects of one embodiment of the invention.FIG. 4 illustrates a Study/Survey setup phase, a Study/Survey Run phase and a Study/Survey Analysis phase. These phases and/or other phases may be carried out at a test facility (or outside the test facility) in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise. Testing may be supervised, semi-supervised or unsupervised. At a test facility the testing may be run by a study/survey leader on each subject manually. Outside of the testing facility, the subject 50 may run the study/survey with or without a study/survey leader. Without a study/survey leader, the subject's emotional state may remain unaltered and unaffected by the presence of a study/survey leader. Alternatively, a combination of aspects from the testing facility and from outside the testing facility may be used during the phases illustrated inFIG. 4 . Other testing environments may also be included within the scope of the invention. - In some or all testing environments, there may be Study/Survey setup phase. In this phase the administrator (or other individual) enters or selects the stimuli and/or survey data and other setup parameters (e.g., background variables). This information may be stored in
stimuli database 240 and/or other database(s) (step 501). - The stimuli to be presented during the study/survey may be selected using the study/survey setup sub-module 202 b of the
setup module 202. The selected stimuli may be loaded on thecomputer 110 and/or stored instimuli database 240 or other database. Various stimuli sources may be used. Remote stimuli (not shown) may be accessed vianetwork interface 209 over the network 250 (e.g., internet, intranet, etc.) to download stimuli from the remote source such as an advertisement database. Another stimuli source may be a stimuli creation application which may allow the creation and/or customization of stimuli. The creation application may enable multimedia stimuli creation. - Other stimuli presentation properties may also be selected. For example for a given test/study, one or more of the stimuli duration for one or more stimuli, the order of presentation of stimuli (e.g., random presentation of stimuli), whether any stimuli should be simultaneously presented, an/or other stimuli properties may be selected. The parameters for identifying a fixation point may be provided during a set up of stimuli properties (or at other times). For example, this may be based at least on threshold values for dwell time or other parameters. The visual display of spotlight(s) may be set-up to be based on a number of aggregated fixation points or other factors. The attention points may be set-up to visually indicate the temporal ordering (e.g., semitransparent number indicator) of aggregated fixation points with respect to identified spotlights. Interest points may be identified based on fixation point (e.g., as determined by selected criteria) and emotional response (as defined by selected criteria) at the fixation point. For example, it may be specified that if a particular type and/or strength of emotional response is associated with one or more fixation point(s), this may identify an interest point(s). These aspects are discussed in more detail below.
- Output presentation properties may also be specified using the
setup module 202. The output presentation properties may identify what analysis will be done, output type and/or format, who should receive the output and/or how the output will be received, among other things. For example, the level of information to be included in an output report may be specified using, for example, a presentation format including predetermined templates. The parties to receive the output information and the associated transmission means may also be specified as part of the output presentation properties. For example, the output(s) may be sent to a specified user/device using a predetermined transmission means (e.g., email, phone, FTP, etc.). The output presentation properties may be entered by one or more of the administrator, leader, and subject and/or other individual. - The method of
FIG. 4 may also include receiving profile (and/or other) information regarding the subject (e.g., background variables including age, gender, location, etc.) At a testing facility, the leader may enter or guide the subject(s) to enter details of the participating subject (Step 502). This may include using subject set-upsub-module 202 a of thesetup module 202. The information may be stored insubject profile database 242 or other database. Calibration of the subject may also be performed, either manually, automatically and/or semi-automatically (step 504). During the run phase, stimuli and/or survey questions may be presented for display to the subject (Step 506). The subject may answer survey questions manually or otherwise (Step 508). Visual attention data and emotional response data may be collected as described elsewhere herein. In other testing environments, various ones of these steps may be performed without a leader (steps 512-514, 516 and 518). - After the stimuli presentation of the study/survey is completed, it is determined whether another participating subject is available (
Step 510, 520). If so, the process may be repeated with another subject. If not, the study session may be concluded and/or analysis may be performed (Step 550). - Analysis may be performed at the conclusion of a test/study and/or in real-time as data is collected. The analysis may include processing the collected data to determine visual attention information and/or emotional response information, among other things. Some aspects of visual attention processing and/or emotional response processing, in general are known. Other aspects are described elsewhere herein.
- Eye-tracking, emotional response (and other) calibration techniques, in general are known. Examples of some aspects of the calibration routines that may be used with the invention are provided. Other calibration techniques may also be used.
Calibration sub-module 202 c performs calibration activity, including subject/device calibration.Eye tracking device 120 and other input devices may be calibrated based on environmental settings and scenarios. Also during calibration, thecalibration sub-module 202 c may present the subject with a number of calibrations points located in predetermined locations of the display device or the subject's field of vision for subject specific calibration. The calibration points may correspond to coordinates of the display device on which the subject may be prompted to focus and move between until the eye tracking device has calibrated the movement of the subject's eyes in relation to the display device coordinates (e.g., x, y, z coordinates) Optionally, the point calibration information is recorded and stored with the subject profile data for future testing sessions. - Emotional calibration may also be recorded and stored. The subject may be presented with predetermined stimuli used to evoke a certain emotion in order to observe the subjects emotional reaction in relation to their eye properties. A subject may be presented with a stimuli known to elicit a positive (e.g., pleasant), neutral, or negative (e.g., unpleasant) response. By way of example, a subject may be presented with an emotionally neutral stimulus in order to record blink rate pattern, pupil response, saccadic movements, and/or other properties to characterize the subject's response to neutral stimuli. Alternatively, the subject may be presented with stimuli known to evoke a certain emotion based on the subject's demographic and other personal data. The emotional reaction may be used to set an emotional baseline for various emotions. Thus, study/survey stimuli may be compared with a subject's baseline to understand the magnitude of emotional valence.
- In connection with running a test, various data may be collected. The collected data may be processed in real-time or subsequently. Collected and processed data may be presented as output information to present visual attention, emotional response and/or other information in a variety of formats as discussed in reference to output presentation properties. One type of output may be a visual output to a display, a visual printout or other visual output. Non-visual output may also be provided.
- The output may include a graphical representation including visual attention information (e.g., one or more gaze plot) and/or emotional response information (e.g. one or more emotion meter) for one or more stimuli. The gaze plot(s) (e.g., spotlight(s), attention points, interest points) may be superimposed on the relevant stimulus (or stimuli if two or more are simultaneously displayed). The gaze plot may include a spotlight feature to highlight aggregated fixation points, attention points to highlight temporal ordering of aggregated fixation points and interest points to highlight emotional response.
-
FIG. 5 is an illustration of exemplary visual stimuli, according to an embodiment of the invention.FIG. 6 is an example of an output, according to one aspect of the invention, relating to the stimuli ofFIG. 5 . The output, as shown, includes a simultaneous display ofvisual attention information 800 and emotional response information 810 (e.g., an emotion meter) relating to a subject's response to a visual stimulus (e.g., thestimulus 700 ofFIG. 5 ). As shown, thevisual attention information 800 includes a gaze plot with a spotlight feature and attention points. The spotlight feature highlights (or otherwise illustrates) one or more fixation points and/or interest points of thevisual stimulus 700. In one implementation of the spotlight feature, a virtual mask may be superimposed over all or some of the stimulus (e.g., visual image 700) and portions of the mask, corresponding to one or more fixation points (e.g., based on minimum time of fixation), may be effectively removed or made more transparent to reveal the underlying portion of the stimulus. Another approach is to start with the entire stimulus revealed and selectively mask non-fixation points. - In general, the mask (if used) may have a first set of optical characteristics and the removed portions may have a second set of optical characteristics (e.g., to distinguish the one or more fixation points from the rest of the stimulus). According to one embodiment, the mask may be at least relatively opaque (to fully or partially obscure the underlying portion of the stimulus) and the removed portions corresponding to the fixation points may be made at least relatively more transparent to highlight (or spotlight) the fixation points (as shown for example by pointers 801-804). Areas illustrated by 801, 802, 803, and 804 may also be include attention points for numbering according to the temporal ordering of fixation. If desired, the actual stimulus may be displayed in proximity to the gaze plot to easily see the masked portions of the stimulus.
- According to another embodiment, the fixation points may be displayed more brightly than the other points. Other techniques for visually displaying distinctions between fixation points and non-fixation points may be used.
- A relative difference in optical characteristics may also be used to indicate the magnitude of fixation points. For example, if a subject dwells at a first fixation point for a longer time than a second fixation point, the first fixation point may be relatively more transparent than the second fixation point, yet each may be more transparent than non-fixation points. Other optical characteristics can be used to distinguish among fixation points and to distinguish fixation points from non-fixation points.
- To the extent a user fixates on different points or areas in a particular temporal order, the order of the fixation points may be visually indicated using attention points, either statically or dynamically. If static, the fixation points may be marked with numbers (or other indictors) to match the temporal ordering of one or more fixation points. If dynamic, a first fixation point may be highlighted as compared with other fixation points (e.g., displayed more transparently or more brightly). Then a second and other fixation points may be highlighted in a sequential fashion.
- According to another aspect of the invention, a fixation point that is determined to correspond to an emotional response may be referred to as an interest point. One or more interest points may be displayed differently than fixation points that are not associated with an emotional response. Additionally, one interest point may be displayed differently than another interest point based on the determined emotional valence and/or arousal associated with the point or other differences. For example, a spotlight feature may be used to highlight one or more portions/areas of the visual stimulus that correspond to interest points. Characteristics of the interest point spotlights may vary to indicate the type and/or strength of a subject's emotional response associated with the fixation point.
-
Emotional response information 810 may be displayed simultaneously withvisual attention information 800. Theemotional response information 810 may include an overall emotional response based on the subject's response to the stimulus (or stimuli) and/or area related emotional response information corresponding to portions of one or more stimuli. For example, a more detailed level of emotional response may be provided by separately displaying emotional response information for one or more fixation points. As shown inFIG. 6 , by way of example only, an emotional response meter may show the emotional valence and/or arousal for one or more fixation points. Emotional valence may also be displayed for interest points, spotlights, and/or attention points. - Textual information may be included at various locations on the report, if desired.
-
FIG. 7 illustrates some options for display of visual attention information and emotional response information. Various permutations of these features may be used together. Not all features need be used in all cases. - For example, the visual attention information may include a gaze plot (with or without the spotlight feature, attention points, interest points). A gaze plot, if used, may illustrate a scan path corresponding to the subject's eye movements, fixation points, and/or interest points. The visual attention information may be for one or more stimuli at a time. The visual attention information may be static or dynamic. A dynamic display may include a sequence of individual displays (e.g., a slide show mode), animated playback, one or more videos and/or other dynamic displays.
- Some output (e.g., reports) may be automatically generated according to one or more templates. Various templates and/or template parameters may be pre-stored in the system. Pre-stored templates can be selected and/or modified (e.g., by an administrative user, test-study leader or other entity). New templates may also be created and stored.
- Reports and
other output 118 may be automatically sent to one or more recipients and/or recipient devices. For example, subject 50,third party device 250, a study/survey leader, an administrator, and/or other recipient.Output 118 may be stored for later retrieval, transmission, and/or data warehousing. Output and reports can be in any of a number of formats, including without limitation, JPEG, Word, PDF, XML and any other convenient output format. - According to an aspect of the invention, emotion maps may be displayed simultaneously and in synchronization with the stimuli that provoked them. For example, as illustrated in
FIG. 8 , a first gaze plot with spotlight feature for afirst stimulus 900 a may be displayed in proximity tocorresponding emotion map 900 b which depicts the emotional response of a subject tostimulus 900 a. Similarly, a second gaze plot with spotlight feature for asecond stimulus 904 a may be displayed in proximity tocorresponding emotion map 904 b which depicts the emotional response of a subject tostimulus 904 a, and so on. Different display formats may be utilized. - Report information along with data from databases (240-243) may be further analyzed for data mining purposes. Data within theses databases, among others, may be used to uncover patterns and relationships contained within the collected data, subject data, and/or analysis results. Background variables (e.g., collected during set-up or other time) including age, gender, location, among others, may be used for data mining. In one or more databases, data mining can be done manually or automatically via
data mining module 208 over all or portions of the data. - By way of further explanation, additional information and examples regarding various aspects of the invention is now presented. Survey questions, if used, may be presented one at a time, or a number of survey questions may be shown at one time on a single screen. The order, timing, and display attributes of stimuli may be determined by the administrator and/or subject/survey leader at setup, based on what the administrator may want to analyze. By further example, the administrator may want to study the subject's response to two or more competing market brands. A simultaneous, side by side presentation of stimuli may elicit different visual attention information and emotional reaction with respect to the two or more brands than a sequential display. Other comparative studies may be conducted.
- As the study/survey is run, the subject's eye properties and other properties observed by the eye tracking device and/or other sensors may be collected, stored, and/or analyzed. The collected data may be synchronized to a timer for later analysis and/or play back. Collected data may comprise eye property data, other physiological data, environmental data, and/or other data. Collected eye property data may include data relating to a subject's pupil size, blink properties, eye position (or gaze) properties, or other eye properties. Collected pupil data may comprise pupil size, velocity of change (contraction or dilation), acceleration (which may be derived from velocity), or other pupil data. Collected blink data may include, for example, blink frequency, blink duration, blink potention, blink magnitude, or other blink data. Collected gaze data may comprise, for example, saccades, express saccades, nystagmus, or other gaze data. Data relating to the movement of facial muscles (or facial expressions in general) may also be collected. If a subject is presented with stimuli, collected data may be synchronized with the presented stimuli.
- Visual attention information components may be decoded from the visual cues (e.g., collected eye property data). This may be done, for example, by applying one or more rules from a visual attention analysis sub-module 205 a. Determination and analysis of visual attention may involve various aspects including interest points and interest tracking. Interest points may be based on fixation (gaze) rate and the type of saccades on a portion or portions of the visual stimuli coupled with emotional response as determined by the eye properties. Processing gaze (or eye movement data) may comprise, for example, analyzing saccades, express saccades (e.g., saccades with a velocity greater than approximately 100 degrees per second), and nystagmus (rapid involuntary movements of the eye), or other data. Features of interest may include the velocity (deg/s) and direction of eye movements, fixation time (e.g., how long does the eye focus on one point), the location of the fixation in space (e.g., area as defined by x, y, z or other coordinates), or other features including return to fixation areas, relevance, vergence for depth evaluation, and scan activity.
- Visual attention may be determined by setting an adjustable fixation/gazing threshold. A sliding window measured in milliseconds (or other unit of time) can be set as a threshold, for example, 400 ms, in order to determine which points or areas on the visual stimuli the subject gazed at for at least 400 ms. If the user remains fixated on the area for at least the window of time, the area of the visual stimuli may be identified as a fixation point.
- The emotional response (e.g., arousal, valence, if any) corresponding to the fixation point may determine the level of interest in that fixation point. For example, if a determined fixation point also elicited an emotional response that exceeds a predetermined emotional threshold value, then the fixation point may be identified as an interest point. Thus, interest points/areas may be identified by the area(s) of a visual stimulus which the subject gazes or fixates upon for more than a predetermined period of time (the selectable threshold value) and elicits a measurable emotional response (emotional threshold value).
- If the sliding window threshold is made smaller, for example 100 ms, the subject's entire scan path on the visual stimuli may be revealed. This may allow an administrator or analyzer to see if a specific feature of a visual stimulus was even looked at and for how long.
- Graphical representation of the subject's visual attention may be put in the form of a gaze plot.
- Emotional response components may include, for example, emotional valence, emotional arousal, emotional category, and/or emotional type. Other components may be determined. Emotional valence may indicate whether a subject's emotional response to a given stimulus is a positive emotional response (e.g., pleasant or “like”), negative emotional response (e.g., unpleasant or “dislike”), or neutral emotional response. Emotional arousal may comprise an indication of the intensity or emotional strength of a response subject a predetermined scale based on the calibrated emotional baseline. Known relationships exist between a subject's emotional valence and arousal and physical properties such as pupil size, blink properties, facial expressions, and eye movement.
- Pupil size can range from approximately 1.5 mm to more than 9 mm. Processing pupil data may further comprise determining the velocity of change or how fast a dilation or contraction occurs in response to a stimulus, as well as acceleration which can be derived from velocity. Other pupil-related data including pupil base level and base distance may be determined as well as, for instance, minimum and maximum pupil sizes.
- Processing blink data may comprise, for example, determining blink frequency, blink duration, blink potention, blink magnitude, or other blink data. Blink frequency measurement may include determining the timeframe between sudden blink activity.
- Blink duration (in, for example, milliseconds) may also be processed to differentiate attentional blinks from physiological blinks. File Blink patterns may be differentiated based on their duration. Neutral blinks may be classified as those which correspond to the blinks measured during calibration. Long blink intervals may indicate increased attention, while short blinks may indicate that the subject may be searching for information. Very short blink intervals may indicate confusion, while half-blinks may serve as an indication of a heightened sense of alert. Blink velocity refers to how fast the amount of eyeball visibility is changing while the magnitude of a blink refers to how much of the eyeball is visible while blinking.
- According to another aspect of the invention,
analysis module 205 may decode emotional cues from extracted feature data by applying one or more rules from an emotionalreaction analysis sub-module 205 b to the collected data to determine one or more emotional components. - Business Models
- A variety of different business models may be used to exploit the features and advantages of the invention. For example, a service provider may use the software/system to run test centers that subjects physically visit. Tests/studies may be performed on behalf of a third party (e.g. a consumer products company). In this scenario, one or more test leaders may be used to assist/guide the subjects in conjunction with the testing. Self-operated test centers (e.g., kiosks) may also be used with or without a leader. The service provider may collect fees from the third party on a variety of bases. By way of example, the fees may include, but are not limited to, a per test fee per subject, a per test fee for a number of subjects, a per stimuli fee, per segment of subjects and/or other bases. Additionally, the amount of fee may vary depending on the type/detail of output. For example, a simple visual attention output (e.g., gaze plot only) may be provided for a first fee. More detailed information (e.g., a gaze plot with the spotlight feature) may be a second fee. A simultaneous display of visual attention information (e.g., a gaze plot with or without a spotlight feature) along with basic emotional response information may be a third fee. Adding more detailed emotional response information (e.g., emotional response for one or more fixation points) may be a fourth fee. Other types of outputs, video, animated, etc. may command other fees. Other business models for such service providers may be implemented.
- According to another business method, a service provider may operate a remotely accessible (via the Internet or other network) test facility with which subjects can interact remotely. The subject can access the remotely accessible test facility in any of a number of ways, including but not limited to, via a test center, a kiosk, a home or work computer, a mobile wireless device or otherwise. Fees may be charged as indicated above or otherwise.
- According to another aspect of the invention, an invoice module (e.g., invoice module 207) may be used to at least partially automate the process of billing. The
invoice module 207 may monitor system information and automatically determine fees and generate invoices. Fee information may be input during a setup phase or otherwise. The monitored information may include test run, subject tested, stimuli presented, type and/or level of detail of output and/or other information upon which fees maybe based. - In the foregoing specification, the invention has been described with reference to specific embodiments thereof. Various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (48)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/685,552 US20070265507A1 (en) | 2006-03-13 | 2007-03-13 | Visual attention and emotional response detection and display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US78132106P | 2006-03-13 | 2006-03-13 | |
US11/685,552 US20070265507A1 (en) | 2006-03-13 | 2007-03-13 | Visual attention and emotional response detection and display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070265507A1 true US20070265507A1 (en) | 2007-11-15 |
Family
ID=39876016
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/685,552 Abandoned US20070265507A1 (en) | 2006-03-13 | 2007-03-13 | Visual attention and emotional response detection and display system |
Country Status (5)
Country | Link |
---|---|
US (1) | US20070265507A1 (en) |
EP (1) | EP2007271A2 (en) |
JP (1) | JP2009530071A (en) |
CA (1) | CA2639125A1 (en) |
WO (1) | WO2008129356A2 (en) |
Cited By (210)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060189885A1 (en) * | 2003-01-07 | 2006-08-24 | Monash University | Assessment of cognitive impairment |
US20070066916A1 (en) * | 2005-09-16 | 2007-03-22 | Imotions Emotion Technology Aps | System and method for determining human emotion by analyzing eye properties |
US20080215975A1 (en) * | 2007-03-01 | 2008-09-04 | Phil Harrison | Virtual world user opinion & response monitoring |
US20080222670A1 (en) * | 2007-03-07 | 2008-09-11 | Lee Hans C | Method and system for using coherence of biological responses as a measure of performance of a media |
US20080243005A1 (en) * | 2007-03-30 | 2008-10-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Computational user-health testing |
US20090025023A1 (en) * | 2007-06-06 | 2009-01-22 | Neurofocus Inc. | Multi-market program and commercial response monitoring system using neuro-response measurements |
US20090070798A1 (en) * | 2007-03-02 | 2009-03-12 | Lee Hans C | System and Method for Detecting Viewer Attention to Media Delivery Devices |
US20090094627A1 (en) * | 2007-10-02 | 2009-04-09 | Lee Hans C | Providing Remote Access to Media, and Reaction and Survey Data From Viewers of the Media |
US20090113297A1 (en) * | 2007-10-24 | 2009-04-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Requesting a second content based on a user's reaction to a first content |
US20090133047A1 (en) * | 2007-10-31 | 2009-05-21 | Lee Hans C | Systems and Methods Providing Distributed Collection and Centralized Processing of Physiological Responses from Viewers |
US20090150919A1 (en) * | 2007-11-30 | 2009-06-11 | Lee Michael J | Correlating Media Instance Information With Physiological Responses From Participating Subjects |
US20090157625A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for identifying an avatar-linked population cohort |
US20090157660A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems employing a cohort-linked avatar |
US20090157482A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for indicating behavior in a population cohort |
US20090157481A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a cohort-linked avatar attribute |
US20090158308A1 (en) * | 2007-12-18 | 2009-06-18 | Daniel Weitzenfeld | Identifying key media events and modeling causal relationships between key events and reported feelings |
US20090156907A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090157751A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090164503A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US20090164549A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for determining interest in a cohort-linked avatar |
US20090164131A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US20090172540A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Population cohort-linked avatar |
US20090171164A1 (en) * | 2007-12-17 | 2009-07-02 | Jung Edward K Y | Methods and systems for identifying an avatar-linked population cohort |
WO2010000986A1 (en) * | 2008-07-03 | 2010-01-07 | Mettler Toledo Sas | Transaction terminal and transaction system comprising such terminals linked to a server |
WO2010004426A1 (en) * | 2008-07-09 | 2010-01-14 | Imotions - Emotion Technology A/S | System and method for calibrating and normalizing eye data in emotional testing |
US20100036709A1 (en) * | 2008-08-05 | 2010-02-11 | Ford Motor Company | Method and system of measuring customer satisfaction with purchased vehicle |
US20100039618A1 (en) * | 2008-08-15 | 2010-02-18 | Imotions - Emotion Technology A/S | System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text |
US20100149093A1 (en) * | 2006-12-30 | 2010-06-17 | Red Dot Square Solutions Limited | Virtual reality system including viewer responsiveness to smart objects |
US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
WO2010100567A2 (en) * | 2009-03-06 | 2010-09-10 | Imotions- Emotion Technology A/S | System and method for determining emotional response to olfactory stimuli |
US20100268108A1 (en) * | 2009-03-10 | 2010-10-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Computational systems and methods for health services planning and matching |
US20110077996A1 (en) * | 2009-09-25 | 2011-03-31 | Hyungil Ahn | Multimodal Affective-Cognitive Product Evaluation |
US20110109879A1 (en) * | 2009-11-09 | 2011-05-12 | Daphna Palti-Wasserman | Multivariate dynamic profiling system and methods |
US20110141011A1 (en) * | 2008-09-03 | 2011-06-16 | Koninklijke Philips Electronics N.V. | Method of performing a gaze-based interaction between a user and an interactive display system |
US20110263946A1 (en) * | 2010-04-22 | 2011-10-27 | Mit Media Lab | Method and system for real-time and offline analysis, inference, tagging of and responding to person(s) experiences |
US20120035428A1 (en) * | 2010-06-17 | 2012-02-09 | Kenneth George Roberts | Measurement of emotional response to sensory stimuli |
WO2012073016A1 (en) * | 2010-11-30 | 2012-06-07 | University Of Lincoln | A response detection system and associated methods |
US20120143693A1 (en) * | 2010-12-02 | 2012-06-07 | Microsoft Corporation | Targeting Advertisements Based on Emotion |
US8209224B2 (en) | 2009-10-29 | 2012-06-26 | The Nielsen Company (Us), Llc | Intracluster content management using neuro-response priming data |
US8270814B2 (en) | 2009-01-21 | 2012-09-18 | The Nielsen Company (Us), Llc | Methods and apparatus for providing video with embedded media |
US20120259240A1 (en) * | 2011-04-08 | 2012-10-11 | Nviso Sarl | Method and System for Assessing and Measuring Emotional Intensity to a Stimulus |
US8335716B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Multimedia advertisement exchange |
US8335715B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Advertisement exchange using neuro-response data |
US8376952B2 (en) | 2007-09-07 | 2013-02-19 | The Nielsen Company (Us), Llc. | Method and apparatus for sensing blood oxygen |
US8386312B2 (en) | 2007-05-01 | 2013-02-26 | The Nielsen Company (Us), Llc | Neuro-informatics repository system |
US8386313B2 (en) | 2007-08-28 | 2013-02-26 | The Nielsen Company (Us), Llc | Stimulus placement system using subject neuro-response measurements |
US20130054090A1 (en) * | 2011-08-29 | 2013-02-28 | Electronics And Telecommunications Research Institute | Emotion-based vehicle service system, emotion cognition processing apparatus, safe driving service apparatus, and emotion-based safe driving service method |
US8392250B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-response evaluated stimulus in virtual reality environments |
US8392251B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Location aware presentation of stimulus material |
US8392253B2 (en) | 2007-05-16 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
US8392255B2 (en) | 2007-08-29 | 2013-03-05 | The Nielsen Company (Us), Llc | Content based selection and meta tagging of advertisement breaks |
US8392254B2 (en) | 2007-08-28 | 2013-03-05 | The Nielsen Company (Us), Llc | Consumer experience assessment system |
US8396744B2 (en) | 2010-08-25 | 2013-03-12 | The Nielsen Company (Us), Llc | Effective virtual reality environments for presentation of marketing materials |
US8401248B1 (en) | 2008-12-30 | 2013-03-19 | Videomining Corporation | Method and system for measuring emotional and attentional response to dynamic digital media content |
US20130137076A1 (en) * | 2011-11-30 | 2013-05-30 | Kathryn Stone Perez | Head-mounted display based education and instruction |
US8464288B2 (en) | 2009-01-21 | 2013-06-11 | The Nielsen Company (Us), Llc | Methods and apparatus for providing personalized media in video |
US8473044B2 (en) | 2007-03-07 | 2013-06-25 | The Nielsen Company (Us), Llc | Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals |
US8473345B2 (en) | 2007-03-29 | 2013-06-25 | The Nielsen Company (Us), Llc | Protocol generator and presenter device for analysis of marketing and entertainment effectiveness |
WO2012174381A3 (en) * | 2011-06-17 | 2013-07-11 | Microsoft Corporation | Video highlight identification based on environmental sensing |
US8494905B2 (en) | 2007-06-06 | 2013-07-23 | The Nielsen Company (Us), Llc | Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) |
US8494610B2 (en) | 2007-09-20 | 2013-07-23 | The Nielsen Company (Us), Llc | Analysis of marketing and entertainment effectiveness using magnetoencephalography |
US8533042B2 (en) | 2007-07-30 | 2013-09-10 | The Nielsen Company (Us), Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US20130243270A1 (en) * | 2012-03-16 | 2013-09-19 | Gila Kamhi | System and method for dynamic adaption of media based on implicit user input and behavior |
US20140016860A1 (en) * | 2010-06-07 | 2014-01-16 | Affectiva, Inc. | Facial analysis to detect asymmetric expressions |
US8635105B2 (en) | 2007-08-28 | 2014-01-21 | The Nielsen Company (Us), Llc | Consumer experience portrayal effectiveness assessment system |
US8655428B2 (en) | 2010-05-12 | 2014-02-18 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
US8655437B2 (en) | 2009-08-21 | 2014-02-18 | The Nielsen Company (Us), Llc | Analysis of the mirror neuron system for evaluation of stimulus |
US8676937B2 (en) | 2011-05-12 | 2014-03-18 | Jeffrey Alan Rapaport | Social-topical adaptive networking (STAN) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging |
US20140127662A1 (en) * | 2006-07-12 | 2014-05-08 | Frederick W. Kron | Computerized medical training system |
US20140149177A1 (en) * | 2012-11-23 | 2014-05-29 | Ari M. Frank | Responding to uncertainty of a user regarding an experience by presenting a prior experience |
US8764652B2 (en) | 2007-03-08 | 2014-07-01 | The Nielson Company (US), LLC. | Method and system for measuring and ranking an “engagement” response to audiovisual or interactive media, products, or activities using physiological signals |
US20140192325A1 (en) * | 2012-12-11 | 2014-07-10 | Ami Klin | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US8782681B2 (en) | 2007-03-08 | 2014-07-15 | The Nielsen Company (Us), Llc | Method and system for rating media and events in media based on physiological data |
US20140200417A1 (en) * | 2010-06-07 | 2014-07-17 | Affectiva, Inc. | Mental state analysis using blink rate |
WO2014116826A1 (en) * | 2013-01-24 | 2014-07-31 | The Trustees Of Columbia University In The City Of New York | Mobile, neurally-assisted personal assistant |
WO2014131860A1 (en) * | 2013-02-28 | 2014-09-04 | Carl Zeiss Meditec Ag | Systems and methods for improved ease and accuracy of gaze tracking |
US20140253303A1 (en) * | 2013-03-11 | 2014-09-11 | Immersion Corporation | Automatic haptic effect adjustment system |
US8869115B2 (en) | 2011-11-23 | 2014-10-21 | General Electric Company | Systems and methods for emotive software usability |
US20140323817A1 (en) * | 2010-06-07 | 2014-10-30 | Affectiva, Inc. | Personal emotional profile generation |
US20140336526A1 (en) * | 2011-11-22 | 2014-11-13 | Jorge Otero-Millan | System and method for using microsaccade dynamics to measure attentional response to a stimulus |
US8989835B2 (en) | 2012-08-17 | 2015-03-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US20150193688A1 (en) * | 2011-10-20 | 2015-07-09 | Gil Thieberger | Estimating affective response to a token instance utilizing a predicted affective response to its background |
US20150332166A1 (en) * | 2013-09-20 | 2015-11-19 | Intel Corporation | Machine learning-based user behavior characterization |
US9215996B2 (en) | 2007-03-02 | 2015-12-22 | The Nielsen Company (Us), Llc | Apparatus and method for objectively determining human response to media |
WO2015193806A1 (en) * | 2014-06-17 | 2015-12-23 | Koninklijke Philips N.V. | Evaluating clinician attention |
US9235968B2 (en) | 2013-03-14 | 2016-01-12 | Otoy, Inc. | Tactile elements for a wearable eye piece |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9292858B2 (en) | 2012-02-27 | 2016-03-22 | The Nielsen Company (Us), Llc | Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments |
US20160109945A1 (en) * | 2013-05-30 | 2016-04-21 | Umoove Services Ltd. | Smooth pursuit gaze tracking |
US9320450B2 (en) | 2013-03-14 | 2016-04-26 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US9357240B2 (en) | 2009-01-21 | 2016-05-31 | The Nielsen Company (Us), Llc | Methods and apparatus for providing alternate media for video decoders |
US9351658B2 (en) | 2005-09-02 | 2016-05-31 | The Nielsen Company (Us), Llc | Device and method for sensing electrical activity in tissue |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US9400993B2 (en) | 2006-12-30 | 2016-07-26 | Red Dot Square Solutions Limited | Virtual reality system including smart objects |
US9451303B2 (en) | 2012-02-27 | 2016-09-20 | The Nielsen Company (Us), Llc | Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing |
US9454646B2 (en) | 2010-04-19 | 2016-09-27 | The Nielsen Company (Us), Llc | Short imagery task (SIT) research method |
US9513699B2 (en) | 2007-10-24 | 2016-12-06 | Invention Science Fund I, LL | Method of selecting a second content based on a user's reaction to a first content |
US9560984B2 (en) | 2009-10-29 | 2017-02-07 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US9569986B2 (en) | 2012-02-27 | 2017-02-14 | The Nielsen Company (Us), Llc | System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications |
US9582805B2 (en) | 2007-10-24 | 2017-02-28 | Invention Science Fund I, Llc | Returning a personalized advertisement |
US20170103424A1 (en) * | 2015-10-13 | 2017-04-13 | Mastercard International Incorporated | Systems and methods for generating mood-based advertisements based on consumer diagnostic measurements |
US9622703B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US20170119296A1 (en) * | 2014-06-11 | 2017-05-04 | Dignity Health | Systems and methods for non-intrusive deception detection |
US9773332B2 (en) | 2013-03-14 | 2017-09-26 | Otoy, Inc. | Visual cortex thought detector interface |
US9858540B2 (en) | 2009-03-10 | 2018-01-02 | Gearbox, Llc | Computational systems and methods for health services planning and matching |
US9867546B2 (en) | 2015-06-14 | 2018-01-16 | Facense Ltd. | Wearable device for taking symmetric thermal measurements |
US9886729B2 (en) | 2009-03-10 | 2018-02-06 | Gearbox, Llc | Computational systems and methods for health services planning and matching |
US9886981B2 (en) | 2007-05-01 | 2018-02-06 | The Nielsen Company (Us), Llc | Neuro-feedback based stimulus compression device |
US9892435B2 (en) | 2009-03-10 | 2018-02-13 | Gearbox Llc | Computational systems and methods for health services planning and matching |
US9936250B2 (en) | 2015-05-19 | 2018-04-03 | The Nielsen Company (Us), Llc | Methods and apparatus to adjust content presented to an individual |
US9934425B2 (en) | 2010-06-07 | 2018-04-03 | Affectiva, Inc. | Collection of affect data from multiple mobile devices |
US9959549B2 (en) | 2010-06-07 | 2018-05-01 | Affectiva, Inc. | Mental state analysis for norm generation |
US9968264B2 (en) | 2015-06-14 | 2018-05-15 | Facense Ltd. | Detecting physiological responses based on thermal asymmetry of the face |
US10045726B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Selecting a stressor based on thermal measurements of the face |
US10045699B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Determining a state of a user based on thermal measurements of the forehead |
US10048748B2 (en) | 2013-11-12 | 2018-08-14 | Excalibur Ip, Llc | Audio-visual interaction with user devices |
US10045737B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Clip-on device with inward-facing cameras |
US10064559B2 (en) | 2015-06-14 | 2018-09-04 | Facense Ltd. | Identification of the dominant nostril using thermal measurements |
US10074024B2 (en) | 2010-06-07 | 2018-09-11 | Affectiva, Inc. | Mental state analysis using blink rate for vehicles |
US10076270B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses while accounting for touching the face |
US10076250B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses based on multispectral data from head-mounted cameras |
US10080861B2 (en) | 2015-06-14 | 2018-09-25 | Facense Ltd. | Breathing biofeedback eyeglasses |
US10085685B2 (en) | 2015-06-14 | 2018-10-02 | Facense Ltd. | Selecting triggers of an allergic reaction based on nasal temperatures |
US10092232B2 (en) | 2015-06-14 | 2018-10-09 | Facense Ltd. | User state selection based on the shape of the exhale stream |
US10113913B2 (en) | 2015-10-03 | 2018-10-30 | Facense Ltd. | Systems for collecting thermal measurements of the face |
US10130308B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Calculating respiratory parameters from thermal measurements |
US10130299B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Neurofeedback eyeglasses |
US10130261B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Detecting physiological responses while taking into account consumption of confounding substances |
US10136856B2 (en) | 2016-06-27 | 2018-11-27 | Facense Ltd. | Wearable respiration measurements system |
US10136852B2 (en) | 2015-06-14 | 2018-11-27 | Facense Ltd. | Detecting an allergic reaction from nasal temperatures |
US10143414B2 (en) | 2010-06-07 | 2018-12-04 | Affectiva, Inc. | Sporadic collection with mobile affect data |
US10151636B2 (en) | 2015-06-14 | 2018-12-11 | Facense Ltd. | Eyeglasses having inward-facing and outward-facing thermal cameras |
US10154810B2 (en) | 2015-06-14 | 2018-12-18 | Facense Ltd. | Security system that detects atypical behavior |
US10159411B2 (en) | 2015-06-14 | 2018-12-25 | Facense Ltd. | Detecting irregular physiological responses during exposure to sensitive data |
US10171877B1 (en) | 2017-10-30 | 2019-01-01 | Dish Network L.L.C. | System and method for dynamically selecting supplemental content based on viewer emotions |
US10204625B2 (en) | 2010-06-07 | 2019-02-12 | Affectiva, Inc. | Audio analysis learning using video data |
US10216981B2 (en) | 2015-06-14 | 2019-02-26 | Facense Ltd. | Eyeglasses that measure facial skin color changes |
US10225621B1 (en) | 2017-12-20 | 2019-03-05 | Dish Network L.L.C. | Eyes free entertainment |
US20190102706A1 (en) * | 2011-10-20 | 2019-04-04 | Affectomatics Ltd. | Affective response based recommendations |
US10289898B2 (en) | 2010-06-07 | 2019-05-14 | Affectiva, Inc. | Video recommendation via affect |
US10299717B2 (en) | 2015-06-14 | 2019-05-28 | Facense Ltd. | Detecting stress based on thermal measurements of the face |
US10319471B2 (en) | 2009-03-10 | 2019-06-11 | Gearbox Llc | Computational systems and methods for health services planning and matching |
US10354261B2 (en) * | 2014-04-16 | 2019-07-16 | 2020 Ip Llc | Systems and methods for virtual environment construction for behavioral research |
US20190244617A1 (en) * | 2017-01-25 | 2019-08-08 | International Business Machines Corporation | Conflict Resolution Enhancement System |
US10376183B2 (en) | 2014-04-29 | 2019-08-13 | Dignity Health | Systems and methods for non-intrusive drug impairment detection |
US10401860B2 (en) | 2010-06-07 | 2019-09-03 | Affectiva, Inc. | Image analysis for two-sided data hub |
US10474875B2 (en) | 2010-06-07 | 2019-11-12 | Affectiva, Inc. | Image analysis using a semiconductor processor for facial evaluation |
CN110432915A (en) * | 2019-08-02 | 2019-11-12 | 秒针信息技术有限公司 | A kind of method and device for assessing information flow intention |
CN110464365A (en) * | 2018-05-10 | 2019-11-19 | 深圳先进技术研究院 | A kind of attention rate determines method, apparatus, equipment and storage medium |
US10482333B1 (en) | 2017-01-04 | 2019-11-19 | Affectiva, Inc. | Mental state analysis using blink rate within vehicles |
US10514553B2 (en) | 2015-06-30 | 2019-12-24 | 3M Innovative Properties Company | Polarizing beam splitting system |
US10517521B2 (en) | 2010-06-07 | 2019-12-31 | Affectiva, Inc. | Mental state mood analysis using heart rate collection based on video imagery |
US10523852B2 (en) | 2015-06-14 | 2019-12-31 | Facense Ltd. | Wearable inward-facing camera utilizing the Scheimpflug principle |
US10546310B2 (en) * | 2013-11-18 | 2020-01-28 | Sentient Decision Science, Inc. | Systems and methods for assessing implicit associations |
US10592757B2 (en) | 2010-06-07 | 2020-03-17 | Affectiva, Inc. | Vehicular cognitive data collection using multiple devices |
US10602214B2 (en) | 2017-01-19 | 2020-03-24 | International Business Machines Corporation | Cognitive television remote control |
US10614289B2 (en) | 2010-06-07 | 2020-04-07 | Affectiva, Inc. | Facial tracking with classifiers |
US10617295B2 (en) | 2013-10-17 | 2020-04-14 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for assessing infant and child development via eye tracking |
TWI691941B (en) * | 2018-02-13 | 2020-04-21 | 林俊毅 | Detection system and method for logical thinking ability |
US10628741B2 (en) | 2010-06-07 | 2020-04-21 | Affectiva, Inc. | Multimodal machine learning for emotion metrics |
US10627817B2 (en) | 2010-06-07 | 2020-04-21 | Affectiva, Inc. | Vehicle manipulation using occupant image analysis |
US10628985B2 (en) | 2017-12-01 | 2020-04-21 | Affectiva, Inc. | Avatar image animation using translation vectors |
US10748439B1 (en) * | 2014-10-13 | 2020-08-18 | The Cognitive Healthcare Company | Automated delivery of unique, equivalent task versions for computer delivered testing environments |
US10779761B2 (en) | 2010-06-07 | 2020-09-22 | Affectiva, Inc. | Sporadic collection of affect data within a vehicle |
US10796176B2 (en) | 2010-06-07 | 2020-10-06 | Affectiva, Inc. | Personal emotional profile generation for vehicle manipulation |
US10799168B2 (en) | 2010-06-07 | 2020-10-13 | Affectiva, Inc. | Individual data sharing across a social network |
US10843078B2 (en) | 2010-06-07 | 2020-11-24 | Affectiva, Inc. | Affect usage within a gaming context |
US20200379560A1 (en) * | 2016-01-21 | 2020-12-03 | Microsoft Technology Licensing, Llc | Implicitly adaptive eye-tracking user interface |
US10869626B2 (en) | 2010-06-07 | 2020-12-22 | Affectiva, Inc. | Image analysis for emotional metric evaluation |
US10897650B2 (en) | 2010-06-07 | 2021-01-19 | Affectiva, Inc. | Vehicle content recommendation using cognitive states |
US10911829B2 (en) | 2010-06-07 | 2021-02-02 | Affectiva, Inc. | Vehicle video recommendation via affect |
US10922566B2 (en) | 2017-05-09 | 2021-02-16 | Affectiva, Inc. | Cognitive state evaluation for vehicle navigation |
US10922567B2 (en) | 2010-06-07 | 2021-02-16 | Affectiva, Inc. | Cognitive state based vehicle manipulation using near-infrared image processing |
CN112535479A (en) * | 2020-12-04 | 2021-03-23 | 中国科学院深圳先进技术研究院 | Method for determining emotional processing tendency and related product |
US10963895B2 (en) | 2007-09-20 | 2021-03-30 | Nielsen Consumer Llc | Personalized content delivery using neuro-response priming data |
US10987015B2 (en) | 2009-08-24 | 2021-04-27 | Nielsen Consumer Llc | Dry electrodes for electroencephalography |
US11017250B2 (en) | 2010-06-07 | 2021-05-25 | Affectiva, Inc. | Vehicle manipulation using convolutional image processing |
US11030662B2 (en) * | 2007-04-16 | 2021-06-08 | Ebay Inc. | Visualization of reputation ratings |
US11048921B2 (en) * | 2018-05-09 | 2021-06-29 | Nviso Sa | Image processing system for extracting a behavioral profile from images of an individual specific to an event |
US11056225B2 (en) | 2010-06-07 | 2021-07-06 | Affectiva, Inc. | Analytics for livestreaming based on image analysis within a shared digital environment |
US11067405B2 (en) | 2010-06-07 | 2021-07-20 | Affectiva, Inc. | Cognitive state vehicle navigation based on image processing |
US11073899B2 (en) | 2010-06-07 | 2021-07-27 | Affectiva, Inc. | Multidevice multimodal emotion services monitoring |
US11151610B2 (en) | 2010-06-07 | 2021-10-19 | Affectiva, Inc. | Autonomous vehicle control using heart rate collection based on video imagery |
WO2021217188A1 (en) * | 2020-04-23 | 2021-10-28 | Ahmad Hassan Abu Elreich | Methods, systems, apparatuses, and devices for facilitating a driver to advertise products to passengers |
US11232290B2 (en) | 2010-06-07 | 2022-01-25 | Affectiva, Inc. | Image analysis using sub-sectional component evaluation to augment classifier usage |
US11292477B2 (en) | 2010-06-07 | 2022-04-05 | Affectiva, Inc. | Vehicle manipulation using cognitive state engineering |
WO2022079305A1 (en) * | 2020-10-15 | 2022-04-21 | Bioserenity | System and method for remotely controlled therapy |
US11318949B2 (en) | 2010-06-07 | 2022-05-03 | Affectiva, Inc. | In-vehicle drowsiness analysis using blink rate |
US11343596B2 (en) * | 2017-09-29 | 2022-05-24 | Warner Bros. Entertainment Inc. | Digitally representing user engagement with directed content based on biometric sensor data |
US11393133B2 (en) | 2010-06-07 | 2022-07-19 | Affectiva, Inc. | Emoji manipulation using machine learning |
US11410438B2 (en) | 2010-06-07 | 2022-08-09 | Affectiva, Inc. | Image analysis using a semiconductor processor for facial evaluation in vehicles |
US11430561B2 (en) | 2010-06-07 | 2022-08-30 | Affectiva, Inc. | Remote computing analysis for cognitive state data metrics |
US11430260B2 (en) | 2010-06-07 | 2022-08-30 | Affectiva, Inc. | Electronic display viewing verification |
US11465640B2 (en) | 2010-06-07 | 2022-10-11 | Affectiva, Inc. | Directed control transfer for autonomous vehicles |
US11481788B2 (en) | 2009-10-29 | 2022-10-25 | Nielsen Consumer Llc | Generating ratings predictions using neuro-response data |
US11484685B2 (en) | 2010-06-07 | 2022-11-01 | Affectiva, Inc. | Robotic control using profiles |
US11511757B2 (en) | 2010-06-07 | 2022-11-29 | Affectiva, Inc. | Vehicle manipulation with crowdsourcing |
US11587357B2 (en) | 2010-06-07 | 2023-02-21 | Affectiva, Inc. | Vehicular cognitive data collection with multiple devices |
US11601715B2 (en) | 2017-07-06 | 2023-03-07 | DISH Technologies L.L.C. | System and method for dynamically adjusting content playback based on viewer emotions |
WO2023037348A1 (en) * | 2021-09-13 | 2023-03-16 | Benjamin Simon Thompson | System and method for monitoring human-device interactions |
US11657288B2 (en) | 2010-06-07 | 2023-05-23 | Affectiva, Inc. | Convolutional computing using multilayered analysis engine |
US11700420B2 (en) | 2010-06-07 | 2023-07-11 | Affectiva, Inc. | Media manipulation using cognitive state metric analysis |
US11704681B2 (en) | 2009-03-24 | 2023-07-18 | Nielsen Consumer Llc | Neurological profiles for market matching and stimulus presentation |
US11704574B2 (en) | 2010-06-07 | 2023-07-18 | Affectiva, Inc. | Multimodal machine learning for vehicle manipulation |
US11769056B2 (en) | 2019-12-30 | 2023-09-26 | Affectiva, Inc. | Synthetic data for neural network training using vectors |
US11816743B1 (en) | 2010-08-10 | 2023-11-14 | Jeffrey Alan Rapaport | Information enhancing method using software agents in a social networking system |
US11823055B2 (en) | 2019-03-31 | 2023-11-21 | Affectiva, Inc. | Vehicular in-cabin sensing using machine learning |
US11887383B2 (en) | 2019-03-31 | 2024-01-30 | Affectiva, Inc. | Vehicle interior object management |
US11887352B2 (en) | 2010-06-07 | 2024-01-30 | Affectiva, Inc. | Live streaming analytics within a shared digital environment |
US11935281B2 (en) | 2010-06-07 | 2024-03-19 | Affectiva, Inc. | Vehicular in-cabin facial tracking using machine learning |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9558499B2 (en) | 2009-02-27 | 2017-01-31 | The Forbes Consulting Group, Llc | Methods and systems for assessing psychological characteristics |
PT2515760E (en) | 2009-12-21 | 2014-05-23 | Fundación Tecnalia Res & Innovation | Affective well-being supervision system and method |
EP2637563A4 (en) * | 2010-11-08 | 2014-04-30 | Optalert Australia Pty Ltd | Fitness for work test |
JP5441071B2 (en) * | 2011-09-15 | 2014-03-12 | 国立大学法人 大阪教育大学 | Face analysis device, face analysis method, and program |
PL2844146T3 (en) * | 2012-04-24 | 2016-12-30 | System and method of measuring attention | |
JP2016521411A (en) * | 2013-04-10 | 2016-07-21 | オークランド ユニサービシズ リミテッド | Head and eye tracking |
JP6201520B2 (en) * | 2013-08-21 | 2017-09-27 | 大日本印刷株式会社 | Gaze analysis system and method using physiological indices |
DE102013017820A1 (en) * | 2013-10-23 | 2015-04-23 | Humboldt-Universität Zu Berlin | Method and system for visualizing the emotional impact of visual stimulation |
DE102014104415A1 (en) * | 2014-03-28 | 2015-10-01 | Herbstwerbung Gmbh | attention acquisition |
US10230805B2 (en) | 2015-09-24 | 2019-03-12 | International Business Machines Corporation | Determining and displaying user awareness of information |
KR101734845B1 (en) * | 2015-11-13 | 2017-05-15 | 가톨릭대학교 산학협력단 | Emotion classification apparatus using visual analysis and method thereof |
US11163359B2 (en) | 2016-11-10 | 2021-11-02 | Neurotrack Technologies, Inc. | Method and system for correlating an image capturing device to a human user for analyzing gaze information associated with cognitive performance |
US10517520B2 (en) * | 2016-11-10 | 2019-12-31 | Neurotrack Technologies, Inc. | Method and system for correlating an image capturing device to a human user for analysis of cognitive performance |
US11249548B2 (en) | 2016-11-10 | 2022-02-15 | Neurotrack Technologies, Inc. | Method and system for correlating an image capturing device to a human user for analyzing gaze information associated with cognitive performance |
WO2019220428A1 (en) * | 2018-05-16 | 2019-11-21 | Moodify Ltd. | Emotional state monitoring and modification system |
KR20210093281A (en) * | 2018-11-09 | 2021-07-27 | 아킬리 인터랙티브 랩스 인크. | Facial expression detection for the examination and treatment of emotional disorders |
JP6755529B1 (en) * | 2019-11-21 | 2020-09-16 | 株式会社スワローインキュベート | Information processing method, information processing device, and control program |
JP6802549B1 (en) * | 2020-08-17 | 2020-12-16 | 株式会社スワローインキュベート | Information processing method, information processing device, and control program |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3827789A (en) * | 1971-01-08 | 1974-08-06 | Biometrics Inc | Monitoring devices |
US5243517A (en) * | 1988-08-03 | 1993-09-07 | Westinghouse Electric Corp. | Method and apparatus for physiological evaluation of short films and entertainment materials |
US6120461A (en) * | 1999-08-09 | 2000-09-19 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for tracking the human eye with a retinal scanning display, and method thereof |
US6292688B1 (en) * | 1996-02-28 | 2001-09-18 | Advanced Neurotechnologies, Inc. | Method and apparatus for analyzing neurological response to emotion-inducing stimuli |
US6385590B1 (en) * | 2000-11-22 | 2002-05-07 | Philip Levine | Method and system for determining the effectiveness of a stimulus |
US20020133347A1 (en) * | 2000-12-29 | 2002-09-19 | Eberhard Schoneburg | Method and apparatus for natural language dialog interface |
US20030125610A1 (en) * | 2001-10-26 | 2003-07-03 | Sachs Gary Steven | Computer system and method for training certifying or monitoring human clinical raters |
US20030123027A1 (en) * | 2001-12-28 | 2003-07-03 | International Business Machines Corporation | System and method for eye gaze tracking using corneal image mapping |
US20040009462A1 (en) * | 2002-05-21 | 2004-01-15 | Mcelwrath Linda Kay | Learning system |
US20040098298A1 (en) * | 2001-01-24 | 2004-05-20 | Yin Jia Hong | Monitoring responses to visual stimuli |
US20050175218A1 (en) * | 2003-11-14 | 2005-08-11 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections |
US20050225723A1 (en) * | 2004-03-25 | 2005-10-13 | Maurizio Pilu | Self-calibration for an eye tracker |
US20050234779A1 (en) * | 2003-11-17 | 2005-10-20 | Leo Chiu | System for dynamic AD selection and placement within a voice application accessed through an electronic information pace |
US7027621B1 (en) * | 2001-03-15 | 2006-04-11 | Mikos, Ltd. | Method and apparatus for operator condition monitoring and assessment |
US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
US7110582B1 (en) * | 2001-11-09 | 2006-09-19 | Hay Sam H | Method for determining binocular balance and disorders of binocularity of an individual or clinical groups of individuals |
US7113916B1 (en) * | 2001-09-07 | 2006-09-26 | Hill Daniel A | Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli |
US7120880B1 (en) * | 1999-02-25 | 2006-10-10 | International Business Machines Corporation | Method and system for real-time determination of a subject's interest level to media content |
US7191403B2 (en) * | 2000-03-17 | 2007-03-13 | Schlucktronic Llc | Methods and devices for reconstructing visual stimuli observed through browser-based interfaces over time |
US20070097319A1 (en) * | 2003-03-27 | 2007-05-03 | Mckay Stuart | Stereoscopic display |
US20070150916A1 (en) * | 2005-12-28 | 2007-06-28 | James Begole | Using sensors to provide feedback on the access of digital content |
US7302475B2 (en) * | 2004-02-20 | 2007-11-27 | Harris Interactive, Inc. | System and method for measuring reactions to product packaging, advertising, or product features over a computer-based network |
US7306337B2 (en) * | 2003-03-06 | 2007-12-11 | Rensselaer Polytechnic Institute | Calibration-free gaze tracking under natural head movement |
US7356470B2 (en) * | 2000-11-10 | 2008-04-08 | Adam Roth | Text-to-speech and image generation of multimedia attachments to e-mail |
US7401920B1 (en) * | 2003-05-20 | 2008-07-22 | Elbit Systems Ltd. | Head mounted eye tracking and display system |
US7593952B2 (en) * | 1999-04-09 | 2009-09-22 | Soll Andrew H | Enhanced medical treatment system |
US7689499B1 (en) * | 2005-02-24 | 2010-03-30 | Trading Technologies International, Inc. | System and method for displaying market data in an electronic trading environment |
US7740631B2 (en) * | 2004-10-15 | 2010-06-22 | Baxano, Inc. | Devices and methods for tissue modification |
US7747068B1 (en) * | 2006-01-20 | 2010-06-29 | Andrew Paul Smyth | Systems and methods for tracking the eye |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2622365A1 (en) * | 2005-09-16 | 2007-09-13 | Imotions-Emotion Technology A/S | System and method for determining human emotion by analyzing eye properties |
-
2007
- 2007-03-13 EP EP20070873991 patent/EP2007271A2/en not_active Withdrawn
- 2007-03-13 WO PCT/IB2007/004587 patent/WO2008129356A2/en active Application Filing
- 2007-03-13 JP JP2009510570A patent/JP2009530071A/en not_active Withdrawn
- 2007-03-13 CA CA 2639125 patent/CA2639125A1/en not_active Abandoned
- 2007-03-13 US US11/685,552 patent/US20070265507A1/en not_active Abandoned
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3827789A (en) * | 1971-01-08 | 1974-08-06 | Biometrics Inc | Monitoring devices |
US5243517A (en) * | 1988-08-03 | 1993-09-07 | Westinghouse Electric Corp. | Method and apparatus for physiological evaluation of short films and entertainment materials |
US6292688B1 (en) * | 1996-02-28 | 2001-09-18 | Advanced Neurotechnologies, Inc. | Method and apparatus for analyzing neurological response to emotion-inducing stimuli |
US7120880B1 (en) * | 1999-02-25 | 2006-10-10 | International Business Machines Corporation | Method and system for real-time determination of a subject's interest level to media content |
US7593952B2 (en) * | 1999-04-09 | 2009-09-22 | Soll Andrew H | Enhanced medical treatment system |
US6120461A (en) * | 1999-08-09 | 2000-09-19 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for tracking the human eye with a retinal scanning display, and method thereof |
US7191403B2 (en) * | 2000-03-17 | 2007-03-13 | Schlucktronic Llc | Methods and devices for reconstructing visual stimuli observed through browser-based interfaces over time |
US7356470B2 (en) * | 2000-11-10 | 2008-04-08 | Adam Roth | Text-to-speech and image generation of multimedia attachments to e-mail |
US6385590B1 (en) * | 2000-11-22 | 2002-05-07 | Philip Levine | Method and system for determining the effectiveness of a stimulus |
US20020133347A1 (en) * | 2000-12-29 | 2002-09-19 | Eberhard Schoneburg | Method and apparatus for natural language dialog interface |
US20040098298A1 (en) * | 2001-01-24 | 2004-05-20 | Yin Jia Hong | Monitoring responses to visual stimuli |
US7027621B1 (en) * | 2001-03-15 | 2006-04-11 | Mikos, Ltd. | Method and apparatus for operator condition monitoring and assessment |
US7113916B1 (en) * | 2001-09-07 | 2006-09-26 | Hill Daniel A | Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli |
US20030125610A1 (en) * | 2001-10-26 | 2003-07-03 | Sachs Gary Steven | Computer system and method for training certifying or monitoring human clinical raters |
US7110582B1 (en) * | 2001-11-09 | 2006-09-19 | Hay Sam H | Method for determining binocular balance and disorders of binocularity of an individual or clinical groups of individuals |
US20030123027A1 (en) * | 2001-12-28 | 2003-07-03 | International Business Machines Corporation | System and method for eye gaze tracking using corneal image mapping |
US20040009462A1 (en) * | 2002-05-21 | 2004-01-15 | Mcelwrath Linda Kay | Learning system |
US7306337B2 (en) * | 2003-03-06 | 2007-12-11 | Rensselaer Polytechnic Institute | Calibration-free gaze tracking under natural head movement |
US20070097319A1 (en) * | 2003-03-27 | 2007-05-03 | Mckay Stuart | Stereoscopic display |
US7401920B1 (en) * | 2003-05-20 | 2008-07-22 | Elbit Systems Ltd. | Head mounted eye tracking and display system |
US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
US20050175218A1 (en) * | 2003-11-14 | 2005-08-11 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections |
US20050234779A1 (en) * | 2003-11-17 | 2005-10-20 | Leo Chiu | System for dynamic AD selection and placement within a voice application accessed through an electronic information pace |
US7302475B2 (en) * | 2004-02-20 | 2007-11-27 | Harris Interactive, Inc. | System and method for measuring reactions to product packaging, advertising, or product features over a computer-based network |
US20050225723A1 (en) * | 2004-03-25 | 2005-10-13 | Maurizio Pilu | Self-calibration for an eye tracker |
US7657062B2 (en) * | 2004-03-25 | 2010-02-02 | Hewlett-Packard Development Company, L.P. | Self-calibration for an eye tracker |
US7740631B2 (en) * | 2004-10-15 | 2010-06-22 | Baxano, Inc. | Devices and methods for tissue modification |
US7689499B1 (en) * | 2005-02-24 | 2010-03-30 | Trading Technologies International, Inc. | System and method for displaying market data in an electronic trading environment |
US20070150916A1 (en) * | 2005-12-28 | 2007-06-28 | James Begole | Using sensors to provide feedback on the access of digital content |
US7747068B1 (en) * | 2006-01-20 | 2010-06-29 | Andrew Paul Smyth | Systems and methods for tracking the eye |
Cited By (344)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060189885A1 (en) * | 2003-01-07 | 2006-08-24 | Monash University | Assessment of cognitive impairment |
US10506941B2 (en) | 2005-08-09 | 2019-12-17 | The Nielsen Company (Us), Llc | Device and method for sensing electrical activity in tissue |
US11638547B2 (en) | 2005-08-09 | 2023-05-02 | Nielsen Consumer Llc | Device and method for sensing electrical activity in tissue |
US9351658B2 (en) | 2005-09-02 | 2016-05-31 | The Nielsen Company (Us), Llc | Device and method for sensing electrical activity in tissue |
US20070066916A1 (en) * | 2005-09-16 | 2007-03-22 | Imotions Emotion Technology Aps | System and method for determining human emotion by analyzing eye properties |
US20140127662A1 (en) * | 2006-07-12 | 2014-05-08 | Frederick W. Kron | Computerized medical training system |
US10074129B2 (en) | 2006-12-30 | 2018-09-11 | Red Dot Square Solutions Limited | Virtual reality system including smart objects |
US9940589B2 (en) * | 2006-12-30 | 2018-04-10 | Red Dot Square Solutions Limited | Virtual reality system including viewer responsiveness to smart objects |
US20100149093A1 (en) * | 2006-12-30 | 2010-06-17 | Red Dot Square Solutions Limited | Virtual reality system including viewer responsiveness to smart objects |
US9400993B2 (en) | 2006-12-30 | 2016-07-26 | Red Dot Square Solutions Limited | Virtual reality system including smart objects |
US20080215975A1 (en) * | 2007-03-01 | 2008-09-04 | Phil Harrison | Virtual world user opinion & response monitoring |
US9215996B2 (en) | 2007-03-02 | 2015-12-22 | The Nielsen Company (Us), Llc | Apparatus and method for objectively determining human response to media |
US20090070798A1 (en) * | 2007-03-02 | 2009-03-12 | Lee Hans C | System and Method for Detecting Viewer Attention to Media Delivery Devices |
US8473044B2 (en) | 2007-03-07 | 2013-06-25 | The Nielsen Company (Us), Llc | Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals |
US20080222670A1 (en) * | 2007-03-07 | 2008-09-11 | Lee Hans C | Method and system for using coherence of biological responses as a measure of performance of a media |
US8973022B2 (en) | 2007-03-07 | 2015-03-03 | The Nielsen Company (Us), Llc | Method and system for using coherence of biological responses as a measure of performance of a media |
US8230457B2 (en) | 2007-03-07 | 2012-07-24 | The Nielsen Company (Us), Llc. | Method and system for using coherence of biological responses as a measure of performance of a media |
US8782681B2 (en) | 2007-03-08 | 2014-07-15 | The Nielsen Company (Us), Llc | Method and system for rating media and events in media based on physiological data |
US8764652B2 (en) | 2007-03-08 | 2014-07-01 | The Nielson Company (US), LLC. | Method and system for measuring and ranking an “engagement” response to audiovisual or interactive media, products, or activities using physiological signals |
US11790393B2 (en) | 2007-03-29 | 2023-10-17 | Nielsen Consumer Llc | Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data |
US11250465B2 (en) | 2007-03-29 | 2022-02-15 | Nielsen Consumer Llc | Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous sytem, and effector data |
US10679241B2 (en) | 2007-03-29 | 2020-06-09 | The Nielsen Company (Us), Llc | Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data |
US8484081B2 (en) | 2007-03-29 | 2013-07-09 | The Nielsen Company (Us), Llc | Analysis of marketing and entertainment effectiveness using central nervous system, autonomic nervous system, and effector data |
US8473345B2 (en) | 2007-03-29 | 2013-06-25 | The Nielsen Company (Us), Llc | Protocol generator and presenter device for analysis of marketing and entertainment effectiveness |
US20080243005A1 (en) * | 2007-03-30 | 2008-10-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Computational user-health testing |
US11030662B2 (en) * | 2007-04-16 | 2021-06-08 | Ebay Inc. | Visualization of reputation ratings |
US11763356B2 (en) | 2007-04-16 | 2023-09-19 | Ebay Inc. | Visualization of reputation ratings |
US8386312B2 (en) | 2007-05-01 | 2013-02-26 | The Nielsen Company (Us), Llc | Neuro-informatics repository system |
US9886981B2 (en) | 2007-05-01 | 2018-02-06 | The Nielsen Company (Us), Llc | Neuro-feedback based stimulus compression device |
US8392253B2 (en) | 2007-05-16 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
US11049134B2 (en) | 2007-05-16 | 2021-06-29 | Nielsen Consumer Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
US10580031B2 (en) | 2007-05-16 | 2020-03-03 | The Nielsen Company (Us), Llc | Neuro-physiology and neuro-behavioral based stimulus targeting system |
US20090025023A1 (en) * | 2007-06-06 | 2009-01-22 | Neurofocus Inc. | Multi-market program and commercial response monitoring system using neuro-response measurements |
US8494905B2 (en) | 2007-06-06 | 2013-07-23 | The Nielsen Company (Us), Llc | Audience response analysis using simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) |
US11763340B2 (en) | 2007-07-30 | 2023-09-19 | Nielsen Consumer Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US8533042B2 (en) | 2007-07-30 | 2013-09-10 | The Nielsen Company (Us), Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US10733625B2 (en) | 2007-07-30 | 2020-08-04 | The Nielsen Company (Us), Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US11244345B2 (en) | 2007-07-30 | 2022-02-08 | Nielsen Consumer Llc | Neuro-response stimulus and stimulus attribute resonance estimator |
US8386313B2 (en) | 2007-08-28 | 2013-02-26 | The Nielsen Company (Us), Llc | Stimulus placement system using subject neuro-response measurements |
US10127572B2 (en) | 2007-08-28 | 2018-11-13 | The Nielsen Company, (US), LLC | Stimulus placement system using subject neuro-response measurements |
US8392254B2 (en) | 2007-08-28 | 2013-03-05 | The Nielsen Company (Us), Llc | Consumer experience assessment system |
US8635105B2 (en) | 2007-08-28 | 2014-01-21 | The Nielsen Company (Us), Llc | Consumer experience portrayal effectiveness assessment system |
US11488198B2 (en) | 2007-08-28 | 2022-11-01 | Nielsen Consumer Llc | Stimulus placement system using subject neuro-response measurements |
US10937051B2 (en) | 2007-08-28 | 2021-03-02 | The Nielsen Company (Us), Llc | Stimulus placement system using subject neuro-response measurements |
US10140628B2 (en) | 2007-08-29 | 2018-11-27 | The Nielsen Company, (US), LLC | Content based selection and meta tagging of advertisement breaks |
US11610223B2 (en) | 2007-08-29 | 2023-03-21 | Nielsen Consumer Llc | Content based selection and meta tagging of advertisement breaks |
US8392255B2 (en) | 2007-08-29 | 2013-03-05 | The Nielsen Company (Us), Llc | Content based selection and meta tagging of advertisement breaks |
US11023920B2 (en) | 2007-08-29 | 2021-06-01 | Nielsen Consumer Llc | Content based selection and meta tagging of advertisement breaks |
US8376952B2 (en) | 2007-09-07 | 2013-02-19 | The Nielsen Company (Us), Llc. | Method and apparatus for sensing blood oxygen |
US8494610B2 (en) | 2007-09-20 | 2013-07-23 | The Nielsen Company (Us), Llc | Analysis of marketing and entertainment effectiveness using magnetoencephalography |
US10963895B2 (en) | 2007-09-20 | 2021-03-30 | Nielsen Consumer Llc | Personalized content delivery using neuro-response priming data |
US9021515B2 (en) | 2007-10-02 | 2015-04-28 | The Nielsen Company (Us), Llc | Systems and methods to determine media effectiveness |
US8332883B2 (en) | 2007-10-02 | 2012-12-11 | The Nielsen Company (Us), Llc | Providing actionable insights based on physiological responses from viewers of media |
US9571877B2 (en) | 2007-10-02 | 2017-02-14 | The Nielsen Company (Us), Llc | Systems and methods to determine media effectiveness |
US8327395B2 (en) | 2007-10-02 | 2012-12-04 | The Nielsen Company (Us), Llc | System providing actionable insights based on physiological responses from viewers of media |
US8151292B2 (en) | 2007-10-02 | 2012-04-03 | Emsense Corporation | System for remote access to media, and reaction and survey data from viewers of the media |
US9894399B2 (en) | 2007-10-02 | 2018-02-13 | The Nielsen Company (Us), Llc | Systems and methods to determine media effectiveness |
US20090094627A1 (en) * | 2007-10-02 | 2009-04-09 | Lee Hans C | Providing Remote Access to Media, and Reaction and Survey Data From Viewers of the Media |
US9582805B2 (en) | 2007-10-24 | 2017-02-28 | Invention Science Fund I, Llc | Returning a personalized advertisement |
US20090113297A1 (en) * | 2007-10-24 | 2009-04-30 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Requesting a second content based on a user's reaction to a first content |
US9513699B2 (en) | 2007-10-24 | 2016-12-06 | Invention Science Fund I, LL | Method of selecting a second content based on a user's reaction to a first content |
US11250447B2 (en) | 2007-10-31 | 2022-02-15 | Nielsen Consumer Llc | Systems and methods providing en mass collection and centralized processing of physiological responses from viewers |
US9521960B2 (en) | 2007-10-31 | 2016-12-20 | The Nielsen Company (Us), Llc | Systems and methods providing en mass collection and centralized processing of physiological responses from viewers |
US10580018B2 (en) | 2007-10-31 | 2020-03-03 | The Nielsen Company (Us), Llc | Systems and methods providing EN mass collection and centralized processing of physiological responses from viewers |
US20090133047A1 (en) * | 2007-10-31 | 2009-05-21 | Lee Hans C | Systems and Methods Providing Distributed Collection and Centralized Processing of Physiological Responses from Viewers |
US20090150919A1 (en) * | 2007-11-30 | 2009-06-11 | Lee Michael J | Correlating Media Instance Information With Physiological Responses From Participating Subjects |
US20090157625A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for identifying an avatar-linked population cohort |
US20090157660A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems employing a cohort-linked avatar |
US20090157481A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a cohort-linked avatar attribute |
US8615479B2 (en) | 2007-12-13 | 2013-12-24 | The Invention Science Fund I, Llc | Methods and systems for indicating behavior in a population cohort |
US9495684B2 (en) | 2007-12-13 | 2016-11-15 | The Invention Science Fund I, Llc | Methods and systems for indicating behavior in a population cohort |
US20090157482A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for indicating behavior in a population cohort |
US9211077B2 (en) * | 2007-12-13 | 2015-12-15 | The Invention Science Fund I, Llc | Methods and systems for specifying an avatar |
US20090157751A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090156907A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090171164A1 (en) * | 2007-12-17 | 2009-07-02 | Jung Edward K Y | Methods and systems for identifying an avatar-linked population cohort |
US8347326B2 (en) * | 2007-12-18 | 2013-01-01 | The Nielsen Company (US) | Identifying key media events and modeling causal relationships between key events and reported feelings |
US8793715B1 (en) | 2007-12-18 | 2014-07-29 | The Nielsen Company (Us), Llc | Identifying key media events and modeling causal relationships between key events and reported feelings |
WO2009079240A1 (en) * | 2007-12-18 | 2009-06-25 | Emsense Corporation | Identifying key media events and modeling causal relationships between key events and reported feelings |
US20090158308A1 (en) * | 2007-12-18 | 2009-06-18 | Daniel Weitzenfeld | Identifying key media events and modeling causal relationships between key events and reported feelings |
US20090164503A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US9418368B2 (en) | 2007-12-20 | 2016-08-16 | Invention Science Fund I, Llc | Methods and systems for determining interest in a cohort-linked avatar |
US20090164549A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for determining interest in a cohort-linked avatar |
US20090164131A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US9775554B2 (en) | 2007-12-31 | 2017-10-03 | Invention Science Fund I, Llc | Population cohort-linked avatar |
US20090172540A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Population cohort-linked avatar |
FR2933518A1 (en) * | 2008-07-03 | 2010-01-08 | Mettler Toledo Sas | TRANSACTION TERMINAL AND TRANSACTION SYSTEM COMPRISING SUCH TERMINALS CONNECTED TO A SERVER |
WO2010000986A1 (en) * | 2008-07-03 | 2010-01-07 | Mettler Toledo Sas | Transaction terminal and transaction system comprising such terminals linked to a server |
US8986218B2 (en) * | 2008-07-09 | 2015-03-24 | Imotions A/S | System and method for calibrating and normalizing eye data in emotional testing |
US20130331729A1 (en) * | 2008-07-09 | 2013-12-12 | Imotions - Eye Tracking Aps | System and method for calibrating and normalizing eye data in emotional testing |
WO2010004426A1 (en) * | 2008-07-09 | 2010-01-14 | Imotions - Emotion Technology A/S | System and method for calibrating and normalizing eye data in emotional testing |
US20100036709A1 (en) * | 2008-08-05 | 2010-02-11 | Ford Motor Company | Method and system of measuring customer satisfaction with purchased vehicle |
US9710816B2 (en) * | 2008-08-05 | 2017-07-18 | Ford Motor Company | Method and system of measuring customer satisfaction with purchased vehicle |
WO2010018459A2 (en) * | 2008-08-15 | 2010-02-18 | Imotions - Emotion Technology A/S | System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text |
US8136944B2 (en) * | 2008-08-15 | 2012-03-20 | iMotions - Eye Tracking A/S | System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text |
US8814357B2 (en) | 2008-08-15 | 2014-08-26 | Imotions A/S | System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text |
WO2010018459A3 (en) * | 2008-08-15 | 2010-04-08 | Imotions - Emotion Technology A/S | System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text |
US20100039618A1 (en) * | 2008-08-15 | 2010-02-18 | Imotions - Emotion Technology A/S | System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text |
US20110141011A1 (en) * | 2008-09-03 | 2011-06-16 | Koninklijke Philips Electronics N.V. | Method of performing a gaze-based interaction between a user and an interactive display system |
US8401248B1 (en) | 2008-12-30 | 2013-03-19 | Videomining Corporation | Method and system for measuring emotional and attentional response to dynamic digital media content |
US9826284B2 (en) | 2009-01-21 | 2017-11-21 | The Nielsen Company (Us), Llc | Methods and apparatus for providing alternate media for video decoders |
US8464288B2 (en) | 2009-01-21 | 2013-06-11 | The Nielsen Company (Us), Llc | Methods and apparatus for providing personalized media in video |
US8270814B2 (en) | 2009-01-21 | 2012-09-18 | The Nielsen Company (Us), Llc | Methods and apparatus for providing video with embedded media |
US9357240B2 (en) | 2009-01-21 | 2016-05-31 | The Nielsen Company (Us), Llc | Methods and apparatus for providing alternate media for video decoders |
US8955010B2 (en) | 2009-01-21 | 2015-02-10 | The Nielsen Company (Us), Llc | Methods and apparatus for providing personalized media in video |
US8977110B2 (en) | 2009-01-21 | 2015-03-10 | The Nielsen Company (Us), Llc | Methods and apparatus for providing video with embedded media |
US20100205541A1 (en) * | 2009-02-11 | 2010-08-12 | Jeffrey A. Rapaport | social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
WO2010093678A1 (en) * | 2009-02-11 | 2010-08-19 | Rapaport Jeffrey A | Instantly clustering people with concurrent focus on same topic into chat rooms |
US8539359B2 (en) | 2009-02-11 | 2013-09-17 | Jeffrey A. Rapaport | Social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic |
US20140236953A1 (en) * | 2009-02-11 | 2014-08-21 | Jeffrey A. Rapaport | Methods using social topical adaptive networking system |
US10691726B2 (en) * | 2009-02-11 | 2020-06-23 | Jeffrey A. Rapaport | Methods using social topical adaptive networking system |
US9295806B2 (en) | 2009-03-06 | 2016-03-29 | Imotions A/S | System and method for determining emotional response to olfactory stimuli |
WO2010100567A2 (en) * | 2009-03-06 | 2010-09-10 | Imotions- Emotion Technology A/S | System and method for determining emotional response to olfactory stimuli |
WO2010100567A3 (en) * | 2009-03-06 | 2010-10-28 | Imotions- Emotion Technology A/S | System and method for determining emotional response to olfactory stimuli |
US9886729B2 (en) | 2009-03-10 | 2018-02-06 | Gearbox, Llc | Computational systems and methods for health services planning and matching |
US9911165B2 (en) * | 2009-03-10 | 2018-03-06 | Gearbox, Llc | Computational systems and methods for health services planning and matching |
US9858540B2 (en) | 2009-03-10 | 2018-01-02 | Gearbox, Llc | Computational systems and methods for health services planning and matching |
US9892435B2 (en) | 2009-03-10 | 2018-02-13 | Gearbox Llc | Computational systems and methods for health services planning and matching |
US20100268108A1 (en) * | 2009-03-10 | 2010-10-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Computational systems and methods for health services planning and matching |
US10319471B2 (en) | 2009-03-10 | 2019-06-11 | Gearbox Llc | Computational systems and methods for health services planning and matching |
US11704681B2 (en) | 2009-03-24 | 2023-07-18 | Nielsen Consumer Llc | Neurological profiles for market matching and stimulus presentation |
US8655437B2 (en) | 2009-08-21 | 2014-02-18 | The Nielsen Company (Us), Llc | Analysis of the mirror neuron system for evaluation of stimulus |
US10987015B2 (en) | 2009-08-24 | 2021-04-27 | Nielsen Consumer Llc | Dry electrodes for electroencephalography |
US20110077996A1 (en) * | 2009-09-25 | 2011-03-31 | Hyungil Ahn | Multimodal Affective-Cognitive Product Evaluation |
US11170400B2 (en) | 2009-10-29 | 2021-11-09 | Nielsen Consumer Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US8762202B2 (en) | 2009-10-29 | 2014-06-24 | The Nielson Company (Us), Llc | Intracluster content management using neuro-response priming data |
US9560984B2 (en) | 2009-10-29 | 2017-02-07 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US10068248B2 (en) | 2009-10-29 | 2018-09-04 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US11481788B2 (en) | 2009-10-29 | 2022-10-25 | Nielsen Consumer Llc | Generating ratings predictions using neuro-response data |
US8209224B2 (en) | 2009-10-29 | 2012-06-26 | The Nielsen Company (Us), Llc | Intracluster content management using neuro-response priming data |
US10269036B2 (en) | 2009-10-29 | 2019-04-23 | The Nielsen Company (Us), Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US11669858B2 (en) | 2009-10-29 | 2023-06-06 | Nielsen Consumer Llc | Analysis of controlled and automatic attention for introduction of stimulus material |
US20110109879A1 (en) * | 2009-11-09 | 2011-05-12 | Daphna Palti-Wasserman | Multivariate dynamic profiling system and methods |
US8335715B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Advertisement exchange using neuro-response data |
US8335716B2 (en) | 2009-11-19 | 2012-12-18 | The Nielsen Company (Us), Llc. | Multimedia advertisement exchange |
US11200964B2 (en) | 2010-04-19 | 2021-12-14 | Nielsen Consumer Llc | Short imagery task (SIT) research method |
US9454646B2 (en) | 2010-04-19 | 2016-09-27 | The Nielsen Company (Us), Llc | Short imagery task (SIT) research method |
US10248195B2 (en) | 2010-04-19 | 2019-04-02 | The Nielsen Company (Us), Llc. | Short imagery task (SIT) research method |
US20110263946A1 (en) * | 2010-04-22 | 2011-10-27 | Mit Media Lab | Method and system for real-time and offline analysis, inference, tagging of and responding to person(s) experiences |
US8655428B2 (en) | 2010-05-12 | 2014-02-18 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
US9336535B2 (en) | 2010-05-12 | 2016-05-10 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
US11935281B2 (en) | 2010-06-07 | 2024-03-19 | Affectiva, Inc. | Vehicular in-cabin facial tracking using machine learning |
US10843078B2 (en) | 2010-06-07 | 2020-11-24 | Affectiva, Inc. | Affect usage within a gaming context |
US11465640B2 (en) | 2010-06-07 | 2022-10-11 | Affectiva, Inc. | Directed control transfer for autonomous vehicles |
US10628741B2 (en) | 2010-06-07 | 2020-04-21 | Affectiva, Inc. | Multimodal machine learning for emotion metrics |
US11887352B2 (en) | 2010-06-07 | 2024-01-30 | Affectiva, Inc. | Live streaming analytics within a shared digital environment |
US10627817B2 (en) | 2010-06-07 | 2020-04-21 | Affectiva, Inc. | Vehicle manipulation using occupant image analysis |
US11700420B2 (en) | 2010-06-07 | 2023-07-11 | Affectiva, Inc. | Media manipulation using cognitive state metric analysis |
US10614289B2 (en) | 2010-06-07 | 2020-04-07 | Affectiva, Inc. | Facial tracking with classifiers |
US10204625B2 (en) | 2010-06-07 | 2019-02-12 | Affectiva, Inc. | Audio analysis learning using video data |
US11430260B2 (en) | 2010-06-07 | 2022-08-30 | Affectiva, Inc. | Electronic display viewing verification |
US11430561B2 (en) | 2010-06-07 | 2022-08-30 | Affectiva, Inc. | Remote computing analysis for cognitive state data metrics |
US11410438B2 (en) | 2010-06-07 | 2022-08-09 | Affectiva, Inc. | Image analysis using a semiconductor processor for facial evaluation in vehicles |
US11393133B2 (en) | 2010-06-07 | 2022-07-19 | Affectiva, Inc. | Emoji manipulation using machine learning |
US11318949B2 (en) | 2010-06-07 | 2022-05-03 | Affectiva, Inc. | In-vehicle drowsiness analysis using blink rate |
US10869626B2 (en) | 2010-06-07 | 2020-12-22 | Affectiva, Inc. | Image analysis for emotional metric evaluation |
US11292477B2 (en) | 2010-06-07 | 2022-04-05 | Affectiva, Inc. | Vehicle manipulation using cognitive state engineering |
US10867197B2 (en) | 2010-06-07 | 2020-12-15 | Affectiva, Inc. | Drowsiness mental state analysis using blink rate |
US9723992B2 (en) * | 2010-06-07 | 2017-08-08 | Affectiva, Inc. | Mental state analysis using blink rate |
US10897650B2 (en) | 2010-06-07 | 2021-01-19 | Affectiva, Inc. | Vehicle content recommendation using cognitive states |
US10289898B2 (en) | 2010-06-07 | 2019-05-14 | Affectiva, Inc. | Video recommendation via affect |
US11484685B2 (en) | 2010-06-07 | 2022-11-01 | Affectiva, Inc. | Robotic control using profiles |
US20140323817A1 (en) * | 2010-06-07 | 2014-10-30 | Affectiva, Inc. | Personal emotional profile generation |
US10779761B2 (en) | 2010-06-07 | 2020-09-22 | Affectiva, Inc. | Sporadic collection of affect data within a vehicle |
US20140016860A1 (en) * | 2010-06-07 | 2014-01-16 | Affectiva, Inc. | Facial analysis to detect asymmetric expressions |
US11657288B2 (en) | 2010-06-07 | 2023-05-23 | Affectiva, Inc. | Convolutional computing using multilayered analysis engine |
US11232290B2 (en) | 2010-06-07 | 2022-01-25 | Affectiva, Inc. | Image analysis using sub-sectional component evaluation to augment classifier usage |
US10401860B2 (en) | 2010-06-07 | 2019-09-03 | Affectiva, Inc. | Image analysis for two-sided data hub |
US10796176B2 (en) | 2010-06-07 | 2020-10-06 | Affectiva, Inc. | Personal emotional profile generation for vehicle manipulation |
US10592757B2 (en) | 2010-06-07 | 2020-03-17 | Affectiva, Inc. | Vehicular cognitive data collection using multiple devices |
US10799168B2 (en) | 2010-06-07 | 2020-10-13 | Affectiva, Inc. | Individual data sharing across a social network |
US10143414B2 (en) | 2010-06-07 | 2018-12-04 | Affectiva, Inc. | Sporadic collection with mobile affect data |
US11511757B2 (en) | 2010-06-07 | 2022-11-29 | Affectiva, Inc. | Vehicle manipulation with crowdsourcing |
US10111611B2 (en) * | 2010-06-07 | 2018-10-30 | Affectiva, Inc. | Personal emotional profile generation |
US9934425B2 (en) | 2010-06-07 | 2018-04-03 | Affectiva, Inc. | Collection of affect data from multiple mobile devices |
US10911829B2 (en) | 2010-06-07 | 2021-02-02 | Affectiva, Inc. | Vehicle video recommendation via affect |
US9959549B2 (en) | 2010-06-07 | 2018-05-01 | Affectiva, Inc. | Mental state analysis for norm generation |
US11587357B2 (en) | 2010-06-07 | 2023-02-21 | Affectiva, Inc. | Vehicular cognitive data collection with multiple devices |
US11151610B2 (en) | 2010-06-07 | 2021-10-19 | Affectiva, Inc. | Autonomous vehicle control using heart rate collection based on video imagery |
US11073899B2 (en) | 2010-06-07 | 2021-07-27 | Affectiva, Inc. | Multidevice multimodal emotion services monitoring |
US11067405B2 (en) | 2010-06-07 | 2021-07-20 | Affectiva, Inc. | Cognitive state vehicle navigation based on image processing |
US11056225B2 (en) | 2010-06-07 | 2021-07-06 | Affectiva, Inc. | Analytics for livestreaming based on image analysis within a shared digital environment |
US10108852B2 (en) * | 2010-06-07 | 2018-10-23 | Affectiva, Inc. | Facial analysis to detect asymmetric expressions |
US10922567B2 (en) | 2010-06-07 | 2021-02-16 | Affectiva, Inc. | Cognitive state based vehicle manipulation using near-infrared image processing |
US10573313B2 (en) | 2010-06-07 | 2020-02-25 | Affectiva, Inc. | Audio analysis learning with video data |
US20140200417A1 (en) * | 2010-06-07 | 2014-07-17 | Affectiva, Inc. | Mental state analysis using blink rate |
US10074024B2 (en) | 2010-06-07 | 2018-09-11 | Affectiva, Inc. | Mental state analysis using blink rate for vehicles |
US10517521B2 (en) | 2010-06-07 | 2019-12-31 | Affectiva, Inc. | Mental state mood analysis using heart rate collection based on video imagery |
US11017250B2 (en) | 2010-06-07 | 2021-05-25 | Affectiva, Inc. | Vehicle manipulation using convolutional image processing |
US10474875B2 (en) | 2010-06-07 | 2019-11-12 | Affectiva, Inc. | Image analysis using a semiconductor processor for facial evaluation |
US11704574B2 (en) | 2010-06-07 | 2023-07-18 | Affectiva, Inc. | Multimodal machine learning for vehicle manipulation |
US8939903B2 (en) * | 2010-06-17 | 2015-01-27 | Forethough Pty Ltd | Measurement of emotional response to sensory stimuli |
US20120035428A1 (en) * | 2010-06-17 | 2012-02-09 | Kenneth George Roberts | Measurement of emotional response to sensory stimuli |
US8392250B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Neuro-response evaluated stimulus in virtual reality environments |
US8392251B2 (en) | 2010-08-09 | 2013-03-05 | The Nielsen Company (Us), Llc | Location aware presentation of stimulus material |
US11816743B1 (en) | 2010-08-10 | 2023-11-14 | Jeffrey Alan Rapaport | Information enhancing method using software agents in a social networking system |
US8396744B2 (en) | 2010-08-25 | 2013-03-12 | The Nielsen Company (Us), Llc | Effective virtual reality environments for presentation of marketing materials |
US8548852B2 (en) | 2010-08-25 | 2013-10-01 | The Nielsen Company (Us), Llc | Effective virtual reality environments for presentation of marketing materials |
WO2012073016A1 (en) * | 2010-11-30 | 2012-06-07 | University Of Lincoln | A response detection system and associated methods |
US20120143693A1 (en) * | 2010-12-02 | 2012-06-07 | Microsoft Corporation | Targeting Advertisements Based on Emotion |
US20120259240A1 (en) * | 2011-04-08 | 2012-10-11 | Nviso Sarl | Method and System for Assessing and Measuring Emotional Intensity to a Stimulus |
US10142276B2 (en) | 2011-05-12 | 2018-11-27 | Jeffrey Alan Rapaport | Contextually-based automatic service offerings to users of machine system |
US11805091B1 (en) | 2011-05-12 | 2023-10-31 | Jeffrey Alan Rapaport | Social topical context adaptive network hosted system |
US11539657B2 (en) | 2011-05-12 | 2022-12-27 | Jeffrey Alan Rapaport | Contextually-based automatic grouped content recommendations to users of a social networking system |
US8676937B2 (en) | 2011-05-12 | 2014-03-18 | Jeffrey Alan Rapaport | Social-topical adaptive networking (STAN) system allowing for group based contextual transaction offers and acceptances and hot topic watchdogging |
WO2012174381A3 (en) * | 2011-06-17 | 2013-07-11 | Microsoft Corporation | Video highlight identification based on environmental sensing |
US20130054090A1 (en) * | 2011-08-29 | 2013-02-28 | Electronics And Telecommunications Research Institute | Emotion-based vehicle service system, emotion cognition processing apparatus, safe driving service apparatus, and emotion-based safe driving service method |
US8862317B2 (en) * | 2011-08-29 | 2014-10-14 | Electronics And Telecommunications Research Institute | Emotion-based vehicle service system, emotion cognition processing apparatus, safe driving apparatus, and emotion-based safe driving service method |
US20190102706A1 (en) * | 2011-10-20 | 2019-04-04 | Affectomatics Ltd. | Affective response based recommendations |
US9665832B2 (en) * | 2011-10-20 | 2017-05-30 | Affectomatics Ltd. | Estimating affective response to a token instance utilizing a predicted affective response to its background |
US9582769B2 (en) * | 2011-10-20 | 2017-02-28 | Affectomatics Ltd. | Estimating affective response to a token instance utilizing a window from which the token instance was removed |
US20150220843A1 (en) * | 2011-10-20 | 2015-08-06 | Gil Thieberger | Estimating affective response to a token instance utilizing a window from which the token instance was removed |
US9569734B2 (en) * | 2011-10-20 | 2017-02-14 | Affectomatics Ltd. | Utilizing eye-tracking to estimate affective response to a token instance of interest |
US9514419B2 (en) * | 2011-10-20 | 2016-12-06 | Affectomatics Ltd. | Estimating affective response to a token instance of interest utilizing a model for predicting interest in token instances |
US20150220842A1 (en) * | 2011-10-20 | 2015-08-06 | Gil Thieberger | Estimating affective response to a token instance of interest utilizing a model for predicting interest in token instances |
US20150193688A1 (en) * | 2011-10-20 | 2015-07-09 | Gil Thieberger | Estimating affective response to a token instance utilizing a predicted affective response to its background |
US9854966B2 (en) * | 2011-11-22 | 2018-01-02 | Dignity Health | System and method for using microsaccade dynamics to measure attentional response to a stimulus |
US20140336526A1 (en) * | 2011-11-22 | 2014-11-13 | Jorge Otero-Millan | System and method for using microsaccade dynamics to measure attentional response to a stimulus |
US11602273B2 (en) | 2011-11-22 | 2023-03-14 | Dignity Health | System and method for using microsaccade dynamics to measure attentional response to a stimulus |
US8869115B2 (en) | 2011-11-23 | 2014-10-21 | General Electric Company | Systems and methods for emotive software usability |
US20130137076A1 (en) * | 2011-11-30 | 2013-05-30 | Kathryn Stone Perez | Head-mounted display based education and instruction |
US9451303B2 (en) | 2012-02-27 | 2016-09-20 | The Nielsen Company (Us), Llc | Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing |
US9292858B2 (en) | 2012-02-27 | 2016-03-22 | The Nielsen Company (Us), Llc | Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments |
US10881348B2 (en) | 2012-02-27 | 2021-01-05 | The Nielsen Company (Us), Llc | System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications |
US9569986B2 (en) | 2012-02-27 | 2017-02-14 | The Nielsen Company (Us), Llc | System and method for gathering and analyzing biometric user feedback for use in social media and advertising applications |
CN104246660A (en) * | 2012-03-16 | 2014-12-24 | 英特尔公司 | System and method for dynamic adaption of media based on implicit user input and behavior |
US20130243270A1 (en) * | 2012-03-16 | 2013-09-19 | Gila Kamhi | System and method for dynamic adaption of media based on implicit user input and behavior |
US9907482B2 (en) | 2012-08-17 | 2018-03-06 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US10842403B2 (en) | 2012-08-17 | 2020-11-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US9215978B2 (en) | 2012-08-17 | 2015-12-22 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US10779745B2 (en) | 2012-08-17 | 2020-09-22 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US9060671B2 (en) | 2012-08-17 | 2015-06-23 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US8989835B2 (en) | 2012-08-17 | 2015-03-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
US20140149177A1 (en) * | 2012-11-23 | 2014-05-29 | Ari M. Frank | Responding to uncertainty of a user regarding an experience by presenting a prior experience |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9510752B2 (en) * | 2012-12-11 | 2016-12-06 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
CN104968270A (en) * | 2012-12-11 | 2015-10-07 | 阿米·克林 | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US20140192325A1 (en) * | 2012-12-11 | 2014-07-10 | Ami Klin | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US10016156B2 (en) * | 2012-12-11 | 2018-07-10 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US9861307B2 (en) | 2012-12-11 | 2018-01-09 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US20220061725A1 (en) * | 2012-12-11 | 2022-03-03 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US20150297075A1 (en) * | 2012-12-11 | 2015-10-22 | Ami Klin | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US10987043B2 (en) | 2012-12-11 | 2021-04-27 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US11759135B2 (en) * | 2012-12-11 | 2023-09-19 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
KR102181927B1 (en) | 2012-12-11 | 2020-11-24 | 아미 클린 | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
KR20150103051A (en) * | 2012-12-11 | 2015-09-09 | 아미 클린 | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US10052057B2 (en) * | 2012-12-11 | 2018-08-21 | Childern's Healthcare of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US20170014050A1 (en) * | 2012-12-11 | 2017-01-19 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
WO2014116826A1 (en) * | 2013-01-24 | 2014-07-31 | The Trustees Of Columbia University In The City Of New York | Mobile, neurally-assisted personal assistant |
US10376139B2 (en) | 2013-02-28 | 2019-08-13 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US9179833B2 (en) | 2013-02-28 | 2015-11-10 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US9872615B2 (en) | 2013-02-28 | 2018-01-23 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
WO2014131860A1 (en) * | 2013-02-28 | 2014-09-04 | Carl Zeiss Meditec Ag | Systems and methods for improved ease and accuracy of gaze tracking |
US10228764B2 (en) | 2013-03-11 | 2019-03-12 | Immersion Corporation | Automatic haptic effect adjustment system |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US9202352B2 (en) * | 2013-03-11 | 2015-12-01 | Immersion Corporation | Automatic haptic effect adjustment system |
US20140253303A1 (en) * | 2013-03-11 | 2014-09-11 | Immersion Corporation | Automatic haptic effect adjustment system |
US9235968B2 (en) | 2013-03-14 | 2016-01-12 | Otoy, Inc. | Tactile elements for a wearable eye piece |
US9668694B2 (en) | 2013-03-14 | 2017-06-06 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US9773332B2 (en) | 2013-03-14 | 2017-09-26 | Otoy, Inc. | Visual cortex thought detector interface |
US9320450B2 (en) | 2013-03-14 | 2016-04-26 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US11076807B2 (en) | 2013-03-14 | 2021-08-03 | Nielsen Consumer Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US10635167B2 (en) * | 2013-05-30 | 2020-04-28 | Umoove Services Ltd. | Smooth pursuit gaze tracking |
US20160109945A1 (en) * | 2013-05-30 | 2016-04-21 | Umoove Services Ltd. | Smooth pursuit gaze tracking |
US20150332166A1 (en) * | 2013-09-20 | 2015-11-19 | Intel Corporation | Machine learning-based user behavior characterization |
US10617295B2 (en) | 2013-10-17 | 2020-04-14 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for assessing infant and child development via eye tracking |
US11864832B2 (en) | 2013-10-17 | 2024-01-09 | Children's Healthcare Of Atlanta, Inc. | Systems and methods for assessing infant and child development via eye tracking |
US10048748B2 (en) | 2013-11-12 | 2018-08-14 | Excalibur Ip, Llc | Audio-visual interaction with user devices |
US10275022B2 (en) | 2013-11-12 | 2019-04-30 | Excalibur Ip, Llc | Audio-visual interaction with user devices |
US11030633B2 (en) | 2013-11-18 | 2021-06-08 | Sentient Decision Science, Inc. | Systems and methods for assessing implicit associations |
US11810136B2 (en) | 2013-11-18 | 2023-11-07 | Sentient Decision Science, Inc. | Systems and methods for assessing implicit associations |
US10546310B2 (en) * | 2013-11-18 | 2020-01-28 | Sentient Decision Science, Inc. | Systems and methods for assessing implicit associations |
US9622703B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US9622702B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US11141108B2 (en) | 2014-04-03 | 2021-10-12 | Nielsen Consumer Llc | Methods and apparatus to gather and analyze electroencephalographic data |
US10354261B2 (en) * | 2014-04-16 | 2019-07-16 | 2020 Ip Llc | Systems and methods for virtual environment construction for behavioral research |
US10600066B2 (en) * | 2014-04-16 | 2020-03-24 | 20/20 Ip, Llc | Systems and methods for virtual environment construction for behavioral research |
US11344226B2 (en) | 2014-04-29 | 2022-05-31 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for non-intrusive drug impairment detection |
US10376183B2 (en) | 2014-04-29 | 2019-08-13 | Dignity Health | Systems and methods for non-intrusive drug impairment detection |
US20170119296A1 (en) * | 2014-06-11 | 2017-05-04 | Dignity Health | Systems and methods for non-intrusive deception detection |
US10743806B2 (en) * | 2014-06-11 | 2020-08-18 | Dignity Health | Systems and methods for non-intrusive deception detection |
US11759134B2 (en) | 2014-06-11 | 2023-09-19 | Arizona Board Of Regents On Behalf Of Arizona State University Dignity Health | Systems and methods for non-intrusive deception detection |
WO2015193806A1 (en) * | 2014-06-17 | 2015-12-23 | Koninklijke Philips N.V. | Evaluating clinician attention |
US10353461B2 (en) | 2014-06-17 | 2019-07-16 | Koninklijke Philips N.V. | Evaluating clinician |
CN106659441A (en) * | 2014-06-17 | 2017-05-10 | 皇家飞利浦有限公司 | Evaluating clinician attention |
US10748439B1 (en) * | 2014-10-13 | 2020-08-18 | The Cognitive Healthcare Company | Automated delivery of unique, equivalent task versions for computer delivered testing environments |
US10771844B2 (en) | 2015-05-19 | 2020-09-08 | The Nielsen Company (Us), Llc | Methods and apparatus to adjust content presented to an individual |
US9936250B2 (en) | 2015-05-19 | 2018-04-03 | The Nielsen Company (Us), Llc | Methods and apparatus to adjust content presented to an individual |
US11290779B2 (en) | 2015-05-19 | 2022-03-29 | Nielsen Consumer Llc | Methods and apparatus to adjust content presented to an individual |
US10299717B2 (en) | 2015-06-14 | 2019-05-28 | Facense Ltd. | Detecting stress based on thermal measurements of the face |
US10064559B2 (en) | 2015-06-14 | 2018-09-04 | Facense Ltd. | Identification of the dominant nostril using thermal measurements |
US10130308B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Calculating respiratory parameters from thermal measurements |
US9968264B2 (en) | 2015-06-14 | 2018-05-15 | Facense Ltd. | Detecting physiological responses based on thermal asymmetry of the face |
US10045726B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Selecting a stressor based on thermal measurements of the face |
US10045737B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Clip-on device with inward-facing cameras |
US9867546B2 (en) | 2015-06-14 | 2018-01-16 | Facense Ltd. | Wearable device for taking symmetric thermal measurements |
US10216981B2 (en) | 2015-06-14 | 2019-02-26 | Facense Ltd. | Eyeglasses that measure facial skin color changes |
US10076270B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses while accounting for touching the face |
US10076250B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses based on multispectral data from head-mounted cameras |
US10045699B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Determining a state of a user based on thermal measurements of the forehead |
US10080861B2 (en) | 2015-06-14 | 2018-09-25 | Facense Ltd. | Breathing biofeedback eyeglasses |
US10165949B2 (en) | 2015-06-14 | 2019-01-01 | Facense Ltd. | Estimating posture using head-mounted cameras |
US10376153B2 (en) | 2015-06-14 | 2019-08-13 | Facense Ltd. | Head mounted system to collect facial expressions |
US10159411B2 (en) | 2015-06-14 | 2018-12-25 | Facense Ltd. | Detecting irregular physiological responses during exposure to sensitive data |
US10154810B2 (en) | 2015-06-14 | 2018-12-18 | Facense Ltd. | Security system that detects atypical behavior |
US10085685B2 (en) | 2015-06-14 | 2018-10-02 | Facense Ltd. | Selecting triggers of an allergic reaction based on nasal temperatures |
US10092232B2 (en) | 2015-06-14 | 2018-10-09 | Facense Ltd. | User state selection based on the shape of the exhale stream |
US10151636B2 (en) | 2015-06-14 | 2018-12-11 | Facense Ltd. | Eyeglasses having inward-facing and outward-facing thermal cameras |
US10136852B2 (en) | 2015-06-14 | 2018-11-27 | Facense Ltd. | Detecting an allergic reaction from nasal temperatures |
US10523852B2 (en) | 2015-06-14 | 2019-12-31 | Facense Ltd. | Wearable inward-facing camera utilizing the Scheimpflug principle |
US10130261B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Detecting physiological responses while taking into account consumption of confounding substances |
US10130299B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Neurofeedback eyeglasses |
US11061233B2 (en) | 2015-06-30 | 2021-07-13 | 3M Innovative Properties Company | Polarizing beam splitter and illuminator including same |
US11693243B2 (en) | 2015-06-30 | 2023-07-04 | 3M Innovative Properties Company | Polarizing beam splitting system |
US10514553B2 (en) | 2015-06-30 | 2019-12-24 | 3M Innovative Properties Company | Polarizing beam splitting system |
US10113913B2 (en) | 2015-10-03 | 2018-10-30 | Facense Ltd. | Systems for collecting thermal measurements of the face |
US20170103424A1 (en) * | 2015-10-13 | 2017-04-13 | Mastercard International Incorporated | Systems and methods for generating mood-based advertisements based on consumer diagnostic measurements |
US20200379560A1 (en) * | 2016-01-21 | 2020-12-03 | Microsoft Technology Licensing, Llc | Implicitly adaptive eye-tracking user interface |
US10136856B2 (en) | 2016-06-27 | 2018-11-27 | Facense Ltd. | Wearable respiration measurements system |
US10482333B1 (en) | 2017-01-04 | 2019-11-19 | Affectiva, Inc. | Mental state analysis using blink rate within vehicles |
US11412287B2 (en) | 2017-01-19 | 2022-08-09 | International Business Machines Corporation | Cognitive display control |
US10602214B2 (en) | 2017-01-19 | 2020-03-24 | International Business Machines Corporation | Cognitive television remote control |
US11640821B2 (en) * | 2017-01-25 | 2023-05-02 | International Business Machines Corporation | Conflict resolution enhancement system |
US20190244617A1 (en) * | 2017-01-25 | 2019-08-08 | International Business Machines Corporation | Conflict Resolution Enhancement System |
US10922566B2 (en) | 2017-05-09 | 2021-02-16 | Affectiva, Inc. | Cognitive state evaluation for vehicle navigation |
US11601715B2 (en) | 2017-07-06 | 2023-03-07 | DISH Technologies L.L.C. | System and method for dynamically adjusting content playback based on viewer emotions |
US11343596B2 (en) * | 2017-09-29 | 2022-05-24 | Warner Bros. Entertainment Inc. | Digitally representing user engagement with directed content based on biometric sensor data |
US10171877B1 (en) | 2017-10-30 | 2019-01-01 | Dish Network L.L.C. | System and method for dynamically selecting supplemental content based on viewer emotions |
US10616650B2 (en) | 2017-10-30 | 2020-04-07 | Dish Network L.L.C. | System and method for dynamically selecting supplemental content based on viewer environment |
US11350168B2 (en) | 2017-10-30 | 2022-05-31 | Dish Network L.L.C. | System and method for dynamically selecting supplemental content based on viewer environment |
US10628985B2 (en) | 2017-12-01 | 2020-04-21 | Affectiva, Inc. | Avatar image animation using translation vectors |
US10645464B2 (en) | 2017-12-20 | 2020-05-05 | Dish Network L.L.C. | Eyes free entertainment |
US10225621B1 (en) | 2017-12-20 | 2019-03-05 | Dish Network L.L.C. | Eyes free entertainment |
TWI691941B (en) * | 2018-02-13 | 2020-04-21 | 林俊毅 | Detection system and method for logical thinking ability |
US11048921B2 (en) * | 2018-05-09 | 2021-06-29 | Nviso Sa | Image processing system for extracting a behavioral profile from images of an individual specific to an event |
CN110464365A (en) * | 2018-05-10 | 2019-11-19 | 深圳先进技术研究院 | A kind of attention rate determines method, apparatus, equipment and storage medium |
US11823055B2 (en) | 2019-03-31 | 2023-11-21 | Affectiva, Inc. | Vehicular in-cabin sensing using machine learning |
US11887383B2 (en) | 2019-03-31 | 2024-01-30 | Affectiva, Inc. | Vehicle interior object management |
CN110432915A (en) * | 2019-08-02 | 2019-11-12 | 秒针信息技术有限公司 | A kind of method and device for assessing information flow intention |
US11769056B2 (en) | 2019-12-30 | 2023-09-26 | Affectiva, Inc. | Synthetic data for neural network training using vectors |
WO2021217188A1 (en) * | 2020-04-23 | 2021-10-28 | Ahmad Hassan Abu Elreich | Methods, systems, apparatuses, and devices for facilitating a driver to advertise products to passengers |
WO2022079305A1 (en) * | 2020-10-15 | 2022-04-21 | Bioserenity | System and method for remotely controlled therapy |
CN112535479A (en) * | 2020-12-04 | 2021-03-23 | 中国科学院深圳先进技术研究院 | Method for determining emotional processing tendency and related product |
WO2023037348A1 (en) * | 2021-09-13 | 2023-03-16 | Benjamin Simon Thompson | System and method for monitoring human-device interactions |
Also Published As
Publication number | Publication date |
---|---|
EP2007271A2 (en) | 2008-12-31 |
JP2009530071A (en) | 2009-08-27 |
WO2008129356A3 (en) | 2009-02-05 |
WO2008129356A2 (en) | 2008-10-30 |
CA2639125A1 (en) | 2007-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20070265507A1 (en) | Visual attention and emotional response detection and display system | |
US20230058925A1 (en) | System and method for providing and aggregating biosignals and action data | |
CN108078573B (en) | Interest orientation value testing method based on physiological response information and stimulation information | |
KR101930566B1 (en) | Systems and methods to assess cognitive function | |
US9454646B2 (en) | Short imagery task (SIT) research method | |
KR102181927B1 (en) | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience | |
US9560984B2 (en) | Analysis of controlled and automatic attention for introduction of stimulus material | |
US20120284332A1 (en) | Systems and methods for formatting a presentation in webpage based on neuro-response data | |
US20070066916A1 (en) | System and method for determining human emotion by analyzing eye properties | |
JP2012511397A (en) | Brain pattern analyzer using neural response data | |
CA2758272A1 (en) | Method and system for measuring user experience for interactive activities | |
WO2010004429A1 (en) | Self-contained data collection system for emotional response testing | |
Yang et al. | Affective image classification based on user eye movement and EEG experience information | |
CN114209324A (en) | Psychological assessment data acquisition method based on image visual cognition and VR system | |
Falkowska et al. | Eye tracking usability testing enhanced with EEG analysis | |
Ward | An analysis of facial movement tracking in ordinary human–computer interaction | |
Chiu et al. | Redesigning the user interface of a healthcare management system for the elderly with a systematic usability testing method | |
JP6739805B2 (en) | Information processing device, program | |
Shipley | Setting our sights on vision: A rationale and research agenda for integrating eye-tracking into leisure research | |
Syed et al. | An Exploratory Study to Understand the Phenomena of Eye-Tracking Technology: A Case of the Education Environment | |
CN114052736B (en) | System and method for evaluating cognitive function | |
De Bruin | Automated usability analysis and visualisation of eye tracking data | |
Cavalcanti et al. | Incorporating Eye Tracking into an EEG-Based Brainwave Visualization System | |
Mateja | Usability research of an online store using eye tracking: a comparison of product specification formats | |
Lyu et al. | Emotiv Epoc+ live metrics quality validation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMOTIONS EMOTION TECHNOLOGY A/S, DENMARK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DE LEMOS, JAKOB;REEL/FRAME:019516/0483 Effective date: 20070626 |
|
AS | Assignment |
Owner name: IMOTIONS EMOTION TECHNOLOGY A/S, DENMARK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEMOS, JAKOB DE;REEL/FRAME:019652/0259 Effective date: 20070626 |
|
AS | Assignment |
Owner name: IMOTIONS EMOTION TECHNOLOGY A/S, DENMARK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT ASSIGNEE'S ZIP CODE. DOCUMENT PREVIOUSLY RECORDED AT REEL 019516, FRAME 0483;ASSIGNOR:DE LEMOS, JAKOB;REEL/FRAME:019836/0728 Effective date: 20070626 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |