US20150324568A1 - Systems and methods for using eye signals with secure mobile communications - Google Patents
Systems and methods for using eye signals with secure mobile communications Download PDFInfo
- Publication number
- US20150324568A1 US20150324568A1 US14/708,229 US201514708229A US2015324568A1 US 20150324568 A1 US20150324568 A1 US 20150324568A1 US 201514708229 A US201514708229 A US 201514708229A US 2015324568 A1 US2015324568 A1 US 2015324568A1
- Authority
- US
- United States
- Prior art keywords
- eye
- data
- user
- state
- processors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 98
- 238000010295 mobile communication Methods 0.000 title abstract 2
- 238000012545 processing Methods 0.000 claims description 29
- 230000001149 cognitive effect Effects 0.000 claims description 27
- 238000005259 measurement Methods 0.000 claims description 23
- 230000008569 process Effects 0.000 claims description 17
- 210000001747 pupil Anatomy 0.000 claims description 13
- 230000015654 memory Effects 0.000 claims description 11
- 230000004424 eye movement Effects 0.000 claims description 9
- 238000005286 illumination Methods 0.000 claims description 7
- 210000001525 retina Anatomy 0.000 claims description 7
- 210000000744 eyelid Anatomy 0.000 claims description 6
- 210000004087 cornea Anatomy 0.000 claims description 5
- 210000000720 eyelash Anatomy 0.000 claims description 4
- 230000001815 facial effect Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- 230000004434 saccadic eye movement Effects 0.000 claims description 3
- 238000013475 authorization Methods 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 abstract description 3
- 238000012795 verification Methods 0.000 abstract 1
- 210000001508 eye Anatomy 0.000 description 98
- 210000000554 iris Anatomy 0.000 description 24
- 238000004891 communication Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 238000007726 management method Methods 0.000 description 16
- 239000000463 material Substances 0.000 description 15
- 230000000007 visual effect Effects 0.000 description 14
- 230000007704 transition Effects 0.000 description 12
- 238000013459 approach Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 11
- 230000036541 health Effects 0.000 description 9
- 238000003384 imaging method Methods 0.000 description 9
- 210000004556 brain Anatomy 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 8
- 230000003466 anti-cipated effect Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 7
- 230000002829 reductive effect Effects 0.000 description 7
- 230000000638 stimulation Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000006399 behavior Effects 0.000 description 6
- 230000015556 catabolic process Effects 0.000 description 6
- 210000003128 head Anatomy 0.000 description 6
- 210000003786 sclera Anatomy 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 230000010354 integration Effects 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000019771 cognition Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 238000012800 visualization Methods 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000000763 evoking effect Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- 241001272996 Polyphylla fullo Species 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000013481 data capture Methods 0.000 description 2
- 238000013523 data management Methods 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000001976 improved effect Effects 0.000 description 2
- 230000003340 mental effect Effects 0.000 description 2
- 239000003607 modifier Substances 0.000 description 2
- 239000004570 mortar (masonry) Substances 0.000 description 2
- 230000000926 neurological effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 231100000430 skin reaction Toxicity 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- KJLPSBMDOIVXSN-UHFFFAOYSA-N 4-[4-[2-[4-(3,4-dicarboxyphenoxy)phenyl]propan-2-yl]phenoxy]phthalic acid Chemical compound C=1C=C(OC=2C=C(C(C(O)=O)=CC=2)C(O)=O)C=CC=1C(C)(C)C(C=C1)=CC=C1OC1=CC=C(C(O)=O)C(C(O)=O)=C1 KJLPSBMDOIVXSN-UHFFFAOYSA-N 0.000 description 1
- 241000284156 Clerodendrum quadriloculare Species 0.000 description 1
- 208000027534 Emotional disease Diseases 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 241000699666 Mus <mouse, genus> Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 206010033664 Panic attack Diseases 0.000 description 1
- 206010038743 Restlessness Diseases 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 208000004350 Strabismus Diseases 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000004071 biological effect Effects 0.000 description 1
- 230000008512 biological response Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000007177 brain activity Effects 0.000 description 1
- 239000011449 brick Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000036992 cognitive tasks Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000001973 epigenetic effect Effects 0.000 description 1
- 230000001667 episodic effect Effects 0.000 description 1
- 230000009932 equilibrioception Effects 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004399 eye closure Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000001097 facial muscle Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 230000001339 gustatory effect Effects 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000008376 long-term health Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 230000004459 microsaccades Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 208000019906 panic disease Diseases 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000002062 proliferating effect Effects 0.000 description 1
- 230000010344 pupil dilation Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002040 relaxant effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000001711 saccadic effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 235000019615 sensations Nutrition 0.000 description 1
- 230000036421 sense of balance Effects 0.000 description 1
- 230000014860 sensory perception of taste Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 210000000162 simple eye Anatomy 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
- 230000004584 weight gain Effects 0.000 description 1
- 235000019786 weight gain Nutrition 0.000 description 1
- 230000004580 weight loss Effects 0.000 description 1
- 230000036642 wellbeing Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/316—User authentication by observing the pattern of computer usage, e.g. typical user behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/64—Protecting data integrity, e.g. using checksums, certificates or signatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/44504—Circuit details of the additional information generator, e.g. details of the character or graphics signal generator, overlay mixing circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
- H04W12/065—Continuous authentication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/30—Security of mobile devices; Security of mobile applications
- H04W12/33—Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- the present invention relates to portable or wearable biometric base user identification and authentication for secure distributed and interactive systems and services.
- nonrepudiation involves associating actions or changes to a unique individual.
- a key card access system it may be desirable to implement a key card access system.
- Non-repudiation would be violated if it were not also a strictly enforced policy to prohibit sharing of the key cards and to immediately report lost or stolen cards. Otherwise determining who performed the action of opening the door cannot be trivially determined.
- the individual owner of the account must not allow others to use that account, especially, for instance, by giving away their account's password, and a policy should be implemented to enforce this. This prevents the owner of the account from denying actions performed by the account.
- the ecosystem is dynamic and rapidly changing, where wireless capability is growing exponentially.
- Cloud based architectures are becoming more appealing and attainable at manageable costs.
- the place to start re-thinking is with transitional and end architectures in mind and the placement of what is called a “data abstraction layer.”
- This abstraction layer is distinguished by data movement defined as data on the move and data at rest; and includes considerations for data generation, data storage, data processing, and the role of the server and browser in the cloud.
- a first transitional step on the way to the Internet of Things is the emergence of fog computing or fog networking.
- This is an architecture that uses one smart device, a collaborative multitude of smart devices or near-user edge devices to carry out a substantial amount of processing and storage (rather than stored primarily in cloud data centers), communication (rather than routed over the internet backbone), and control, configuration, measurement, and management rather than controlled primarily by network gateways such as those in the Long Term Evolution (LTE) such as 4-G LTE.
- LTE Long Term Evolution
- Fog Networking consists of a control plane and a data plane.
- fog computing enables computing services to reside at the edge of the network as opposed to servers in a data-center.
- fog computing emphasizes proximity to end-users and client objectives, resulting in superior user-experience and redundancy in case of failure.
- Fog Networking supports the IoT, in which most of the devices that are used on a daily basis will be connected to each other. Examples include phones, wearable health monitoring devices, connected vehicle, and augmented reality using devices such as the Google Glass.
- the ultimate goal of the IoT is to realize connections between objects, objects and persons, all things, and networks for the secure identification, management, and control of data.
- wearable display devices will challenge traditional computer human machine interaction.
- Today, computer mice, joysticks, and other manual tracking devices are ubiquitous tools for specifying positional information during human-machine interactions (HMIs).
- HMIs human-machine interactions
- wearable computing such bulky and obtrusive devices that, for example, generally require stationary surfaces for proper operation are incompatible with the portable nature of apparatuses that are designed to be worn on the body.
- Wearable display devices include virtual reality (“VR”) displays such as those manufactured by Sony, Samsung, Oculus, Carl Zeiss; head mounted displays (“HMDs”) such as those produced by Google (e.g., Glass®) and Vuzix; augmented reality (“AR”) displays such as those manufactured by Microsoft, Vuzix, and DigiLens; and similar devices. Eye tracking can be used to view such displays and to specify positional information. However, the eyes are also used extensively during normal human activities.
- VR virtual reality
- HMDs head mounted displays
- AR augmented reality
- Eye tracking can be used to view such displays and to specify positional information. However, the eyes are also used extensively during normal human activities.
- data collected from the face, eye(s), or voice constitute unique biometric data of the user or user groups, if desired. These collected data can be used to generate a unique private key in a system of public key and private key cryptography.
- Cryptographic systems have been widely used for information protection, authentication, and access control for many years and as such are well known in the art of information security.
- An additional component for the continuous exchange of secure information with a biometrically identified individual is the encryption of all transmitted (i.e., sent or received) data.
- Data encryption has a long history that pre-dates the electronic computer. A number of well-established methods have been developed to protect the confidentiality, integrity, and authenticity of data. Most encryption techniques make use of one or more secret keys or security codes that can be used to encrypt and/or decipher data streams. Keys used to encode or decipher data streams can originate from a number of sources including previously transmitted data sequences, identification codes embedded during the manufacture of a device, and usage counts.
- Encryption and deciphering methods that make use of transposition, substitution, repositioning, masking, translation tables, and/or pre-defined numeric sequences are well-known in the art. More sophisticated techniques utilize multiple methods applied to larger blocks (i.e., more than a single character or byte) of information.
- encryption and deciphering methods that include a processing step within a protected hardware component are generally more protected from attempts at decoding compared to those implemented using software stored on some form of memory device.
- FPGAs Field-programmable gate arrays
- ASICs application specific integrated circuits
- Streicher et al U.S. Pat. No. 8,363,833 and others describe processes whereby even the bit stream used to program an FPGA that can be used for encryption, is itself encrypted. Concealing both security keys and methods to decipher secure information greatly reduces the risk of anyone other than the intended recipient gaining meaningful access to an encrypted data stream.
- Bluetooth had its origins in 1998 with the release of the 1.0 specification, with a subsequent release in 2000 of what was called 1.0b. These early releases were designed to remove wires from the desktop of a user; these included considerations for serial, headset, cordless phone, and LAN connections. However, these early versions had many problems and manufacturers had difficulty making their products interoperable. Subsequent releases of Bluetooth 1.1, 1.2, and 2.0 included expanded bandwidth, profile capability, and finally, in release 2.1, new levels of security, including what is now called Secure Simple Pairing (SSP).
- SSP Secure Simple Pairing
- SSP allows two devices to establish a link key based on a Diffie-Hellman key agreement and supports four methods to authenticate the key agreement.
- One of these methods called the Passkey Entry method, uses a PIN (i.e., personal identification number) entered on one or both devices.
- PIN i.e., personal identification number
- the Passkey Entry method has been shown to leak this PIN to any attacker eavesdropping on the first part of the pairing process. If the attacker can prevent the pairing process to successfully complete and the user uses the same PIN twice (or a fixed PIN is used), the attacker can mount a man-in-the-middle attack on a new run of the pairing process.
- nonrepudiation generally means an assurance that someone cannot deny something.
- denial of the ability to ensure the use of a device or communications from a device cannot deny the authenticity of their signature on any use, communications, or messages that a qualified user originates.
- Eye signals voluntary eye movements that are intended to interact with a computing device are referred to as “eye signals.”
- Eye signal controls are described in Systems and Methods for Biomechanically-based Eye Signals for Interacting with Real and Virtual Objects [Attorney Docket No. EYE-023], application Ser. No. 14/______, filed May 8, 2015, the entire disclosure of which is expressly incorporated by reference herein.
- apparatus, systems, and methods are provided for a head-mounted device (HMD) that includes at least one processor connected to at least one imager, where at least one of the imagers is oriented toward the eye(s) of a user.
- the processor is configured to at least one of substantially continuously, simultaneously, and/or periodically determine eye signal estimation, where the eye signal estimation is determined by the processor using the imager to detect at least one glint from the surface of the eye that has been generated from a light source attached to the HMD or where the imager detects one or more distinctive features of the eye; and determine biometric data of a user including facial features, voice, or iris data of a user, where the biometric data is used for the identification and authentication of the user for access and control of at least one of the HMD, a connected device, a wireless device, and a remote server.
- apparatus, systems, and methods may substantially continuously, periodically, and/or on demand perform iris recognition utilizing a head-mounted device.
- Biometric identification during the formation of eye-signal controls may be used within a wide range of applications in which user identification and/or authentication are required in real time.
- systems and methods are disclosed in which eye-signal control sequences are used for authentication of a user for at least one of withdrawing money from an automated teller machine (ATM) and making online purchases.
- ATM automated teller machine
- Another embodiment discloses systems and methods to authenticate a user for online activities including at least one of private, group, or other testing, complying with performance requirements coupled with identity for various forms of employment such as professional driving, piloting, or other transportation, logging hours, confirming acknowledgement to informed consent provided orally or read by a user whereupon continuous confirmation of identity during saccadic and other eye-based movements during reading, and confirming acknowledgement of any legally binding agreement.
- Another embodiment discloses systems and methods for combining identifying characteristics with other security tokens including at least one of information tokens (passwords), physical tokens (keys), produced tokens (speech, gestures, writing), and other biometric tokens such as fingerprint and voiceprint.
- Another embodiment discloses systems and methods that describe a plurality of system configurations, including:
- Another embodiment discloses systems and methods for capturing an image of the iris, or a locally generated irisCode (e.g., as described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015) from the iris image, and transmit the iris information to the cloud for authentication of any HMD.
- a locally generated irisCode e.g., as described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015
- Another embodiment discloses systems and methods that replace or augment common password-based access to computing devices.
- Another embodiment discloses systems and methods to use a specific implementation of the continuous biometric identification (CBID) approach (e.g., as described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015) to “buy at the aisle” by using eye-signal methods or processes referred to as “look to buy.”
- CBID continuous biometric identification
- Another embodiment discloses systems and methods for displaying (on an HMD or remote display device) information, including cost, about the item.
- Another embodiment discloses systems and methods for object recognition used to identify items for purchase that are simply viewed within the environment of the user.
- Another embodiment discloses systems and methods that establish a true identity of a user wearing the HMD.
- Another embodiment discloses systems and methods that prevent user identity fraud and identity theft.
- Another embodiment discloses systems and methods that use the HMD to authenticate users for at least one of educational or legal purposes.
- Another embodiment discloses systems and methods that use the HMD for the purpose of authenticating a purchase, where the authenticated purchase is for online purchase security and offline purchase security, where offline includes at a retail establishment.
- Another embodiment discloses systems and methods that use the HMD that includes a second imager connected to the processor oriented outward from the HMD, where the second imager detects a code that can be decoded by the processor, where the code is one of a bar code and a QR (i.e., quick response) code and where the processor decoded data represents information about a product.
- a code that can be decoded by the processor
- QR i.e., quick response
- Another embodiment discloses systems and methods that use the information related to a product to allow an authenticated user to securely purchase the product.
- Another embodiment discloses systems and methods that allow an entity to initiate a secure communication channel with another entity by mutual gaze where the security of the communication channel may be established prior to communication and may be revalidated continuously or at intervals during communication.
- Another embodiment discloses systems and methods that enable a secure protocol for coordination among parties to cause an action to occur whereupon each party performs some action during which time their identities are substantially continuously verified with CBID.
- Another embodiment discloses systems and methods that increase security when using a HMD device by limiting access to functional blocks in a silicon chip that supports eye-tracking for the HMD device.
- Another embodiment discloses systems and methods that manage, coordinate, filter, and/or sequence the stimulus provided by one or more wearable devices associated with the identity of a user.
- systems and methods include a dynamically evolving cognitive architecture for a system based on interpreting the gaze-based intent of a user. Natural eye movement is interpreted by the system, and used for real-time image services.
- An illumination device comprising a processor and a camera is worn by the user, with the camera aimed toward an eye of the user.
- the system includes memory with stored instructions. When the instructions are executed the system receives eye measurement data from the camera aimed at the user's eye. The data is used to determine a first state of the eye, and compare it to data captured from a second state of the eye. When the system determines that the first and second states of the eye are the same, further instructions are sent to at least one processor in the system.
- Applicant(s) herein expressly incorporate(s) by reference all of the following materials identified in each numbered paragraph below.
- the incorporated materials are not necessarily “prior art” and Applicant(s) expressly reserve(s) the right to swear behind any of the incorporated materials.
- noun, term, or phrase is intended to be further characterized, specified, or narrowed in some way, then such noun, term, or phrase will expressly include additional adjectives, descriptive terms, or other modifiers in accordance with the normal precepts of English grammar. Absent the use of such adjectives, descriptive terms, or modifiers, it is the intent that such nouns, terms, or phrases be given their plain, and ordinary English meaning to those skilled in the applicable arts as set forth above.
- FIG. 1 is a front view of a human eye.
- FIG. 2 is a section view of a human eye from the side.
- FIG. 3A depicts a top down view of an eye showing the regions of vision.
- FIG. 3B shows an example approximation of the sizes of the regions of vision.
- FIG. 4 depicts overall system architecture.
- FIG. 5 depicts architecture of the eye signal object.
- FIG. 6 depicts abstracted hardware.
- FIG. 7 depicts typical Bluetooth architecture.
- FIG. 8 depicts hardware interface and hardware components.
- FIG. 9 depicts imaging architecture.
- FIG. 10 depicts biometric data generation.
- FIG. 11 depicts a breakdown of a cognitive load manager.
- FIG. 12 depicts system components of the cognitive load manager.
- FIG. 13 depicts a HMD (head mounted device) connecting to a mobile device.
- FIG. 14 depicts a HMD connecting to the cloud.
- FIG. 15 depicts a HMD connecting to home and vehicle controls.
- FIG. 16 depicts communication between a HMD and a NESTTM thermostat system.
- FIG. 17 shows system architecture on a HMD communicating with the cloud.
- FIG. 18 depicts a breakdown of the data manager.
- FIG. 19 shows the system architecture of a HMD with the processing capabilities moved to the cloud.
- FIG. 20 shows further evolution of a HMD towards the Internet of Things.
- FIG. 21 depicts the system architecture from a HMD moved to a remote server.
- FIG. 22 depicts a HMD with all processing pulled off to the cloud.
- FIG. 23 depicts a HMD and the remote server communicating.
- FIG. 24 depicts a HMD communicating with home control systems.
- FIG. 25 depicts a HMD communicating with social media.
- FIG. 26 depicts a HMD communicating with home entertainment systems.
- FIG. 27 depicts a HMD communicating with vehicle entertainment systems.
- FIG. 28 depicts a HMD communicating with vehicle control systems.
- FIG. 29 is a flow chart of steps taken to perform an online, secure purchase.
- FIGS. 1 and 2 generally depict the anatomy of the human eye 100 .
- FIG. 1 is a front view of the eye 100 showing the pupil 145 , iris 115 , sclera 150 , limbus 245 , pupil/iris boundary 250 , upper eyelid 105 , lower eyelid 110 , and eyelashes 235 .
- FIG. 2 is section view of the eye 100 showing the pupil 145 , iris 115 , retina 52 , sclera 150 , fovea 160 , lens 165 , and cornea 170 .
- the pupil 145 is the approximately round dark portion at the center of the eye that expands and contracts to regulate the light the retina 52 receives.
- the iris 115 is the colored portion of the eye 100 that surrounds and controls the expansion and contraction of the pupil 145 .
- the sclera 150 is the white region of the eye 100 that surrounds the iris 115 .
- the sclera 150 contains blood vessels and other identifiable markers.
- the limbus 245 is the outer edge of the iris 115 next to the sclera 150 .
- the pupil/iris boundary 250 is where the pupil 145 and the iris 115 meet.
- the eyelids 105 , 110 and the eyelashes 235 surround and occasionally partially cover or obscure portions of the eye 100 during blinks, eye closures, or different angles of viewing.
- the retina 52 is the sensory membrane that lines the eye 100 that receives images from the lens 165 and converts them into signals for the brain.
- the fovea 160 is an indentation in the retina 52 that contains only cones (no rods) and provides particularly acute vision.
- the lens 165 is the nearly spherical body of the eye 100 behind the cornea 170 that focuses light onto the retina 52 .
- the cornea 170 is the clear part of the eye covering the iris 115 , pupil 145 , and the lens 165 .
- FIGS. 3A and 3B depict the foveal, parafoveal, and peripheral ranges of vision.
- the foveal region 190 is about two degrees outward from a user's gaze point. An approximation of this region is a US penny held at an adult's arm length.
- the parafoveal range 195 is the viewable area outside the foveal region 190 , generally from two to ten degrees from a user's gaze point. An approximation of the ten degree parafoveal visual field is a circle with a four-inch diameter held at an adult's arm length.
- the peripheral region 197 is outside of the parafoveal region 195 and is generally from ten to thirty degrees out.
- FIG. 4 depicts an overall system architecture, including a processor 1035 with non-volatile flash memory 1050 , D-RAM memory 1040 , a hardware abstraction layer (HAL) 1030 , and physical connections 1235 to external hardware, an operating system (OS) 1025 , and software and/or firmware 1000 that handles the middleware services for the HMD that operates as a Visual Disambiguation Service (VDS) interface termed IRIS (Interface for Real-time Image Services) for the HMD and is operable as a software control object.
- VDS Visual Disambiguation Service
- middleware services is a software layer 1015 containing software to facilitate the software integration of the IRIS object with a third party application; also above the middleware services is a set of software tools 1020 used for third party hardware integration and debug, including operations like single stepping and break-pointing through the Joint Test Action Group (JTAG)—supported by the IEEE 1149.1 Standard Test Access Port and Boundary-Scan Architecture.
- software tools and integration layer is an application programming interface (API) 1010 , followed by applications 1005 .
- API application programming interface
- the system includes public 1635 and private key generators 1630 for added security.
- FIG. 5 depicts the overall system architecture including software blocks identified as a power manager 1140 and device manager 1120 .
- Power management schemes are derived from one or more open standards such as the Advanced Configuration and Power Interface (ACPI).
- ACPI Advanced Configuration and Power Interface
- the ACPI has three main components: the ACPI tables, the ACPI BIOS and the ACPI registers. Unlike its predecessors, like the APM or PnP BIOS, the ACPI implements little of its functionality in the ACPI BIOS code, whose main role is to load the ACPI tables in system memory. Instead, most of the firmware ACPI functionality is provided in ACPI Machine Language (AML) bytecode stored in the ACPI tables. To make use of these tables, the operating system must have an interpreter for the AML bytecode. A reference AML interpreter implementation is provided by the ACPI Component Architecture (ACPICA). At the BIOS development time, AML code is compiled from the ASL (ACPI Source Language) code. To date, the most recent release of the ACPI standard was in 2011 .
- ACPICA ACPI Component Architecture
- the systems of the future may implement without an operating system and robust support.
- a power management scheme and ACPI elements discussed above will need to be pulled up to an application control level, giving the application and user dynamic control of the power scheme.
- the ACPI might implement in a split or distributed fashion. This current standard does not fully anticipate the challenges of wearable computing devices like an HMD disclosed in this specification; therefore additional considerations for operating HMD systems in multiple modes is disclosed.
- An HMD may include a low-power mode of operation that may be deployed during times when no eyes are detected. This typically occurs when the user removes the headwear or when the headwear has shifted out of place on a user's head. This functionality could be implemented in silicon as a system on a chip (SOC).
- SOC system on a chip
- the device can be re-mounted by the original user or worn by a new user.
- the device For purposes of device calibration (e.g., to account for anatomical variations among individuals) and/or user authentication, it is desirable for the device to be capable of determining the identity of registered users when re-mounted or re-deployed. This can include loading a new set of configuration/calibration parameters and differentiating identities between the previous and new user; including halting, pausing and/or concealing the outputs of any ongoing programs launched by the previous user.
- OSPM Operating System Power Management
- hardware drivers 1197 which each in turn impacts the system, device, and processor states; and these are managed globally as Power States and these include Global States, Device States, Processor States, and Performance States.
- Power consumption is an omnipresent concern, particularly if the device is not worn for an extended period.
- a commonly deployed solution to this issue is an “off” switch that completely powers down an electronic device.
- the time and inconvenience of “powering up” a headset device is restrictive particularly, for example, if the device has only been removed from the head momentarily.
- Low-power HMD and eye-signal control anticipates these issues by using at least one technique comprising:
- This specific dedicated hardware can utilize modern methods of “hybrid” chip manufacturing that can segment a portion of circuitry to operate in an extremely low power mode.
- This hybrid circuitry effectively builds a “firewall,” preventing an unauthorized user from fully powering up or utilizing a device.
- Low-power HMD modes are when a low-battery state is sensed. Instead of running a device until all global functions cease, a “graceful degradation” model is implemented as part of the new class of Power State for HMDs. “Graceful degradation” can include algorithmic approaches by limiting the use of more power-hungry (i.e., generally more sophisticated) image processing and other routines; as well as any number of the hybrid and hardware approaches to reduce power while maintaining at least partial functionality, discussed above. Low-power modes for the processor and critical operations continue until the battery finally runs out of power, the unit is plugged into a central power source, or the device is placed sufficiently close to an inductive charging station.
- FPGAs field-programmable gate arrays
- ASICs application-specific integrated circuits
- CPLDs complex programmable logic devices
- SoC system-on-chip
- Embedded or distributed processing can facilitate existing, CPU-based approaches by off-loading computationally intensive routines.
- Hardware dedicated to performing these routines can be faster (often requiring only one, or just a few, clock cycles) and utilize less power (often by greater than an order of magnitude).
- Distributed processing can also facilitate new algorithmic approaches that are generally not feasible (within time and power-consumption constraints) using CPUs. Distributed processing is particularly valuable within algorithms that require repeated and/or simultaneous application of calculations to be performed on large data sets such as video images. These are further discussed below in the sub-section Distributed Processing.
- Another embodiment utilizes low-power distributed processing to detect whether the device has been removed from the head.
- the device In order to implement an “instant on” capability, the device must “sense” whether it is mounted on the wearer's head or has been removed.
- a method to perform this function is to determine if an eye can be viewed within eye-tracking camera images.
- power consumption can be reduced when the device is not in use (i.e. removed from the head) by a reduced frame rate, low resolution imaging, lower CPU clock rate, etc.
- illumination can be eliminated or reduced by reducing the power of illuminating LEDs, reducing the number of LEDs turned on and/or only turning on illuminator(s) when actually sampling camera images (at reduced frame rates).
- a substantial reduction in power can also be attained by embedding relatively simple eye geometry detection routines in distributed processing hardware.
- An example of this is one form of convolution filter to determine if an image (i.e. of an eye) is present is a focus filter.
- a filter would be classified as a high-pass spatial filter that detects the presence of high spatial contrast edges. The absence of such edges indicates that the device has been removed from the head when a defocused image is generally present (i.e.
- Another approach is to detect a dark (i.e. pupil) region adjacent to a white (i.e. sclera) region.
- the device When an in-focus eye is detected, the device “powers up” (recognizing that it was not completely powered off) for higher resolution eye tracking.
- the device may include a micro electro-mechanical system (MEMS) such as an accelerometer or rate sensor for determining motion.
- MEMS micro electro-mechanical system
- the device When the device is not being worn it may operate at an ultra-low power mode in which it is not intermittently searching for the presence of an eye. In the ultra-low power mode, the device may only search for the presence of an eye when movement of the device is detected, for instance when a user picks up the device. When movement is detected, the device may initiate a scan in search of an eye or eyes at predetermined intervals (for instance every two seconds) or substantially continuously for a period of time (for instance one minute) as set by user preferences.
- predetermined intervals for instance every two seconds
- substantially continuously for a period of time (for instance one minute) as set by user preferences.
- the device fails to detect an eye in the pre-set time interval it may resume ultra-low power mode or it may cycle through a low power mode prior to resuming ultra-low power mode. Should an eye or eyes be detected, the device will switch into full power mode or into a power settings scheme as set by the preferences of the detected user.
- the primary device owner (administrator) may set the overall system power schemes that will govern the power mode settings for the device when it is not in use. Additionally, the device owner may lock down changes to the power schemes such that other users are unable to edit them.
- FIG. 5 depicts a further breakdown of the IRIS Object 1000 , including eye tracking 1100 , a module for tracking the eye gaze of a user; eye data 1105 for user identification using biometric data of the user such as facial, speech and iris identification; eye control 1100 for relating the user's eye gaze to a display, iUiTM interface (an interface comprising eye-signal controls) 1116 . Eye signals 1115 gleaned from eye movements are used to interact with a user interface iUi 1116 and display screen(s) and images on the display.
- eye tracking 1100 a module for tracking the eye gaze of a user
- eye data 1105 for user identification using biometric data of the user such as facial, speech and iris identification
- eye control 1100 for relating the user's eye gaze to a display, iUiTM interface (an interface comprising eye-signal controls) 1116 .
- Eye signals 1115 gleaned from eye movements are used to interact with a user interface iUi 1116 and display screen(s
- IRIS object 1000 includes a number of software modules operative as managers of certain functions. As an example, but not limiting to:
- A/V control 1150 audio/video (A/V) control 1150 ; speech control 1155 ; or something more complex, e.g., cognitive load control 1160 ( FIGS. 11 and 12 ).
- A/V control 1150 audio/video (A/V) control 1150
- speech control 1155 speech control 1155
- cognitive load control 1160 FIGS. 11 and 12
- the HAL 1030 ( FIG. 6 ) includes the “hosted” aspects of external hardware systems; this generally includes software specific to the IRIS platform developed specifically for integration.
- the Hardware IF 1180 which interfaces with external software and/or hardware drivers through physical links 1235 ; these physical links can be I 2 C, USB, serial, or proprietary.
- the Bluetooth system has been selected in a non-limiting example of an embodiment because it is so pervasive in the rapidly growing market of mobile devices.
- Infotainment systems these are a combination of entertainment such as music and videos as well as information, where the information could come from within the vehicle as data from a sensor, control of a system like a heater or lights, or information available through the internet.
- Most of these systems use wired and wireless technologies to connect to the vehicle and/or the internet.
- the wireless connections to the vehicle are generally Bluetooth established through a set of standard interfaces; these are referred to as Profiles 1300 ( FIG. 9 ) and are hosted in a processor above the Bluetooth radio, further shown in FIG. 5 as Hands Free Profile (HFP) 1187 ; Advanced Audio Distribution Profile (A2DP) 1193 , Audio Video Resource Control Profile (AVRCP) 1190 , etc.
- HFP Hands Free Profile
- A2DP Advanced Audio Distribution Profile
- AVRCP Audio Video Resource Control Profile
- FIG. 6 depicts a Bluetooth systems architecture, including connections to the profiles, network and transport 1230 , and the data link 1250 and modem 1255 .
- FIG. 7 depicts a breakdown of the Bluetooth architecture 1205 broken down into their subcomponents. Underlying all of these protocols is a key piece of Bluetooth termed the Service Discovery Protocol (SDP) 1310 , which includes what is called Secure Simple Pairing (SSP). SSP today is required by all Bluetooth standards above v2.1. Secure Simple Pairing uses a form of public key cryptography, which can help protect against what is called “man in the middle,” or MITM attacks. Generally, the Bluetooth HID 1185 specification requires a security mode 4 for pairing and bonding two devices together citing that it should not be possible to perform pairing or bonding to any Bluetooth HID Host or Device without physical access to both the Bluetooth HID Host and Bluetooth HID device.
- SDP Service Discovery Protocol
- SSP Secure Simple Pairing
- Bluetooth HID Hosts and Bluetooth HID devices that support bonding use some form of non-volatile memory to store the 128-bit link keys and the corresponding BD_ADDRs, as well as the type of each link-key (authenticated, unauthenticated, or combination).
- complex access is limited as there is no mouse or keyboard in a conventional sense.
- a Bluetooth HID Host that accepts sensitive information from Bluetooth HID devices may be implemented to only accept sensitive information from reports that are contained in a top-level application collection of “Generic Desktop Keyboard” or “Generic Desktop Keypad.” Furthermore, such a Bluetooth HID Host may require MITM protection when pairing with any Bluetooth HID device with a Bluetooth HID report descriptor that contains a top-level application collection of “Generic Desktop Keyboard” or “Generic Desktop Keypad,” which in turn contains any of the following sets of usage codes and their descriptions:
- CBID can equivalently involve comparisons and/or the exchange of information involving images of irises, ICs, EICs or other derived parameters.
- databases used for biometric comparisons could equivalently (for the purposes of identification) contain ICs, EICs, images of eyes, images of faces (including eyes), images of irises, so-called “unfolded” (i.e. expressed in polar coordinates) iris images, or other derived parameters. Therefore, references to exchanges or comparisons of EICs also refer to the exchange or comparison of any other derived data sets for the purpose of biometric identification.
- programmable attributes include but are not limited to:
- Spatial Phase Imaging generally relies on the polarization state of light as it emanates from surfaces to capture information about the shape of objects.
- Triangulation employs the location of two or more displaced features, detectors, and/or illuminants to compute object geometry.
- Two important triangulation subcategories are stereo correspondence (STC) and stereoscopy (STO).
- Stereo correspondence cameras determine the location of features in a scene by identifying corresponding features in two or more offset intensity images using 3D geometry to compute feature locations.
- Stereoscopic cameras rely on human biological systems (eyes, brain) to create a notion of a 3D scene from two images taken from different vantage points and projected into the eyes of a viewer.
- coherent methods rely on a high degree of spatial and/or temporal coherence in the electromagnetic energy illuminating and/or emanating from the surfaces in order to determine 3D surface geometry.
- FIGS. 8 and 9 depict imager object code 1415 for either a 2D or 3D implementation.
- any system implemented must consider two key factors: Human-fidelic visualization (completely realistic display) and visual intelligence (automated vision).
- Human-fidelic visualization can create a visual notion of a scene in the mind of a human that is as realistic or almost as realistic as viewing the scene directly; the visualization system is human-fidelic.
- An imaging system has to be 3D to be human-fidelic, since human sight is 3D.
- the second being visual intelligence which means sensing and analyzing light to understand the state of the physical world. Automatic recognition of human emotions, gestures, and activities represent examples of visual intelligence.
- 2D video cameras struggle to provide a high level of visual intelligence because they throw away depth information when a video is captured. As a consequence of neglecting depth, 2D images of 3D scenes are inferior to 3D images. 3D images have better contrast (the ability to distinguish between different objects). Real video of real scenes typically contains dozens of instances where contrast and depth ambiguity make it difficult for automated systems to understand the state of the scene.
- 3D video cameras do everything that 2D cameras do, but add the benefits just discussed. It is inevitable that single lens native 3D video will eventually replace 2D video offered today by offering two interesting benefits: human-fidelic visualization and improved visual intelligence. It is reasonable to assume that global production of most cameras will shift to 3D as they become cost effective, simple to operate, compact and produce visual fidelity. With this in mind, the technology emerging today as the most likely to reach mass markets in terms of cost, complexity, and fidelity is Spatial Phase Imaging within the broad 3D imaging categories discussed.
- This technology relies on commercially available imagers implementing a micro-polarizing lens over four sub-pixels resulting in an ability to rapidly determine small changes in reflected light, computing a vector as a direction cosine for each pixel and generating a three dimensional value in terms of X, Y and Z-depth; truly a single lens native 3D video.
- the accuracy of both CBID and eye-signal control processes can be improved via the use of more than a single camera to view an eye.
- Images substantially simultaneously or sequentially acquired from multiple cameras can be used to 1) create on-axis (i.e. perpendicular to the surface) views of different regions of the eye, 2) view surfaces with specular reflections (particularly glints) located at different positions within images of the eye, 3) allow for viewing of fine structures while maintaining the ability to view over a wide spatial range, 4) increase eye tracking accuracy by making multiple measurements based on multiple views of glints and eye structures, and 5) view “around” obscuring objects such as eye lids and lashes.
- Another area where distributed/embedded processing is particularly valuable is in the “off-loading” of operations that are computationally intensive for a CPU.
- Examples of such a “hybrid” approach i.e. mixing CPU and embedded processing
- eye tracking and iris identification algorithms include subroutine that perform Fast Fourier Transform (FFT), random sample consensus (RANSAC), so-called StarBurst feature extraction, and trigonometric functions.
- FFT Fast Fourier Transform
- RANSAC random sample consensus
- StarBurst feature extraction so-called StarBurst feature extraction
- trigonometric functions trigonometric functions
- Cognitive load control 1160 In programs like cockpit workload management; cognitive load control 1160 generally deals with the human mind interacting with some external stimulus.
- cognitive load is slightly different in different fields; for example, in an academic sense cognitive load refers to the total amount of mental activity imposed on working memory at any instance in time; while in the ergonomics literature it is described as the portion of operator information processing capacity, or resources that are required to meet cognitive task demands. Each field provides different methods to measure cognitive load.
- Cognitive load is considered herein as the mental effort or demand required for a particular user to comprehend or learn some material or complete some task. Cognitive load is relative to both the user (i.e., their ability to process novel information) and the task being completed (i.e., complexity), at any single point in time. It is attributable to the limited capacity of a person's working memory and their ability to process novel information.
- Conventional methods for measuring cognitive load include:
- FIG. 12 depicts system components for a cognitive load manager 1160 that addresses many of these issues.
- mobile, wearable, implanted, consumed, and other physiologically integrated computers employ increasingly sophisticated and varied sensors, data input methods, data access methods, and processing capabilities that capture, access, and interpret more and more data that can be used as sensory input to the brain and impact cognitive activity.
- the data comprises physiological data 1815 and environmental data 1810 .
- the data are used to better establish a user's preferences for the integration, management, and delivery of information to the head mounted unit.
- FIGS. 13-15 depict three different system architectures for connecting the HMD to another device or to the Internet.
- FIG. 13 depicts the HMD 600 connecting through a local link, such as Bluetooth, to a mobile device 710 carried by the user; the mobile device 710 is connected via link 155 to a packet switched network typically provided by a wireless carrier through 700 or what today is generally referred to as a packet network also known as the world wide web; with subsequent connection to either a web-based service, a database, or external application 160 .
- a local link such as Bluetooth
- FIG. 14 depicts the HMD 600 including a wireless transmitter 750 that is either embedded or attached to the HMD for connection directly to the internet 700 and a service provider 160 .
- FIG. 15 depicts the HMD 600 including a wireless transceiver 750 connected via a link 725 directly to the Internet, where the local link is generally a packet link, but could be other proprietary wireless protocols.
- the HMD is independent from other smart devices; essentially the HMD is connected directly to the Internet all of the time.
- the user would simply implement a local connection through a Bluetooth profile.
- the user would need to use Audio Video Transport Profile, Audio Video Resource Control profile, or Advanced Audio Distribution Profile.
- a user wanted to connect to a vehicle he or she would need to implement the Hands Free Profile. Simpler and less complex systems are needed along with methods to connect to these systems, especially if the user is beyond the range of a local connection to the system they want to control.
- FIG. 16 depicts another embodiment where an HMD is implemented in an “abstracted” real-time server-browser cloud based architecture; known today as the “Internet of Things” or IoT.
- the key to any abstracted layer is the ability to abstract away from some device or software operational or strategic complexity; these could include proprietary aspects, including trade secrets and intellectual property.
- the abstraction can support extended or new business models to a technology supplier.
- a good example of this architecture is the NESTTM Labs business model. This model could be loosely referred to as a “razor/razor blade” model; in this case the NESTTM thermostat is the razor, the NESTTM Services are the razor blades or simply stated the business model includes the sale of the thermostat and a monthly recurring service. In addition to the sale of hardware and services, this business model supports data harvesting of a user in his home. In this system, the thermostat serves data off to a centralized server for the purposes of “learning.”
- FIG. 16 depicts an HMD 600 connected via a packet network 155 to the Internet 700 .
- the user needs to access their page on the NESTTM Services server 965 .
- the traditional role of web server and browser has been expanded under the new HTML 5 standard. There has been what looks like a role reversal of the server and browser, one where the web server is now the smart thermostat; this server is simply serving small amounts of data to a fixed URL in the cloud running a browser. This browser in the cloud can be accessed by a user using a smart device or computer from virtually anywhere to read or interact with their thermostat.
- Using the web server in this role is now a key and underlying concept of the IoT, one where complexity and cost are greatly reduced.
- the head-mounted device 600 could be connected to any consumer, industrial, or commercial device located anywhere in the world on the cloud, a user could control that device via eye interaction with the included display via eye signals defined as a standardized command set mapping the eye signals to communication, diagnostics, control, and interaction with the device(s).
- FIGS. 17-23 depict an abstraction transition model from a smart head mounted system, to a much simpler model as depicted in FIG. 23 .
- FIG. 23 depicts a cloud based implementation within an IoT architecture of an HMD 600 connected by a very high speed packet based link, a wireless link that would rival or potentially outperform the typical communication bus in a local processor.
- processor busses operate as subsystems of the processor to facilitate transfer of data between computer components or between computers.
- Typical bus types include front-side bus (FSB), which carries data between the CPU and memory controller hub; direct media interface (DMI), which is a point-to-point interconnection between an integrated memory controller and an I/O controller hub in the processor; and Quick Path Interconnect (QPI), which is a point-to-point interconnect between the CPU and the integrated memory controller.
- FSB front-side bus
- DMI direct media interface
- QPI Quick Path Interconnect
- the HMD is connected to a centralized server-browser 800 that operates the Visual Disambiguation Service (VDS) interface termed IRIS (Interface for Real-time Image Services); think of this operating much like SIRI (Speech Interpretation and Recognition Interface) does for audio.
- VDS Visual Disambiguation Service
- SIRI Speech Interpretation and Recognition Interface
- the IRIS service is for the complex disambiguation of eye movement for the real-time interpretation, determination, and prediction of a user's intent.
- IRIS like SIRI, operates in the cloud. 1126 and 1127 represent the IRIS abstraction layer discussed above.
- the HMD now operates with a minimum amount of software, a processor richer in features and configured with a limited or possibly no operating system using a publish/subscribe messaging scheme.
- the embedded IRIS (e-IRIS) 1111 includes a number of tools or utilities operating in the FOG as a combined real-time service. These include a data manager 1125 , device manager 1120 , communication manager 1130 , power manager 1140 , and security Manager 1135 a . In the e-IRIS abstraction 1127 , there are counterpart managers, with a slight exception in the security manager 1135 b ; this will be discussed below in more detail.
- FIG. 23 also depicts the eye management tools centralized in a cloud-based version in support of a user. These include an eye tracker 1100 , eye data 1105 in support of security, eye control 1110 , eye signals 1115 , and iUi 1116 for an eye user interface.
- eye management tools centralized in a cloud-based version in support of a user. These include an eye tracker 1100 , eye data 1105 in support of security, eye control 1110 , eye signals 1115 , and iUi 1116 for an eye user interface.
- other real-time services are available and associated to IRIS including an Audio-Video manager 1150 , speech manager 1155 , cognitive load manager 1160 and a context manager 1165 . The combination of these services and architecture constitutes IRIS.
- the HMD 600 is wirelessly connected to a smart device (such as a smart phone, a tablet, home or office PC) or simply to the Internet through an 802.11 link. All of the services operate in the HMD 600 processor or are stored in a memory associated with the HMD 600 .
- This embodiment would operate as a stand-alone computer, with an operating system, and micro-processor(s) and/or other logic elements.
- some of the non-real-time applications are off loaded to applications run on the local smart phone 710 , local PC, or other smart devices.
- this first transition embodiment would still be highly dependent on the locally available resources in the HMD 600 to operate as intended.
- FIG. 18 depicts a second transition step wherein the data manager 1125 takes on a new role.
- the data manager is configured to manage some of the data either on or off board the HMD 600 using a markup language, such as or JSON (Java Script Object Notation), HTML 4.01, or 5.0.
- JSON Java Script Object Notation
- HTML 4.01 HyperText Markup Language
- the object of this transition step is to implement a web server-browser relationship in the HMD 600 .
- some of the data acquired by the imagers, audio input, or any other sensors available to the HMD 600 are served to the cloud and directed by a fixed URL to a cloud based IRIS, where a user's browser page resides and his/her data are aggregated.
- This second transition supports non real-time data applications, as an example the HMD 600 is used for the transmission of data that have been collected and stored by a user.
- the user may capture a photograph, an audio clip, a video clip, or other user physiological data related to the eye or a user's health; these data are then transferred to IRIS for storage, aggregation, or possible subsequent dissemination (discussed in more detail below).
- FIGS. 19 , 20 , and 21 depict a third step in the transition, where the wireless bandwidth is now near real-time.
- a web server and browser relationship exists operationally in parallel with a now more mature e-IRIS 1111 in the HMD 600 and IRIS 800 in the cloud. They operate and interact with each other in near real-time across the abstraction layer 1126 and 1127 .
- This new configuration now allows an evolution of the security manager with respect to security and implementation of the private key-public key.
- the security manager 1135 resident in the HMD 600 takes on the role of generating a private key and public key based on certain bio-metrics as disclosed in described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015. Data collected from the face, eye, or voice constitute unique biometric data of the user or user groups if desired. These data collected can be used to generate a unique private key in a system of public key and private key cryptography.
- cryptographic systems have been widely used for information protection, authentication, and access control for many years. These cryptosystems are generally categorized as symmetric key cryptosystems and public key cryptosystems. Symmetric key cryptosystems use the same key for encrypting and decrypting secret information; however using the same key can be problematic: 1) if the key is compromised, security cannot be assured; and 2) if there are multiple users, multiple keys are needed, which may increase system costs and data security.
- Public key cryptosystems can overcome these limitations by using a pair of cryptographic keys (i.e., a private key and a public key). The private key used for decryption is kept secret, whereas the public key used for encryption may be distributed to multiple users. Therefore, secrecy of the private key is a major challenge when it comes to achieving high levels of security in practical crypto systems.
- the irisCode of the user possibly combined with other biometric data are used to establish a unique key that subsequently generates the private key-public key.
- the public key generated from the user's unique biometric aspects is sent to IRIS 800 for storage in the security manager portion of the user's browser, FIG. 22 1135 b .
- the private key is never stored, but is generated in the HMD 600 every time a user instantiates a session.
- the private key is generated, FIG. 21 1129 , and authenticated in IRIS 800 . This ensures levels of non-repudiation and security currently not available in web applications, especially in e-commerce.
- FIG. 23 depicts the final step in the transition to a real-time HMD 600 .
- the Internet is now prolific and operates at speeds in excess of processor buses.
- IRIS 800 is cloud-based and real-time for all intents and purposes. Data are collected and aggregated in IRIS 800 .
- IRIS 800 is now implementing advanced algorithms based on learning about the physiology of the human eye, as well as the user generally; disambiguation in IRIS 800 is enhanced to the point. IRIS 800 can now predict what and where a user wants to see or do.
- the user's HMD 600 is commodity, low cost, low power and immediately replaceable.
- the final step abstracts all of the intelligence for the device to the cloud 700 .
- CBID, now cloud 700 based is substantially continuous and real-time. Since the generation of the private key is unique to a user, this allows any user to pick up any HMD 600 and use it at any time; simply slip it on and they are looking at their browser page where all of their personal information now resides. If their HMD 600 is stolen, the information is secure. If a user loses their HMD 600 , no worry, simply borrow one, or buy a new one.
- the CBID and cloud 700 aspects of IRIS 800 abstract the device at a new level, it abstracts the user like HMI and displays do today.
- the thermostat is only accessible through the NESTTM Services portal and page.
- the HMD 600 is securely connected to IRIS 800 and a user's page. If the user wants to access their thermostat, IRIS connects them directly and securely to the NESTTM Services portal 965 .
- This model will extend to XFINITY, if a user wanted access to his/her account to set a recording, or have access to an XFINITY service, IRIS will connect them directly the XFINITY portal 970 . Further, if the user wants access to their COZYHOME application, again, the link is securely made to the appropriate server in this case 975 .
- IRIS 800 may be linked to a user's social media account, giving the user a real-time access.
- FIG. 25 depicts how IRIS 800 would securely connect a user securely to their Google+ account to see postings or to post in near real-time information they want to share.
- Social Media 920 comprises social media services available to a user.
- IRIS 800 includes a context manager 1165 in both e-IRIS 1111 in FIG. 17 , as well as IRIS 800 FIG. 23 and its role to generated Contextualized Eye Data (CED).
- CED begins with eye data extracted from episodic and/or substantially continuous monitoring of one or both eyes. These eye data include eye movements such as: saccades, fixations, dwells, pursuits, drift, tremors, and micro-saccades.
- Eye data also include blinks and winks, squints, pupil dilation, blood vessel patterns, iris and pupil size, feature locations, internal eye-structure size, shape, and location.
- a key aspect for CED is to use this data to detect behavior changes over time.
- CED is the correlation of eye-data with other classes of data over time to extract relationships for meaningful prediction, measurement, analysis, interpretation, and impact on the user.
- three classes of data IRIS 800 will have aggregated are raw data, semantic data, and evoked data.
- Raw data comprises data captured by any sensors, whether in the HMD 600 or present on or in a person.
- IRIS 800 can take this raw data from an individual and correlate it with eye data.
- Examples include, but are not limited to, sensors that capture: movement, GSR (galvanic skin response), temperature, heart rate and heart rate variability (HRV), EOG (Electro-oculogram), EEG (Electro-encephelogram, EKG (electro-cardiogram), temperature, facial muscle movement and skin movement, internal organ or biological systems status and performance, scent, audio, scene and images for a range of electromagnetic radiation (visible light, IR, UV, and other electromagnetic frequencies), location (GPS and other beacon sensors), time monitoring/tracking, and more.
- GSR galvanic skin response
- HRV heart rate and heart rate variability
- EOG Electro-oculogram
- EEG Electro-encephelogram
- EKG electro-cardiogram
- temperature facial muscle movement and skin movement
- scent audio, scene and images for a range of electromagnetic radiation (visible light, IR, UV, and other electromagnetic frequencies), location (GPS and other beacon sensors), time monitoring/tracking, and more.
- Semantic data comprises the interpretation or meaning of “what, when, where, and how” a user is “doing” something, as well as with whom the user is doing something. “Doing” can be working, playing, eating, exercising, reading, and myriad other activities. These data are constructed by interpreting sensor data in the context of a user's activities.
- Evoked data are extracted from conscious or subconscious individual response to visual, tactile, olfactory, taste, audio, brain, or other sensory, organ, or biological responses to intentional stimuli.
- Eye-tracking data have primarily been captured indoors due to the technology's inability to function well in high-infrared (outdoor) environments without substantial filtering or shielding of ambient IR light, further reducing the practicality, breadth, and quantity of eye-data capture.
- high quality, environmentally diverse, high-volume data across diverse “natural” use cases have been limited due to the expense, limited portability, constrained form-factor, high-power requirements, high-computing requirements, limited environmental robustness, and dedicated “data capture” utility of eye-tracking technology and devices. While early research on the data captured has shown promise for extraordinary insights into human health, cognition, and behavior, the general capture of such data has been highly constrained to specific tests and environments for short durations.
- the first generation of IRIS integrated HMDs may be worn by millions of people in a broad range of life activities.
- these data may be collected by IRIS first as historical data, then in both near real-time, and ultimately in real-time. Should this transition occur, it could increase by orders of magnitude the quantity, quality, and contextualization of eye data that is captured.
- IRIS could then have the ability to correlate data with a broad range of other personal and aggregated data such as individual and group health cognition and behavior.
- IRIS may then use the aggregated data to provide insights into eye data correlated with personal health, cognition, and behavior as a starting point regarding self-quantification, self-improvement, and self-actualization.
- IRIS will support applications for extracting patterns from large datasets that will expose and predict future behavior such as that of our likelihood to adopt a new habit, our interest in acquiring a product, or our likelihood in voting for a new politician.
- future behavior such as that of our likelihood to adopt a new habit, our interest in acquiring a product, or our likelihood in voting for a new politician.
- IRIS' contextualized eye-data these include but are not limited to:
- IRIS application and tools positively impact the user of the HMD by contextualizing the eye data that are aggregated.
- IRIS technology will advance the user's performance in many dimensions and will enhance their human-to-human interactions as well as their human-machine interactions.
- FIGS. 26-28 depict other portals for secure access to a user's information where again, the common element is IRIS 800 .
- the private key stored in IRIS can be related to a password for the user that greatly simplifies the user's interaction on the web, to include secure transactions.
- FIG. 29 depicts a user operating a setup process that needs to occur only once where the user needs to link their public key with account information.
- a bank or other financial institution that is responsible for the account might verify other forms of target (i.e., intended) user identity and offer the linkage process as a service.
- target i.e., intended
- real time knowledge of a device-wearer's identity allows financial particulars to be exchanged electronically with each item as selected and purchased. This eliminates the need to repeatedly enter passwords, security questions or account information for each transaction or group of transactions. As a consequence, such an instantaneous purchasing system eliminates processes involved with a so-called online shopping “carts” since there is no longer a need to cluster items for the purpose of entering account information. Solely for customer convenience, groups of items purchased during an online shopping session can be treated as a cluster or summarized for the purchaser.
- systems and methods are provided to enhance security and streamline shopping at so-called “bricks and mortar” retail outlets.
- a camera mounted on the headwear device that views the environment of the device wearer can be used to identify objects that may be of interest for purchase. Identification can be based on bar codes or quick-response (i.e. Q-R) codes that are commonly attached to purchasable items.
- Q-R quick-response
- Such object identification uses image processing methods that are well known in the art.
- Information about the item including a proposed purchase price can be generated by a processing unit associated with the retail outlet. This information can then be displayed on nearby monitors or on a head-mounted display associated with the device wearer. If the customer wishes to purchase a given item, a CBID-based transaction can be initiated by the customer. Such transactions can occur repeatedly throughout a store. A match between transported items and the transaction record would then allow items to be verifiably removed from the store by the customer. CBID-based retail purchases eliminate the need for check stands or tills. In many situations, the automated, real time display of information during the purchasing process also reduces the need for store clerks to assist potential customers.
- These devices are also integrating increasingly sophisticated and varied data output methods that stimulate visual, auditory, tactile, olfactory, gustatory (sense of taste), equilibrioception (sense of balance), direct neurological, indirect (wireless) neurological (neural and synaptic brainwave stimulation), chemical, biological activity, and multi-modal input sensation.
- the increased stimulation of the body and associated enhanced delivery of information to the brain can affect brain activity in subtle and profound ways.
- Cognitive stimulation resulting from more, varied, and faster delivery of multiple forms of input to the brain can positively impact human performance.
- cognitive overload or inappropriate stimulation can negatively impact performance, damage health, create safety hazards, and even kill.
- a recent form of cognitive load management associated with electronic stimulation includes applications that temporarily disable email, text, and other online forms of interruption. These applications are very simple in form, however.
- This approach allows user's customization and prioritization to improve over time as historical context, performance, biometric, and other data are accumulated and analyzed forming generally a user profile of activities and preferences.
- These also provide a variety of methods and techniques for dynamically managing stimuli (deferral, termination, sequencing, reprioritization, pacing, and more), support stimuli aggregation and management across multiple individuals for risk-controlled or performance-enhanced group activity.
- Another embodiment is context-aware computing as 1165 .
- a mobile computing paradigm it will be advantageous for applications to discover and take advantage of contextual information such as user location, time of day, neighboring users and devices, user activity to specifically support collecting and disseminating context and applications that adapt to changing context.
Abstract
Description
- The present application claims benefit of co-pending provisional application Ser. No. 61/991,435, filed May 9, 2014, 62/023,940, filed Jul. 13, 2014, 62/027,774, filed Jul. 22, 2014, 62/027,777, filed Jul. 22, 2014, 62/038,984, filed Aug. 19, 2014, 62/039,001, filed Aug. 19, 2014, 62/046,072, filed Sep. 4, 2014, 62/074,920, filed Nov. 4, 2014, and 62/074,927, filed Nov. 4, 2014, the entire disclosures of which are expressly incorporated by reference herein.
- Contained herein is material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent file or records, but otherwise reserves all rights to the copyright whatsoever. The following notice applies to the software, screenshots and data as described below and in the drawings hereto and All Rights Reserved.
- The present invention relates to portable or wearable biometric base user identification and authentication for secure distributed and interactive systems and services.
- The widespread use of the internet and computing/communications devices has led to an explosive growth in the electronic dissemination of information. However, verifiable control over the recipient(s) of secure information remains an important issue in the field of cyber security. Moreover, recipients of information can also become sources of sensitive information where real time knowledge of the identity of such a source can be an important security issue. An example of this situation is knowledge of the identity of an individual entering credit card (or other account) information during the process of making an online purchase. Present-day techniques commonly used to remotely identify the recipients or sources of secure information are readily susceptible to deception. In the United States, identity theft affects approximately fifteen million individuals each year with an estimated financial impact of $50 billion.
- To solve these problems, there is a need today to re-think system architectures and roles with a specific view on data security and non-repudiation of a user's electronic signature (e.g. password), where the authenticity of the signature is being challenged. In a general sense, nonrepudiation involves associating actions or changes to a unique individual. For a secure area, for example, it may be desirable to implement a key card access system. Non-repudiation would be violated if it were not also a strictly enforced policy to prohibit sharing of the key cards and to immediately report lost or stolen cards. Otherwise determining who performed the action of opening the door cannot be trivially determined.
- Similarly, for computer accounts, the individual owner of the account must not allow others to use that account, especially, for instance, by giving away their account's password, and a policy should be implemented to enforce this. This prevents the owner of the account from denying actions performed by the account.
- The ecosystem is dynamic and rapidly changing, where wireless capability is growing exponentially. Cloud based architectures are becoming more appealing and attainable at manageable costs. The place to start re-thinking is with transitional and end architectures in mind and the placement of what is called a “data abstraction layer.” This abstraction layer is distinguished by data movement defined as data on the move and data at rest; and includes considerations for data generation, data storage, data processing, and the role of the server and browser in the cloud.
- A first transitional step on the way to the Internet of Things (IoT) is the emergence of fog computing or fog networking. This is an architecture that uses one smart device, a collaborative multitude of smart devices or near-user edge devices to carry out a substantial amount of processing and storage (rather than stored primarily in cloud data centers), communication (rather than routed over the internet backbone), and control, configuration, measurement, and management rather than controlled primarily by network gateways such as those in the Long Term Evolution (LTE) such as 4-G LTE.
- Fog Networking consists of a control plane and a data plane. For example, on the data plane, fog computing enables computing services to reside at the edge of the network as opposed to servers in a data-center. Compared to cloud computing, fog computing emphasizes proximity to end-users and client objectives, resulting in superior user-experience and redundancy in case of failure. Fog Networking supports the IoT, in which most of the devices that are used on a daily basis will be connected to each other. Examples include phones, wearable health monitoring devices, connected vehicle, and augmented reality using devices such as the Google Glass. The ultimate goal of the IoT is to realize connections between objects, objects and persons, all things, and networks for the secure identification, management, and control of data.
- With the above in mind, wearable display devices will challenge traditional computer human machine interaction. Today, computer mice, joysticks, and other manual tracking devices are ubiquitous tools for specifying positional information during human-machine interactions (HMIs). With the advent of wearable computing, such bulky and obtrusive devices that, for example, generally require stationary surfaces for proper operation are incompatible with the portable nature of apparatuses that are designed to be worn on the body.
- Wearable display devices include virtual reality (“VR”) displays such as those manufactured by Sony, Samsung, Oculus, Carl Zeiss; head mounted displays (“HMDs”) such as those produced by Google (e.g., Glass®) and Vuzix; augmented reality (“AR”) displays such as those manufactured by Microsoft, Vuzix, and DigiLens; and similar devices. Eye tracking can be used to view such displays and to specify positional information. However, the eyes are also used extensively during normal human activities.
- In a further discussion of an embodiment with respect to security, data collected from the face, eye(s), or voice constitute unique biometric data of the user or user groups, if desired. These collected data can be used to generate a unique private key in a system of public key and private key cryptography. Cryptographic systems have been widely used for information protection, authentication, and access control for many years and as such are well known in the art of information security.
- An additional component for the continuous exchange of secure information with a biometrically identified individual is the encryption of all transmitted (i.e., sent or received) data. Data encryption has a long history that pre-dates the electronic computer. A number of well-established methods have been developed to protect the confidentiality, integrity, and authenticity of data. Most encryption techniques make use of one or more secret keys or security codes that can be used to encrypt and/or decipher data streams. Keys used to encode or decipher data streams can originate from a number of sources including previously transmitted data sequences, identification codes embedded during the manufacture of a device, and usage counts.
- Encryption and deciphering methods that make use of transposition, substitution, repositioning, masking, translation tables, and/or pre-defined numeric sequences are well-known in the art. More sophisticated techniques utilize multiple methods applied to larger blocks (i.e., more than a single character or byte) of information. In addition, encryption and deciphering methods that include a processing step within a protected hardware component are generally more protected from attempts at decoding compared to those implemented using software stored on some form of memory device.
- Field-programmable gate arrays (FPGAs) and application specific integrated circuits (ASICs) are particularly useful as encrypting and deciphering components. In fact, Streicher et al (U.S. Pat. No. 8,363,833) and others describe processes whereby even the bit stream used to program an FPGA that can be used for encryption, is itself encrypted. Concealing both security keys and methods to decipher secure information greatly reduces the risk of anyone other than the intended recipient gaining meaningful access to an encrypted data stream.
- As further background, Bluetooth had its origins in 1998 with the release of the 1.0 specification, with a subsequent release in 2000 of what was called 1.0b. These early releases were designed to remove wires from the desktop of a user; these included considerations for serial, headset, cordless phone, and LAN connections. However, these early versions had many problems and manufacturers had difficulty making their products interoperable. Subsequent releases of Bluetooth 1.1, 1.2, and 2.0 included expanded bandwidth, profile capability, and finally, in release 2.1, new levels of security, including what is now called Secure Simple Pairing (SSP).
- SSP allows two devices to establish a link key based on a Diffie-Hellman key agreement and supports four methods to authenticate the key agreement. One of these methods, called the Passkey Entry method, uses a PIN (i.e., personal identification number) entered on one or both devices. However, the Passkey Entry method has been shown to leak this PIN to any attacker eavesdropping on the first part of the pairing process. If the attacker can prevent the pairing process to successfully complete and the user uses the same PIN twice (or a fixed PIN is used), the attacker can mount a man-in-the-middle attack on a new run of the pairing process.
- Today there are numerous papers on the security risks even the most sophisticated protocols are challenged with. New systems and methods are needed to ensure what is termed nonrepudiation; “nonrepudiation” generally means an assurance that someone cannot deny something. In this case, the denial of the ability to ensure the use of a device or communications from a device cannot deny the authenticity of their signature on any use, communications, or messages that a qualified user originates.
- Although the best understanding of the present invention will be had from a thorough reading of the specification and claims presented below, this summary is provided in order to acquaint the reader with some of the new and useful features of the present invention. Of course, this summary is not intended to be a complete litany of all of the features of the present invention, nor is it intended in any way to limit the breadth of the claims, which are presented at the end of the detailed description of this application.
- In this disclosure, voluntary eye movements that are intended to interact with a computing device are referred to as “eye signals.” Eye signal controls are described in Systems and Methods for Biomechanically-based Eye Signals for Interacting with Real and Virtual Objects [Attorney Docket No. EYE-023], application Ser. No. 14/______, filed May 8, 2015, the entire disclosure of which is expressly incorporated by reference herein.
- Processes for identification of a device user are described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification [Attorney Docket No. EYE-024], application Ser. No. 14/______, filed May 8, 2015, the entire disclosure of which is expressly incorporated by reference herein.
- In accordance with one embodiment, apparatus, systems, and methods are provided for a head-mounted device (HMD) that includes at least one processor connected to at least one imager, where at least one of the imagers is oriented toward the eye(s) of a user. The processor is configured to at least one of substantially continuously, simultaneously, and/or periodically determine eye signal estimation, where the eye signal estimation is determined by the processor using the imager to detect at least one glint from the surface of the eye that has been generated from a light source attached to the HMD or where the imager detects one or more distinctive features of the eye; and determine biometric data of a user including facial features, voice, or iris data of a user, where the biometric data is used for the identification and authentication of the user for access and control of at least one of the HMD, a connected device, a wireless device, and a remote server.
- In accordance with another embodiment, apparatus, systems, and methods are provided that may substantially continuously, periodically, and/or on demand perform iris recognition utilizing a head-mounted device. Biometric identification during the formation of eye-signal controls may be used within a wide range of applications in which user identification and/or authentication are required in real time.
- In another embodiment, systems and methods are disclosed in which eye-signal control sequences are used for authentication of a user for at least one of withdrawing money from an automated teller machine (ATM) and making online purchases.
- Another embodiment discloses systems and methods to authenticate a user for online activities including at least one of private, group, or other testing, complying with performance requirements coupled with identity for various forms of employment such as professional driving, piloting, or other transportation, logging hours, confirming acknowledgement to informed consent provided orally or read by a user whereupon continuous confirmation of identity during saccadic and other eye-based movements during reading, and confirming acknowledgement of any legally binding agreement.
- Another embodiment discloses systems and methods for combining identifying characteristics with other security tokens including at least one of information tokens (passwords), physical tokens (keys), produced tokens (speech, gestures, writing), and other biometric tokens such as fingerprint and voiceprint.
- Another embodiment discloses systems and methods that describe a plurality of system configurations, including:
-
- Storing multiple user codes where searching and matching is performed entirely on an HMD.
- Sending user code(s) to a specific processor for identification and matching.
- Sending user code(s) to the cloud.
- Augmenting or replacing common password-based access to computing devices.
- Substantially continuously re-verify the identity of the device wearer.
- Another embodiment discloses systems and methods for capturing an image of the iris, or a locally generated irisCode (e.g., as described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015) from the iris image, and transmit the iris information to the cloud for authentication of any HMD.
- Another embodiment discloses systems and methods that replace or augment common password-based access to computing devices.
- Another embodiment discloses systems and methods to use a specific implementation of the continuous biometric identification (CBID) approach (e.g., as described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015) to “buy at the aisle” by using eye-signal methods or processes referred to as “look to buy.”
- Another embodiment discloses systems and methods for displaying (on an HMD or remote display device) information, including cost, about the item.
- Another embodiment discloses systems and methods for object recognition used to identify items for purchase that are simply viewed within the environment of the user.
- Another embodiment discloses systems and methods that establish a true identity of a user wearing the HMD.
- Another embodiment discloses systems and methods that prevent user identity fraud and identity theft.
- Another embodiment discloses systems and methods that use the HMD to authenticate users for at least one of educational or legal purposes.
- Another embodiment discloses systems and methods that use the HMD for the purpose of authenticating a purchase, where the authenticated purchase is for online purchase security and offline purchase security, where offline includes at a retail establishment.
- Another embodiment discloses systems and methods that use the HMD that includes a second imager connected to the processor oriented outward from the HMD, where the second imager detects a code that can be decoded by the processor, where the code is one of a bar code and a QR (i.e., quick response) code and where the processor decoded data represents information about a product.
- Another embodiment discloses systems and methods that use the information related to a product to allow an authenticated user to securely purchase the product.
- Another embodiment discloses systems and methods that allow an entity to initiate a secure communication channel with another entity by mutual gaze where the security of the communication channel may be established prior to communication and may be revalidated continuously or at intervals during communication.
- Another embodiment discloses systems and methods that enable a secure protocol for coordination among parties to cause an action to occur whereupon each party performs some action during which time their identities are substantially continuously verified with CBID.
- Another embodiment discloses systems and methods that increase security when using a HMD device by limiting access to functional blocks in a silicon chip that supports eye-tracking for the HMD device.
- Another embodiment discloses systems and methods that manage, coordinate, filter, and/or sequence the stimulus provided by one or more wearable devices associated with the identity of a user.
- In another embodiment, systems and methods are provided that include a dynamically evolving cognitive architecture for a system based on interpreting the gaze-based intent of a user. Natural eye movement is interpreted by the system, and used for real-time image services. An illumination device comprising a processor and a camera is worn by the user, with the camera aimed toward an eye of the user. The system includes memory with stored instructions. When the instructions are executed the system receives eye measurement data from the camera aimed at the user's eye. The data is used to determine a first state of the eye, and compare it to data captured from a second state of the eye. When the system determines that the first and second states of the eye are the same, further instructions are sent to at least one processor in the system.
- So as to reduce the complexity and length of the Detailed Specification, and to fully establish the state of the art in certain areas of technology, Applicant(s) herein expressly incorporate(s) by reference all of the following materials identified in each numbered paragraph below. The incorporated materials are not necessarily “prior art” and Applicant(s) expressly reserve(s) the right to swear behind any of the incorporated materials.
- Applicant(s) believe(s) that the material incorporated by reference herein is “non-essential” in accordance with 37 CFR 1.57, because it is referred to for purposes of indicating the background of the systems and methods herein. However, if the Examiner believes that any of the above-incorporated material constitutes “essential material” within the meaning of 37 CFR 1.57(c)(1)-(3), applicant(s) will amend the specification to expressly recite the essential material that is incorporated by reference as allowed by the applicable rules.
- The inventors are also aware of the normal precepts of English grammar. Thus, if a noun, term, or phrase is intended to be further characterized, specified, or narrowed in some way, then such noun, term, or phrase will expressly include additional adjectives, descriptive terms, or other modifiers in accordance with the normal precepts of English grammar. Absent the use of such adjectives, descriptive terms, or modifiers, it is the intent that such nouns, terms, or phrases be given their plain, and ordinary English meaning to those skilled in the applicable arts as set forth above.
- Further, the inventors are fully informed of the standards and application of the special provisions of 35 U.S.C. §112, ¶6. Thus, the use of the words “function,” “means” or “step” in the Detailed Description or Description of the Drawings or claims is not intended to somehow indicate a desire to invoke the special provisions of 35 U.S.C. §112, ¶6, to define terms or features herein. To the contrary, if the provisions of 35 U.S.C. §112, ¶6 are sought to be invoked to define features of the claims, the claims will specifically and expressly state the exact phrases “means for” or “step for”, and will also recite the word “function” (i.e., will state “means for performing the function of [insert function]”), without also reciting in such phrases any structure, material or act in support of the function. Thus, even when the claims recite a “means for performing the function of . . . ” or “step for performing the function of . . . ”, if the claims also recite any structure, material or acts in support of that means or step, or that perform the recited function, then it is the clear intention of the inventors not to invoke the provisions of 35 U.S.C. §112, ¶6. Moreover, even if the provisions of 35 U.S.C. §112, ¶6 are invoked to define the claimed features, it is intended that the features not be limited only to the specific structure, material, or acts that are described in the embodiments, but in addition, include any and all structures, materials or acts that perform the claimed function as described in alternative embodiments or forms, or that are well known present or later-developed, equivalent structures, material or acts for performing the claimed function.
- A more complete understanding of the present invention may be derived by referring to the detailed description when considered in connection with the following illustrative figures. In the figures, like-reference numbers refer to like-elements or acts throughout the figures. The presently embodiments are illustrated in the accompanying drawings, in which:
-
FIG. 1 is a front view of a human eye. -
FIG. 2 is a section view of a human eye from the side. -
FIG. 3A depicts a top down view of an eye showing the regions of vision. -
FIG. 3B shows an example approximation of the sizes of the regions of vision. -
FIG. 4 depicts overall system architecture. -
FIG. 5 depicts architecture of the eye signal object. -
FIG. 6 depicts abstracted hardware. -
FIG. 7 depicts typical Bluetooth architecture. -
FIG. 8 depicts hardware interface and hardware components. -
FIG. 9 depicts imaging architecture. -
FIG. 10 depicts biometric data generation. -
FIG. 11 depicts a breakdown of a cognitive load manager. -
FIG. 12 depicts system components of the cognitive load manager. -
FIG. 13 depicts a HMD (head mounted device) connecting to a mobile device. -
FIG. 14 depicts a HMD connecting to the cloud. -
FIG. 15 depicts a HMD connecting to home and vehicle controls. -
FIG. 16 depicts communication between a HMD and a NEST™ thermostat system. -
FIG. 17 shows system architecture on a HMD communicating with the cloud. -
FIG. 18 depicts a breakdown of the data manager. -
FIG. 19 shows the system architecture of a HMD with the processing capabilities moved to the cloud. -
FIG. 20 shows further evolution of a HMD towards the Internet of Things. -
FIG. 21 depicts the system architecture from a HMD moved to a remote server. -
FIG. 22 depicts a HMD with all processing pulled off to the cloud. -
FIG. 23 depicts a HMD and the remote server communicating. -
FIG. 24 depicts a HMD communicating with home control systems. -
FIG. 25 depicts a HMD communicating with social media. -
FIG. 26 depicts a HMD communicating with home entertainment systems. -
FIG. 27 depicts a HMD communicating with vehicle entertainment systems. -
FIG. 28 depicts a HMD communicating with vehicle control systems. -
FIG. 29 is a flow chart of steps taken to perform an online, secure purchase. - In the following description, and for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various aspects of the embodiments. It will be understood, however, by those skilled in the relevant arts, that the apparatus, systems, and methods herein may be practiced without these specific details. It is to be understood that other embodiments may be utilized and structural and functional changes may be made without departing from the scope of the apparatus, systems, and methods herein. In other instances, known structures and devices are shown or discussed more generally in order to avoid obscuring the embodiments. In many cases, a description of the operation is sufficient to enable one to implement the various forms, particularly when the operation is to be implemented in software. It should be noted that there are many different and alternative configurations, devices, and technologies to which the disclosed embodiments may be applied. The full scope of the embodiments is not limited to the examples that are described below.
- The anatomy of the eye is well known in the art. For the purposes of this disclosure, relevant anatomy is depicted and described.
FIGS. 1 and 2 generally depict the anatomy of thehuman eye 100.FIG. 1 is a front view of theeye 100 showing thepupil 145,iris 115,sclera 150,limbus 245, pupil/iris boundary 250,upper eyelid 105,lower eyelid 110, andeyelashes 235.FIG. 2 is section view of theeye 100 showing thepupil 145,iris 115,retina 52,sclera 150,fovea 160,lens 165, andcornea 170. Thepupil 145 is the approximately round dark portion at the center of the eye that expands and contracts to regulate the light theretina 52 receives. Theiris 115 is the colored portion of theeye 100 that surrounds and controls the expansion and contraction of thepupil 145. Thesclera 150 is the white region of theeye 100 that surrounds theiris 115. Thesclera 150 contains blood vessels and other identifiable markers. Thelimbus 245 is the outer edge of theiris 115 next to thesclera 150. The pupil/iris boundary 250 is where thepupil 145 and theiris 115 meet. Theeyelids eyelashes 235 surround and occasionally partially cover or obscure portions of theeye 100 during blinks, eye closures, or different angles of viewing. Theretina 52 is the sensory membrane that lines theeye 100 that receives images from thelens 165 and converts them into signals for the brain. Thefovea 160 is an indentation in theretina 52 that contains only cones (no rods) and provides particularly acute vision. Thelens 165 is the nearly spherical body of theeye 100 behind thecornea 170 that focuses light onto theretina 52. Thecornea 170 is the clear part of the eye covering theiris 115,pupil 145, and thelens 165. -
FIGS. 3A and 3B depict the foveal, parafoveal, and peripheral ranges of vision. Thefoveal region 190 is about two degrees outward from a user's gaze point. An approximation of this region is a US penny held at an adult's arm length. Theparafoveal range 195 is the viewable area outside thefoveal region 190, generally from two to ten degrees from a user's gaze point. An approximation of the ten degree parafoveal visual field is a circle with a four-inch diameter held at an adult's arm length. Theperipheral region 197 is outside of theparafoveal region 195 and is generally from ten to thirty degrees out. -
FIG. 4 depicts an overall system architecture, including aprocessor 1035 withnon-volatile flash memory 1050, D-RAM memory 1040, a hardware abstraction layer (HAL) 1030, andphysical connections 1235 to external hardware, an operating system (OS) 1025, and software and/orfirmware 1000 that handles the middleware services for the HMD that operates as a Visual Disambiguation Service (VDS) interface termed IRIS (Interface for Real-time Image Services) for the HMD and is operable as a software control object. Above the middleware services is asoftware layer 1015 containing software to facilitate the software integration of the IRIS object with a third party application; also above the middleware services is a set ofsoftware tools 1020 used for third party hardware integration and debug, including operations like single stepping and break-pointing through the Joint Test Action Group (JTAG)—supported by the IEEE 1149.1 Standard Test Access Port and Boundary-Scan Architecture. Above the software tools and integration layer is an application programming interface (API) 1010, followed byapplications 1005. Additionally, the system includes public 1635 and privatekey generators 1630 for added security. -
FIG. 5 depicts the overall system architecture including software blocks identified as apower manager 1140 anddevice manager 1120. Power management schemes are derived from one or more open standards such as the Advanced Configuration and Power Interface (ACPI). - The ACPI has three main components: the ACPI tables, the ACPI BIOS and the ACPI registers. Unlike its predecessors, like the APM or PnP BIOS, the ACPI implements little of its functionality in the ACPI BIOS code, whose main role is to load the ACPI tables in system memory. Instead, most of the firmware ACPI functionality is provided in ACPI Machine Language (AML) bytecode stored in the ACPI tables. To make use of these tables, the operating system must have an interpreter for the AML bytecode. A reference AML interpreter implementation is provided by the ACPI Component Architecture (ACPICA). At the BIOS development time, AML code is compiled from the ASL (ACPI Source Language) code. To date, the most recent release of the ACPI standard was in 2011.
- For wearable computing, the systems of the future may implement without an operating system and robust support. A power management scheme and ACPI elements discussed above will need to be pulled up to an application control level, giving the application and user dynamic control of the power scheme. As discussed below, in highly abstracted embodiments, the ACPI might implement in a split or distributed fashion. This current standard does not fully anticipate the challenges of wearable computing devices like an HMD disclosed in this specification; therefore additional considerations for operating HMD systems in multiple modes is disclosed.
- An HMD may include a low-power mode of operation that may be deployed during times when no eyes are detected. This typically occurs when the user removes the headwear or when the headwear has shifted out of place on a user's head. This functionality could be implemented in silicon as a system on a chip (SOC).
- At any time, the device can be re-mounted by the original user or worn by a new user. For purposes of device calibration (e.g., to account for anatomical variations among individuals) and/or user authentication, it is desirable for the device to be capable of determining the identity of registered users when re-mounted or re-deployed. This can include loading a new set of configuration/calibration parameters and differentiating identities between the previous and new user; including halting, pausing and/or concealing the outputs of any ongoing programs launched by the previous user.
- Typically under the old standard, once the Operating System Power Management (OSPM) activates ACPI, it takes over exclusive control of all aspects of power management and device configuration. The OSPM implementation also exposes an ACPI-compatible environment to
hardware drivers 1197, which each in turn impacts the system, device, and processor states; and these are managed globally as Power States and these include Global States, Device States, Processor States, and Performance States. - Power consumption is an omnipresent concern, particularly if the device is not worn for an extended period. A commonly deployed solution to this issue is an “off” switch that completely powers down an electronic device. However, the time and inconvenience of “powering up” a headset device is restrictive particularly, for example, if the device has only been removed from the head momentarily.
- Low-power HMD and eye-signal control anticipates these issues by using at least one technique comprising:
-
- modifying the Processor States by reducing clock rates to processor(s),
- modifying Performance States by confining processing to a low power processor or portion of a processor,
- modifying Device States by imaging at a reduced frame rate,
- modifying Global States by turning the camera off or into a low-power mode between images,
- reducing illumination,
- collecting and/or processing images with reduced spatial resolution,
- limiting algorithms (particularly those associated with searching for iris boundaries) to low-spatial resolution modes,
- relaxing stringency measures during irisCode comparisons, and
- fabricating specific dedicated hardware (a chip or SOC) that operates in a low-power mode that does not “power-up” the full device until a low-level authentication has occurred. Further, such a chip or SOC could prohibit access to other embedded functionality or connected or wirelessly connected devices until authentication, possibly following power up and the determination of a user's eye viewing a display or other target object is performed.
- This specific dedicated hardware can utilize modern methods of “hybrid” chip manufacturing that can segment a portion of circuitry to operate in an extremely low power mode. This hybrid circuitry effectively builds a “firewall,” preventing an unauthorized user from fully powering up or utilizing a device.
- Another application of low-power HMD modes is when a low-battery state is sensed. Instead of running a device until all global functions cease, a “graceful degradation” model is implemented as part of the new class of Power State for HMDs. “Graceful degradation” can include algorithmic approaches by limiting the use of more power-hungry (i.e., generally more sophisticated) image processing and other routines; as well as any number of the hybrid and hardware approaches to reduce power while maintaining at least partial functionality, discussed above. Low-power modes for the processor and critical operations continue until the battery finally runs out of power, the unit is plugged into a central power source, or the device is placed sufficiently close to an inductive charging station.
- Another power management concern for all forms of wearable computing is that more sophisticated algorithmic eye tracking and user interface techniques can draw upon faster or parallel central processing units (CPUs), but generally these approaches require more power. Greater power consumption results in larger and/or heavier batteries, and/or shorter device use times between recharging or replacing batteries.
- An alternative or adjunct to the deployment of more/faster CPUs is the use of embedded or distributed processing approaches. These can be implemented within a variety of hardware components including field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), complex programmable logic devices (CPLDs) and hybrid devices that can include system-on-chip (SoC) configurations.
- Embedded or distributed processing can facilitate existing, CPU-based approaches by off-loading computationally intensive routines. Hardware dedicated to performing these routines can be faster (often requiring only one, or just a few, clock cycles) and utilize less power (often by greater than an order of magnitude). Distributed processing can also facilitate new algorithmic approaches that are generally not feasible (within time and power-consumption constraints) using CPUs. Distributed processing is particularly valuable within algorithms that require repeated and/or simultaneous application of calculations to be performed on large data sets such as video images. These are further discussed below in the sub-section Distributed Processing.
- Another embodiment utilizes low-power distributed processing to detect whether the device has been removed from the head. In order to implement an “instant on” capability, the device must “sense” whether it is mounted on the wearer's head or has been removed. A method to perform this function (without adding additional hardware) is to determine if an eye can be viewed within eye-tracking camera images. As described elsewhere, power consumption can be reduced when the device is not in use (i.e. removed from the head) by a reduced frame rate, low resolution imaging, lower CPU clock rate, etc.
- For low-power eye-presence measurements, illumination can be eliminated or reduced by reducing the power of illuminating LEDs, reducing the number of LEDs turned on and/or only turning on illuminator(s) when actually sampling camera images (at reduced frame rates). A substantial reduction in power can also be attained by embedding relatively simple eye geometry detection routines in distributed processing hardware. An example of this is one form of convolution filter to determine if an image (i.e. of an eye) is present is a focus filter. Such a filter would be classified as a high-pass spatial filter that detects the presence of high spatial contrast edges. The absence of such edges indicates that the device has been removed from the head when a defocused image is generally present (i.e. absent a high-contrast object located at the approximately 25 mm focal distance of the camera). Another approach is to detect a dark (i.e. pupil) region adjacent to a white (i.e. sclera) region. When an in-focus eye is detected, the device “powers up” (recognizing that it was not completely powered off) for higher resolution eye tracking.
- In another embodiment, the device may include a micro electro-mechanical system (MEMS) such as an accelerometer or rate sensor for determining motion. When the device is not being worn it may operate at an ultra-low power mode in which it is not intermittently searching for the presence of an eye. In the ultra-low power mode, the device may only search for the presence of an eye when movement of the device is detected, for instance when a user picks up the device. When movement is detected, the device may initiate a scan in search of an eye or eyes at predetermined intervals (for instance every two seconds) or substantially continuously for a period of time (for instance one minute) as set by user preferences. If the device fails to detect an eye in the pre-set time interval it may resume ultra-low power mode or it may cycle through a low power mode prior to resuming ultra-low power mode. Should an eye or eyes be detected, the device will switch into full power mode or into a power settings scheme as set by the preferences of the detected user. The primary device owner (administrator) may set the overall system power schemes that will govern the power mode settings for the device when it is not in use. Additionally, the device owner may lock down changes to the power schemes such that other users are unable to edit them.
-
FIG. 5 depicts a further breakdown of theIRIS Object 1000, including eye tracking 1100, a module for tracking the eye gaze of a user;eye data 1105 for user identification using biometric data of the user such as facial, speech and iris identification;eye control 1100 for relating the user's eye gaze to a display, iUi™ interface (an interface comprising eye-signal controls) 1116. Eye signals 1115 gleaned from eye movements are used to interact with auser interface iUi 1116 and display screen(s) and images on the display. - Included in the
IRIS object 1000 are a number of software modules operative as managers of certain functions. As an example, but not limiting to: -
-
Device Manager 1120 that allows a user to control hardware attached to the system, whether it be the imager in the HMD, a cell phone attached, or a vehicle Infotainment system. - Data Manager/
Personal Data Management 1125 enables the secure storage and access to user data such as e-mail, messages etc. TheData Manager 1125 also may include one or more of password management software, web browser favorites, and cryptographic software. Advance data management can include, as an example, setting and establishing a Virtual Private Network (VPN), terminal services with an external processor, whether local or accessible through the internet. -
Communications Manager 1130 is designed to pass information from one system to another and provides remote access to systems and transmits files in a multitude of formats between computers. TheCommunications Manager 1130 also may include link management and data routing. -
Security Manager 1135 refers to software steps or measures used to protect the HMD and user from threats, viruses, worms, malware, or remote hacker intrusions; to include preventive-control techniques, which safeguard the system and its data from being stolen or compromised. -
Power Manager 1140 manages device power schemes to optimize and maximize a user's experience and device battery life. -
Display Manager 1145 has two basic functions: how the eye is tracked to objects on the screen in the mode of theiUi 1116; and what is displayed to the user in the HMD. TheDisplay Manager 1145 also has the ability to transfer all or portions of the screen in the HMD to an external display, such as a computer screen, the dashboard of a vehicle, or a home entertainment monitor such as a TV.
-
- With now a further discussion on the
Security Manager 1135, a number of additional controls may be included, for example audio/video (A/V)control 1150;speech control 1155; or something more complex, e.g., cognitive load control 1160 (FIGS. 11 and 12 ). - The HAL 1030 (
FIG. 6 ) includes the “hosted” aspects of external hardware systems; this generally includes software specific to the IRIS platform developed specifically for integration. The hardware anticipated, but not limited to, is a Bluetooth interface 1170 (discussed separately below as one embodiment), a TCP/IP interface 1175 to any form or any type of TCP communications to include 802.2 (wired interface) 802.11, 802.15 (WPAN—Wireless Personal Area Networks other than Bluetooth), and 802.20 Wi-Max; this includes stacks that support the network andtransport software 1195 and the physical links to wired or wireless systems. In addition, there are considerations for other systems through the Hardware IF 1180, which interfaces with external software and/or hardware drivers throughphysical links 1235; these physical links can be I2C, USB, serial, or proprietary. - In a discussion of one embodiment, as an example, the Bluetooth system has been selected in a non-limiting example of an embodiment because it is so pervasive in the rapidly growing market of mobile devices. As an example, today almost all vehicles have what are called Infotainment systems; these are a combination of entertainment such as music and videos as well as information, where the information could come from within the vehicle as data from a sensor, control of a system like a heater or lights, or information available through the internet. Most of these systems use wired and wireless technologies to connect to the vehicle and/or the internet. Today, the wireless connections to the vehicle are generally Bluetooth established through a set of standard interfaces; these are referred to as Profiles 1300 (
FIG. 9 ) and are hosted in a processor above the Bluetooth radio, further shown inFIG. 5 as Hands Free Profile (HFP) 1187; Advanced Audio Distribution Profile (A2DP) 1193, Audio Video Resource Control Profile (AVRCP) 1190, etc. - To date, it has not been anticipated that a
HMD 600 would be used in a vehicle tocontrol vehicle operations 915, to include theInfotainment system 910; therefore incorporation of one of the newest Profiles is inevitable, this Profile is called Human Interface Device (HID) Profile;profiles FIG. 6 depicts a Bluetooth systems architecture, including connections to the profiles, network andtransport 1230, and thedata link 1250 andmodem 1255. -
FIG. 7 depicts a breakdown of theBluetooth architecture 1205 broken down into their subcomponents. Underlying all of these protocols is a key piece of Bluetooth termed the Service Discovery Protocol (SDP) 1310, which includes what is called Secure Simple Pairing (SSP). SSP today is required by all Bluetooth standards above v2.1. Secure Simple Pairing uses a form of public key cryptography, which can help protect against what is called “man in the middle,” or MI™ attacks. Generally, the Bluetooth HID 1185 specification requires asecurity mode 4 for pairing and bonding two devices together citing that it should not be possible to perform pairing or bonding to any Bluetooth HID Host or Device without physical access to both the Bluetooth HID Host and Bluetooth HID device. Bluetooth HID Hosts and Bluetooth HID devices that support bonding use some form of non-volatile memory to store the 128-bit link keys and the corresponding BD_ADDRs, as well as the type of each link-key (authenticated, unauthenticated, or combination). In the case of a HMD, complex access is limited as there is no mouse or keyboard in a conventional sense. However, there are other ways to establish a secure link that have not been anticipated by Bluetooth, even though Bluetooth acknowledges the precepts of Public key cryptography. - In another example, a Bluetooth HID Host that accepts sensitive information from Bluetooth HID devices may be implemented to only accept sensitive information from reports that are contained in a top-level application collection of “Generic Desktop Keyboard” or “Generic Desktop Keypad.” Furthermore, such a Bluetooth HID Host may require MITM protection when pairing with any Bluetooth HID device with a Bluetooth HID report descriptor that contains a top-level application collection of “Generic Desktop Keyboard” or “Generic Desktop Keypad,” which in turn contains any of the following sets of usage codes and their descriptions:
-
- IC—irisCode: the result of applying pattern-recognition techniques to images of an eye to quantify the epigenetic patterns within an iris into comparable bit-patterns for the purpose of biometric identification.
- EIC—Encrypted IC: an irisCode that has been encrypted so that it cannot be reverse engineered to an original image of the iris or any other iris-based, derived parameter.
- TEIC—Target EIC: an identified EIC in which a match with an IC computed from an image of an eye indicates association and thus, a positive biometric identification.
- CBID—Continuous Biometric Identification: the repeated process of biometric identification that can be performed either on a headset device or remotely by transmitting EICs, or images of one or both eyes to a remote processor. CBID can occur at a fixed rate (e.g. 30 times per second) or an asynchronous rate (e.g. each time the device is moved or re-mounted).
- The following table refers to the transmission and comparison of EICs; however, it is algorithmically possible to convert images of eyes into ICs and subsequently into EICs. Thus, CBID can equivalently involve comparisons and/or the exchange of information involving images of irises, ICs, EICs or other derived parameters. Similarly, databases used for biometric comparisons could equivalently (for the purposes of identification) contain ICs, EICs, images of eyes, images of faces (including eyes), images of irises, so-called “unfolded” (i.e. expressed in polar coordinates) iris images, or other derived parameters. Therefore, references to exchanges or comparisons of EICs also refer to the exchange or comparison of any other derived data sets for the purpose of biometric identification.
- In Table 1 below, where a set of programmable attributes can be assigned to a new Bluetooth profile, these attributes can be used:
-
TABLE 1 Example Headset Configurations for a Bluetooth SDP Transaction Attribute Attribute ID Value Description Example ExtendedProto- 0x36 0x0C Data element Following the colDescriptor sequence, BT HID 1.1 List 12 octets 0x020F (Next BT HID Revision) stand-alone, 0x36 0x01 Boolean8, recognizes the 1 of 1 single TEIC sole owner/user stored within of a device headset with that is no off- inoperative headset otherwise communication required for CBID - Other examples of programmable attributes include but are not limited to:
-
- recognize an individual member of a family, all of whom are permitted to use a device (e.g. dynamically loads calibration factors associated with each user)
- pay-per-view rental of a public HMD
- enabling multiple, general-use headsets available to employees within a business
- online purchase from an online “store” in which a user has been registered with no restrictions on the device used to make a purchase
- online purchase from an online “store” in which both a user and specific headset have been registered
- determine if user is on a “no-fly” list
- confidential list of traditional passwords
- taking an examination within a massively online course
- medical records made available to primary care doctor, specialist, and patient
- structured advertising based on the demographic of an identified viewer
- a device license agreement sent to a user
- confirmation that all components of a legal document have been viewed
- confirmation that a notice of changes in terms and conditions has been sent to a user
- confirmation of informed consent related to legal documents
- pre-flight inspection by a pilot
- identification of a vehicle driver and possibly identifying pre-accident driver distractions
- retrieve e-mail based on user identification where another user would be offered separate e-mail access
- outgoing text and e-mail can (optionally) be tagged to indicate the CBID user is the author
- electronically “sign” legal documents
- administration of an examination that must take place in the presence of both student and an instructor
- exchange of personal information between/among people who have just met
- purchases made in a bricks-and-mortar store that requires no check stand
- remote control of door opening for authorized personnel only
- gaining or restricting access to a building based on user identification
- tracking an individual under house arrest
- tracking an individual restricted from entering a casino or interacting with another individual
- ensure legal purchase of alcohol or other age-sensitive materials to an individual
- automatic 911 call with user identification (that can be linked to medical history), “vitals” and geographic location
- based on CBID, interact with an automated teller machine
- gaining access to highly secure military sites
- proving the historical activities of an individual under investigation
- restricting access to the audio/video of a private conversation
- restricting access to the audio/video of a conference to participants
- restricting access of a data set of a private conversation to CBID participants
- when/where is the last time I saw my car keys?
- list hockey (versus basketball or some other sport) scores first
- control household thermostat by an adult (not a child)
- remotely turn on household entry lights
- In a further discussion of the image process and use of the imager or video, in the embodiment described above, a system is anticipated where the HMD is implemented with a single conventional 2D imager system oriented either toward the eye and face and outward facing to the scene. However, in an alternate embodiment, consideration is anticipated for the HMD to be implemented with multiple imagers oriented toward the face and eyes as well as the scene, where the multiple imagers generate a stereoscopic 3D video image. In addition to stereoscopic 3D images, consideration for other forms of 3D image generation has been anticipated by the applicant. Today, non-contact three-dimensional cameras, or digitizers, generally fall into four categories: stereoscopic digitizers (as mentioned above), silhouette digitizers, timing digitizers, and projected pattern digitizers. The underlying 3D surface imaging technologies can further be summarized in terms of four broad categories: Spatial Phase Imaging (SPI), Triangulation, Time of Flight (TOF) and Coherent approaches.
- Spatial Phase Imaging generally relies on the polarization state of light as it emanates from surfaces to capture information about the shape of objects. Triangulation employs the location of two or more displaced features, detectors, and/or illuminants to compute object geometry. Two important triangulation subcategories are stereo correspondence (STC) and stereoscopy (STO). Stereo correspondence cameras determine the location of features in a scene by identifying corresponding features in two or more offset intensity images using 3D geometry to compute feature locations. Stereoscopic cameras rely on human biological systems (eyes, brain) to create a notion of a 3D scene from two images taken from different vantage points and projected into the eyes of a viewer. Finally, coherent methods rely on a high degree of spatial and/or temporal coherence in the electromagnetic energy illuminating and/or emanating from the surfaces in order to determine 3D surface geometry.
-
FIGS. 8 and 9 depictimager object code 1415 for either a 2D or 3D implementation. Regardless of the technology employed, any system implemented must consider two key factors: Human-fidelic visualization (completely realistic display) and visual intelligence (automated vision). Human-fidelic visualization can create a visual notion of a scene in the mind of a human that is as realistic or almost as realistic as viewing the scene directly; the visualization system is human-fidelic. An imaging system has to be 3D to be human-fidelic, since human sight is 3D. The second being visual intelligence, which means sensing and analyzing light to understand the state of the physical world. Automatic recognition of human emotions, gestures, and activities represent examples of visual intelligence. 2D video cameras struggle to provide a high level of visual intelligence because they throw away depth information when a video is captured. As a consequence of neglecting depth, 2D images of 3D scenes are inferior to 3D images. 3D images have better contrast (the ability to distinguish between different objects). Real video of real scenes typically contains dozens of instances where contrast and depth ambiguity make it difficult for automated systems to understand the state of the scene. - 3D video cameras do everything that 2D cameras do, but add the benefits just discussed. It is inevitable that
single lens native 3D video will eventually replace 2D video offered today by offering two interesting benefits: human-fidelic visualization and improved visual intelligence. It is reasonable to assume that global production of most cameras will shift to 3D as they become cost effective, simple to operate, compact and produce visual fidelity. With this in mind, the technology emerging today as the most likely to reach mass markets in terms of cost, complexity, and fidelity is Spatial Phase Imaging within the broad 3D imaging categories discussed. This technology relies on commercially available imagers implementing a micro-polarizing lens over four sub-pixels resulting in an ability to rapidly determine small changes in reflected light, computing a vector as a direction cosine for each pixel and generating a three dimensional value in terms of X, Y and Z-depth; truly asingle lens native 3D video. - In another embodiment, the accuracy of both CBID and eye-signal control processes can be improved via the use of more than a single camera to view an eye. Images substantially simultaneously or sequentially acquired from multiple cameras can be used to 1) create on-axis (i.e. perpendicular to the surface) views of different regions of the eye, 2) view surfaces with specular reflections (particularly glints) located at different positions within images of the eye, 3) allow for viewing of fine structures while maintaining the ability to view over a wide spatial range, 4) increase eye tracking accuracy by making multiple measurements based on multiple views of glints and eye structures, and 5) view “around” obscuring objects such as eye lids and lashes.
- Another area where distributed/embedded processing is particularly valuable is in the “off-loading” of operations that are computationally intensive for a CPU. Examples of such a “hybrid” approach (i.e. mixing CPU and embedded processing) within eye tracking and iris identification algorithms include subroutine that perform Fast Fourier Transform (FFT), random sample consensus (RANSAC), so-called StarBurst feature extraction, and trigonometric functions.
- Included within the system architecture are methods for managing cognitive load for safety, optimized performance, and general well-being for individuals and groups. Generally, the concept of cognitive load extends from tactical fighter programs and activities that generally relate to situation awareness. These in turn drive
cognitive load control 1160 in programs like cockpit workload management;cognitive load control 1160 generally deals with the human mind interacting with some external stimulus. - The definition of cognitive load is slightly different in different fields; for example, in an academic sense cognitive load refers to the total amount of mental activity imposed on working memory at any instance in time; while in the ergonomics literature it is described as the portion of operator information processing capacity, or resources that are required to meet cognitive task demands. Each field provides different methods to measure cognitive load.
- Cognitive load is considered herein as the mental effort or demand required for a particular user to comprehend or learn some material or complete some task. Cognitive load is relative to both the user (i.e., their ability to process novel information) and the task being completed (i.e., complexity), at any single point in time. It is attributable to the limited capacity of a person's working memory and their ability to process novel information.
- Conventional methods for measuring cognitive load, include:
-
- 1. subjective measures, such as self-rating scales;
- 2. physiological techniques, such as pupil dilatation, heart rate and galvanic skin responses;
- 3. task or performance based measures, such as critical error rates and task completion times; and
- 4. behavioral measures, such as speech pathology (e.g., impairment, self-talk, etc.)
- There are a number of problems with these methods for measuring cognitive load, including:
-
- 1. some of the methods are intrusive and disrupt the normal flow of performing the task;
- 2. some of the methods are physically uncomfortable for the user;
- 3. some methods cannot be conducted in real-time as they are too intensive;
- 4. the data quality is potentially unreliable outside laboratory conditions; and
- 5. the data quality can be affected by outside factors, such as a user's stress level.
-
FIG. 12 depicts system components for acognitive load manager 1160 that addresses many of these issues. In one embodiment, mobile, wearable, implanted, consumed, and other physiologically integrated computers employ increasingly sophisticated and varied sensors, data input methods, data access methods, and processing capabilities that capture, access, and interpret more and more data that can be used as sensory input to the brain and impact cognitive activity. The data comprisesphysiological data 1815 andenvironmental data 1810. The data are used to better establish a user's preferences for the integration, management, and delivery of information to the head mounted unit. -
FIGS. 13-15 depict three different system architectures for connecting the HMD to another device or to the Internet.FIG. 13 depicts theHMD 600 connecting through a local link, such as Bluetooth, to amobile device 710 carried by the user; themobile device 710 is connected vialink 155 to a packet switched network typically provided by a wireless carrier through 700 or what today is generally referred to as a packet network also known as the world wide web; with subsequent connection to either a web-based service, a database, orexternal application 160. -
FIG. 14 depicts theHMD 600 including awireless transmitter 750 that is either embedded or attached to the HMD for connection directly to theinternet 700 and aservice provider 160. -
FIG. 15 depicts theHMD 600 including awireless transceiver 750 connected via a link 725 directly to the Internet, where the local link is generally a packet link, but could be other proprietary wireless protocols. In this configuration, the HMD is independent from other smart devices; essentially the HMD is connected directly to the Internet all of the time. Today, if a user wants to connect a smart device, or now a HMD, to another system for the control and operation of that system, the user would simply implement a local connection through a Bluetooth profile. In the case of home audio the user would need to use Audio Video Transport Profile, Audio Video Resource Control profile, or Advanced Audio Distribution Profile. If a user wanted to connect to a vehicle, he or she would need to implement the Hands Free Profile. Simpler and less complex systems are needed along with methods to connect to these systems, especially if the user is beyond the range of a local connection to the system they want to control. - To solve this new challenge,
FIG. 16 depicts another embodiment where an HMD is implemented in an “abstracted” real-time server-browser cloud based architecture; known today as the “Internet of Things” or IoT. The key to any abstracted layer is the ability to abstract away from some device or software operational or strategic complexity; these could include proprietary aspects, including trade secrets and intellectual property. The abstraction can support extended or new business models to a technology supplier. A good example of this architecture is the NEST™ Labs business model. This model could be loosely referred to as a “razor/razor blade” model; in this case the NEST™ thermostat is the razor, the NEST™ Services are the razor blades or simply stated the business model includes the sale of the thermostat and a monthly recurring service. In addition to the sale of hardware and services, this business model supports data harvesting of a user in his home. In this system, the thermostat serves data off to a centralized server for the purposes of “learning.” - Even though NEST™ products can be accessed via the Internet, they cannot be directly connected to by a smart device for the control and operation of a home heating system.
FIG. 16 depicts anHMD 600 connected via apacket network 155 to theInternet 700. In order for the user to access his or her home thermostat, the user needs to access their page on the NEST™ Services server 965. However, the traditional role of web server and browser has been expanded under thenew HTML 5 standard. There has been what looks like a role reversal of the server and browser, one where the web server is now the smart thermostat; this server is simply serving small amounts of data to a fixed URL in the cloud running a browser. This browser in the cloud can be accessed by a user using a smart device or computer from virtually anywhere to read or interact with their thermostat. Using the web server in this role is now a key and underlying concept of the IoT, one where complexity and cost are greatly reduced. - Now re-thinking
FIG. 15 , in view of the IoT; access to home entertainment, home security systems, or for that matter any home appliance (washers, dryers, refrigerators, etc.) for their monitoring, control, and operation will implement differently. Further, considering the IoT architecture, the head-mounteddevice 600 could be connected to any consumer, industrial, or commercial device located anywhere in the world on the cloud, a user could control that device via eye interaction with the included display via eye signals defined as a standardized command set mapping the eye signals to communication, diagnostics, control, and interaction with the device(s). - This new model abstracts away complexity and cost, as an example, a model where the HMD may not require Bluetooth or, for that matter, a distributed intelligence. It is inevitable two things will happen in the near future: first wireless bandwidth will continue to grow exponentially with gigabit service on the horizon; and second, the IoT architecture will deploy as it continues to deploy today; very rapidly. What are needed are methods and systems disclosed on how a standalone head mounted system will strategically evolve within a rapidly evolving ecosystem.
FIGS. 17-23 depict an abstraction transition model from a smart head mounted system, to a much simpler model as depicted inFIG. 23 . - Starting with the end of the transition first,
FIG. 23 depicts a cloud based implementation within an IoT architecture of anHMD 600 connected by a very high speed packet based link, a wireless link that would rival or potentially outperform the typical communication bus in a local processor. These processor busses operate as subsystems of the processor to facilitate transfer of data between computer components or between computers. Typical bus types include front-side bus (FSB), which carries data between the CPU and memory controller hub; direct media interface (DMI), which is a point-to-point interconnection between an integrated memory controller and an I/O controller hub in the processor; and Quick Path Interconnect (QPI), which is a point-to-point interconnect between the CPU and the integrated memory controller. Other high speed busses have been used in the embedded computing industry to include SPI for inter-processor communication. What is not currently anticipated is that under cloud based architectures and distributed computing, much of the intelligence will reside outside of the connected devices.HTML 5 and JSON are good examples of markup languages optimized for distributed computing. To include audio, video, and scalable vector graphics, operating systems will evolve and operate to meet these new distributed architectures likely using much simpler “publish subscribe” access. - With the above in view, and with respect now to the HMD operating on the cloud, the HMD is connected to a centralized server-
browser 800 that operates the Visual Disambiguation Service (VDS) interface termed IRIS (Interface for Real-time Image Services); think of this operating much like SIRI (Speech Interpretation and Recognition Interface) does for audio. The IRIS service is for the complex disambiguation of eye movement for the real-time interpretation, determination, and prediction of a user's intent. IRIS, like SIRI, operates in the cloud. 1126 and 1127 represent the IRIS abstraction layer discussed above. The HMD now operates with a minimum amount of software, a processor richer in features and configured with a limited or possibly no operating system using a publish/subscribe messaging scheme. - At the beginning of the transition, the embedded IRIS (e-IRIS) 1111 includes a number of tools or utilities operating in the FOG as a combined real-time service. These include a
data manager 1125,device manager 1120,communication manager 1130,power manager 1140, andsecurity Manager 1135 a. In thee-IRIS abstraction 1127, there are counterpart managers, with a slight exception in thesecurity manager 1135 b; this will be discussed below in more detail. -
FIG. 23 also depicts the eye management tools centralized in a cloud-based version in support of a user. These include aneye tracker 1100,eye data 1105 in support of security,eye control 1110, eye signals 1115, andiUi 1116 for an eye user interface. In addition to these elements, other real-time services are available and associated to IRIS including an Audio-Video manager 1150,speech manager 1155,cognitive load manager 1160 and acontext manager 1165. The combination of these services and architecture constitutes IRIS. - Back now to
FIG. 17 and an initial embodiment, theHMD 600 is wirelessly connected to a smart device (such as a smart phone, a tablet, home or office PC) or simply to the Internet through an 802.11 link. All of the services operate in theHMD 600 processor or are stored in a memory associated with theHMD 600. This embodiment would operate as a stand-alone computer, with an operating system, and micro-processor(s) and/or other logic elements. In a first transition step of the first embodiment, some of the non-real-time applications are off loaded to applications run on the localsmart phone 710, local PC, or other smart devices. However, this first transition embodiment would still be highly dependent on the locally available resources in theHMD 600 to operate as intended. -
FIG. 18 depicts a second transition step wherein thedata manager 1125 takes on a new role. In addition to managing data on and off theHMD 600, the data manager is configured to manage some of the data either on or off board theHMD 600 using a markup language, such as or JSON (Java Script Object Notation), HTML 4.01, or 5.0. The object of this transition step is to implement a web server-browser relationship in theHMD 600. In this case, some of the data acquired by the imagers, audio input, or any other sensors available to theHMD 600 are served to the cloud and directed by a fixed URL to a cloud based IRIS, where a user's browser page resides and his/her data are aggregated. This second transition supports non real-time data applications, as an example theHMD 600 is used for the transmission of data that have been collected and stored by a user. As an example, the user may capture a photograph, an audio clip, a video clip, or other user physiological data related to the eye or a user's health; these data are then transferred to IRIS for storage, aggregation, or possible subsequent dissemination (discussed in more detail below). -
FIGS. 19 , 20, and 21 depict a third step in the transition, where the wireless bandwidth is now near real-time. A web server and browser relationship exists operationally in parallel with a now more mature e-IRIS 1111 in theHMD 600 andIRIS 800 in the cloud. They operate and interact with each other in near real-time across theabstraction layer security manager 1135 resident in theHMD 600 takes on the role of generating a private key and public key based on certain bio-metrics as disclosed in described in Systems and Methods for Discerning Eye Signals and Continuous Biometric Identification, filed May 8, 2015. Data collected from the face, eye, or voice constitute unique biometric data of the user or user groups if desired. These data collected can be used to generate a unique private key in a system of public key and private key cryptography. - As background, cryptographic systems have been widely used for information protection, authentication, and access control for many years. These cryptosystems are generally categorized as symmetric key cryptosystems and public key cryptosystems. Symmetric key cryptosystems use the same key for encrypting and decrypting secret information; however using the same key can be problematic: 1) if the key is compromised, security cannot be assured; and 2) if there are multiple users, multiple keys are needed, which may increase system costs and data security. Public key cryptosystems can overcome these limitations by using a pair of cryptographic keys (i.e., a private key and a public key). The private key used for decryption is kept secret, whereas the public key used for encryption may be distributed to multiple users. Therefore, secrecy of the private key is a major challenge when it comes to achieving high levels of security in practical crypto systems.
- As one example, the irisCode of the user possibly combined with other biometric data are used to establish a unique key that subsequently generates the private key-public key. The public key generated from the user's unique biometric aspects is sent to
IRIS 800 for storage in the security manager portion of the user's browser,FIG. 22 1135 b. The private key is never stored, but is generated in theHMD 600 every time a user instantiates a session. When the user dons theHMD 600, the private key is generated,FIG. 21 1129, and authenticated inIRIS 800. This ensures levels of non-repudiation and security currently not available in web applications, especially in e-commerce. -
FIG. 23 depicts the final step in the transition to a real-time HMD 600. The Internet is now prolific and operates at speeds in excess of processor buses.IRIS 800 is cloud-based and real-time for all intents and purposes. Data are collected and aggregated inIRIS 800.IRIS 800 is now implementing advanced algorithms based on learning about the physiology of the human eye, as well as the user generally; disambiguation inIRIS 800 is enhanced to the point.IRIS 800 can now predict what and where a user wants to see or do. The user'sHMD 600 is commodity, low cost, low power and immediately replaceable. - The final step abstracts all of the intelligence for the device to the
cloud 700. CBID, now cloud 700 based, is substantially continuous and real-time. Since the generation of the private key is unique to a user, this allows any user to pick up anyHMD 600 and use it at any time; simply slip it on and they are looking at their browser page where all of their personal information now resides. If theirHMD 600 is stolen, the information is secure. If a user loses theirHMD 600, no worry, simply borrow one, or buy a new one. The CBID andcloud 700 aspects ofIRIS 800, abstract the device at a new level, it abstracts the user like HMI and displays do today. - As discussed above in the NEST™ home thermostat model, the thermostat is only accessible through the NEST™ Services portal and page. In this implementation, the
HMD 600 is securely connected toIRIS 800 and a user's page. If the user wants to access their thermostat, IRIS connects them directly and securely to the NEST™ Services portal 965. This model will extend to XFINITY, if a user wanted access to his/her account to set a recording, or have access to an XFINITY service, IRIS will connect them directly theXFINITY portal 970. Further, if the user wants access to their COZYHOME application, again, the link is securely made to the appropriate server in thiscase 975. - As discussed above,
IRIS 800 may be linked to a user's social media account, giving the user a real-time access.FIG. 25 depicts howIRIS 800 would securely connect a user securely to their Google+ account to see postings or to post in near real-time information they want to share.Social Media 920 comprises social media services available to a user. - Shifting now to real-time cloud based
IRIS 800 and its extended capabilities, eye signals will be substantially continually aggregated and analyzed for its users. This makes IRIS 800 a unique service and development platform for applications and services associated with contextualized eye data (CED).IRIS 800 includes acontext manager 1165 in bothe-IRIS 1111 inFIG. 17 , as well asIRIS 800FIG. 23 and its role to generated Contextualized Eye Data (CED). CED begins with eye data extracted from episodic and/or substantially continuous monitoring of one or both eyes. These eye data include eye movements such as: saccades, fixations, dwells, pursuits, drift, tremors, and micro-saccades. Eye data also include blinks and winks, squints, pupil dilation, blood vessel patterns, iris and pupil size, feature locations, internal eye-structure size, shape, and location. A key aspect for CED is to use this data to detect behavior changes over time. - CED is the correlation of eye-data with other classes of data over time to extract relationships for meaningful prediction, measurement, analysis, interpretation, and impact on the user. As an example, three classes of
data IRIS 800 will have aggregated are raw data, semantic data, and evoked data. - Raw data comprises data captured by any sensors, whether in the
HMD 600 or present on or in a person. Today, there are many new wearable sensors uses in sports or health where these new systems all have wireless capability.IRIS 800 can take this raw data from an individual and correlate it with eye data. Examples include, but are not limited to, sensors that capture: movement, GSR (galvanic skin response), temperature, heart rate and heart rate variability (HRV), EOG (Electro-oculogram), EEG (Electro-encephelogram, EKG (electro-cardiogram), temperature, facial muscle movement and skin movement, internal organ or biological systems status and performance, scent, audio, scene and images for a range of electromagnetic radiation (visible light, IR, UV, and other electromagnetic frequencies), location (GPS and other beacon sensors), time monitoring/tracking, and more. - Semantic data comprises the interpretation or meaning of “what, when, where, and how” a user is “doing” something, as well as with whom the user is doing something. “Doing” can be working, playing, eating, exercising, reading, and myriad other activities. These data are constructed by interpreting sensor data in the context of a user's activities.
- Evoked data are extracted from conscious or subconscious individual response to visual, tactile, olfactory, taste, audio, brain, or other sensory, organ, or biological responses to intentional stimuli.
- To date, the capture of data associated with “eye-tracking” has been primarily enabled with expensive, stationary, “remote” eye-tracking devices, situated in front of displays oriented towards users eyes, for limited durations (measured in minutes) for specific tests; or expensive, dedicated purpose, wearable eye-tracking devices, sometimes packaged as glasses, placed on users for limited durations, in limited contrived environments, for specific tests.
- Eye-tracking data have primarily been captured indoors due to the technology's inability to function well in high-infrared (outdoor) environments without substantial filtering or shielding of ambient IR light, further reducing the practicality, breadth, and quantity of eye-data capture. As such, high quality, environmentally diverse, high-volume data across diverse “natural” use cases have been limited due to the expense, limited portability, constrained form-factor, high-power requirements, high-computing requirements, limited environmental robustness, and dedicated “data capture” utility of eye-tracking technology and devices. While early research on the data captured has shown promise for extraordinary insights into human health, cognition, and behavior, the general capture of such data has been highly constrained to specific tests and environments for short durations.
- The first generation of IRIS integrated HMDs may be worn by millions of people in a broad range of life activities. In the disclosed transition plan, these data may be collected by IRIS first as historical data, then in both near real-time, and ultimately in real-time. Should this transition occur, it could increase by orders of magnitude the quantity, quality, and contextualization of eye data that is captured. IRIS could then have the ability to correlate data with a broad range of other personal and aggregated data such as individual and group health cognition and behavior. IRIS may then use the aggregated data to provide insights into eye data correlated with personal health, cognition, and behavior as a starting point regarding self-quantification, self-improvement, and self-actualization.
- IRIS will support applications for extracting patterns from large datasets that will expose and predict future behavior such as that of our likelihood to adopt a new habit, our interest in acquiring a product, or our likelihood in voting for a new politician. As an example, below lists the types of measurements and predictions that will be afforded by IRIS' contextualized eye-data; these include but are not limited to:
-
- MEASUREMENTS
- Measuring drowsiness and fatigue
- Measuring medical conditions and trends
- Measuring reaction to a drugs, food, or other comestibles
- Measuring short, medium, and long term health trends
- Measuring reading speed, focus, interest, fluency, vocabulary, areas of confusion
- Measuring knowledge, understanding, and skills
- Measuring emotional state and reactions to stimuli
- Measuring interest and emotional reaction to people, places, and things
- Measuring recognition and familiarity with people, places, and things
- Measuring focus and cognitive load
- Measuring improvement in performance of specific and general tasks
- Measuring effectiveness and satisfaction with IRIS
- PREDICTION
- Predicting the onset of a medical condition or disease
- Predicting the incidence of a specific health event such as a seizure or panic attack
- Predicting weight gain or loss
- Predicting the general improvement of health
- Predicting the likelihood of adopting a new behavior or a bad habit
- Predicting the likelihood of succeeding at a task or endeavor
- Predicting an automobile accident
- Predicting the rate of improvement of an athletic skill
- Predicting the market success of a new product
- Predicting the rise or fall of a specific stock or the stock market
- Predicting a political outcome, political stability, and political unrest
- Impacting learning, work, play, understanding, socialization, creativity, energy, focus, attitude, motivation, and all things that make us human today, and that will drive the enhancement and evolution of humanity and our species.
- MEASUREMENTS
- The IRIS application and tools positively impact the user of the HMD by contextualizing the eye data that are aggregated. IRIS technology will advance the user's performance in many dimensions and will enhance their human-to-human interactions as well as their human-machine interactions.
- The key common aspect to any of these is IRIS's role as a real-time secure abstraction.
FIGS. 26-28 depict other portals for secure access to a user's information where again, the common element isIRIS 800. Further, the private key stored in IRIS can be related to a password for the user that greatly simplifies the user's interaction on the web, to include secure transactions. - In accordance with other embodiments; systems and methods are provided to enhance security and convenience during online shopping.
FIG. 29 depicts a user operating a setup process that needs to occur only once where the user needs to link their public key with account information. For increased security, a bank or other financial institution that is responsible for the account might verify other forms of target (i.e., intended) user identity and offer the linkage process as a service. Once linked, online purchase selections and transactions can be performed by a user with their HMD in a seemingly instantaneous fashion. - In another embodiment of secure shopping, real time knowledge of a device-wearer's identity allows financial particulars to be exchanged electronically with each item as selected and purchased. This eliminates the need to repeatedly enter passwords, security questions or account information for each transaction or group of transactions. As a consequence, such an instantaneous purchasing system eliminates processes involved with a so-called online shopping “carts” since there is no longer a need to cluster items for the purpose of entering account information. Solely for customer convenience, groups of items purchased during an online shopping session can be treated as a cluster or summarized for the purchaser.
- In accordance with another embodiment, systems and methods are provided to enhance security and streamline shopping at so-called “bricks and mortar” retail outlets. In this case, a camera mounted on the headwear device that views the environment of the device wearer can be used to identify objects that may be of interest for purchase. Identification can be based on bar codes or quick-response (i.e. Q-R) codes that are commonly attached to purchasable items. Such object identification uses image processing methods that are well known in the art.
- Information about the item including a proposed purchase price can be generated by a processing unit associated with the retail outlet. This information can then be displayed on nearby monitors or on a head-mounted display associated with the device wearer. If the customer wishes to purchase a given item, a CBID-based transaction can be initiated by the customer. Such transactions can occur repeatedly throughout a store. A match between transported items and the transaction record would then allow items to be verifiably removed from the store by the customer. CBID-based retail purchases eliminate the need for check stands or tills. In many situations, the automated, real time display of information during the purchasing process also reduces the need for store clerks to assist potential customers.
- These devices are also integrating increasingly sophisticated and varied data output methods that stimulate visual, auditory, tactile, olfactory, gustatory (sense of taste), equilibrioception (sense of balance), direct neurological, indirect (wireless) neurological (neural and synaptic brainwave stimulation), chemical, biological activity, and multi-modal input sensation.
- The increased stimulation of the body and associated enhanced delivery of information to the brain can affect brain activity in subtle and profound ways. Cognitive stimulation resulting from more, varied, and faster delivery of multiple forms of input to the brain can positively impact human performance. However, cognitive overload or inappropriate stimulation, can negatively impact performance, damage health, create safety hazards, and even kill.
- As mobile, wearable, implanted, consumed, and other physiologically integrated computers proliferate, a solution is needed to manage stimulation and flow of data to the body and brain. Individuals are already applying various forms of cognitive management in technologically stimulated situations. Some methods are purely manual, while methods for intelligent, software-based management are beginning to emerge. For example, reducing audio stimulation during periods of increased, high-impact cognitive activity is commonplace. Consider a driver of an automobile turning down the radio when driving stress increases in challenging traffic or when a driver is lost and is trying to navigate. The attention directed to listening, consciously or subconsciously, provided by the audio stimulus of the radio, reduces input to other areas of the brain, such as visual processing. Simultaneous multi-modalities, such as talking on a cell phone, impact the visual task of driving.
- Reducing physical exertion during periods of higher cognitive load is another form of self-management that is commonplace. Research on “walking while talking” (WWT) shows a correlation between gait pace and rate as walkers talk. In general, walkers that become engaged in conversations requiring higher cognition typically slow their walking pace.
- A recent form of cognitive load management associated with electronic stimulation includes applications that temporarily disable email, text, and other online forms of interruption. These applications are very simple in form, however.
- This approach allows user's customization and prioritization to improve over time as historical context, performance, biometric, and other data are accumulated and analyzed forming generally a user profile of activities and preferences. These also provide a variety of methods and techniques for dynamically managing stimuli (deferral, termination, sequencing, reprioritization, pacing, and more), support stimuli aggregation and management across multiple individuals for risk-controlled or performance-enhanced group activity.
- Another embodiment is context-aware computing as 1165. In a mobile computing paradigm it will be advantageous for applications to discover and take advantage of contextual information such as user location, time of day, neighboring users and devices, user activity to specifically support collecting and disseminating context and applications that adapt to changing context.
- For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or described features can be implemented by themselves, or in combination with other operations in either hardware or software.
- Having described and illustrated the principles of the present invention in embodiments thereof, it should be apparent that the present invention may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the scope of the following claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/708,229 US20150324568A1 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for using eye signals with secure mobile communications |
US15/237,581 US10564714B2 (en) | 2014-05-09 | 2016-08-15 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
Applications Claiming Priority (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461991435P | 2014-05-09 | 2014-05-09 | |
US201462023940P | 2014-07-13 | 2014-07-13 | |
US201462027774P | 2014-07-22 | 2014-07-22 | |
US201462027777P | 2014-07-22 | 2014-07-22 | |
US201462038984P | 2014-08-19 | 2014-08-19 | |
US201462039001P | 2014-08-19 | 2014-08-19 | |
US201462046072P | 2014-09-04 | 2014-09-04 | |
US201462074920P | 2014-11-04 | 2014-11-04 | |
US201462074927P | 2014-11-04 | 2014-11-04 | |
US14/708,229 US20150324568A1 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for using eye signals with secure mobile communications |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/708,241 Continuation-In-Part US9600069B2 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for discerning eye signals and continuous biometric identification |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150324568A1 true US20150324568A1 (en) | 2015-11-12 |
Family
ID=54368077
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/708,229 Abandoned US20150324568A1 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for using eye signals with secure mobile communications |
US14/708,234 Active 2035-06-09 US10620700B2 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US14/708,241 Active 2035-06-17 US9600069B2 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for discerning eye signals and continuous biometric identification |
US14/930,617 Active US9823744B2 (en) | 2014-05-09 | 2015-11-02 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US14/937,782 Abandoned US20160062459A1 (en) | 2014-05-09 | 2015-11-10 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US15/131,273 Pending US20160274660A1 (en) | 2014-05-09 | 2016-04-18 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US15/418,034 Active US10156900B2 (en) | 2014-05-09 | 2017-01-27 | Systems and methods for discerning eye signals and continuous biometric identification |
Family Applications After (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/708,234 Active 2035-06-09 US10620700B2 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US14/708,241 Active 2035-06-17 US9600069B2 (en) | 2014-05-09 | 2015-05-09 | Systems and methods for discerning eye signals and continuous biometric identification |
US14/930,617 Active US9823744B2 (en) | 2014-05-09 | 2015-11-02 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US14/937,782 Abandoned US20160062459A1 (en) | 2014-05-09 | 2015-11-10 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US15/131,273 Pending US20160274660A1 (en) | 2014-05-09 | 2016-04-18 | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US15/418,034 Active US10156900B2 (en) | 2014-05-09 | 2017-01-27 | Systems and methods for discerning eye signals and continuous biometric identification |
Country Status (7)
Country | Link |
---|---|
US (7) | US20150324568A1 (en) |
EP (3) | EP3140719B1 (en) |
JP (3) | JP6550460B2 (en) |
KR (4) | KR102173699B1 (en) |
CN (3) | CN106537290B (en) |
AU (3) | AU2015297035B2 (en) |
WO (3) | WO2015172124A1 (en) |
Cited By (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160246054A1 (en) * | 2015-02-23 | 2016-08-25 | International Business Machines Corporation | Interfacing via heads-up display using eye contact |
US20160308859A1 (en) * | 2015-04-14 | 2016-10-20 | Blub0X Technology Holdings, Inc. | Multi-factor and multi-mode biometric physical access control device |
US20170069159A1 (en) * | 2015-09-04 | 2017-03-09 | Musigma Business Solutions Pvt. Ltd. | Analytics system and method |
US20170090588A1 (en) * | 2015-09-29 | 2017-03-30 | Kabushiki Kaisha Toshiba | Electronic device and method |
US20170123489A1 (en) * | 2015-10-28 | 2017-05-04 | Microsoft Technology Licensing, Llc | Adjusting image frames based on tracking motion of eyes |
US9823744B2 (en) | 2014-05-09 | 2017-11-21 | Google Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
WO2018046347A1 (en) * | 2016-09-07 | 2018-03-15 | Bundesdruckerei Gmbh | Data glasses for cryptographically signing image data |
EP3376367A1 (en) * | 2017-03-13 | 2018-09-19 | Siemens Aktiengesellschaft | Acknowledgement of the transfer of a good |
KR20190022376A (en) * | 2017-08-23 | 2019-03-06 | 한국전자통신연구원 | Apparatus for self-quantification service |
US20190125264A1 (en) * | 2017-10-29 | 2019-05-02 | Orlando Efrain Abreu Oramas | Method and system of facilitating monitoring of an individual based on at least one wearable device |
US20190155896A1 (en) * | 2015-08-31 | 2019-05-23 | Ayla Networks, Inc. | Compact schedules for resource-constrained devices |
US10325083B2 (en) * | 2014-06-27 | 2019-06-18 | Intel Corporation | Wearable electronic devices |
US10353465B2 (en) * | 2016-06-08 | 2019-07-16 | South China University Of Technology | Iris and pupil-based gaze estimation method for head-mounted device |
US10397594B2 (en) | 2017-04-28 | 2019-08-27 | Hewlett Packard Enterprise Development Lp | Real-time processing of IoT data |
US20190266427A1 (en) * | 2018-02-23 | 2019-08-29 | Samsung Electronics Co., Ltd | Method of biometric authenticating using plurality of camera with different field of view and electronic apparatus thereof |
US10466778B2 (en) | 2016-01-19 | 2019-11-05 | Magic Leap, Inc. | Eye image selection |
US10554758B2 (en) | 2015-06-15 | 2020-02-04 | Blub0X Security, Inc. | Web-cloud hosted unified physical security system |
US10564714B2 (en) | 2014-05-09 | 2020-02-18 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
WO2020048778A1 (en) * | 2018-09-04 | 2020-03-12 | Robert Bosch Gmbh | Method for controlling a multimedia device, and computer program and device therefor |
CN111091595A (en) * | 2019-12-23 | 2020-05-01 | 吉林省广播电视研究所(吉林省广播电视局科技信息中心) | Strabismus three-dimensional mapping method and mapping system |
EP3648069A1 (en) * | 2018-10-29 | 2020-05-06 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for selling commodity, vending machine and storage medium |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US10733275B1 (en) * | 2016-04-01 | 2020-08-04 | Massachusetts Mutual Life Insurance Company | Access control through head imaging and biometric authentication |
US10748340B1 (en) * | 2017-07-31 | 2020-08-18 | Apple Inc. | Electronic device with coordinated camera and display operation |
US10750560B2 (en) | 2016-09-27 | 2020-08-18 | Extreme Networks, Inc. | IoT device management using multi-protocol infrastructure network devices |
US20200293744A1 (en) * | 2015-08-21 | 2020-09-17 | Magic Leap, Inc. | Eyelid shape estimation using eye pose measurement |
EP3433707B1 (en) | 2016-03-22 | 2020-10-28 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
US10866633B2 (en) | 2017-02-28 | 2020-12-15 | Microsoft Technology Licensing, Llc | Signing with your eyes |
CN112262373A (en) * | 2018-06-26 | 2021-01-22 | 苹果公司 | View-based breakpoints |
US10996477B2 (en) | 2017-02-27 | 2021-05-04 | Advanced New Technologies Co., Ltd. | Virtual reality head-mounted apparatus |
US11099645B2 (en) | 2015-09-04 | 2021-08-24 | Sony Interactive Entertainment Inc. | Apparatus and method for dynamic graphics rendering based on saccade detection |
US11170087B2 (en) | 2017-02-23 | 2021-11-09 | Advanced New Technologies Co., Ltd. | Virtual reality scene-based business verification method and device |
WO2022015812A1 (en) * | 2020-07-14 | 2022-01-20 | Surgical Theater, Inc. | System and method for four-dimensional angiography |
US11271745B2 (en) | 2019-03-19 | 2022-03-08 | Advanced New Technologies Co., Ltd. | Method and system for operating internet of things device |
CN114223194A (en) * | 2019-08-06 | 2022-03-22 | 爱尔康公司 | Scene camera system and method for vitreoretinal surgery |
WO2022182916A1 (en) * | 2021-02-24 | 2022-09-01 | Lifebrand, Llc | System and method for determining the impact of a social media post across multiple social media platforms |
US11461444B2 (en) | 2017-03-31 | 2022-10-04 | Advanced New Technologies Co., Ltd. | Information processing method and device based on internet of things |
US11533272B1 (en) * | 2018-02-06 | 2022-12-20 | Amesite Inc. | Computer based education methods and apparatus |
US11698535B2 (en) | 2020-08-14 | 2023-07-11 | Hes Ip Holdings, Llc | Systems and methods for superimposing virtual image on real-time image |
US11706656B2 (en) | 2020-06-29 | 2023-07-18 | Qualcomm Incorporated | Downlink data prioritization for time-sensitive applications |
US11749025B2 (en) | 2015-10-16 | 2023-09-05 | Magic Leap, Inc. | Eye pose identification using eye features |
US11774759B2 (en) | 2020-09-03 | 2023-10-03 | Hes Ip Holdings, Llc | Systems and methods for improving binocular vision |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11811513B2 (en) * | 2020-12-04 | 2023-11-07 | Capital One Services, Llc | Methods and systems for managing multiple content delivery networks |
US11838419B2 (en) | 2021-01-15 | 2023-12-05 | Delta Electronics, Inc. | Method and system for monitoring industrial devices |
US11861145B2 (en) | 2018-07-17 | 2024-01-02 | Methodical Mind, Llc | Graphical user interface system |
US11953689B2 (en) | 2020-09-30 | 2024-04-09 | Hes Ip Holdings, Llc | Virtual image display system for virtual reality and augmented reality devices |
Families Citing this family (811)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9823737B2 (en) * | 2008-04-07 | 2017-11-21 | Mohammad A Mazed | Augmented reality personal assistant apparatus |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US9229233B2 (en) | 2014-02-11 | 2016-01-05 | Osterhout Group, Inc. | Micro Doppler presentations in head worn computing |
TWI439960B (en) | 2010-04-07 | 2014-06-01 | Apple Inc | Avatar editing environment |
US10463248B2 (en) * | 2011-03-02 | 2019-11-05 | Brien Holden Vision Institute Limited | Systems, methods, and devices for measuring eye movement and pupil response |
WO2013148557A1 (en) | 2012-03-26 | 2013-10-03 | New York University | Methods and kits for assessing central nervous system integrity |
US10716469B2 (en) | 2013-01-25 | 2020-07-21 | Wesley W. O. Krueger | Ocular-performance-based head impact measurement applied to rotationally-centered impact mitigation systems and methods |
US10231614B2 (en) | 2014-07-08 | 2019-03-19 | Wesley W. O. Krueger | Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance |
US11490809B2 (en) | 2013-01-25 | 2022-11-08 | Wesley W. O. Krueger | Ocular parameter-based head impact measurement using a face shield |
US10602927B2 (en) | 2013-01-25 | 2020-03-31 | Wesley W. O. Krueger | Ocular-performance-based head impact measurement using a faceguard |
US11504051B2 (en) | 2013-01-25 | 2022-11-22 | Wesley W. O. Krueger | Systems and methods for observing eye and head information to measure ocular parameters and determine human health status |
US9788714B2 (en) | 2014-07-08 | 2017-10-17 | Iarmourholdings, Inc. | Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance |
US11389059B2 (en) | 2013-01-25 | 2022-07-19 | Wesley W. O. Krueger | Ocular-performance-based head impact measurement using a faceguard |
KR102516577B1 (en) | 2013-02-07 | 2023-04-03 | 애플 인크. | Voice trigger for a digital assistant |
US10895908B2 (en) | 2013-03-04 | 2021-01-19 | Tobii Ab | Targeting saccade landing prediction using visual history |
US9665171B1 (en) | 2013-03-04 | 2017-05-30 | Tobii Ab | Gaze and saccade based graphical manipulation |
US11714487B2 (en) | 2013-03-04 | 2023-08-01 | Tobii Ab | Gaze and smooth pursuit based continuous foveal adjustment |
US9898081B2 (en) | 2013-03-04 | 2018-02-20 | Tobii Ab | Gaze and saccade based graphical manipulation |
US10082870B2 (en) * | 2013-03-04 | 2018-09-25 | Tobii Ab | Gaze and saccade based graphical manipulation |
AU2014281725B2 (en) | 2013-06-17 | 2019-10-10 | New York University | Methods and kits for assessing neurological and ophthalmic function and localizing neurological lesions |
US10884493B2 (en) | 2013-06-20 | 2021-01-05 | Uday Parshionikar | Gesture based user interfaces, apparatuses and systems using eye tracking, head tracking, hand tracking, facial expressions and other user actions |
KR101882594B1 (en) | 2013-09-03 | 2018-07-26 | 토비 에이비 | Portable eye tracking device |
US10686972B2 (en) | 2013-09-03 | 2020-06-16 | Tobii Ab | Gaze assisted field of view control |
US10310597B2 (en) * | 2013-09-03 | 2019-06-04 | Tobii Ab | Portable eye tracking device |
US9958939B2 (en) * | 2013-10-31 | 2018-05-01 | Sync-Think, Inc. | System and method for dynamic content delivery based on gaze analytics |
JP2015114865A (en) * | 2013-12-12 | 2015-06-22 | ソニー株式会社 | Information processor, relay computer, information processing system, and information processing program |
EP3090322A4 (en) * | 2013-12-31 | 2017-07-19 | Eyefluence, Inc. | Systems and methods for gaze-based media selection and editing |
US20150228119A1 (en) | 2014-02-11 | 2015-08-13 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US20160019715A1 (en) * | 2014-07-15 | 2016-01-21 | Osterhout Group, Inc. | Content presentation in head worn computing |
US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
US9766463B2 (en) | 2014-01-21 | 2017-09-19 | Osterhout Group, Inc. | See-through computer display systems |
US9852545B2 (en) | 2014-02-11 | 2017-12-26 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US10430985B2 (en) | 2014-03-14 | 2019-10-01 | Magic Leap, Inc. | Augmented reality systems and methods utilizing reflections |
US11138793B2 (en) | 2014-03-14 | 2021-10-05 | Magic Leap, Inc. | Multi-depth plane display system with reduced switching between depth planes |
US20160187651A1 (en) | 2014-03-28 | 2016-06-30 | Osterhout Group, Inc. | Safety for a vehicle operator with an hmd |
US10424103B2 (en) * | 2014-04-29 | 2019-09-24 | Microsoft Technology Licensing, Llc | Display device viewer gaze attraction |
US9706910B1 (en) * | 2014-05-29 | 2017-07-18 | Vivid Vision, Inc. | Interactive system for vision assessment and correction |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9880401B2 (en) * | 2014-06-13 | 2018-01-30 | Verily Life Sciences Llc | Method, device and system for accessing an eye-mountable device with a user interface |
WO2015194135A1 (en) * | 2014-06-19 | 2015-12-23 | 日本電気株式会社 | Authentication device, authentication system, authentication method, and program storage medium |
DE102014211823A1 (en) * | 2014-06-20 | 2015-12-24 | Robert Bosch Gmbh | Procedure for personal identification |
KR102266195B1 (en) * | 2014-06-20 | 2021-06-17 | 삼성전자주식회사 | Apparatus and method for providing information associated with object |
US9269328B2 (en) * | 2014-06-24 | 2016-02-23 | Google Inc. | Efficient frame rendering |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9961307B1 (en) * | 2014-06-30 | 2018-05-01 | Lee S. Weinblatt | Eyeglass recorder with multiple scene cameras and saccadic motion detection |
ES2964604T3 (en) * | 2014-07-07 | 2024-04-08 | Attenti Electronic Monitoring Ltd | Tamper-proof self-administered drug screening |
KR101645087B1 (en) * | 2014-07-10 | 2016-08-02 | 아이리텍 잉크 | High security set using hand attached-type wearable device for iris recognition with wearing detection sensor and control method of the same set |
US10540907B2 (en) * | 2014-07-31 | 2020-01-21 | Intelligent Technologies International, Inc. | Biometric identification headpiece system for test taking |
US11013441B2 (en) | 2014-08-04 | 2021-05-25 | New York University | Methods and kits for diagnosing, assessing or quantitating drug use, drug abuse and narcosis, internuclear ophthalmoplegia, attention deficit hyperactivity disorder (ADHD), chronic traumatic encephalopathy, schizophrenia spectrum disorders and alcohol consumption |
US9829708B1 (en) * | 2014-08-19 | 2017-11-28 | Boston Incubator Center, LLC | Method and apparatus of wearable eye pointing system |
HK1203120A2 (en) * | 2014-08-26 | 2015-10-16 | 高平 | A gait monitor and a method of monitoring the gait of a person |
US10425814B2 (en) | 2014-09-24 | 2019-09-24 | Princeton Identity, Inc. | Control of wireless communication device capability in a mobile device with a biometric key |
WO2016054092A1 (en) | 2014-09-29 | 2016-04-07 | Magic Leap, Inc. | Architectures and methods for outputting different wavelength light out of waveguides |
US9898868B2 (en) * | 2014-11-06 | 2018-02-20 | Seiko Epson Corporation | Display device, method of controlling the same, and program |
GB2532438B (en) * | 2014-11-18 | 2019-05-08 | Eshare Ltd | Apparatus, method and system for determining a viewed status of a document |
WO2016089592A1 (en) | 2014-12-03 | 2016-06-09 | Sri Internaitonal | System and method for mobile device biometric add-on |
JPWO2016088415A1 (en) * | 2014-12-05 | 2017-09-14 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
KR102139795B1 (en) * | 2014-12-15 | 2020-07-31 | 삼성전자주식회사 | Method for updating biometric feature pattern and the electronic device therefor |
US10013620B1 (en) * | 2015-01-13 | 2018-07-03 | State Farm Mutual Automobile Insurance Company | Apparatuses, systems and methods for compressing image data that is representative of a series of digital images |
US10878775B2 (en) | 2015-02-17 | 2020-12-29 | Mentor Acquisition One, Llc | See-through computer display systems |
US20160239985A1 (en) | 2015-02-17 | 2016-08-18 | Osterhout Group, Inc. | See-through computer display systems |
EP3062142B1 (en) | 2015-02-26 | 2018-10-03 | Nokia Technologies OY | Apparatus for a near-eye display |
USD779556S1 (en) * | 2015-02-27 | 2017-02-21 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with an icon |
US9836663B2 (en) * | 2015-03-05 | 2017-12-05 | Samsung Electronics Co., Ltd. | User authenticating method and head mounted device supporting the same |
KR102634148B1 (en) | 2015-03-16 | 2024-02-05 | 매직 립, 인코포레이티드 | Methods and system for diagnosing and treating health ailments |
US10921896B2 (en) * | 2015-03-16 | 2021-02-16 | Facebook Technologies, Llc | Device interaction in augmented reality |
IN2015CH01313A (en) * | 2015-03-17 | 2015-04-10 | Wipro Ltd | |
KR101648017B1 (en) * | 2015-03-23 | 2016-08-12 | 현대자동차주식회사 | Display apparatus, vehicle and display method |
CN106155288B (en) * | 2015-04-10 | 2019-02-12 | 北京智谷睿拓技术服务有限公司 | Information acquisition method, information acquisition device and user equipment |
WO2016183020A1 (en) | 2015-05-11 | 2016-11-17 | Magic Leap, Inc. | Devices, methods and systems for biometric user recognition utilizing neural networks |
US10254544B1 (en) * | 2015-05-13 | 2019-04-09 | Rockwell Collins, Inc. | Head tracking accuracy and reducing latency in dynamic environments |
US9860452B2 (en) * | 2015-05-13 | 2018-01-02 | Lenovo (Singapore) Pte. Ltd. | Usage of first camera to determine parameter for action associated with second camera |
US20160358181A1 (en) * | 2015-05-14 | 2016-12-08 | Magic Leap, Inc. | Augmented reality systems and methods for tracking biometric data |
US20200321107A1 (en) * | 2015-05-19 | 2020-10-08 | Iryou Jyouhou Gijyutu Kenkyusyo Corporation | Integrated multi-facility electronic medical record system |
AU2016264503B2 (en) * | 2015-05-20 | 2021-10-28 | Magic Leap, Inc. | Tilt shift iris imaging |
US9716834B2 (en) * | 2015-05-20 | 2017-07-25 | Panasonic Intellectual Property Management Co., Ltd. | Image display device and image processing device |
US10937064B2 (en) * | 2015-06-08 | 2021-03-02 | Samsung Electronics Co.. Ltd. | Method and apparatus for providing content |
IN2015CH02866A (en) * | 2015-06-09 | 2015-07-17 | Wipro Ltd | |
RU2601169C1 (en) * | 2015-06-11 | 2016-10-27 | Виталий Витальевич Аверьянов | Method and device for interaction with virtual objects |
JP6553418B2 (en) * | 2015-06-12 | 2019-07-31 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Display control method, display control device and control program |
US20160366317A1 (en) * | 2015-06-12 | 2016-12-15 | Delta ID Inc. | Apparatuses and methods for image based biometric recognition |
US10076250B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses based on multispectral data from head-mounted cameras |
US10076270B2 (en) | 2015-06-14 | 2018-09-18 | Facense Ltd. | Detecting physiological responses while accounting for touching the face |
US10085685B2 (en) | 2015-06-14 | 2018-10-02 | Facense Ltd. | Selecting triggers of an allergic reaction based on nasal temperatures |
US10523852B2 (en) | 2015-06-14 | 2019-12-31 | Facense Ltd. | Wearable inward-facing camera utilizing the Scheimpflug principle |
US10080861B2 (en) | 2015-06-14 | 2018-09-25 | Facense Ltd. | Breathing biofeedback eyeglasses |
US10045726B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Selecting a stressor based on thermal measurements of the face |
US10136852B2 (en) | 2015-06-14 | 2018-11-27 | Facense Ltd. | Detecting an allergic reaction from nasal temperatures |
US10159411B2 (en) | 2015-06-14 | 2018-12-25 | Facense Ltd. | Detecting irregular physiological responses during exposure to sensitive data |
US10045737B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Clip-on device with inward-facing cameras |
US9968264B2 (en) | 2015-06-14 | 2018-05-15 | Facense Ltd. | Detecting physiological responses based on thermal asymmetry of the face |
US10136856B2 (en) | 2016-06-27 | 2018-11-27 | Facense Ltd. | Wearable respiration measurements system |
US10154810B2 (en) | 2015-06-14 | 2018-12-18 | Facense Ltd. | Security system that detects atypical behavior |
US10130308B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Calculating respiratory parameters from thermal measurements |
US10045699B2 (en) | 2015-06-14 | 2018-08-14 | Facense Ltd. | Determining a state of a user based on thermal measurements of the forehead |
US10130299B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Neurofeedback eyeglasses |
US10216981B2 (en) | 2015-06-14 | 2019-02-26 | Facense Ltd. | Eyeglasses that measure facial skin color changes |
US10064559B2 (en) | 2015-06-14 | 2018-09-04 | Facense Ltd. | Identification of the dominant nostril using thermal measurements |
US10151636B2 (en) | 2015-06-14 | 2018-12-11 | Facense Ltd. | Eyeglasses having inward-facing and outward-facing thermal cameras |
US10092232B2 (en) | 2015-06-14 | 2018-10-09 | Facense Ltd. | User state selection based on the shape of the exhale stream |
US10130261B2 (en) | 2015-06-14 | 2018-11-20 | Facense Ltd. | Detecting physiological responses while taking into account consumption of confounding substances |
US10113913B2 (en) | 2015-10-03 | 2018-10-30 | Facense Ltd. | Systems for collecting thermal measurements of the face |
DE102016110902A1 (en) * | 2015-06-14 | 2016-12-15 | Facense Ltd. | Head-mounted devices for recording thermal readings |
US10299717B2 (en) | 2015-06-14 | 2019-05-28 | Facense Ltd. | Detecting stress based on thermal measurements of the face |
EP4249965A3 (en) | 2015-06-15 | 2023-12-27 | Magic Leap, Inc. | Display system with optical elements for in-coupling multiplexed light streams |
US10043487B2 (en) * | 2015-06-24 | 2018-08-07 | Samsung Electronics Co., Ltd. | Apparatus and method for split screen display on mobile device |
US10685488B1 (en) * | 2015-07-17 | 2020-06-16 | Naveen Kumar | Systems and methods for computer assisted operation |
CN104966359B (en) * | 2015-07-20 | 2018-01-30 | 京东方科技集团股份有限公司 | anti-theft alarm system and method |
TWI570638B (en) * | 2015-07-29 | 2017-02-11 | 財團法人資訊工業策進會 | Gaze analysis method and apparatus |
CN107787472A (en) * | 2015-08-04 | 2018-03-09 | 谷歌有限责任公司 | For staring interactive hovering behavior in virtual reality |
US10178150B2 (en) * | 2015-08-07 | 2019-01-08 | International Business Machines Corporation | Eye contact-based information transfer |
CN108140259B (en) | 2015-08-18 | 2022-06-14 | 奇跃公司 | Virtual and augmented reality systems and methods |
CN105184246B (en) * | 2015-08-28 | 2020-05-19 | 北京旷视科技有限公司 | Living body detection method and living body detection system |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US20170200316A1 (en) * | 2015-09-10 | 2017-07-13 | Sphere Optics Company, Llc | Advertising system for virtual reality environments |
US10681489B2 (en) | 2015-09-16 | 2020-06-09 | Magic Leap, Inc. | Head pose mixing of audio files |
IN2015DE02924A (en) * | 2015-09-16 | 2015-10-23 | Hcl Technologies Ltd | |
JP6684559B2 (en) * | 2015-09-16 | 2020-04-22 | 株式会社バンダイナムコエンターテインメント | Program and image generation device |
US9858706B2 (en) * | 2015-09-22 | 2018-01-02 | Facebook, Inc. | Systems and methods for content streaming |
US10096130B2 (en) | 2015-09-22 | 2018-10-09 | Facebook, Inc. | Systems and methods for content streaming |
CA2999261C (en) | 2015-09-23 | 2022-10-18 | Magic Leap, Inc. | Eye imaging with an off-axis imager |
CN108762496B (en) * | 2015-09-24 | 2020-12-18 | 联想(北京)有限公司 | Information processing method and electronic equipment |
EP3274878A1 (en) | 2015-09-28 | 2018-01-31 | Google LLC | Sharing images and image albums over a communication network |
USD791787S1 (en) * | 2015-09-28 | 2017-07-11 | Google Inc. | Display screen with a transitional graphical user interface for a photo album |
US9635167B2 (en) * | 2015-09-29 | 2017-04-25 | Paypal, Inc. | Conversation assistance system |
EP3349424B1 (en) * | 2015-10-08 | 2021-03-03 | Huawei Technologies Co., Ltd. | Method for protecting privacy information and terminal device |
EP3156880A1 (en) * | 2015-10-14 | 2017-04-19 | Ecole Nationale de l'Aviation Civile | Zoom effect in gaze tracking interface |
EP3365724B1 (en) | 2015-10-20 | 2021-05-05 | Magic Leap, Inc. | Selecting virtual objects in a three-dimensional space |
US10466780B1 (en) * | 2015-10-26 | 2019-11-05 | Pillantas | Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor |
US10831922B1 (en) * | 2015-10-30 | 2020-11-10 | United Services Automobile Association (Usaa) | System and method for access control |
JP7210280B2 (en) | 2015-11-04 | 2023-01-23 | マジック リープ, インコーポレイテッド | Light field display measurement |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11231544B2 (en) | 2015-11-06 | 2022-01-25 | Magic Leap, Inc. | Metasurfaces for redirecting light and methods for fabricating |
CN113769369A (en) * | 2015-11-19 | 2021-12-10 | 天使集团股份有限公司 | Management system for table game and game chip |
US10061552B2 (en) * | 2015-11-25 | 2018-08-28 | International Business Machines Corporation | Identifying the positioning in a multiple display grid |
CN105528577B (en) * | 2015-12-04 | 2019-02-12 | 深圳大学 | Recognition methods based on intelligent glasses |
US10097443B2 (en) * | 2015-12-16 | 2018-10-09 | Fluke Corporation | System and method for secure communications between a computer test tool and a cloud-based server |
US9703374B1 (en) * | 2015-12-16 | 2017-07-11 | Google, Inc. | In-cell gaze tracking for near-eye display |
CN108369451B (en) * | 2015-12-18 | 2021-10-29 | 索尼公司 | Information processing apparatus, information processing method, and computer-readable storage medium |
US10102358B2 (en) * | 2015-12-29 | 2018-10-16 | Sensory, Incorporated | Face-controlled liveness verification |
CN105892642A (en) * | 2015-12-31 | 2016-08-24 | 乐视移动智能信息技术(北京)有限公司 | Method and device for controlling terminal according to eye movement |
WO2017113757A1 (en) * | 2015-12-31 | 2017-07-06 | 北京小鸟看看科技有限公司 | Method of laying out surrounding interface, methods of switching content and switching list in three-dimensional immersive environment |
CN106940766A (en) * | 2016-01-04 | 2017-07-11 | 由田新技股份有限公司 | Sight line track authentication system and method |
JP6231585B2 (en) * | 2016-01-05 | 2017-11-15 | 株式会社Qdレーザ | Image projection device |
KR102466996B1 (en) * | 2016-01-06 | 2022-11-14 | 삼성전자주식회사 | Method and apparatus for predicting eye position |
WO2017120372A1 (en) | 2016-01-07 | 2017-07-13 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
JP2017123050A (en) * | 2016-01-07 | 2017-07-13 | ソニー株式会社 | Information processor, information processing method, program, and server |
JP2019506694A (en) | 2016-01-12 | 2019-03-07 | プリンストン・アイデンティティー・インコーポレーテッド | Biometric analysis system and method |
CN113156650A (en) | 2016-01-19 | 2021-07-23 | 奇跃公司 | Augmented reality system and method using images |
CN108885352B (en) | 2016-01-29 | 2021-11-23 | 奇跃公司 | Display of three-dimensional images |
US10169560B2 (en) * | 2016-02-04 | 2019-01-01 | Lenovo Enterprise Solutions (Singapore) Pte. Ltd. | Stimuli-based authentication |
KR101729434B1 (en) * | 2016-02-16 | 2017-04-24 | 주식회사 시큐브 | Space division segment block and its dynamic movement tracking based manual signature authentication system and method thereof |
US10591728B2 (en) | 2016-03-02 | 2020-03-17 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US10129510B2 (en) | 2016-02-18 | 2018-11-13 | Samsung Electronics Co., Ltd. | Initiating human-machine interaction based on visual attention |
CN108700275B (en) | 2016-02-24 | 2022-05-31 | 奇跃公司 | Low profile interconnect for light emitter |
KR20180114162A (en) | 2016-02-24 | 2018-10-17 | 매직 립, 인코포레이티드 | Polarizing beam splitter with low light leakage |
WO2017147534A1 (en) | 2016-02-26 | 2017-08-31 | Magic Leap, Inc. | Display system having a plurality of light pipes for a plurality of light emitters |
EP3420601B1 (en) | 2016-02-26 | 2023-08-02 | Magic Leap, Inc. | Optical system |
US10667981B2 (en) | 2016-02-29 | 2020-06-02 | Mentor Acquisition One, Llc | Reading assistance system for visually impaired |
KR20180117181A (en) | 2016-03-01 | 2018-10-26 | 매직 립, 인코포레이티드 | A reflective switching device for inputting light of different wavelengths into waveguides |
AU2017225977C1 (en) | 2016-03-04 | 2023-08-03 | Magic Leap, Inc. | Current drain reduction in AR/VR display systems |
US10089453B2 (en) | 2016-03-07 | 2018-10-02 | Magic Leap, Inc. | Blue light adjustment for biometric identification |
CA3015658A1 (en) | 2016-03-11 | 2017-09-14 | Magic Leap, Inc. | Structure learning in convolutional neural networks |
WO2017156486A1 (en) * | 2016-03-11 | 2017-09-14 | Oculus Vr, Llc | Corneal sphere tracking for generating an eye model |
US10115205B2 (en) | 2016-03-11 | 2018-10-30 | Facebook Technologies, Llc | Eye tracking system with single point calibration |
US10579708B1 (en) * | 2016-03-22 | 2020-03-03 | Massachusetts Mutual Life Insurance Company | Systems and methods for improving workflow efficiency and for electronic record population utilizing intelligent input systems |
US10306311B1 (en) | 2016-03-24 | 2019-05-28 | Massachusetts Mutual Life Insurance Company | Intelligent and context aware reading systems |
US10360254B1 (en) | 2016-03-24 | 2019-07-23 | Massachusetts Mutual Life Insurance Company | Intelligent and context aware reading systems |
AU2017238847A1 (en) | 2016-03-25 | 2018-10-04 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
JP6728863B2 (en) * | 2016-03-25 | 2020-07-22 | 富士ゼロックス株式会社 | Information processing system |
US10372205B2 (en) | 2016-03-31 | 2019-08-06 | Sony Interactive Entertainment Inc. | Reducing rendering computation and power consumption by detecting saccades and blinks |
US10192528B2 (en) | 2016-03-31 | 2019-01-29 | Sony Interactive Entertainment Inc. | Real-time user adaptive foveated rendering |
WO2017172695A1 (en) | 2016-03-31 | 2017-10-05 | Princeton Identity, Inc. | Systems and methods of biometric anaysis with adaptive trigger |
US10088898B2 (en) | 2016-03-31 | 2018-10-02 | Verizon Patent And Licensing Inc. | Methods and systems for determining an effectiveness of content in an immersive virtual reality world |
CN114995594A (en) | 2016-03-31 | 2022-09-02 | 奇跃公司 | Interaction with 3D virtual objects using gestures and multi-DOF controllers |
US10169846B2 (en) | 2016-03-31 | 2019-01-01 | Sony Interactive Entertainment Inc. | Selective peripheral vision filtering in a foveated rendering system |
US10401952B2 (en) | 2016-03-31 | 2019-09-03 | Sony Interactive Entertainment Inc. | Reducing rendering computation and power consumption by detecting saccades and blinks |
US10366296B2 (en) | 2016-03-31 | 2019-07-30 | Princeton Identity, Inc. | Biometric enrollment systems and methods |
JP6923552B2 (en) | 2016-04-08 | 2021-08-18 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | Augmented reality systems and methods with varifocal lens elements |
US20170291723A1 (en) * | 2016-04-11 | 2017-10-12 | Honeywell International Inc. | System and method for validating flight checklist items for maintenance and inspection applications |
KR20230098916A (en) | 2016-04-21 | 2023-07-04 | 매직 립, 인코포레이티드 | Visual aura around field of view |
KR101904889B1 (en) * | 2016-04-21 | 2018-10-05 | 주식회사 비주얼캠프 | Display apparatus and method and system for input processing therof |
AU2017257549B2 (en) | 2016-04-26 | 2021-09-09 | Magic Leap, Inc. | Electromagnetic tracking with augmented reality systems |
CN105897428B (en) * | 2016-04-28 | 2019-06-25 | 武汉大学 | A kind of real-time video safety communication system and method based on iris recognition |
NZ747834A (en) | 2016-05-06 | 2023-06-30 | Magic Leap Inc | Metasurfaces with asymmetric gratings for redirecting light and methods for fabricating |
JP7021110B2 (en) | 2016-05-09 | 2022-02-16 | マジック リープ, インコーポレイテッド | Augmented reality systems and methods for user health analysis |
EP3455766A4 (en) | 2016-05-10 | 2019-11-27 | National ICT Australia Limited | Authenticating a user |
US9904058B2 (en) | 2016-05-12 | 2018-02-27 | Magic Leap, Inc. | Distributed light manipulation over imaging waveguide |
EP3459071B1 (en) | 2016-05-20 | 2022-05-11 | Magic Leap, Inc. | Contextual awareness of user interface menus |
US10065658B2 (en) * | 2016-05-23 | 2018-09-04 | International Business Machines Corporation | Bias of physical controllers in a system |
US20180249941A1 (en) * | 2016-05-24 | 2018-09-06 | neuroFit, Inc. | Oculometric Neurological Examination (ONE) Appliance |
JP6563596B2 (en) * | 2016-05-25 | 2019-08-21 | 株式会社ソニー・インタラクティブエンタテインメント | Image processing apparatus, image processing method, and program |
US10037080B2 (en) * | 2016-05-31 | 2018-07-31 | Paypal, Inc. | User physical attribute based device and content management system |
USD796551S1 (en) * | 2016-06-03 | 2017-09-05 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
AU2017273737B2 (en) | 2016-06-03 | 2022-05-05 | Magic Leap, Inc. | Augmented reality identity verification |
US11108708B2 (en) | 2016-06-06 | 2021-08-31 | Global Tel*Link Corporation | Personalized chatbots for inmates |
WO2017213753A1 (en) | 2016-06-10 | 2017-12-14 | Magic Leap, Inc. | Integrating point source for texture projecting bulb |
US10009536B2 (en) | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
US10339659B2 (en) * | 2016-06-13 | 2019-07-02 | International Business Machines Corporation | System, method, and recording medium for workforce performance management |
EP3751396A1 (en) * | 2016-06-16 | 2020-12-16 | Apple Inc. | Method and system for providing eye tracking based information about a user behavior, client device, server and computer program product |
US10565287B2 (en) * | 2016-06-17 | 2020-02-18 | International Business Machines Corporation | Web content layout engine instance sharing across mobile devices |
EP3472828B1 (en) | 2016-06-20 | 2022-08-10 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
CN106200905B (en) * | 2016-06-27 | 2019-03-29 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN109643373B (en) | 2016-06-30 | 2023-06-27 | 奇跃公司 | Estimating pose in 3D space |
US11354863B2 (en) * | 2016-06-30 | 2022-06-07 | Honeywell International Inc. | Systems and methods for immersive and collaborative video surveillance |
US10200262B1 (en) * | 2016-07-08 | 2019-02-05 | Splunk Inc. | Continuous anomaly detection service |
US10146609B1 (en) | 2016-07-08 | 2018-12-04 | Splunk Inc. | Configuration of continuous anomaly detection service |
US10769854B2 (en) | 2016-07-12 | 2020-09-08 | Tyco Fire & Security Gmbh | Holographic technology implemented security solution |
KR102648770B1 (en) * | 2016-07-14 | 2024-03-15 | 매직 립, 인코포레이티드 | Deep neural network for iris identification |
EP3484343B1 (en) * | 2016-07-14 | 2024-01-10 | Magic Leap, Inc. | Iris boundary estimation using cornea curvature |
CN117793564A (en) | 2016-07-22 | 2024-03-29 | 索尼公司 | Image processing system, image sensor, and image processing method |
KR102412525B1 (en) | 2016-07-25 | 2022-06-23 | 매직 립, 인코포레이티드 | Optical Field Processor System |
KR20230133940A (en) | 2016-07-25 | 2023-09-19 | 매직 립, 인코포레이티드 | Imaging modification, display and visualization using augmented and virtual reality eyewear |
EP4138339A1 (en) | 2016-07-29 | 2023-02-22 | Magic Leap, Inc. | Secure exchange of cryptographically signed records |
US11642071B2 (en) | 2016-08-02 | 2023-05-09 | New York University | Methods and kits for assessing neurological function and localizing neurological lesions |
US9844321B1 (en) * | 2016-08-04 | 2017-12-19 | Novartis Ag | Enhanced ophthalmic surgical experience using a virtual reality head-mounted display |
US10417495B1 (en) * | 2016-08-08 | 2019-09-17 | Google Llc | Systems and methods for determining biometric information |
JP6795683B2 (en) | 2016-08-11 | 2020-12-02 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | Automatic placement of virtual objects in 3D space |
IL292025B2 (en) | 2016-08-12 | 2023-12-01 | Magic Leap Inc | Word flow annotation |
NZ750551A (en) | 2016-08-22 | 2023-05-26 | Magic Leap Inc | Multi-layer diffractive eyepiece |
EP3500911B1 (en) * | 2016-08-22 | 2023-09-27 | Magic Leap, Inc. | Augmented reality display device with deep learning sensors |
US10835120B2 (en) * | 2016-08-23 | 2020-11-17 | Welch Allyn, Inc. | Extended medical test system |
CN106899567B (en) | 2016-08-24 | 2019-12-13 | 阿里巴巴集团控股有限公司 | User body checking method, device and system |
JP6559359B2 (en) * | 2016-09-01 | 2019-08-14 | 三菱電機株式会社 | Gesture determination device, gesture operation device, and gesture determination method |
KR20210060676A (en) | 2016-09-13 | 2021-05-26 | 매직 립, 인코포레이티드 | Sensory eyewear |
US10230719B2 (en) * | 2016-09-19 | 2019-03-12 | Intel Corporation | Head mounted secure display system |
US10210320B2 (en) * | 2016-09-21 | 2019-02-19 | Lextron Systems, Inc. | System and method for secure 5-D user identification |
AU2017332030B2 (en) | 2016-09-21 | 2022-02-03 | Magic Leap, Inc. | Systems and methods for optical systems with exit pupil expander |
IL307292A (en) | 2016-09-22 | 2023-11-01 | Magic Leap Inc | Augmented reality spectroscopy |
EP3504610A1 (en) | 2016-09-22 | 2019-07-03 | Apple Inc. | Postponing the state change of an information affecting the graphical user interface until during the conditon of inattentiveness |
WO2018058063A1 (en) | 2016-09-26 | 2018-03-29 | Magic Leap, Inc. | Calibration of magnetic and optical sensors in a virtual reality or augmented reality display system |
CA3037047A1 (en) | 2016-09-28 | 2018-04-05 | Magic Leap, Inc. | Face model capture by a wearable device |
RU2016138608A (en) | 2016-09-29 | 2018-03-30 | Мэджик Лип, Инк. | NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE |
US10430042B2 (en) * | 2016-09-30 | 2019-10-01 | Sony Interactive Entertainment Inc. | Interaction context-based virtual reality |
US10863902B2 (en) | 2016-10-03 | 2020-12-15 | Oculogica Inc. | Method for detecting glaucoma |
CA3038967A1 (en) | 2016-10-04 | 2018-04-12 | Magic Leap, Inc. | Efficient data layouts for convolutional neural networks |
JP7090601B2 (en) | 2016-10-05 | 2022-06-24 | マジック リープ, インコーポレイテッド | Peripheral test for mixed reality calibration |
CN107018121B (en) * | 2016-10-13 | 2021-07-20 | 创新先进技术有限公司 | User identity authentication method and device |
US10925479B2 (en) * | 2016-10-13 | 2021-02-23 | Ronald Michael Kurtz | Networked system of mobile communication platforms for nonpharmacologic constriction of a pupil |
CN106997239A (en) | 2016-10-13 | 2017-08-01 | 阿里巴巴集团控股有限公司 | Service implementation method and device based on virtual reality scenario |
US9769166B1 (en) * | 2016-10-19 | 2017-09-19 | International Business Machines Corporation | Wearable sensor based system for person identification |
US10201274B2 (en) * | 2016-10-20 | 2019-02-12 | Oculogica Inc | Eye tracking system with biometric identification |
JP7128179B2 (en) | 2016-10-21 | 2022-08-30 | マジック リープ, インコーポレイテッド | Systems and methods for presenting image content on multiple depth planes by providing multiple intra-pupillary suggestive views |
US10660517B2 (en) | 2016-11-08 | 2020-05-26 | International Business Machines Corporation | Age estimation using feature of eye movement |
US20180125405A1 (en) * | 2016-11-08 | 2018-05-10 | International Business Machines Corporation | Mental state estimation using feature of eye movement |
US11074325B1 (en) * | 2016-11-09 | 2021-07-27 | Wells Fargo Bank, N.A. | Systems and methods for dynamic bio-behavioral authentication |
EP4202840A1 (en) | 2016-11-11 | 2023-06-28 | Magic Leap, Inc. | Periocular and audio synthesis of a full face image |
US11011140B2 (en) * | 2016-11-14 | 2021-05-18 | Huawei Technologies Co., Ltd. | Image rendering method and apparatus, and VR device |
ES2936390T3 (en) * | 2016-11-14 | 2023-03-16 | Mastercard International Inc | Biometric-based document signature method |
CA3043352A1 (en) | 2016-11-15 | 2018-05-24 | Magic Leap, Inc. | Deep learning system for cuboid detection |
IL266595B (en) | 2016-11-16 | 2022-08-01 | Magic Leap Inc | Thermal management systems for wearable components |
US10867445B1 (en) * | 2016-11-16 | 2020-12-15 | Amazon Technologies, Inc. | Content segmentation and navigation |
CA3042553C (en) * | 2016-11-16 | 2024-01-02 | Magic Leap, Inc. | Mixed reality system with reduced power rendering |
CN115639642A (en) | 2016-11-18 | 2023-01-24 | 奇跃公司 | Waveguide optical multiplexer using crossed gratings |
IL303676B1 (en) | 2016-11-18 | 2024-02-01 | Magic Leap Inc | Spatially variable liquid crystal diffraction gratings |
US11067860B2 (en) | 2016-11-18 | 2021-07-20 | Magic Leap, Inc. | Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same |
AU2017363081B2 (en) | 2016-11-18 | 2022-01-13 | Magic Leap, Inc. | Multilayer liquid crystal diffractive gratings for redirecting light of wide incident angle ranges |
US20180144554A1 (en) | 2016-11-18 | 2018-05-24 | Eyedaptic, LLC | Systems for augmented reality visual aids and tools |
WO2018094285A1 (en) * | 2016-11-18 | 2018-05-24 | Eyedaptic, LLC | Improved systems for augmented reality visual aids and tools |
CN206301289U (en) * | 2016-11-29 | 2017-07-04 | 阿里巴巴集团控股有限公司 | VR terminal devices |
CN107066079A (en) | 2016-11-29 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Service implementation method and device based on virtual reality scenario |
KR20180061956A (en) * | 2016-11-30 | 2018-06-08 | 삼성전자주식회사 | Method and apparatus for estimating eye location |
US10531220B2 (en) | 2016-12-05 | 2020-01-07 | Magic Leap, Inc. | Distributed audio capturing techniques for virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems |
KR20230070318A (en) | 2016-12-05 | 2023-05-22 | 매직 립, 인코포레이티드 | Virual user input controls in a mixed reality environment |
WO2018106963A1 (en) | 2016-12-08 | 2018-06-14 | Magic Leap, Inc. | Diffractive devices based on cholesteric liquid crystal |
US10796147B1 (en) * | 2016-12-12 | 2020-10-06 | Keith Hanna | Method and apparatus for improving the match performance and user convenience of biometric systems that use images of the human eye |
KR102491442B1 (en) | 2016-12-13 | 2023-01-20 | 매직 립, 인코포레이티드 | Augmented and Virtual Reality Eyewear, Systems, and Methods for Delivering Polarized Light and Determining Glucose Levels |
CN110291565B (en) | 2016-12-13 | 2023-06-30 | 奇跃公司 | Augmented reality display system |
KR102550742B1 (en) | 2016-12-14 | 2023-06-30 | 매직 립, 인코포레이티드 | Patterning of liquid crystals using soft-imprint replication of surface alignment patterns |
IL308598A (en) | 2016-12-22 | 2024-01-01 | Magic Leap Inc | Systems and methods for manipulating light from ambient light sources |
US10885676B2 (en) | 2016-12-27 | 2021-01-05 | Samsung Electronics Co., Ltd. | Method and apparatus for modifying display settings in virtual/augmented reality |
US10746999B2 (en) | 2016-12-28 | 2020-08-18 | Magic Leap, Inc. | Dual depth exit pupil expander |
US10650552B2 (en) | 2016-12-29 | 2020-05-12 | Magic Leap, Inc. | Systems and methods for augmented reality |
CN117251053A (en) | 2016-12-29 | 2023-12-19 | 奇跃公司 | Automatic control of wearable display device based on external conditions |
US10853775B1 (en) * | 2016-12-29 | 2020-12-01 | Wells Fargo Bank, N.A. | Computing systems for proximity-based fees |
EP4300160A2 (en) | 2016-12-30 | 2024-01-03 | Magic Leap, Inc. | Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light |
EP4122897A1 (en) | 2017-01-05 | 2023-01-25 | Magic Leap, Inc. | Patterning of high refractive index glasses by plasma etching |
CA3051239C (en) | 2017-01-23 | 2023-12-19 | Magic Leap, Inc. | Eyepiece for virtual, augmented, or mixed reality systems |
IL268115B2 (en) | 2017-01-27 | 2024-01-01 | Magic Leap Inc | Antireflection coatings for metasurfaces |
CA3051414A1 (en) | 2017-01-27 | 2018-08-02 | Magic Leap, Inc. | Diffraction gratings formed by metasurfaces having differently oriented nanobeams |
US10404804B2 (en) * | 2017-01-30 | 2019-09-03 | Global Tel*Link Corporation | System and method for personalized virtual reality experience in a controlled environment |
US10140773B2 (en) * | 2017-02-01 | 2018-11-27 | Accenture Global Solutions Limited | Rendering virtual objects in 3D environments |
US10824703B1 (en) * | 2017-02-01 | 2020-11-03 | United Services Automobile Association (Usaa) | Authentication based on motion and biometric data |
US10416769B2 (en) * | 2017-02-14 | 2019-09-17 | Microsoft Technology Licensing, Llc | Physical haptic feedback system with spatial warping |
US11347054B2 (en) | 2017-02-16 | 2022-05-31 | Magic Leap, Inc. | Systems and methods for augmented reality |
US10485420B2 (en) * | 2017-02-17 | 2019-11-26 | Analog Devices Global Unlimited Company | Eye gaze tracking |
US11141095B2 (en) | 2017-02-17 | 2021-10-12 | Oculogica Inc. | Method and system for detecting concussion |
US20180239422A1 (en) * | 2017-02-17 | 2018-08-23 | International Business Machines Corporation | Tracking eye movements with a smart device |
KR102601052B1 (en) | 2017-02-23 | 2023-11-09 | 매직 립, 인코포레이티드 | Display system with variable power reflector |
CN106873159A (en) | 2017-02-27 | 2017-06-20 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
CN106932905A (en) | 2017-02-27 | 2017-07-07 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
CN106873158A (en) * | 2017-02-27 | 2017-06-20 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
US10683100B2 (en) | 2017-03-06 | 2020-06-16 | Bell Helicopter Textron Inc. | Pilot and passenger seat |
JP7057893B2 (en) * | 2017-03-07 | 2022-04-21 | マツダ株式会社 | Visual condition judgment device |
US10568573B2 (en) * | 2017-03-07 | 2020-02-25 | Sony Interactive Entertainment LLC | Mitigation of head-mounted-display impact via biometric sensors and language processing |
US10628994B2 (en) * | 2017-03-07 | 2020-04-21 | Google Llc | Reducing visually induced motion sickness in head mounted display systems |
US10169973B2 (en) | 2017-03-08 | 2019-01-01 | International Business Machines Corporation | Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions |
IL268586B2 (en) | 2017-03-14 | 2023-09-01 | Magic Leap Inc | Waveguides with light absorbing films and processes for forming the same |
US10657838B2 (en) | 2017-03-15 | 2020-05-19 | International Business Machines Corporation | System and method to teach and evaluate image grading performance using prior learned expert knowledge base |
CN107122642A (en) * | 2017-03-15 | 2017-09-01 | 阿里巴巴集团控股有限公司 | Identity identifying method and device based on reality environment |
KR102302725B1 (en) | 2017-03-17 | 2021-09-14 | 매직 립, 인코포레이티드 | Room Layout Estimation Methods and Techniques |
CA3056900A1 (en) | 2017-03-21 | 2018-09-27 | Magic Leap, Inc. | Methods, devices, and systems for illuminating spatial light modulators |
KR20230130770A (en) | 2017-03-21 | 2023-09-12 | 매직 립, 인코포레이티드 | Eye-imaging apparatus using diffractive optical elements |
KR20190126408A (en) | 2017-03-21 | 2019-11-11 | 매직 립, 인코포레이티드 | Stacked waveguides with different diffraction gratings for the combined field of view |
EP3602176A4 (en) | 2017-03-21 | 2020-12-02 | Magic Leap, Inc. | Low-profile beam splitter |
CA3057136A1 (en) | 2017-03-21 | 2018-09-27 | Magic Leap, Inc. | Display system with spatial light modulator illumination for divided pupils |
EP3603055B1 (en) | 2017-03-21 | 2022-03-02 | Magic Leap, Inc. | Depth sensing techniques for virtual, augmented, and mixed reality systems |
AU2018239511A1 (en) | 2017-03-22 | 2019-10-17 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
IL269432B2 (en) * | 2017-03-24 | 2023-10-01 | Magic Leap Inc | Accumulation and confidence assignment of iris codes |
US10410349B2 (en) * | 2017-03-27 | 2019-09-10 | Microsoft Technology Licensing, Llc | Selective application of reprojection processing on layer sub-regions for optimizing late stage reprojection power |
WO2018183390A1 (en) | 2017-03-28 | 2018-10-04 | Magic Leap, Inc. | Augmeted reality system with spatialized audio tied to user manipulated virtual object |
CN106990843B (en) * | 2017-04-01 | 2021-01-08 | 维沃移动通信有限公司 | Parameter calibration method of eye tracking system and electronic equipment |
US10607096B2 (en) | 2017-04-04 | 2020-03-31 | Princeton Identity, Inc. | Z-dimension user feedback biometric system |
US10609025B2 (en) * | 2017-04-06 | 2020-03-31 | Htc Corporation | System and method for providing simulated environment |
EP3610359B1 (en) * | 2017-04-14 | 2023-09-20 | Magic Leap, Inc. | Multimodal eye tracking |
US10401954B2 (en) | 2017-04-17 | 2019-09-03 | Intel Corporation | Sensory enhanced augmented reality and virtual reality device |
US10242486B2 (en) * | 2017-04-17 | 2019-03-26 | Intel Corporation | Augmented reality and virtual reality feedback enhancement system, apparatus and method |
CN107097227B (en) * | 2017-04-17 | 2019-12-06 | 北京航空航天大学 | human-computer cooperation robot system |
KR102629577B1 (en) | 2017-04-18 | 2024-01-24 | 매직 립, 인코포레이티드 | Waveguides having reflective layers formed by reflective flowable materials |
KR102652922B1 (en) | 2017-04-19 | 2024-03-29 | 매직 립, 인코포레이티드 | Multimodal mission execution and text editing for wearable systems |
US10564733B2 (en) | 2017-04-21 | 2020-02-18 | Htc Corporation | Operating method of tracking system, controller, tracking system, and non-transitory computer readable storage medium |
US10620779B2 (en) * | 2017-04-24 | 2020-04-14 | Microsoft Technology Licensing, Llc | Navigating a holographic image |
EP3616110A1 (en) * | 2017-04-24 | 2020-03-04 | Siemens Aktiengesellschaft | Unlocking passwords in augmented reality based on look |
WO2018201067A1 (en) | 2017-04-27 | 2018-11-01 | Magic Leap, Inc. | Light-emitting user input device |
KR20180123354A (en) * | 2017-05-08 | 2018-11-16 | 엘지전자 주식회사 | User interface apparatus for vehicle and Vehicle |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770429A1 (en) | 2017-05-12 | 2018-12-14 | Apple Inc. | Low-latency intelligent automated assistant |
TWI669657B (en) * | 2017-05-17 | 2019-08-21 | 宏碁股份有限公司 | Host,head-mounted display,portable device,virtual reality system having adaptive controlling function and controlling method thereof |
US10432728B2 (en) | 2017-05-17 | 2019-10-01 | Google Llc | Automatic image sharing with designated users over a communication network |
CN110832441B (en) | 2017-05-19 | 2023-12-26 | 奇跃公司 | Keyboard for virtual, augmented and mixed reality display systems |
EP3631734B1 (en) | 2017-05-22 | 2021-08-18 | Magic Leap, Inc. | Pairing with companion device |
GB2563004A (en) * | 2017-05-23 | 2018-12-05 | Nokia Technologies Oy | Methods and apparatuses for handling visual virtual reality content |
US10339334B2 (en) * | 2017-05-25 | 2019-07-02 | Ca, Inc. | Augmented reality captcha |
CN107224292B (en) * | 2017-05-27 | 2019-12-31 | 西南交通大学 | Method and system for testing attention span of dispatcher |
CN116666814A (en) | 2017-05-30 | 2023-08-29 | 奇跃公司 | Power supply assembly with fan assembly for electronic device |
DE102017111933A1 (en) * | 2017-05-31 | 2018-12-06 | Krohne Messtechnik Gmbh | Method for secure communication with a process measuring field measuring device and corresponding field measuring device |
EP4123425A1 (en) | 2017-05-31 | 2023-01-25 | Magic Leap, Inc. | Eye tracking calibration techniques |
US10242476B2 (en) * | 2017-05-31 | 2019-03-26 | Verizon Patent and Licensong Inc. | Methods and systems for dynamically representing, within a virtual reality data stream being presented to a user, a proxy object that corresponds to an object in the real-world environment of the user |
WO2018222897A1 (en) * | 2017-06-01 | 2018-12-06 | University Of Washington | Smartphone-based digital pupillometer |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US10528794B2 (en) * | 2017-06-05 | 2020-01-07 | Motorola Solutions, Inc. | System and method for tailoring an electronic digital assistant inquiry response as a function of previously detected user ingestion of related video information |
US10657401B2 (en) | 2017-06-06 | 2020-05-19 | Microsoft Technology Licensing, Llc | Biometric object spoof detection based on image intensity variations |
EP3413226A1 (en) * | 2017-06-07 | 2018-12-12 | Gemalto Sa | Method for authenticating a user and corresponding device and system |
US10853918B2 (en) * | 2017-06-09 | 2020-12-01 | Sony Interactive Entertainment Inc. | Foveal adaptation of temporal anti-aliasing |
WO2018231784A1 (en) | 2017-06-12 | 2018-12-20 | Magic Leap, Inc. | Augmented reality display having multi-element adaptive lens for changing depth planes |
CN107633196A (en) * | 2017-06-14 | 2018-01-26 | 电子科技大学 | A kind of eyeball moving projection scheme based on convolutional neural networks |
US10810773B2 (en) * | 2017-06-14 | 2020-10-20 | Dell Products, L.P. | Headset display control based upon a user's pupil state |
US10620710B2 (en) * | 2017-06-15 | 2020-04-14 | Microsoft Technology Licensing, Llc | Displacement oriented interaction in computer-mediated reality |
WO2019005622A1 (en) | 2017-06-30 | 2019-01-03 | Pcms Holdings, Inc. | Method and apparatus for generating and displaying 360-degree video based on eye tracking and physiological measurements |
US10482229B2 (en) * | 2017-06-30 | 2019-11-19 | Wipro Limited | Method of providing content access permission to a user and a device thereof |
US20190012552A1 (en) * | 2017-07-06 | 2019-01-10 | Yves Lambert | Hidden driver monitoring |
US10573071B2 (en) | 2017-07-07 | 2020-02-25 | Nvidia Corporation | Path planning for virtual reality locomotion |
US10573061B2 (en) | 2017-07-07 | 2020-02-25 | Nvidia Corporation | Saccadic redirection for virtual reality locomotion |
US10319151B2 (en) * | 2017-07-07 | 2019-06-11 | Motorola Solutions, Inc. | Device and method for hierarchical object recognition |
US20190012835A1 (en) * | 2017-07-07 | 2019-01-10 | Microsoft Technology Licensing, Llc | Driving an Image Capture System to Serve Plural Image-Consuming Processes |
US20190012841A1 (en) | 2017-07-09 | 2019-01-10 | Eyedaptic, Inc. | Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids |
US10908680B1 (en) | 2017-07-12 | 2021-02-02 | Magic Leap, Inc. | Pose estimation using electromagnetic tracking |
US10691945B2 (en) | 2017-07-14 | 2020-06-23 | International Business Machines Corporation | Altering virtual content based on the presence of hazardous physical obstructions |
KR102368661B1 (en) | 2017-07-26 | 2022-02-28 | 매직 립, 인코포레이티드 | Training a neural network using representations of user interface devices |
US10578870B2 (en) | 2017-07-26 | 2020-03-03 | Magic Leap, Inc. | Exit pupil expander |
WO2019023032A1 (en) | 2017-07-26 | 2019-01-31 | Princeton Identity, Inc. | Biometric security systems and methods |
US11073904B2 (en) * | 2017-07-26 | 2021-07-27 | Microsoft Technology Licensing, Llc | Intelligent user interface element selection using eye-gaze |
US11237691B2 (en) * | 2017-07-26 | 2022-02-01 | Microsoft Technology Licensing, Llc | Intelligent response using eye gaze |
US11355023B2 (en) * | 2017-07-27 | 2022-06-07 | Kennesaw State University Research And Service Foundation, Inc. | System and method for intervention with attention deficient disorders |
CN107360424B (en) * | 2017-07-28 | 2019-10-25 | 深圳岚锋创视网络科技有限公司 | A kind of bit rate control method based on video encoder, device and video server |
JP7398962B2 (en) | 2017-07-28 | 2023-12-15 | マジック リープ, インコーポレイテッド | Fan assembly for displaying images |
KR102026526B1 (en) * | 2017-08-03 | 2019-09-30 | 주식회사 에스지엠 | Authentication system using bio-information and screen golf system using the same |
US11587419B2 (en) * | 2017-08-04 | 2023-02-21 | Toyota Research Institute, Inc. | Methods and systems providing an intelligent camera system |
TWI642030B (en) * | 2017-08-09 | 2018-11-21 | 宏碁股份有限公司 | Visual utility analytic method and related eye tracking device and system |
EP3443883B1 (en) * | 2017-08-14 | 2020-07-29 | Carl Zeiss Vision International GmbH | Method and devices for performing eye-related measurements |
WO2019036533A1 (en) * | 2017-08-16 | 2019-02-21 | Veritaz Inc. | Personal display headset for mitigating user access to disallowed resources |
US20190057694A1 (en) * | 2017-08-17 | 2019-02-21 | Dolby International Ab | Speech/Dialog Enhancement Controlled by Pupillometry |
CN107610235B (en) * | 2017-08-21 | 2020-11-10 | 北京精密机电控制设备研究所 | Mobile platform navigation method and device based on deep learning |
CN109426710A (en) * | 2017-08-22 | 2019-03-05 | 上海荆虹电子科技有限公司 | A kind of electronics iris seal implementation method, system and Electronic Signature equipment |
JP2020532031A (en) | 2017-08-23 | 2020-11-05 | ニューラブル インコーポレイテッド | Brain-computer interface with high-speed optotype tracking |
US10313315B2 (en) * | 2017-08-25 | 2019-06-04 | Bank Of America Corporation | Ensuring information security in data transfers by utilizing proximity keys |
US11145124B2 (en) | 2017-08-30 | 2021-10-12 | Ronald H. Winston | System and method for rendering virtual reality interactions |
US10521661B2 (en) | 2017-09-01 | 2019-12-31 | Magic Leap, Inc. | Detailed eye shape model for robust biometric applications |
CA3071819A1 (en) | 2017-09-01 | 2019-03-07 | Magic Leap, Inc. | Detailed eye shape model for robust biometric applications |
JP6953247B2 (en) * | 2017-09-08 | 2021-10-27 | ラピスセミコンダクタ株式会社 | Goggles type display device, line-of-sight detection method and line-of-sight detection system |
CN114403802A (en) | 2017-09-13 | 2022-04-29 | 奥库洛吉卡公司 | Eye tracking system |
WO2019060298A1 (en) | 2017-09-19 | 2019-03-28 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement |
WO2019060283A1 (en) | 2017-09-20 | 2019-03-28 | Magic Leap, Inc. | Personalized neural network for eye tracking |
EP4296753A2 (en) | 2017-09-21 | 2023-12-27 | Magic Leap, Inc. | Augmented reality display with waveguide configured to capture images of eye and/or environment |
KR102481884B1 (en) | 2017-09-22 | 2022-12-28 | 삼성전자주식회사 | Method and apparatus for displaying a virtual image |
EP3688513A1 (en) * | 2017-09-25 | 2020-08-05 | Continental Automotive GmbH | Head-up display |
IL307639A (en) | 2017-09-27 | 2023-12-01 | Magic Leap Inc | Near eye 3d display with separate phase and amplitude modulators |
US10929993B2 (en) * | 2017-09-29 | 2021-02-23 | L'oreal | Automated imaging system for evaluating the curl of a keratinous substrate |
EP3466338A1 (en) * | 2017-10-03 | 2019-04-10 | Tata Consultancy Services Limited | Cognitive load estimation based on pupil dilation |
WO2019069171A1 (en) * | 2017-10-06 | 2019-04-11 | Novartis Ag | Tracking movement of an eye within a tracking range |
AU2018348229A1 (en) | 2017-10-11 | 2020-04-23 | Magic Leap, Inc. | Augmented reality display comprising eyepiece having a transparent emissive display |
US10712899B2 (en) * | 2017-10-17 | 2020-07-14 | Microsoft Technology Licensing, Llc | Human-machine interface tethered to a user position in a three-dimensional VR or AR environment |
CN111373419A (en) | 2017-10-26 | 2020-07-03 | 奇跃公司 | Gradient normalization system and method for adaptive loss balancing in deep multitask networks |
JP7260538B2 (en) | 2017-10-26 | 2023-04-18 | マジック リープ, インコーポレイテッド | Broadband Adaptive Lens Assembly for Augmented Reality Displays |
AU2018354330A1 (en) | 2017-10-26 | 2020-05-14 | Magic Leap, Inc. | Augmented reality display having liquid crystal variable focus element and roll-to-roll method and apparatus for forming the same |
JP7116166B2 (en) | 2017-10-27 | 2022-08-09 | マジック リープ, インコーポレイテッド | Virtual reticles for augmented reality systems |
US10984508B2 (en) | 2017-10-31 | 2021-04-20 | Eyedaptic, Inc. | Demonstration devices and methods for enhancement for low vision users and systems improvements |
US20190129174A1 (en) * | 2017-10-31 | 2019-05-02 | Google Llc | Multi-perspective eye-tracking for vr/ar systems |
CN108038884B (en) | 2017-11-01 | 2020-12-11 | 北京七鑫易维信息技术有限公司 | Calibration method, calibration device, storage medium and processor |
US11410564B2 (en) | 2017-11-07 | 2022-08-09 | The Board Of Trustees Of The University Of Illinois | System and method for creating immersive interactive application |
US11175736B2 (en) | 2017-11-10 | 2021-11-16 | South Dakota Board Of Regents | Apparatus, systems and methods for using pupillometry parameters for assisted communication |
JP7213241B2 (en) | 2017-11-14 | 2023-01-26 | マジック リープ, インコーポレイテッド | Meta-learning for Multitask Learning on Neural Networks |
CN109799899B (en) * | 2017-11-17 | 2021-10-22 | 腾讯科技(深圳)有限公司 | Interaction control method and device, storage medium and computer equipment |
US10395624B2 (en) | 2017-11-21 | 2019-08-27 | Nvidia Corporation | Adjusting an angular sampling rate during rendering utilizing gaze information |
US10586360B2 (en) | 2017-11-21 | 2020-03-10 | International Business Machines Corporation | Changing view order of augmented reality objects based on user gaze |
US20190156447A1 (en) * | 2017-11-21 | 2019-05-23 | Lorenzo Curci | Flight Logging and Resource Management System |
US11282133B2 (en) | 2017-11-21 | 2022-03-22 | International Business Machines Corporation | Augmented reality product comparison |
CN107968937B (en) * | 2017-11-30 | 2018-08-17 | 泰州腾翔信息科技有限公司 | A kind of system for alleviating eyeball fatigue |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
CN109871674A (en) * | 2017-12-04 | 2019-06-11 | 上海聚虹光电科技有限公司 | VR or AR equipment subregion operating right management method |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
CN109870873B (en) * | 2017-12-05 | 2022-04-22 | 青岛海信激光显示股份有限公司 | Wavelength conversion device, light source device and projection system |
KR20190067433A (en) * | 2017-12-07 | 2019-06-17 | 주식회사 비주얼캠프 | Method for providing text-reading based reward advertisement service and user terminal for executing the same |
CN111448497B (en) | 2017-12-10 | 2023-08-04 | 奇跃公司 | Antireflective coating on optical waveguides |
KR102045743B1 (en) * | 2017-12-11 | 2019-11-18 | 상명대학교산학협력단 | Device and method for periocular biometric authentication in wearable display device |
AU2018383595A1 (en) | 2017-12-11 | 2020-06-11 | Magic Leap, Inc. | Waveguide illuminator |
US11333902B2 (en) * | 2017-12-12 | 2022-05-17 | RaayonNova LLC | Smart contact lens with embedded display and image focusing system |
CN107992896A (en) * | 2017-12-13 | 2018-05-04 | 东南大学 | A kind of scientific concept evaluating method that tracer technique is moved based on eye |
CN111656406A (en) | 2017-12-14 | 2020-09-11 | 奇跃公司 | Context-based rendering of virtual avatars |
JP7431157B2 (en) | 2017-12-15 | 2024-02-14 | マジック リープ, インコーポレイテッド | Improved pose determination for display devices |
US10852547B2 (en) | 2017-12-15 | 2020-12-01 | Magic Leap, Inc. | Eyepieces for augmented reality display system |
CN108108019B (en) * | 2017-12-15 | 2021-03-19 | 歌尔光学科技有限公司 | Virtual reality equipment and display method thereof |
CN115826240A (en) | 2017-12-20 | 2023-03-21 | 奇跃公司 | Insert for augmented reality viewing apparatus |
JP7171727B2 (en) | 2017-12-20 | 2022-11-15 | ビュージックス コーポレーション | Augmented reality display system |
WO2019133997A1 (en) | 2017-12-31 | 2019-07-04 | Neuroenhancement Lab, LLC | System and method for neuroenhancement to enhance emotional response |
US10916060B2 (en) | 2018-01-04 | 2021-02-09 | Magic Leap, Inc. | Optical elements based on polymeric structures incorporating inorganic materials |
CN110022454B (en) * | 2018-01-10 | 2021-02-23 | 华为技术有限公司 | Method for identifying identity in video conference and related equipment |
US10360419B1 (en) * | 2018-01-15 | 2019-07-23 | Universal City Studios Llc | Interactive systems and methods with tracking devices |
WO2019143844A1 (en) | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Eye center of rotation determination, depth plane selection, and render camera positioning in display systems |
EP4339692A2 (en) | 2018-01-17 | 2024-03-20 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
KR20200108888A (en) * | 2018-01-18 | 2020-09-21 | 뉴레이블 인크. | Brain-computer interface with adaptations for fast, accurate, and intuitive user interactions |
US10853674B2 (en) | 2018-01-23 | 2020-12-01 | Toyota Research Institute, Inc. | Vehicle systems and methods for determining a gaze target based on a virtual eye position |
US10817068B2 (en) * | 2018-01-23 | 2020-10-27 | Toyota Research Institute, Inc. | Vehicle systems and methods for determining target based on selecting a virtual eye position or a pointing direction |
US10706300B2 (en) * | 2018-01-23 | 2020-07-07 | Toyota Research Institute, Inc. | Vehicle systems and methods for determining a target based on a virtual eye position and a pointing direction |
US11567627B2 (en) * | 2018-01-30 | 2023-01-31 | Magic Leap, Inc. | Eclipse cursor for virtual content in mixed reality displays |
US10540941B2 (en) | 2018-01-30 | 2020-01-21 | Magic Leap, Inc. | Eclipse cursor for mixed reality displays |
CN110120229A (en) * | 2018-02-05 | 2019-08-13 | 北京三星通信技术研究有限公司 | The processing method and relevant device of Virtual Reality audio signal |
CN108337430A (en) * | 2018-02-07 | 2018-07-27 | 北京联合大学 | 360 degree without dead angle intelligent glasses |
US11556741B2 (en) | 2018-02-09 | 2023-01-17 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
EP3749172B1 (en) | 2018-02-09 | 2022-03-30 | Pupil Labs GmbH | Devices, systems and methods for predicting gaze-related parameters |
EP3750028B1 (en) | 2018-02-09 | 2022-10-19 | Pupil Labs GmbH | Devices, systems and methods for predicting gaze-related parameters |
CN108764007A (en) * | 2018-02-10 | 2018-11-06 | 集智学园(北京)科技有限公司 | Based on OCR with text analysis technique to the measurement method of attention |
US11048785B2 (en) * | 2018-02-14 | 2021-06-29 | Samsung Electronics Co., Ltd | Method and apparatus of performing authentication |
US10726765B2 (en) | 2018-02-15 | 2020-07-28 | Valve Corporation | Using tracking of display device to control image display |
US20190253700A1 (en) | 2018-02-15 | 2019-08-15 | Tobii Ab | Systems and methods for calibrating image sensors in wearable apparatuses |
US10735649B2 (en) | 2018-02-22 | 2020-08-04 | Magic Leap, Inc. | Virtual and augmented reality systems and methods using display system control information embedded in image data |
CN108537111A (en) | 2018-02-26 | 2018-09-14 | 阿里巴巴集团控股有限公司 | A kind of method, apparatus and equipment of In vivo detection |
CN111771231A (en) | 2018-02-27 | 2020-10-13 | 奇跃公司 | Matching mesh for avatars |
WO2019168723A1 (en) | 2018-02-28 | 2019-09-06 | Magic Leap, Inc. | Head scan alignment using ocular registration |
WO2019172503A1 (en) * | 2018-03-05 | 2019-09-12 | 고려대학교 산학협력단 | Device for visual field defect evaluation via eye tracking, method for visual field defect evaluation using same, and computer-readable storage medium |
CN108491072B (en) * | 2018-03-05 | 2020-01-21 | 京东方科技集团股份有限公司 | Virtual reality interaction method and device |
EP3762765A4 (en) | 2018-03-05 | 2021-12-08 | Magic Leap, Inc. | Display system with low-latency pupil tracker |
US11563885B2 (en) | 2018-03-06 | 2023-01-24 | Eyedaptic, Inc. | Adaptive system for autonomous machine learning and control in wearable augmented reality and virtual reality visual aids |
CA3139648A1 (en) | 2018-03-07 | 2019-09-12 | Magic Leap, Inc. | Visual tracking of peripheral devices |
AU2019232746A1 (en) | 2018-03-07 | 2020-08-20 | Magic Leap, Inc. | Adaptive lens assemblies including polarization-selective lens stacks for augmented reality display |
CN111886533A (en) | 2018-03-12 | 2020-11-03 | 奇跃公司 | Inclined array based display |
US10528133B2 (en) * | 2018-03-13 | 2020-01-07 | Facebook Technologies, Llc | Bracelet in a distributed artificial reality system |
US10878620B2 (en) | 2018-03-14 | 2020-12-29 | Magic Leap, Inc. | Display systems and methods for clipping content to increase viewing comfort |
US10747312B2 (en) * | 2018-03-14 | 2020-08-18 | Apple Inc. | Image enhancement devices with gaze tracking |
US10755676B2 (en) | 2018-03-15 | 2020-08-25 | Magic Leap, Inc. | Image correction due to deformation of components of a viewing device |
WO2019177870A1 (en) | 2018-03-15 | 2019-09-19 | Magic Leap, Inc. | Animating virtual avatar facial movements |
WO2019178566A1 (en) | 2018-03-16 | 2019-09-19 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
WO2019177869A1 (en) | 2018-03-16 | 2019-09-19 | Magic Leap, Inc. | Facial expressions from eye-tracking cameras |
US11480467B2 (en) | 2018-03-21 | 2022-10-25 | Magic Leap, Inc. | Augmented reality system and method for spectroscopic analysis |
JP6583460B2 (en) * | 2018-03-23 | 2019-10-02 | 株式会社セガゲームス | Authentication system |
CN108416322B (en) * | 2018-03-27 | 2019-05-17 | 吉林大学 | Visual action identification method in a kind of Virtual assemble seat type operation |
JP7118697B2 (en) | 2018-03-30 | 2022-08-16 | 株式会社Preferred Networks | Point-of-regard estimation processing device, point-of-regard estimation model generation device, point-of-regard estimation processing system, point-of-regard estimation processing method, program, and point-of-regard estimation model |
EP3776027A4 (en) | 2018-04-02 | 2021-12-29 | Magic Leap, Inc. | Waveguides with integrated optical elements and methods of making the same |
WO2019195193A1 (en) | 2018-04-02 | 2019-10-10 | Magic Leap, Inc. | Waveguides having integrated spacers, waveguides having edge absorbers, and methods for making the same |
CN112041716A (en) | 2018-04-02 | 2020-12-04 | 奇跃公司 | Hybrid polymer waveguide and method for manufacturing hybrid polymer waveguide |
WO2019204164A1 (en) | 2018-04-16 | 2019-10-24 | Magic Leap, Inc. | Systems and methods for cross-application authoring, transfer, and evaluation of rigging control systems for virtual characters |
CN108459720B (en) * | 2018-04-19 | 2023-11-21 | 京东方科技集团股份有限公司 | Video control device and method for controlling terminal by using video control device |
US11067805B2 (en) | 2018-04-19 | 2021-07-20 | Magic Leap, Inc. | Systems and methods for operating a display system based on user perceptibility |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
WO2019209431A1 (en) | 2018-04-23 | 2019-10-31 | Magic Leap, Inc. | Avatar facial expression representation in multidimensional space |
US11257268B2 (en) | 2018-05-01 | 2022-02-22 | Magic Leap, Inc. | Avatar animation using Markov decision process policies |
WO2019213220A1 (en) | 2018-05-03 | 2019-11-07 | Magic Leap, Inc. | Using 3d scans of a physical subject to determine positions and orientations of joints for a virtual character |
EP4343499A2 (en) * | 2018-05-04 | 2024-03-27 | Google LLC | Adapting automated assistant based on detected mouth movement and/or gaze |
JP7277569B2 (en) | 2018-05-04 | 2023-05-19 | グーグル エルエルシー | Invoke automation assistant functions based on detected gestures and gazes |
KR102512446B1 (en) | 2018-05-04 | 2023-03-22 | 구글 엘엘씨 | Hot-word free adaptation of automated assistant function(s) |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
DK201870374A1 (en) | 2018-05-07 | 2019-12-04 | Apple Inc. | Avatar creation user interface |
JP7099036B2 (en) * | 2018-05-07 | 2022-07-12 | オムロン株式会社 | Data processing equipment, monitoring system, awakening system, data processing method, and data processing program |
KR102637122B1 (en) * | 2018-05-07 | 2024-02-16 | 애플 인크. | Creative camera |
WO2019217903A1 (en) * | 2018-05-11 | 2019-11-14 | Visionairy Health, Inc. | Automated screening of medical data |
WO2019221325A1 (en) * | 2018-05-14 | 2019-11-21 | 한국과학기술원 | System for continuous authentication by using pupillary response |
US11262839B2 (en) | 2018-05-17 | 2022-03-01 | Sony Interactive Entertainment Inc. | Eye tracking with prediction and late update to GPU for fast foveated rendering in an HMD environment |
US10942564B2 (en) * | 2018-05-17 | 2021-03-09 | Sony Interactive Entertainment Inc. | Dynamic graphics rendering based on predicted saccade landing point |
US11282255B2 (en) | 2018-05-21 | 2022-03-22 | Magic Leap, Inc. | Generating textured polygon strip hair from strand-based hair for a virtual character |
CN108763394B (en) * | 2018-05-21 | 2021-11-23 | 浙江工业大学 | Multi-user eye movement tracking data visualization method and system for collaborative interaction |
US11210835B2 (en) | 2018-05-22 | 2021-12-28 | Magic Leap, Inc. | Computer generated hair groom transfer tool |
EP3797404A4 (en) | 2018-05-22 | 2022-02-16 | Magic Leap, Inc. | Skeletal systems for animating virtual avatars |
EP3797345A4 (en) | 2018-05-22 | 2022-03-09 | Magic Leap, Inc. | Transmodal input fusion for a wearable system |
WO2019226865A1 (en) | 2018-05-25 | 2019-11-28 | Magic Leap, Inc. | Compression of dynamic unstructured point clouds |
CN108854064B (en) * | 2018-05-25 | 2023-03-28 | 深圳市腾讯网络信息技术有限公司 | Interaction control method and device, computer readable medium and electronic equipment |
CN108416341B (en) * | 2018-05-25 | 2023-11-21 | 重庆青腾致汇科技有限公司 | Novel biological recognition system |
CN117631307A (en) | 2018-05-29 | 2024-03-01 | 爱达扩视眼镜公司 | Hybrid perspective augmented reality system and method for low vision users |
WO2019232282A1 (en) | 2018-05-30 | 2019-12-05 | Magic Leap, Inc. | Compact variable focus configurations |
EP3803450A4 (en) | 2018-05-31 | 2021-08-18 | Magic Leap, Inc. | Radar head pose localization |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
US10509467B1 (en) * | 2018-06-01 | 2019-12-17 | Facebook Technologies, Llc | Determining fixation of a user's eyes from images of portions of the user's face enclosed by a head mounted display |
US10360304B1 (en) * | 2018-06-04 | 2019-07-23 | Imageous, Inc. | Natural language processing interface-enabled building conditions control system |
EP3804306B1 (en) | 2018-06-05 | 2023-12-27 | Magic Leap, Inc. | Homography transformation matrices based temperature calibration of a viewing system |
US11238143B2 (en) * | 2018-06-05 | 2022-02-01 | Google Llc | Method and system for authenticating a user on a wearable heads-up display |
US11157159B2 (en) | 2018-06-07 | 2021-10-26 | Magic Leap, Inc. | Augmented reality scrollbar |
JP7421505B2 (en) | 2018-06-08 | 2024-01-24 | マジック リープ, インコーポレイテッド | Augmented reality viewer with automated surface selection and content orientation placement |
WO2019238230A1 (en) * | 2018-06-14 | 2019-12-19 | Brainlab Ag | Registration of an anatomical body part by detecting a finger pose |
WO2019237175A1 (en) * | 2018-06-14 | 2019-12-19 | Integrity Advocate Inc. | Method and system for assessing participants |
WO2019241575A1 (en) | 2018-06-15 | 2019-12-19 | Magic Leap, Inc. | Wide field-of-view polarization switches with liquid crystal optical elements with pretilt |
CN108962182A (en) * | 2018-06-15 | 2018-12-07 | 广东康云多维视觉智能科技有限公司 | 3-D image display device and its implementation based on eyeball tracking |
WO2019246058A1 (en) | 2018-06-18 | 2019-12-26 | Magic Leap, Inc. | Systems and methods for temporarily disabling user control interfaces during attachment of an electronic device |
JP7378431B2 (en) | 2018-06-18 | 2023-11-13 | マジック リープ, インコーポレイテッド | Augmented reality display with frame modulation functionality |
WO2019246044A1 (en) | 2018-06-18 | 2019-12-26 | Magic Leap, Inc. | Head-mounted display systems with power saving functionality |
EP3806710A4 (en) * | 2018-06-18 | 2022-03-30 | New Jersey Institute of Technology | Method, system and apparatus for diagnostic assessment and screening of binocular dysfunctions |
JP7214986B2 (en) * | 2018-06-25 | 2023-01-31 | 日本電信電話株式会社 | Reflectivity determination device, reflectivity determination method, and program |
US11151793B2 (en) | 2018-06-26 | 2021-10-19 | Magic Leap, Inc. | Waypoint creation in map detection |
US11669726B2 (en) | 2018-07-02 | 2023-06-06 | Magic Leap, Inc. | Methods and systems for interpolation of disparate inputs |
WO2020010097A1 (en) | 2018-07-02 | 2020-01-09 | Magic Leap, Inc. | Pixel intensity modulation using modifying gain values |
US11510027B2 (en) | 2018-07-03 | 2022-11-22 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US11856479B2 (en) | 2018-07-03 | 2023-12-26 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality along a route with markers |
KR20200004666A (en) | 2018-07-04 | 2020-01-14 | 주식회사 링크트리 | Biometric information authentication system using machine learning and block chain and its method |
WO2020009670A1 (en) * | 2018-07-04 | 2020-01-09 | Solmaz Gumruk Musavirligi A.S. | A method using artificial neural networks to find a unique harmonized system code from given texts and system for implementing the same |
JP7407748B2 (en) | 2018-07-05 | 2024-01-04 | マジック リープ, インコーポレイテッド | Waveguide-based illumination for head-mounted display systems |
WO2020014311A1 (en) * | 2018-07-10 | 2020-01-16 | Carrier Corporation | Applying image analytics and machine learning to lock systems in hotels |
US10863812B2 (en) | 2018-07-18 | 2020-12-15 | L'oreal | Makeup compact with eye tracking for guidance of makeup application |
US10795435B2 (en) | 2018-07-19 | 2020-10-06 | Samsung Electronics Co., Ltd. | System and method for hybrid eye tracker |
US10884492B2 (en) | 2018-07-20 | 2021-01-05 | Avegant Corp. | Relative position based eye-tracking system |
US11320899B2 (en) | 2018-07-23 | 2022-05-03 | Magic Leap, Inc. | Deep predictor recurrent neural network for head pose prediction |
US11627587B2 (en) | 2018-07-23 | 2023-04-11 | Magic Leap, Inc. | Coexistence interference avoidance between two different radios operating in the same band |
USD918176S1 (en) | 2018-07-24 | 2021-05-04 | Magic Leap, Inc. | Totem controller having an illumination region |
WO2020023404A1 (en) | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Flicker mitigation when toggling eyepiece display illumination in augmented reality systems |
WO2020023672A1 (en) | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Display systems and methods for determining vertical alignment between left and right displays and a user's eyes |
JP7426982B2 (en) | 2018-07-24 | 2024-02-02 | マジック リープ, インコーポレイテッド | Temperature-dependent calibration of movement sensing devices |
EP3827294A4 (en) | 2018-07-24 | 2022-04-20 | Magic Leap, Inc. | Diffractive optical elements with mitigation of rebounce-induced light loss and related systems and methods |
USD930614S1 (en) | 2018-07-24 | 2021-09-14 | Magic Leap, Inc. | Totem controller having an illumination region |
US11624929B2 (en) | 2018-07-24 | 2023-04-11 | Magic Leap, Inc. | Viewing device with dust seal integration |
EP3827426A4 (en) | 2018-07-24 | 2022-07-27 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and eyes of a user |
USD924204S1 (en) | 2018-07-24 | 2021-07-06 | Magic Leap, Inc. | Totem controller having an illumination region |
JP7459050B2 (en) | 2018-07-27 | 2024-04-01 | マジック リープ, インコーポレイテッド | Pose space dimension reduction for pose space deformation of virtual characters |
US11112862B2 (en) | 2018-08-02 | 2021-09-07 | Magic Leap, Inc. | Viewing system with interpupillary distance compensation based on head motion |
JP7438188B2 (en) | 2018-08-03 | 2024-02-26 | マジック リープ, インコーポレイテッド | Unfused pose-based drift correction of fused poses of totems in user interaction systems |
EP3830674A4 (en) | 2018-08-03 | 2022-04-20 | Magic Leap, Inc. | Depth plane selection for multi-depth plane display systems by user categorization |
US11012659B2 (en) * | 2018-08-07 | 2021-05-18 | International Business Machines Corporation | Intelligent illumination and sound control in an internet of things (IoT) computing environment |
CN109165939A (en) * | 2018-08-23 | 2019-01-08 | 唐剑虹 | Block chain VR hardware wallet based on biological identification technology |
US11880441B2 (en) | 2018-08-26 | 2024-01-23 | The Research Foundation For The State University Of New York | System and method for inter-individual discrimination based on oculomotor kinematics |
GB2576910B (en) * | 2018-09-06 | 2021-10-20 | Sony Interactive Entertainment Inc | User profile generating system and method |
CN109145566A (en) | 2018-09-08 | 2019-01-04 | 太若科技(北京)有限公司 | Method, apparatus and AR glasses based on blinkpunkt information unlock AR glasses |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | User interfaces for simulated depth effects |
EP3849410A4 (en) | 2018-09-14 | 2022-11-02 | Neuroenhancement Lab, LLC | System and method of improving sleep |
CN112639687A (en) * | 2018-09-17 | 2021-04-09 | 脸谱科技有限责任公司 | Eye tracking using reverse biased light emitting diode devices |
USD950567S1 (en) | 2018-09-18 | 2022-05-03 | Magic Leap, Inc. | Mobile computing support system having an illumination region |
USD934873S1 (en) | 2018-09-18 | 2021-11-02 | Magic Leap, Inc. | Mobile computing support system having an illumination region |
USD955396S1 (en) | 2018-09-18 | 2022-06-21 | Magic Leap, Inc. | Mobile computing support system having an illumination region |
USD934872S1 (en) | 2018-09-18 | 2021-11-02 | Magic Leap, Inc. | Mobile computing support system having an illumination region |
US20200089855A1 (en) * | 2018-09-19 | 2020-03-19 | XRSpace CO., LTD. | Method of Password Authentication by Eye Tracking in Virtual Reality System |
US10664050B2 (en) | 2018-09-21 | 2020-05-26 | Neurable Inc. | Human-computer interface using high-speed and accurate tracking of user interactions |
WO2020068819A1 (en) | 2018-09-24 | 2020-04-02 | Eyedaptic, Inc. | Enhanced autonomous hands-free control in electronic visual aids |
US11733523B2 (en) | 2018-09-26 | 2023-08-22 | Magic Leap, Inc. | Diffractive optical elements with optical power |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11119573B2 (en) * | 2018-09-28 | 2021-09-14 | Apple Inc. | Pupil modulation as a cognitive control signal |
US11055388B2 (en) * | 2018-10-08 | 2021-07-06 | Advanced New Technologies Co., Ltd. | Passive affective and knowledge-based authentication through eye movement tracking |
CN111083299A (en) * | 2018-10-18 | 2020-04-28 | 富士施乐株式会社 | Information processing apparatus and storage medium |
EP3866690A4 (en) * | 2018-10-19 | 2022-07-27 | Emory University | Systems and methods for automated passive assessment of visuospatial memory and/or salience |
CN109040604B (en) * | 2018-10-23 | 2020-09-15 | Oppo广东移动通信有限公司 | Shot image processing method and device, storage medium and mobile terminal |
WO2020086356A2 (en) | 2018-10-26 | 2020-04-30 | Magic Leap, Inc. | Ambient electromagnetic distortion correction for electromagnetic tracking |
CN111127537A (en) * | 2018-10-29 | 2020-05-08 | 托比股份公司 | Method and apparatus for detecting shadows in a head mounted device |
SE542887C2 (en) * | 2018-10-31 | 2020-08-11 | Tobii Ab | Gaze tracking using mapping of pupil center position |
WO2020089724A1 (en) | 2018-11-01 | 2020-05-07 | 3M Innovative Properties Company | Device, user, or server registration and verification |
US10417497B1 (en) | 2018-11-09 | 2019-09-17 | Qwake Technologies | Cognitive load reducing platform for first responders |
US10896492B2 (en) | 2018-11-09 | 2021-01-19 | Qwake Technologies, Llc | Cognitive load reducing platform having image edge enhancement |
US11890494B2 (en) | 2018-11-09 | 2024-02-06 | Qwake Technologies, Inc. | Retrofittable mask mount system for cognitive load reducing platform |
US10833945B2 (en) * | 2018-11-13 | 2020-11-10 | International Business Machines Corporation | Managing downloading of content |
CN113272821A (en) | 2018-11-15 | 2021-08-17 | 奇跃公司 | Deep neural network attitude estimation system |
EP3881279A4 (en) | 2018-11-16 | 2022-08-17 | Magic Leap, Inc. | Image size triggered clarification to maintain image sharpness |
EP3884337A4 (en) | 2018-11-20 | 2022-08-17 | Magic Leap, Inc. | Eyepieces for augmented reality display system |
WO2020107022A1 (en) * | 2018-11-23 | 2020-05-28 | Slingshot Aerospace, Inc. | Signal processing workflow engine incorporating graphical user interface for space situational awareness |
CN109683704B (en) * | 2018-11-29 | 2022-01-28 | 武汉中地地科传媒文化有限责任公司 | AR interface interaction method and AR display equipment |
WO2020112561A1 (en) | 2018-11-30 | 2020-06-04 | Magic Leap, Inc. | Multi-modal hand location and orientation for avatar movement |
CN111277857B (en) * | 2018-12-04 | 2021-04-13 | 清华大学 | Streaming media scheduling method and device |
CN109799838B (en) * | 2018-12-21 | 2022-04-15 | 金季春 | Training method and system |
US11443515B2 (en) * | 2018-12-21 | 2022-09-13 | Ambient AI, Inc. | Systems and methods for machine learning enhanced intelligent building access endpoint security monitoring and management |
WO2020132941A1 (en) * | 2018-12-26 | 2020-07-02 | 中国科学院深圳先进技术研究院 | Identification method and related device |
JP2022516256A (en) | 2018-12-28 | 2022-02-25 | マジック リープ, インコーポレイテッド | Extended and virtual reality display system with shared display for left and right eyes |
EP3903143A4 (en) | 2018-12-28 | 2022-10-12 | Magic Leap, Inc. | Variable pixel density display system with mechanically-actuated image projector |
US11139071B2 (en) * | 2018-12-31 | 2021-10-05 | Cerner Innovation, Inc. | Virtual augmentation of clinical care environments |
EP3912013A1 (en) | 2019-01-16 | 2021-11-24 | Pupil Labs GmbH | Methods for generating calibration data for head-wearable devices and eye tracking system |
US11036043B2 (en) * | 2019-01-17 | 2021-06-15 | Advanced New Technologies Co., Ltd. | Identity authentication using lens features |
US11107261B2 (en) | 2019-01-18 | 2021-08-31 | Apple Inc. | Virtual avatar animation based on facial feature movement |
US11458040B2 (en) | 2019-01-23 | 2022-10-04 | Meta Platforms Technologies, Llc | Corneal topography mapping with dense illumination |
WO2020154524A1 (en) | 2019-01-25 | 2020-07-30 | Magic Leap, Inc. | Eye-tracking using images having different exposure times |
CN109828734A (en) * | 2019-01-29 | 2019-05-31 | 深圳市海派通讯科技有限公司 | Intelligent terminal shows screen control method, system and storage medium |
EP3921720A4 (en) | 2019-02-06 | 2022-06-29 | Magic Leap, Inc. | Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors |
KR102246408B1 (en) * | 2019-02-14 | 2021-05-18 | 엔에이치엔 주식회사 | Method for providing similar products based on deep-learning |
CN109919065A (en) * | 2019-02-26 | 2019-06-21 | 浪潮金融信息技术有限公司 | A method of focus is obtained on the screen using eyeball tracking technology |
US11138302B2 (en) * | 2019-02-27 | 2021-10-05 | International Business Machines Corporation | Access control using multi-authentication factors |
KR102190527B1 (en) * | 2019-02-28 | 2020-12-14 | 현대모비스 주식회사 | Apparatus and method for automatic synthesizing images |
US11287657B2 (en) | 2019-02-28 | 2022-03-29 | Magic Leap, Inc. | Display system and method for providing variable accommodation cues using multiple intra-pupil parallax views formed by light emitter arrays |
RU2715300C1 (en) * | 2019-03-12 | 2020-02-26 | Алексей Федорович Хорошев | Method of creating object conformity information and information about it |
JP2022523852A (en) | 2019-03-12 | 2022-04-26 | マジック リープ, インコーポレイテッド | Aligning local content between first and second augmented reality viewers |
CN110059232B (en) * | 2019-03-15 | 2021-05-07 | 杭州电子科技大学 | Data visualization method based on user experience measurement |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
CN113841006A (en) | 2019-03-20 | 2021-12-24 | 奇跃公司 | System for providing illumination to an eye |
US11099384B2 (en) * | 2019-03-27 | 2021-08-24 | Lenovo (Singapore) Pte. Ltd. | Adjusting display settings of a head-mounted display |
US11644897B2 (en) | 2019-04-01 | 2023-05-09 | Evolution Optiks Limited | User tracking system using user feature location and method, and digital display device and digital image rendering system and method using same |
WO2020201999A2 (en) * | 2019-04-01 | 2020-10-08 | Evolution Optiks Limited | Pupil tracking system and method, and digital display device and digital image rendering system and method using same |
USD916892S1 (en) * | 2019-04-09 | 2021-04-20 | Google Llc | Display screen or portion thereof with graphical user interface with icon |
TWI754806B (en) * | 2019-04-09 | 2022-02-11 | 栗永徽 | System and method for locating iris using deep learning |
CN114008514A (en) | 2019-04-15 | 2022-02-01 | 奇跃公司 | Sensor fusion for electromagnetic tracking |
CN110060678B (en) * | 2019-04-16 | 2021-09-14 | 深圳欧博思智能科技有限公司 | Virtual role control method based on intelligent device and intelligent device |
JP7060544B6 (en) * | 2019-04-26 | 2022-05-23 | 塁 佐藤 | Exercise equipment |
JP2022530900A (en) | 2019-05-01 | 2022-07-04 | マジック リープ, インコーポレイテッド | Content provisioning system and method |
CN111897411A (en) * | 2019-05-05 | 2020-11-06 | Oppo广东移动通信有限公司 | Interaction method and device based on atmospheric optical communication and wearable device |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
SE543144C2 (en) * | 2019-05-15 | 2020-10-13 | Tobii Ab | Method and system for dwell-less, hands-free interaction with a selectable object |
EP3973347A4 (en) | 2019-05-20 | 2023-05-31 | Magic Leap, Inc. | Systems and techniques for estimating eye pose |
WO2020236993A1 (en) | 2019-05-21 | 2020-11-26 | Magic Leap, Inc. | Hand pose estimation |
EP3977183A4 (en) | 2019-05-24 | 2022-08-31 | Magic Leap, Inc. | Variable focus assemblies |
JP7357081B2 (en) | 2019-05-28 | 2023-10-05 | マジック リープ, インコーポレイテッド | Thermal management system for portable electronic devices |
USD962981S1 (en) | 2019-05-29 | 2022-09-06 | Magic Leap, Inc. | Display screen or portion thereof with animated scrollbar graphical user interface |
JP6830981B2 (en) * | 2019-05-29 | 2021-02-17 | 株式会社東芝 | Wearable device and display method |
US11468890B2 (en) | 2019-06-01 | 2022-10-11 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US10885173B2 (en) | 2019-06-04 | 2021-01-05 | Nant Holdings Ip, Llc | Content authentication and validation via multi-factor digital tokens, systems, and methods |
EP3979896A1 (en) | 2019-06-05 | 2022-04-13 | Pupil Labs GmbH | Devices, systems and methods for predicting gaze-related parameters |
CN110338748B (en) * | 2019-06-13 | 2022-03-08 | 宁波明星科技发展有限公司 | Method for quickly positioning vision value, storage medium, terminal and vision detector |
CN114286962A (en) | 2019-06-20 | 2022-04-05 | 奇跃公司 | Eyepiece for augmented reality display system |
CN114270312A (en) | 2019-06-21 | 2022-04-01 | 奇跃公司 | Secure authorization via modal windows |
EP3987329A4 (en) | 2019-06-24 | 2023-10-11 | Magic Leap, Inc. | Waveguides having integral spacers and related systems and methods |
US10976816B2 (en) * | 2019-06-25 | 2021-04-13 | Microsoft Technology Licensing, Llc | Using eye tracking to hide virtual reality scene changes in plain sight |
US11307650B1 (en) * | 2019-06-25 | 2022-04-19 | Apple Inc. | Modifying virtual content to invoke a target user state |
US10901502B2 (en) * | 2019-06-27 | 2021-01-26 | Facebook, Inc. | Reducing head mounted display power consumption and heat generation through predictive rendering of content |
US11379610B2 (en) * | 2019-07-10 | 2022-07-05 | Blackberry Limited | Methods and devices for automatically encrypting files |
US11029805B2 (en) | 2019-07-10 | 2021-06-08 | Magic Leap, Inc. | Real-time preview of connectable objects in a physically-modeled virtual space |
TW202103646A (en) * | 2019-07-15 | 2021-02-01 | 美商外科劇院股份有限公司 | Augmented reality system and method for tele-proctoring a surgical procedure |
EP3999940A4 (en) | 2019-07-16 | 2023-07-26 | Magic Leap, Inc. | Eye center of rotation determination with one or more eye tracking cameras |
US11460616B2 (en) | 2019-07-19 | 2022-10-04 | Magic Leap, Inc. | Method of fabricating diffraction gratings |
CN114502991A (en) | 2019-07-19 | 2022-05-13 | 奇跃公司 | Display device with diffraction grating having reduced polarization sensitivity |
US11907417B2 (en) | 2019-07-25 | 2024-02-20 | Tectus Corporation | Glance and reveal within a virtual environment |
JP2022542363A (en) | 2019-07-26 | 2022-10-03 | マジック リープ, インコーポレイテッド | Systems and methods for augmented reality |
US11354805B2 (en) | 2019-07-30 | 2022-06-07 | Apple Inc. | Utilization of luminance changes to determine user characteristics |
CN110572632A (en) * | 2019-08-15 | 2019-12-13 | 中国人民解放军军事科学院国防科技创新研究院 | Augmented reality display system, helmet and method based on sight tracking |
US11263634B2 (en) | 2019-08-16 | 2022-03-01 | Advanced New Technologies Co., Ltd. | Payment method and device |
US11380065B2 (en) * | 2019-08-20 | 2022-07-05 | Red Pacs, Llc | Advanced head display unit for fire fighters |
WO2021041990A1 (en) | 2019-08-28 | 2021-03-04 | Qwake Technologies, Llc | Wearable assisted perception module for navigation and communication in hazardous environments |
US11282297B2 (en) * | 2019-09-10 | 2022-03-22 | Blue Planet Training, Inc. | System and method for visual analysis of emotional coherence in videos |
JP7420926B2 (en) | 2019-09-11 | 2024-01-23 | マジック リープ, インコーポレイテッド | Display device with a diffraction grating with reduced polarization sensitivity |
US11295309B2 (en) * | 2019-09-13 | 2022-04-05 | International Business Machines Corporation | Eye contact based financial transaction |
US11010980B2 (en) | 2019-09-25 | 2021-05-18 | International Business Machines Corporation | Augmented interface distraction reduction |
US11354910B2 (en) * | 2019-09-27 | 2022-06-07 | Ncr Corporation | Frictionless authentication and monitoring |
US11601693B2 (en) | 2019-09-30 | 2023-03-07 | Kyndryl, Inc. | Automatic adaptation of digital content |
CN110751064B (en) * | 2019-09-30 | 2022-06-24 | 四川大学 | Blink frequency analysis method and system based on image processing |
US20210097436A1 (en) * | 2019-10-01 | 2021-04-01 | Kelley S. Weiland | Automated system for generating properly tagged training data for and verifying the efficacy of artificial intelligence algorithms |
US11436655B2 (en) * | 2019-10-01 | 2022-09-06 | Ebay Inc. | Different action user-interface components in a comparison view |
US11276246B2 (en) | 2019-10-02 | 2022-03-15 | Magic Leap, Inc. | Color space mapping for intuitive surface normal visualization |
US11176757B2 (en) | 2019-10-02 | 2021-11-16 | Magic Leap, Inc. | Mission driven virtual character for user interaction |
KR102128894B1 (en) * | 2019-10-10 | 2020-07-01 | 주식회사 메디씽큐 | A method and system for eyesight sensing of medical smart goggles |
CN110837294B (en) * | 2019-10-14 | 2023-12-12 | 成都西山居世游科技有限公司 | Facial expression control method and system based on eyeball tracking |
JP7423248B2 (en) * | 2019-10-23 | 2024-01-29 | キヤノン株式会社 | Electronic devices, control methods for electronic devices, programs and storage media |
US11662807B2 (en) * | 2020-01-06 | 2023-05-30 | Tectus Corporation | Eye-tracking user interface for virtual tool control |
US10901505B1 (en) | 2019-10-24 | 2021-01-26 | Tectus Corporation | Eye-based activation and tool selection systems and methods |
US10607077B1 (en) * | 2019-10-28 | 2020-03-31 | EyeVerify Inc. | Identity authentication using an inlier neural network |
CN110745000B (en) * | 2019-10-29 | 2021-09-28 | 上海天马有机发光显示技术有限公司 | Vehicle instrument and display method thereof, and vehicle speed monitoring display system |
CN110727352A (en) * | 2019-10-31 | 2020-01-24 | 哈雷医用(广州)智能技术有限公司 | Electronic product with depression improving effect and control method thereof |
US11830318B2 (en) | 2019-10-31 | 2023-11-28 | 8 Bit Development Inc. | Method of authenticating a consumer or user in virtual reality, thereby enabling access to controlled environments |
TWI731461B (en) * | 2019-11-01 | 2021-06-21 | 宏碁股份有限公司 | Identification method of real face and identification device using the same |
US10795984B1 (en) | 2019-11-01 | 2020-10-06 | Capital One Services, Llc | Active locking mechanism using machine learning |
WO2021092211A1 (en) * | 2019-11-05 | 2021-05-14 | The Regents Of The University Of Colorado, A Body Corporate | Systems and methods to probe ocular structures |
US11493989B2 (en) | 2019-11-08 | 2022-11-08 | Magic Leap, Inc. | Modes of user interaction |
EP4055423A4 (en) | 2019-11-08 | 2024-01-10 | Magic Leap Inc | Metasurfaces with light-redirecting structures including multiple materials and methods for fabricating |
USD982593S1 (en) | 2019-11-08 | 2023-04-04 | Magic Leap, Inc. | Portion of a display screen with animated ray |
WO2021097323A1 (en) | 2019-11-15 | 2021-05-20 | Magic Leap, Inc. | A viewing system for use in a surgical environment |
CN114945947A (en) | 2019-11-18 | 2022-08-26 | 奇跃公司 | Universal world mapping and positioning |
EP4062229A4 (en) | 2019-11-22 | 2024-01-03 | Magic Leap Inc | Method and system for patterning a liquid crystal layer |
US11665379B2 (en) * | 2019-11-26 | 2023-05-30 | Photo Sensitive Cinema (PSC) | Rendering image content as time-spaced frames |
CN114761859A (en) | 2019-11-26 | 2022-07-15 | 奇跃公司 | Augmented eye tracking for augmented or virtual reality display systems |
US11273341B2 (en) * | 2019-11-27 | 2022-03-15 | Ready 2 Perform Technology LLC | Interactive visualization system for biomechanical assessment |
US10871825B1 (en) * | 2019-12-04 | 2020-12-22 | Facebook Technologies, Llc | Predictive eye tracking systems and methods for variable focus electronic displays |
CN112904997B (en) * | 2019-12-04 | 2023-05-26 | Oppo广东移动通信有限公司 | Equipment control method and related product |
CN114746796A (en) | 2019-12-06 | 2022-07-12 | 奇跃公司 | Dynamic browser stage |
WO2021113309A1 (en) | 2019-12-06 | 2021-06-10 | Magic Leap, Inc. | Encoding stereo splash screen in static image |
USD941307S1 (en) | 2019-12-09 | 2022-01-18 | Magic Leap, Inc. | Portion of a display screen with graphical user interface for guiding graphics |
USD952673S1 (en) | 2019-12-09 | 2022-05-24 | Magic Leap, Inc. | Portion of a display screen with transitional graphical user interface for guiding graphics |
USD941353S1 (en) | 2019-12-09 | 2022-01-18 | Magic Leap, Inc. | Portion of a display screen with transitional graphical user interface for guiding graphics |
USD940748S1 (en) | 2019-12-09 | 2022-01-11 | Magic Leap, Inc. | Portion of a display screen with transitional graphical user interface for guiding graphics |
USD940189S1 (en) | 2019-12-09 | 2022-01-04 | Magic Leap, Inc. | Portion of a display screen with transitional graphical user interface for guiding graphics |
USD940749S1 (en) | 2019-12-09 | 2022-01-11 | Magic Leap, Inc. | Portion of a display screen with transitional graphical user interface for guiding graphics |
US11288876B2 (en) | 2019-12-13 | 2022-03-29 | Magic Leap, Inc. | Enhanced techniques for volumetric stage mapping based on calibration object |
US11928632B2 (en) * | 2019-12-19 | 2024-03-12 | Senseye, Inc. | Ocular system for deception detection |
CN113010066B (en) * | 2019-12-20 | 2022-11-11 | 华为技术有限公司 | Display parameter determination method and device |
US20210192853A1 (en) * | 2019-12-20 | 2021-06-24 | Abdul Zalil | Method and system for wireless transmission of audio/video media content to a display device |
CN111159678B (en) * | 2019-12-26 | 2023-08-18 | 联想(北京)有限公司 | Identity recognition method, device and storage medium |
CN111292850A (en) * | 2020-01-22 | 2020-06-16 | 福建中医药大学 | ADHD children attention intelligent rehabilitation system |
US11294461B2 (en) | 2020-01-24 | 2022-04-05 | Magic Leap, Inc. | Content movement and interaction using a single controller |
US11340695B2 (en) | 2020-01-24 | 2022-05-24 | Magic Leap, Inc. | Converting a 2D positional input into a 3D point in space |
USD948562S1 (en) | 2020-01-27 | 2022-04-12 | Magic Leap, Inc. | Portion of a display screen with avatar |
CN115004128A (en) | 2020-01-27 | 2022-09-02 | 奇跃公司 | Functional enhancement of user input device based on gaze timer |
WO2021154558A1 (en) | 2020-01-27 | 2021-08-05 | Magic Leap, Inc. | Augmented reality map curation |
USD948574S1 (en) | 2020-01-27 | 2022-04-12 | Magic Leap, Inc. | Portion of a display screen with a set of avatars |
EP4097684A4 (en) | 2020-01-27 | 2024-02-14 | Magic Leap Inc | Enhanced state control for anchor-based cross reality applications |
USD949200S1 (en) | 2020-01-27 | 2022-04-19 | Magic Leap, Inc. | Portion of a display screen with a set of avatars |
WO2021154646A1 (en) | 2020-01-27 | 2021-08-05 | Magic Leap, Inc. | Neutral avatars |
USD936704S1 (en) | 2020-01-27 | 2021-11-23 | Magic Leap, Inc. | Portion of a display screen with avatar |
US11487356B2 (en) | 2020-01-31 | 2022-11-01 | Magic Leap, Inc. | Augmented and virtual reality display systems for oculometric assessments |
CN111402100A (en) * | 2020-02-03 | 2020-07-10 | 重庆特斯联智慧科技股份有限公司 | Population registration method and system realized through target tracking |
CN111880662A (en) * | 2020-02-06 | 2020-11-03 | 北京师范大学 | Eye movement control system applied to interactive map |
CN111880663A (en) * | 2020-02-06 | 2020-11-03 | 北京师范大学 | Eye movement control method and device applied to interactive map |
US11538199B2 (en) * | 2020-02-07 | 2022-12-27 | Lenovo (Singapore) Pte. Ltd. | Displaying a window in an augmented reality view |
JP7455985B2 (en) | 2020-02-10 | 2024-03-26 | マジック リープ, インコーポレイテッド | Body-centered content positioning for 3D containers in mixed reality environments |
US11709363B1 (en) | 2020-02-10 | 2023-07-25 | Avegant Corp. | Waveguide illumination of a spatial light modulator |
WO2021163354A1 (en) | 2020-02-14 | 2021-08-19 | Magic Leap, Inc. | Virtual object movement speed curve for virtual and augmented reality display systems |
US11016656B1 (en) * | 2020-02-14 | 2021-05-25 | International Business Machines Corporation | Fault recognition self-learning graphical user interface |
US11300784B2 (en) * | 2020-02-21 | 2022-04-12 | Fotonation Limited | Multi-perspective eye acquisition |
WO2021173566A1 (en) | 2020-02-26 | 2021-09-02 | Magic Leap, Inc. | Procedural electron beam lithography |
WO2021174062A1 (en) | 2020-02-28 | 2021-09-02 | Magic Leap, Inc. | Method of fabricating molds for forming eyepieces with integrated spacers |
KR102379350B1 (en) * | 2020-03-02 | 2022-03-28 | 주식회사 비주얼캠프 | Method for page turn and computing device for executing the method |
US11262588B2 (en) | 2020-03-10 | 2022-03-01 | Magic Leap, Inc. | Spectator view of virtual and physical objects |
KR102359602B1 (en) * | 2020-03-10 | 2022-02-08 | 한국과학기술원 | Method for inputting the gaze for display and apparatuses performing the same |
EP3883235A1 (en) | 2020-03-17 | 2021-09-22 | Aptiv Technologies Limited | Camera control modules and methods |
EP4121813A4 (en) | 2020-03-20 | 2024-01-17 | Magic Leap Inc | Systems and methods for retinal imaging and tracking |
EP4127793A1 (en) | 2020-03-25 | 2023-02-08 | Magic Leap, Inc. | Optical device with one-way mirror |
CN111383313B (en) * | 2020-03-31 | 2023-05-12 | 歌尔股份有限公司 | Virtual model rendering method, device, equipment and readable storage medium |
US11537701B2 (en) | 2020-04-01 | 2022-12-27 | Toyota Motor North America, Inc. | Transport related n-factor authentication |
WO2021202783A1 (en) | 2020-04-03 | 2021-10-07 | Magic Leap, Inc. | Avatar customization for optimal gaze discrimination |
US11604354B2 (en) | 2020-04-03 | 2023-03-14 | Magic Leap, Inc. | Wearable display systems with nanowire LED micro-displays |
US11536970B1 (en) * | 2020-04-07 | 2022-12-27 | Google Llc | Tracking of item of interest using wearable heads up display |
US11587388B2 (en) | 2020-04-22 | 2023-02-21 | Igt | Determining a player's emotional state using player gaze movement at gaming devices |
US11950022B1 (en) | 2020-04-24 | 2024-04-02 | Apple Inc. | Head-mounted devices with forward facing cameras |
CN111399659B (en) * | 2020-04-24 | 2022-03-08 | Oppo广东移动通信有限公司 | Interface display method and related device |
DK202070625A1 (en) | 2020-05-11 | 2022-01-04 | Apple Inc | User interfaces related to time |
US11921998B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Editing features of an avatar |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
CN116194821A (en) | 2020-05-22 | 2023-05-30 | 奇跃公司 | Augmented and virtual reality display system with associated in-and out-coupling optical zones |
US11615205B2 (en) * | 2020-05-28 | 2023-03-28 | Bank Of America Corporation | Intelligent dynamic data masking on display screens based on viewer proximity |
US11195490B1 (en) * | 2020-05-29 | 2021-12-07 | International Business Machines Corporation | Smart contact lens with adjustable light transmittance |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
WO2021247435A1 (en) | 2020-06-05 | 2021-12-09 | Magic Leap, Inc. | Enhanced eye tracking techniques based on neural network analysis of images |
JP7218978B2 (en) * | 2020-06-15 | 2023-02-07 | 株式会社mediVR | Rehabilitation support system, rehabilitation support method and rehabilitation support program |
GB202010326D0 (en) * | 2020-07-06 | 2020-08-19 | Palakollu Vamsee Krishna | A virtual reality headset |
US11690435B2 (en) | 2020-07-07 | 2023-07-04 | Perfect Mobile Corp. | System and method for navigating user interfaces using a hybrid touchless control mechanism |
CN115885237A (en) * | 2020-07-17 | 2023-03-31 | 惠普发展公司,有限责任合伙企业 | Head mounted display image and foveal luminance calculation |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
WO2022025921A1 (en) * | 2020-07-31 | 2022-02-03 | Hewlett-Packard Development Company, L.P. | Change blindness detection via bio-analytics |
EP4193215A1 (en) | 2020-08-07 | 2023-06-14 | Magic Leap, Inc. | Tunable cylindrical lenses and head-mounted display including the same |
JP7154259B2 (en) | 2020-08-11 | 2022-10-17 | 株式会社トプコン | ophthalmic equipment |
JP7154260B2 (en) * | 2020-08-11 | 2022-10-17 | 株式会社トプコン | ophthalmic equipment |
US11321797B2 (en) * | 2020-08-25 | 2022-05-03 | Kyndryl, Inc. | Wearable watermarks |
US20220061659A1 (en) * | 2020-08-27 | 2022-03-03 | Revieve Oy | System and method for finding an area of an eye from a facial image |
WO2022046120A1 (en) * | 2020-08-31 | 2022-03-03 | Hewlett-Packard Development Company, L.P. | User authentication using event cameras |
US11620855B2 (en) | 2020-09-03 | 2023-04-04 | International Business Machines Corporation | Iterative memory mapping operations in smart lens/augmented glasses |
CN112084990A (en) * | 2020-09-16 | 2020-12-15 | 重庆科技学院 | Classroom head-raising rate statistical system based on convolutional neural network and backtracking |
KR102230797B1 (en) * | 2020-09-23 | 2021-03-22 | 국방과학연구소 | Deep learning training method and system using infrared image |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
JP2023545653A (en) | 2020-09-29 | 2023-10-31 | エイヴギャント コーポレイション | Architecture for illuminating display panels |
US20230350197A1 (en) * | 2020-10-05 | 2023-11-02 | Sony Group Corporation | Line-of-sight detection device and display device |
GB2599900B (en) * | 2020-10-09 | 2023-01-11 | Sony Interactive Entertainment Inc | Data processing system and method for image enhancement |
US11747896B2 (en) | 2020-10-20 | 2023-09-05 | Rovi Guides, Inc. | Methods and systems of extended reality environment interaction based on eye motions |
US11392198B2 (en) | 2020-10-20 | 2022-07-19 | ROVl GUIDES, INC. | Methods and systems of extended reality environment interaction based on eye motions |
US11281291B1 (en) | 2020-10-20 | 2022-03-22 | Rovi Guides, Inc. | Methods and systems of extended reality environment interaction based on eye motions |
US11609629B2 (en) * | 2020-10-20 | 2023-03-21 | Rovi Guides, Inc. | Methods and systems of extended reality environment interaction based on eye motions |
US11320903B1 (en) | 2020-10-20 | 2022-05-03 | Rovi Guides, Inc. | Methods and systems of extended reality environment interaction based on eye motions |
US11659266B2 (en) | 2020-10-21 | 2023-05-23 | Qualcomm Incorporated | Power control based at least in part on user eye movement |
US11803237B2 (en) | 2020-11-14 | 2023-10-31 | Facense Ltd. | Controlling an eye tracking camera according to eye movement velocity |
JP7252296B2 (en) * | 2020-12-09 | 2023-04-04 | 株式会社トプコン | Ophthalmic device and ophthalmic measurement method |
CN116711302A (en) * | 2020-12-16 | 2023-09-05 | 三星电子株式会社 | Method and device for transmitting multiple application data with low time delay |
EP4278366A1 (en) | 2021-01-12 | 2023-11-22 | Emed Labs, LLC | Health testing and diagnostics platform |
WO2022159628A1 (en) * | 2021-01-22 | 2022-07-28 | Zinn Labs, Inc. | Headset integrated into healthcare platform |
WO2022159630A1 (en) | 2021-01-22 | 2022-07-28 | Zinn Labs, Inc. | Gaze sensors and display elements for detection of gaze vectors and user control at headset |
JP7119145B2 (en) | 2021-01-25 | 2022-08-16 | 株式会社東芝 | Wearable device and display method |
KR20220113633A (en) | 2021-02-05 | 2022-08-16 | 호서대학교 산학협력단 | Apparatus and method for controlling e-book for the disabled |
US20220293241A1 (en) * | 2021-03-12 | 2022-09-15 | Facebook Technologies, Llc | Systems and methods for signaling cognitive-state transitions |
US11929168B2 (en) | 2021-05-24 | 2024-03-12 | Emed Labs, Llc | Systems, devices, and methods for diagnostic aid kit apparatus |
US11615888B2 (en) | 2021-03-23 | 2023-03-28 | Emed Labs, Llc | Remote diagnostic testing and treatment |
US11273074B1 (en) * | 2021-03-24 | 2022-03-15 | Stroma Medical Corporation | Systems and methods for for physical and electronic security of medical devices |
WO2022203620A1 (en) * | 2021-03-25 | 2022-09-29 | Dm Dayanikli Tüketi̇m Mallari Sanayi̇ Ve Ti̇caret Li̇mi̇ted Şi̇rketi̇ | Digital cinema system |
CN113077795B (en) * | 2021-04-06 | 2022-07-15 | 重庆邮电大学 | Voiceprint recognition method under channel attention spreading and aggregation |
US11823148B2 (en) | 2021-04-15 | 2023-11-21 | Bank Of America Corporation | Augmented reality-enabled ATM for secure augmented reality check realization |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11369454B1 (en) | 2021-05-24 | 2022-06-28 | Emed Labs, Llc | Systems, devices, and methods for diagnostic aid kit apparatus |
CN113297994B (en) * | 2021-05-31 | 2023-08-18 | 中国航天科工集团第二研究院 | Pilot behavior analysis method and system |
US11776190B2 (en) | 2021-06-04 | 2023-10-03 | Apple Inc. | Techniques for managing an avatar on a lock screen |
US20220398302A1 (en) * | 2021-06-10 | 2022-12-15 | Trivver, Inc. | Secure wearable lens apparatus |
GB2623461A (en) | 2021-06-22 | 2024-04-17 | Emed Labs Llc | Systems, methods, and devices for non-human readable diagnostic tests |
US20230008868A1 (en) * | 2021-07-08 | 2023-01-12 | Nippon Telegraph And Telephone Corporation | User authentication device, user authentication method, and user authentication computer program |
CN113434511B (en) * | 2021-07-12 | 2023-08-29 | 北京林业大学 | Clustering index method based on Hilbert curve |
KR102644877B1 (en) * | 2021-08-20 | 2024-03-08 | 주식회사 경신 | Apparatus and method for controlling vehicle |
CN113655927A (en) * | 2021-08-24 | 2021-11-16 | 亮风台(上海)信息科技有限公司 | Interface interaction method and device |
US11651404B2 (en) | 2021-08-31 | 2023-05-16 | Kyndryl, Inc. | Virtual shopping assistant |
US20230097716A1 (en) * | 2021-09-25 | 2023-03-30 | FiveGen, LLC | Authenticating Individuals Based on Game Decisions and Behaviors |
US20230102506A1 (en) * | 2021-09-25 | 2023-03-30 | FiveGen, LLC | Selective Recommendation by Mapping Game Decisions and Behaviors to Predefined Attributes |
WO2023047572A1 (en) | 2021-09-27 | 2023-03-30 | 日本電気株式会社 | Authentication system, authentication device, authentication method, and recording medium |
CN113923252B (en) * | 2021-09-30 | 2023-11-21 | 北京蜂巢世纪科技有限公司 | Image display device, method and system |
US11592899B1 (en) * | 2021-10-28 | 2023-02-28 | Tectus Corporation | Button activation within an eye-controlled user interface |
WO2023075771A1 (en) * | 2021-10-28 | 2023-05-04 | Hewlett-Packard Development Company, L.P. | Avatar training images for training machine learning model |
US20230289535A1 (en) * | 2021-11-03 | 2023-09-14 | Virginia Tech Intellectual Properties, Inc. | Visual language processing modeling framework via an attention-on-attention mechanism |
WO2023091403A2 (en) * | 2021-11-17 | 2023-05-25 | Meta Platforms Technologies, Llc | Gaze-based user interface with assistant features for smart glasses in immersive reality applications |
US11789530B2 (en) | 2021-11-17 | 2023-10-17 | Meta Platforms Technologies, Llc | Gaze-based user interface with assistant features for smart glasses in immersive reality applications |
US11619994B1 (en) | 2022-01-14 | 2023-04-04 | Tectus Corporation | Control of an electronic contact lens using pitch-based eye gestures |
US11852825B1 (en) * | 2022-03-08 | 2023-12-26 | Meta Platforms Technologies, Llc | Selective notifications from eye measurements |
US11874961B2 (en) | 2022-05-09 | 2024-01-16 | Tectus Corporation | Managing display of an icon in an eye tracking augmented reality device |
CN115658933B (en) * | 2022-12-28 | 2023-04-07 | 四川大学华西医院 | Psychological state knowledge base construction method and device, computer equipment and storage medium |
CN116436919B (en) * | 2023-06-13 | 2023-10-10 | 深圳市明源云科技有限公司 | Cloud resource consumption optimization method and device, electronic equipment and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140096077A1 (en) * | 2012-09-28 | 2014-04-03 | Michal Jacob | System and method for inferring user intent based on eye movement during observation of a display screen |
US20140289834A1 (en) * | 2013-03-22 | 2014-09-25 | Rolf Lindemann | System and method for eye tracking during authentication |
US20140289833A1 (en) * | 2013-03-22 | 2014-09-25 | Marc Briceno | Advanced authentication techniques and applications |
Family Cites Families (182)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3863243A (en) | 1972-01-19 | 1975-01-28 | Max Skolnick | Sleep inhibiting alarm |
US3798599A (en) | 1972-02-24 | 1974-03-19 | H Kafafian | Single input controller for a communication system |
US4359724A (en) | 1980-04-28 | 1982-11-16 | Ronald R. Zimmerman | Eyelid movement detector |
US4737040A (en) | 1985-02-15 | 1988-04-12 | Moon Tag Y | Keyboard device and method for entering Japanese language text utilizing Romaji character notation |
EP0280124A1 (en) | 1987-02-12 | 1988-08-31 | Omron Tateisi Electronics Co. | Doze detector |
US4850691A (en) | 1987-03-18 | 1989-07-25 | University Of Illinois | Method and apparatus for determining pupillary response parameters |
US4815839A (en) | 1987-08-03 | 1989-03-28 | Waldorf Ronald A | Infrared/video electronystagmographic apparatus |
US5291560A (en) * | 1991-07-15 | 1994-03-01 | Iri Scan Incorporated | Biometric personal identification system based on iris analysis |
US5214456A (en) | 1991-10-09 | 1993-05-25 | Computed Anatomy Incorporated | Mapping of corneal topography with display of pupil perimeter |
US5345281A (en) | 1992-12-17 | 1994-09-06 | John Taboada | Eye tracking system and method |
US5517021A (en) | 1993-01-19 | 1996-05-14 | The Research Foundation State University Of New York | Apparatus and method for eye tracking interface |
US5402109A (en) | 1993-04-29 | 1995-03-28 | Mannik; Kallis H. | Sleep prevention device for automobile drivers |
US5481622A (en) | 1994-03-01 | 1996-01-02 | Rensselaer Polytechnic Institute | Eye tracking apparatus and method employing grayscale threshold values |
JPH086708A (en) | 1994-04-22 | 1996-01-12 | Canon Inc | Display device |
CA2126142A1 (en) | 1994-06-17 | 1995-12-18 | David Alexander Kahn | Visual communications apparatus |
US5469143A (en) | 1995-01-10 | 1995-11-21 | Cooper; David E. | Sleep awakening device for drivers of motor vehicles |
US5566067A (en) | 1995-03-23 | 1996-10-15 | The President And Fellows Of Harvard College | Eyelid vigilance detector system |
US5689241A (en) | 1995-04-24 | 1997-11-18 | Clarke, Sr.; James Russell | Sleep detection and driver alert apparatus |
US5570698A (en) | 1995-06-02 | 1996-11-05 | Siemens Corporate Research, Inc. | System for monitoring eyes for detecting sleep behavior |
US5682144A (en) | 1995-11-20 | 1997-10-28 | Mannik; Kallis Hans | Eye actuated sleep prevention devices and other eye controlled devices |
US6003991A (en) | 1996-02-17 | 1999-12-21 | Erik Scott Viirre | Eye examination apparatus and method for remote examination of a patient by a health professional |
US5912721A (en) | 1996-03-13 | 1999-06-15 | Kabushiki Kaisha Toshiba | Gaze detection apparatus and its method as well as information display apparatus |
US5886683A (en) | 1996-06-25 | 1999-03-23 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven information retrieval |
US6163281A (en) | 1996-08-19 | 2000-12-19 | Torch; William C. | System and method for communication using eye movement |
US6542081B2 (en) | 1996-08-19 | 2003-04-01 | William C. Torch | System and method for monitoring eye movement |
US6246344B1 (en) | 1996-08-19 | 2001-06-12 | William C. Torch | Method and apparatus for voluntary communication |
US5748113A (en) | 1996-08-19 | 1998-05-05 | Torch; William C. | Method and apparatus for communication |
US5867587A (en) | 1997-05-19 | 1999-02-02 | Northrop Grumman Corporation | Impaired operator detection and warning system employing eyeblink analysis |
AU1091099A (en) | 1997-10-16 | 1999-05-03 | Board Of Trustees Of The Leland Stanford Junior University | Method for inferring mental states from eye movements |
US6007202A (en) | 1997-10-23 | 1999-12-28 | Lasersight Technologies, Inc. | Eye illumination system and method |
DE19803158C1 (en) | 1998-01-28 | 1999-05-06 | Daimler Chrysler Ag | Arrangement for determining the state of vigilance, esp. for machinery operator or vehicle driver |
US6204828B1 (en) | 1998-03-31 | 2001-03-20 | International Business Machines Corporation | Integrated gaze/manual cursor positioning system |
US6867752B1 (en) | 1998-08-31 | 2005-03-15 | Semiconductor Energy Laboratory Co., Ltd. | Portable information processing system |
US6087941A (en) | 1998-09-01 | 2000-07-11 | Ferraz; Mark | Warning device for alerting a person falling asleep |
US6243076B1 (en) | 1998-09-01 | 2001-06-05 | Synthetic Environments, Inc. | System and method for controlling host system interface with point-of-interest data |
AUPP612998A0 (en) | 1998-09-23 | 1998-10-15 | Canon Kabushiki Kaisha | Multiview multimedia generation system |
US6526159B1 (en) | 1998-12-31 | 2003-02-25 | Intel Corporation | Eye tracking for resource and power management |
US6433760B1 (en) * | 1999-01-14 | 2002-08-13 | University Of Central Florida | Head mounted display with eyetracking capability |
US6577329B1 (en) | 1999-02-25 | 2003-06-10 | International Business Machines Corporation | Method and system for relevance feedback through gaze tracking and ticker interfaces |
GB2348520B (en) | 1999-03-31 | 2003-11-12 | Ibm | Assisting user selection of graphical user interface elements |
US6116736A (en) | 1999-04-23 | 2000-09-12 | Neuroptics, Inc. | Pupilometer with pupil irregularity detection capability |
JP3636927B2 (en) * | 1999-05-18 | 2005-04-06 | 三菱電機株式会社 | Face image processing device |
JP2001197400A (en) * | 2000-01-12 | 2001-07-19 | Mixed Reality Systems Laboratory Inc | Display device, head installation-type display device, control method of head installation-type display device, picture generation method for the same, computer and program storage medium |
US6456262B1 (en) | 2000-05-09 | 2002-09-24 | Intel Corporation | Microdisplay with eye gaze detection |
DK1285409T3 (en) | 2000-05-16 | 2005-08-22 | Swisscom Mobile Ag | Process of biometric identification and authentication |
US6608615B1 (en) | 2000-09-19 | 2003-08-19 | Intel Corporation | Passive gaze-driven browsing |
US8113657B2 (en) | 2000-10-07 | 2012-02-14 | Metaio Gmbh | Device and method for determining the orientation of an eye |
DE10103922A1 (en) | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
JP3586431B2 (en) * | 2001-02-28 | 2004-11-10 | 松下電器産業株式会社 | Personal authentication method and device |
US20030038754A1 (en) | 2001-08-22 | 2003-02-27 | Mikael Goldstein | Method and apparatus for gaze responsive text presentation in RSVP display |
AUPR872301A0 (en) | 2001-11-08 | 2001-11-29 | Sleep Diagnostics Pty Ltd | Alertness monitor |
US6712468B1 (en) | 2001-12-12 | 2004-03-30 | Gregory T. Edwards | Techniques for facilitating use of eye tracking data |
US7715595B2 (en) * | 2002-01-16 | 2010-05-11 | Iritech, Inc. | System and method for iris identification using stereoscopic face recognition |
KR100954640B1 (en) * | 2002-02-05 | 2010-04-27 | 파나소닉 주식회사 | Personal authentication method and device |
US6873714B2 (en) | 2002-02-19 | 2005-03-29 | Delphi Technologies, Inc. | Auto calibration and personalization of eye tracking system using larger field of view imager with higher resolution |
US6919907B2 (en) | 2002-06-20 | 2005-07-19 | International Business Machines Corporation | Anticipatory image capture for stereoscopic remote viewing with foveal priority |
US20040061680A1 (en) | 2002-07-10 | 2004-04-01 | John Taboada | Method and apparatus for computer control |
US7400782B2 (en) * | 2002-08-28 | 2008-07-15 | Arcsoft, Inc. | Image warping correction in forming 360 degree panoramic images |
JP3574653B2 (en) * | 2002-09-13 | 2004-10-06 | 松下電器産業株式会社 | Iris coding method, personal authentication method, iris code registration device, iris authentication device, and iris authentication program |
US7486806B2 (en) * | 2002-09-13 | 2009-02-03 | Panasonic Corporation | Iris encoding method, individual authentication method, iris code registration device, iris authentication device, and iris authentication program |
BR0315384A (en) | 2002-10-15 | 2005-09-06 | Volvo Technology Corp | Method and disposition to interpret head and eye activity of individuals |
US6932090B1 (en) | 2003-02-06 | 2005-08-23 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Motion sickness treatment apparatus and method |
US7347551B2 (en) | 2003-02-13 | 2008-03-25 | Fergason Patent Properties, Llc | Optical system for monitoring eye movement |
US7881493B1 (en) | 2003-04-11 | 2011-02-01 | Eyetools, Inc. | Methods and apparatuses for use of eye interpretation information |
US7401920B1 (en) * | 2003-05-20 | 2008-07-22 | Elbit Systems Ltd. | Head mounted eye tracking and display system |
US9274598B2 (en) | 2003-08-25 | 2016-03-01 | International Business Machines Corporation | System and method for selecting and activating a target object using a combination of eye gaze and key presses |
US20050047629A1 (en) | 2003-08-25 | 2005-03-03 | International Business Machines Corporation | System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking |
EP1671258A4 (en) * | 2003-09-04 | 2008-03-19 | Sarnoff Corp | Method and apparatus for performing iris recognition from an image |
US8098901B2 (en) * | 2005-01-26 | 2012-01-17 | Honeywell International Inc. | Standoff iris recognition system |
US7365738B2 (en) | 2003-12-02 | 2008-04-29 | International Business Machines Corporation | Guides and indicators for eye movement monitoring systems |
US7561143B1 (en) | 2004-03-19 | 2009-07-14 | The University of the Arts | Using gaze actions to interact with a display |
CN102670163B (en) | 2004-04-01 | 2016-04-13 | 威廉·C·托奇 | The system and method for controlling calculation device |
US7195355B2 (en) * | 2004-04-28 | 2007-03-27 | Neurocom International, Inc. | Isolating and quantifying functional impairments of the gaze stabilization system |
ES2535364T3 (en) | 2004-06-18 | 2015-05-08 | Tobii Ab | Eye control of computer equipment |
US7797040B2 (en) * | 2004-12-16 | 2010-09-14 | California Institute Of Technology | Prosthetic devices and methods and systems related thereto |
MX2007010513A (en) | 2005-03-04 | 2008-01-16 | Sleep Diagnostics Pty Ltd | Measuring alertness. |
WO2006132686A2 (en) * | 2005-06-03 | 2006-12-14 | Sarnoff Corporation | Method and apparatus for designing iris biometric systems for use in minimally |
JP2007006393A (en) * | 2005-06-27 | 2007-01-11 | Institute Of Physical & Chemical Research | Information presentation system |
US7438414B2 (en) | 2005-07-28 | 2008-10-21 | Outland Research, Llc | Gaze discriminating electronic control apparatus, system, method and computer program product |
US7751598B2 (en) * | 2005-08-25 | 2010-07-06 | Sarnoff Corporation | Methods and systems for biometric identification |
CA2627068C (en) * | 2005-10-24 | 2015-02-03 | Itesoft S.A. | Device and method for interaction with a user |
WO2007050029A2 (en) * | 2005-10-28 | 2007-05-03 | Tobii Technology Ab | Eye tracker with visual feedback |
US7429108B2 (en) | 2005-11-05 | 2008-09-30 | Outland Research, Llc | Gaze-responsive interface to enhance on-screen user reading tasks |
US8260008B2 (en) * | 2005-11-11 | 2012-09-04 | Eyelock, Inc. | Methods for performing biometric recognition of a human eye and corroboration of same |
JP2007159610A (en) * | 2005-12-09 | 2007-06-28 | Matsushita Electric Ind Co Ltd | Registration device, authentication device, registration authentication device, registration method, authentication method, registration program, and authentication program |
US7760910B2 (en) | 2005-12-12 | 2010-07-20 | Eyetools, Inc. | Evaluation of visual stimuli using existing viewing data |
JP4367424B2 (en) * | 2006-02-21 | 2009-11-18 | 沖電気工業株式会社 | Personal identification device and personal identification method |
US8793620B2 (en) | 2011-04-21 | 2014-07-29 | Sony Computer Entertainment Inc. | Gaze-assisted computer interface |
JP2007319174A (en) * | 2006-05-30 | 2007-12-13 | Matsushita Electric Ind Co Ltd | Photographic equipment and authentication apparatus using the same |
US20070297653A1 (en) * | 2006-06-22 | 2007-12-27 | Rudolf Maarten Bolle | Fingerprint representation using localized texture features |
US7574021B2 (en) * | 2006-09-18 | 2009-08-11 | Sarnoff Corporation | Iris recognition for a secure facility |
US7970179B2 (en) * | 2006-09-25 | 2011-06-28 | Identix Incorporated | Iris data extraction |
JP2008206143A (en) * | 2007-01-23 | 2008-09-04 | Kanazawa Univ | Imaging device having image processing function |
JP2008198028A (en) * | 2007-02-14 | 2008-08-28 | Sony Corp | Wearable device, authentication method and program |
JP2008288767A (en) | 2007-05-16 | 2008-11-27 | Sony Corp | Information processor, method, and program |
IL184399A0 (en) * | 2007-07-03 | 2007-10-31 | Yossi Tsuria | Content delivery system |
WO2009029757A1 (en) * | 2007-09-01 | 2009-03-05 | Global Rainmakers, Inc. | System and method for iris data acquisition for biometric identification |
US8462949B2 (en) | 2007-11-29 | 2013-06-11 | Oculis Labs, Inc. | Method and apparatus for secure display of visual content |
WO2009093435A1 (en) | 2008-01-25 | 2009-07-30 | Panasonic Corporation | Brain wave interface system, brain wave interface device, method and computer program |
CN101677762B (en) * | 2008-02-28 | 2012-08-22 | 松下电器产业株式会社 | Sight line detector and method for detecting sight line |
US20100045596A1 (en) * | 2008-08-21 | 2010-02-25 | Sony Ericsson Mobile Communications Ab | Discreet feature highlighting |
US7850306B2 (en) | 2008-08-28 | 2010-12-14 | Nokia Corporation | Visual cognition aware display and visual data transmission architecture |
WO2010042557A2 (en) | 2008-10-06 | 2010-04-15 | Neuro Kinetics, Inc. | Method and apparatus for corrective secondary saccades analysis with video oculography system |
ATE527934T1 (en) | 2009-04-01 | 2011-10-15 | Tobii Technology Ab | ADAPTIVE CAMERA AND ILLUMINATOR EYE TRACKER |
US20120105486A1 (en) | 2009-04-09 | 2012-05-03 | Dynavox Systems Llc | Calibration free, motion tolerent eye-gaze direction detector with contextually aware computer interaction and communication methods |
US8472681B2 (en) * | 2009-06-15 | 2013-06-25 | Honeywell International Inc. | Iris and ocular recognition system using trace transforms |
CN101943982B (en) * | 2009-07-10 | 2012-12-12 | 北京大学 | Method for manipulating image based on tracked eye movements |
WO2011008793A1 (en) * | 2009-07-13 | 2011-01-20 | Emsense Corporation | Systems and methods for generating bio-sensory metrics |
EP3338621B1 (en) | 2009-07-16 | 2019-08-07 | Tobii AB | Eye detection unit using parallel data flow |
US20110047377A1 (en) * | 2009-08-19 | 2011-02-24 | Harris Corporation | Secure digital communications via biometric key generation |
JP5613025B2 (en) * | 2009-11-18 | 2014-10-22 | パナソニック株式会社 | Gaze detection apparatus, gaze detection method, electrooculogram measurement apparatus, wearable camera, head mounted display, electronic glasses, and ophthalmologic diagnosis apparatus |
US9507418B2 (en) * | 2010-01-21 | 2016-11-29 | Tobii Ab | Eye tracker based contextual action |
US8922342B1 (en) * | 2010-02-15 | 2014-12-30 | Noblis, Inc. | Systems, apparatus, and methods for continuous authentication |
US8467133B2 (en) * | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
US9759917B2 (en) * | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
EP2539759A1 (en) * | 2010-02-28 | 2013-01-02 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
US9091851B2 (en) * | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US8890946B2 (en) | 2010-03-01 | 2014-11-18 | Eyefluence, Inc. | Systems and methods for spatially controlled scene illumination |
KR20110125460A (en) * | 2010-05-13 | 2011-11-21 | 김석수 | A product information provider system using eye tracing and a method thereof |
US8593375B2 (en) | 2010-07-23 | 2013-11-26 | Gregory A Maltz | Eye gaze user interface and method |
US9916006B2 (en) | 2010-07-23 | 2018-03-13 | Telepatheye Inc. | Eye-wearable device user interface and method |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
US9690099B2 (en) * | 2010-12-17 | 2017-06-27 | Microsoft Technology Licensing, Llc | Optimized focal area for augmented reality displays |
EP2923638B1 (en) * | 2011-03-18 | 2019-02-20 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Optical measuring device and system |
US8643680B2 (en) | 2011-04-08 | 2014-02-04 | Amazon Technologies, Inc. | Gaze-based content display |
US8682073B2 (en) | 2011-04-28 | 2014-03-25 | Sri International | Method of pupil segmentation |
US9256720B2 (en) * | 2011-05-18 | 2016-02-09 | Nextgenid, Inc. | Enrollment kiosk including biometric enrollment and verification, face recognition and fingerprint matching systems |
WO2012159070A2 (en) * | 2011-05-18 | 2012-11-22 | Nextgenid, Inc. | Multi-biometric enrollment kiosk including biometric enrollment and verification, face recognition and fingerprint matching systems |
US8911087B2 (en) * | 2011-05-20 | 2014-12-16 | Eyefluence, Inc. | Systems and methods for measuring reactions of head, eyes, eyelids and pupils |
US8885877B2 (en) | 2011-05-20 | 2014-11-11 | Eyefluence, Inc. | Systems and methods for identifying gaze tracking scene reference locations |
EP2726937B1 (en) * | 2011-06-30 | 2019-01-23 | Nokia Technologies Oy | Method, apparatus and computer program product for generating panorama images |
US20130021374A1 (en) * | 2011-07-20 | 2013-01-24 | Google Inc. | Manipulating And Displaying An Image On A Wearable Computing System |
US8965064B2 (en) * | 2011-08-22 | 2015-02-24 | Eyelock, Inc. | Systems and methods for capturing artifact free images |
US9342610B2 (en) * | 2011-08-25 | 2016-05-17 | Microsoft Technology Licensing, Llc | Portals: registered objects as virtualized, personalized displays |
WO2013033195A2 (en) * | 2011-08-30 | 2013-03-07 | Microsoft Corporation | Head mounted display with iris scan profiling |
US8223024B1 (en) * | 2011-09-21 | 2012-07-17 | Google Inc. | Locking mechanism based on unnatural movement of head-mounted display |
ES2620762T3 (en) * | 2011-10-27 | 2017-06-29 | Tobii Ab | Power management in an eye tracking system |
US20130258089A1 (en) * | 2011-11-03 | 2013-10-03 | Intel Corporation | Eye Gaze Based Image Capture |
US8929589B2 (en) | 2011-11-07 | 2015-01-06 | Eyefluence, Inc. | Systems and methods for high-resolution gaze tracking |
KR101891786B1 (en) | 2011-11-29 | 2018-08-27 | 삼성전자주식회사 | Operation Method For User Function based on a Eye-Tracking and Portable Device supporting the same |
US8235529B1 (en) * | 2011-11-30 | 2012-08-07 | Google Inc. | Unlocking a screen using eye tracking information |
US8955973B2 (en) * | 2012-01-06 | 2015-02-17 | Google Inc. | Method and system for input detection using structured light projection |
WO2013111140A2 (en) * | 2012-01-26 | 2013-08-01 | Umoove Services Ltd. | Eye tracking |
US9001030B2 (en) | 2012-02-15 | 2015-04-07 | Google Inc. | Heads up display |
KR101158502B1 (en) * | 2012-02-16 | 2012-06-20 | 김유정 | User recognition device for access control |
CN104159497B (en) * | 2012-03-09 | 2018-01-12 | 奥斯派克特公司 | For the method and its device of the function of assessing vision system |
CN104246682B (en) | 2012-03-26 | 2017-08-25 | 苹果公司 | Enhanced virtual touchpad and touch-screen |
US9082011B2 (en) | 2012-03-28 | 2015-07-14 | Texas State University—San Marcos | Person identification using ocular biometrics with liveness detection |
US8864310B2 (en) | 2012-05-01 | 2014-10-21 | RightEye, LLC | Systems and methods for evaluating human eye tracking |
EP2847648A4 (en) | 2012-05-09 | 2016-03-02 | Intel Corp | Eye tracking based selective accentuation of portions of a display |
DE102012105664A1 (en) | 2012-06-28 | 2014-04-10 | Oliver Hein | Method and device for coding eye and eye tracking data |
JP2014044654A (en) * | 2012-08-28 | 2014-03-13 | Nikon Corp | Information input and output device |
US9189064B2 (en) | 2012-09-05 | 2015-11-17 | Apple Inc. | Delay of display event based on user gaze |
US20140092006A1 (en) | 2012-09-28 | 2014-04-03 | Joshua Boelter | Device and method for modifying rendering based on viewer focus area from eye tracking |
WO2014057618A1 (en) | 2012-10-09 | 2014-04-17 | パナソニック株式会社 | Three-dimensional display device, three-dimensional image processing device and three-dimensional display method |
JP2014092941A (en) * | 2012-11-02 | 2014-05-19 | Sony Corp | Information processor and information processing method and computer program |
JP2014092940A (en) * | 2012-11-02 | 2014-05-19 | Sony Corp | Image display device and image display method and computer program |
US9626072B2 (en) * | 2012-11-07 | 2017-04-18 | Honda Motor Co., Ltd. | Eye gaze control system |
US9674510B2 (en) | 2012-11-21 | 2017-06-06 | Elwha Llc | Pulsed projection system for 3D video |
US9083757B2 (en) | 2012-11-21 | 2015-07-14 | Telefonaktiebolaget L M Ericsson LLP | Multi-objective server placement determination |
JP5652886B2 (en) * | 2012-11-28 | 2015-01-14 | Necカシオモバイルコミュニケーションズ株式会社 | Face authentication device, authentication method and program, information device |
US20140218281A1 (en) | 2012-12-06 | 2014-08-07 | Eyefluence, Inc. | Systems and methods for eye gaze determination |
WO2014093227A1 (en) * | 2012-12-10 | 2014-06-19 | Sri International | Iris biometric matching system |
US20140173407A1 (en) | 2012-12-17 | 2014-06-19 | Empire Technology Development Llc | Progressively triggered auto-fill |
WO2014111924A1 (en) | 2013-01-15 | 2014-07-24 | Poow Innovation Ltd. | Dynamic icons |
US9829971B2 (en) | 2013-01-21 | 2017-11-28 | Facebook, Inc. | Systems and methods of eye tracking control |
US9070015B2 (en) * | 2013-02-07 | 2015-06-30 | Ittiam Systems (P) Ltd. | System and method for iris detection in digital images |
US9791921B2 (en) | 2013-02-19 | 2017-10-17 | Microsoft Technology Licensing, Llc | Context-aware augmented reality object commands |
KR102093198B1 (en) | 2013-02-21 | 2020-03-25 | 삼성전자주식회사 | Method and apparatus for user interface using gaze interaction |
KR102175853B1 (en) | 2013-02-22 | 2020-11-06 | 삼성전자주식회사 | Method for controlling operation and an electronic device thereof |
KR20160005013A (en) * | 2013-03-01 | 2016-01-13 | 토비 에이비 | Delay warp gaze interaction |
US10268276B2 (en) * | 2013-03-15 | 2019-04-23 | Eyecam, LLC | Autonomous computing and telecommunications head-up displays glasses |
KR101627290B1 (en) * | 2013-04-16 | 2016-06-21 | 구태언 | Head-mounted display apparatus with enhanced secuirity and method for accessing encrypted information by the apparatus |
US9979547B2 (en) * | 2013-05-08 | 2018-05-22 | Google Llc | Password management |
US10025378B2 (en) * | 2013-06-25 | 2018-07-17 | Microsoft Technology Licensing, Llc | Selecting user interface elements via position signal |
KR101882594B1 (en) * | 2013-09-03 | 2018-07-26 | 토비 에이비 | Portable eye tracking device |
US9582716B2 (en) * | 2013-09-09 | 2017-02-28 | Delta ID Inc. | Apparatuses and methods for iris based biometric recognition |
EP3090322A4 (en) | 2013-12-31 | 2017-07-19 | Eyefluence, Inc. | Systems and methods for gaze-based media selection and editing |
US9552060B2 (en) * | 2014-01-28 | 2017-01-24 | Microsoft Technology Licensing, Llc | Radial selection by vestibulo-ocular reflex fixation |
US10564714B2 (en) | 2014-05-09 | 2020-02-18 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
KR102173699B1 (en) | 2014-05-09 | 2020-11-03 | 아이플루언스, 인크. | Systems and methods for discerning eye signals and continuous biometric identification |
US20160364609A1 (en) * | 2015-06-12 | 2016-12-15 | Delta ID Inc. | Apparatuses and methods for iris based biometric recognition |
-
2015
- 2015-05-09 KR KR1020167034652A patent/KR102173699B1/en active IP Right Grant
- 2015-05-09 CN CN201580035714.9A patent/CN106537290B/en active Active
- 2015-05-09 EP EP15826370.7A patent/EP3140719B1/en active Active
- 2015-05-09 AU AU2015297035A patent/AU2015297035B2/en active Active
- 2015-05-09 WO PCT/US2015/030047 patent/WO2015172124A1/en active Application Filing
- 2015-05-09 JP JP2017511569A patent/JP6550460B2/en active Active
- 2015-05-09 JP JP2017511567A patent/JP2017527036A/en active Pending
- 2015-05-09 KR KR1020207030953A patent/KR20200127267A/en not_active Application Discontinuation
- 2015-05-09 KR KR1020167034649A patent/KR102230172B1/en active IP Right Grant
- 2015-05-09 CN CN201580034682.0A patent/CN107087431B/en active Active
- 2015-05-09 US US14/708,229 patent/US20150324568A1/en not_active Abandoned
- 2015-05-09 US US14/708,234 patent/US10620700B2/en active Active
- 2015-05-09 EP EP15827954.7A patent/EP3140780B1/en active Active
- 2015-05-09 CN CN201580031094.1A patent/CN106462743A/en active Pending
- 2015-05-09 AU AU2015255652A patent/AU2015255652B2/en active Active
- 2015-05-09 WO PCT/US2015/030050 patent/WO2016018487A2/en active Application Filing
- 2015-05-09 JP JP2017511568A patent/JP2017526078A/en active Pending
- 2015-05-09 AU AU2015297036A patent/AU2015297036B2/en active Active
- 2015-05-09 KR KR1020167034651A patent/KR20170046108A/en not_active IP Right Cessation
- 2015-05-09 WO PCT/US2015/030052 patent/WO2016018488A2/en active Application Filing
- 2015-05-09 US US14/708,241 patent/US9600069B2/en active Active
- 2015-05-09 EP EP15789095.5A patent/EP3140779A4/en not_active Withdrawn
- 2015-11-02 US US14/930,617 patent/US9823744B2/en active Active
- 2015-11-10 US US14/937,782 patent/US20160062459A1/en not_active Abandoned
-
2016
- 2016-04-18 US US15/131,273 patent/US20160274660A1/en active Pending
-
2017
- 2017-01-27 US US15/418,034 patent/US10156900B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140096077A1 (en) * | 2012-09-28 | 2014-04-03 | Michal Jacob | System and method for inferring user intent based on eye movement during observation of a display screen |
US20140289834A1 (en) * | 2013-03-22 | 2014-09-25 | Rolf Lindemann | System and method for eye tracking during authentication |
US20140289833A1 (en) * | 2013-03-22 | 2014-09-25 | Marc Briceno | Advanced authentication techniques and applications |
Non-Patent Citations (1)
Title |
---|
Dario D. Salvucci, Inferring Intent in Eye-Based Interfaces: Tracing Eye Movements with Process Models, Human Factors in Computing Systems: CHI 99 Conference Proceedsings (pp.254-261), New York, ACM Press. * |
Cited By (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9823744B2 (en) | 2014-05-09 | 2017-11-21 | Google Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US10620700B2 (en) | 2014-05-09 | 2020-04-14 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US10564714B2 (en) | 2014-05-09 | 2020-02-18 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US10325083B2 (en) * | 2014-06-27 | 2019-06-18 | Intel Corporation | Wearable electronic devices |
US20160246365A1 (en) * | 2015-02-23 | 2016-08-25 | International Business Machines Corporation | Interfacing via heads-up display using eye contact |
US20160246054A1 (en) * | 2015-02-23 | 2016-08-25 | International Business Machines Corporation | Interfacing via heads-up display using eye contact |
US9652035B2 (en) * | 2015-02-23 | 2017-05-16 | International Business Machines Corporation | Interfacing via heads-up display using eye contact |
US9658689B2 (en) * | 2015-02-23 | 2017-05-23 | International Business Machines Corporation | Interfacing via heads-up display using eye contact |
US10305895B2 (en) * | 2015-04-14 | 2019-05-28 | Blubox Security, Inc. | Multi-factor and multi-mode biometric physical access control device |
US20160308859A1 (en) * | 2015-04-14 | 2016-10-20 | Blub0X Technology Holdings, Inc. | Multi-factor and multi-mode biometric physical access control device |
US11595479B2 (en) | 2015-06-15 | 2023-02-28 | Blubøx Security, Inc. | Web-cloud hosted unified physical security system |
US10757194B2 (en) | 2015-06-15 | 2020-08-25 | Blubøx Security, Inc. | Web-cloud hosted unified physical security system |
US10554758B2 (en) | 2015-06-15 | 2020-02-04 | Blub0X Security, Inc. | Web-cloud hosted unified physical security system |
US11538280B2 (en) * | 2015-08-21 | 2022-12-27 | Magic Leap, Inc. | Eyelid shape estimation using eye pose measurement |
US20200293744A1 (en) * | 2015-08-21 | 2020-09-17 | Magic Leap, Inc. | Eyelid shape estimation using eye pose measurement |
US10949255B2 (en) * | 2015-08-31 | 2021-03-16 | Ayla Networks, Inc. | Compact schedules for resource-constrained devices |
US20190155896A1 (en) * | 2015-08-31 | 2019-05-23 | Ayla Networks, Inc. | Compact schedules for resource-constrained devices |
US11416073B2 (en) | 2015-09-04 | 2022-08-16 | Sony Interactive Entertainment Inc. | Apparatus and method for dynamic graphics rendering based on saccade detection |
US20170069159A1 (en) * | 2015-09-04 | 2017-03-09 | Musigma Business Solutions Pvt. Ltd. | Analytics system and method |
US11099645B2 (en) | 2015-09-04 | 2021-08-24 | Sony Interactive Entertainment Inc. | Apparatus and method for dynamic graphics rendering based on saccade detection |
US11703947B2 (en) | 2015-09-04 | 2023-07-18 | Sony Interactive Entertainment Inc. | Apparatus and method for dynamic graphics rendering based on saccade detection |
US20170090588A1 (en) * | 2015-09-29 | 2017-03-30 | Kabushiki Kaisha Toshiba | Electronic device and method |
US11749025B2 (en) | 2015-10-16 | 2023-09-05 | Magic Leap, Inc. | Eye pose identification using eye features |
US10338677B2 (en) * | 2015-10-28 | 2019-07-02 | Microsoft Technology Licensing, Llc | Adjusting image frames based on tracking motion of eyes |
US20170123489A1 (en) * | 2015-10-28 | 2017-05-04 | Microsoft Technology Licensing, Llc | Adjusting image frames based on tracking motion of eyes |
US10466778B2 (en) | 2016-01-19 | 2019-11-05 | Magic Leap, Inc. | Eye image selection |
US11231775B2 (en) | 2016-01-19 | 2022-01-25 | Magic Leap, Inc. | Eye image selection |
US11579694B2 (en) | 2016-01-19 | 2023-02-14 | Magic Leap, Inc. | Eye image selection |
US10831264B2 (en) | 2016-01-19 | 2020-11-10 | Magic Leap, Inc. | Eye image combination |
US11436625B2 (en) | 2016-03-22 | 2022-09-06 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
EP3433707B1 (en) | 2016-03-22 | 2020-10-28 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
EP3779740B1 (en) | 2016-03-22 | 2021-12-08 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
US10733275B1 (en) * | 2016-04-01 | 2020-08-04 | Massachusetts Mutual Life Insurance Company | Access control through head imaging and biometric authentication |
US10353465B2 (en) * | 2016-06-08 | 2019-07-16 | South China University Of Technology | Iris and pupil-based gaze estimation method for head-mounted device |
WO2018046347A1 (en) * | 2016-09-07 | 2018-03-15 | Bundesdruckerei Gmbh | Data glasses for cryptographically signing image data |
EP3940559A1 (en) * | 2016-09-07 | 2022-01-19 | Bundesdruckerei GmbH | Data goggles for cryptographic signing of image data |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US11232655B2 (en) | 2016-09-13 | 2022-01-25 | Iocurrents, Inc. | System and method for interfacing with a vehicular controller area network |
US10750560B2 (en) | 2016-09-27 | 2020-08-18 | Extreme Networks, Inc. | IoT device management using multi-protocol infrastructure network devices |
US11170087B2 (en) | 2017-02-23 | 2021-11-09 | Advanced New Technologies Co., Ltd. | Virtual reality scene-based business verification method and device |
US10996477B2 (en) | 2017-02-27 | 2021-05-04 | Advanced New Technologies Co., Ltd. | Virtual reality head-mounted apparatus |
US10866633B2 (en) | 2017-02-28 | 2020-12-15 | Microsoft Technology Licensing, Llc | Signing with your eyes |
EP3376367A1 (en) * | 2017-03-13 | 2018-09-19 | Siemens Aktiengesellschaft | Acknowledgement of the transfer of a good |
US11461444B2 (en) | 2017-03-31 | 2022-10-04 | Advanced New Technologies Co., Ltd. | Information processing method and device based on internet of things |
US10397594B2 (en) | 2017-04-28 | 2019-08-27 | Hewlett Packard Enterprise Development Lp | Real-time processing of IoT data |
US10748340B1 (en) * | 2017-07-31 | 2020-08-18 | Apple Inc. | Electronic device with coordinated camera and display operation |
KR20190022376A (en) * | 2017-08-23 | 2019-03-06 | 한국전자통신연구원 | Apparatus for self-quantification service |
KR102577681B1 (en) | 2017-08-23 | 2023-09-14 | 한국전자통신연구원 | Apparatus for self-quantification service |
US20190125264A1 (en) * | 2017-10-29 | 2019-05-02 | Orlando Efrain Abreu Oramas | Method and system of facilitating monitoring of an individual based on at least one wearable device |
US10492725B2 (en) * | 2017-10-29 | 2019-12-03 | Orlando Efrain Abreu Oramas | Method and system of facilitating monitoring of an individual based on at least one wearable device |
US11533272B1 (en) * | 2018-02-06 | 2022-12-20 | Amesite Inc. | Computer based education methods and apparatus |
US20190266427A1 (en) * | 2018-02-23 | 2019-08-29 | Samsung Electronics Co., Ltd | Method of biometric authenticating using plurality of camera with different field of view and electronic apparatus thereof |
US10867202B2 (en) * | 2018-02-23 | 2020-12-15 | Samsung Electronics Co., Ltd. | Method of biometric authenticating using plurality of camera with different field of view and electronic apparatus thereof |
CN112262373A (en) * | 2018-06-26 | 2021-01-22 | 苹果公司 | View-based breakpoints |
US11861145B2 (en) | 2018-07-17 | 2024-01-02 | Methodical Mind, Llc | Graphical user interface system |
WO2020048778A1 (en) * | 2018-09-04 | 2020-03-12 | Robert Bosch Gmbh | Method for controlling a multimedia device, and computer program and device therefor |
EP3648069A1 (en) * | 2018-10-29 | 2020-05-06 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for selling commodity, vending machine and storage medium |
US11501299B2 (en) | 2018-10-29 | 2022-11-15 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method for selling commodity, vending machine and storage medium |
US11271745B2 (en) | 2019-03-19 | 2022-03-08 | Advanced New Technologies Co., Ltd. | Method and system for operating internet of things device |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
CN114223194A (en) * | 2019-08-06 | 2022-03-22 | 爱尔康公司 | Scene camera system and method for vitreoretinal surgery |
CN111091595A (en) * | 2019-12-23 | 2020-05-01 | 吉林省广播电视研究所(吉林省广播电视局科技信息中心) | Strabismus three-dimensional mapping method and mapping system |
US11706656B2 (en) | 2020-06-29 | 2023-07-18 | Qualcomm Incorporated | Downlink data prioritization for time-sensitive applications |
WO2022015812A1 (en) * | 2020-07-14 | 2022-01-20 | Surgical Theater, Inc. | System and method for four-dimensional angiography |
US11698535B2 (en) | 2020-08-14 | 2023-07-11 | Hes Ip Holdings, Llc | Systems and methods for superimposing virtual image on real-time image |
US11822089B2 (en) | 2020-08-14 | 2023-11-21 | Hes Ip Holdings, Llc | Head wearable virtual image module for superimposing virtual image on real-time image |
US11774759B2 (en) | 2020-09-03 | 2023-10-03 | Hes Ip Holdings, Llc | Systems and methods for improving binocular vision |
US11953689B2 (en) | 2020-09-30 | 2024-04-09 | Hes Ip Holdings, Llc | Virtual image display system for virtual reality and augmented reality devices |
US11811513B2 (en) * | 2020-12-04 | 2023-11-07 | Capital One Services, Llc | Methods and systems for managing multiple content delivery networks |
US11838419B2 (en) | 2021-01-15 | 2023-12-05 | Delta Electronics, Inc. | Method and system for monitoring industrial devices |
WO2022182916A1 (en) * | 2021-02-24 | 2022-09-01 | Lifebrand, Llc | System and method for determining the impact of a social media post across multiple social media platforms |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2015255652B2 (en) | Systems and methods for using eye signals with secure mobile communications | |
US11704939B2 (en) | Liveness detection | |
Zhang et al. | Continuous authentication using eye movement response of implicit visual stimuli | |
EP3284016B1 (en) | Authentication of a user of a device | |
Shrestha et al. | An offensive and defensive exposition of wearable computing | |
John et al. | The security-utility trade-off for iris authentication and eye animation for social virtual avatars | |
CN110866230B (en) | Authenticated device assisted user authentication | |
CN115427919A (en) | Physical companion device for use with augmented reality system | |
KR102132613B1 (en) | User authentication method using biometrics technology and authentication device | |
WO2023164268A1 (en) | Devices, methods, and graphical user interfaces for authorizing a secure operation | |
Sellahewa et al. | Biometric Authentication for Wearables | |
Sireesha et al. | A survey on gaze estimation techniques | |
US20230161411A1 (en) | Control device and control method | |
Sluganovic | Security of mixed reality systems: authenticating users, devices, and data | |
Alt et al. | Human-centered Behavioral and Physiological Security | |
US20230273985A1 (en) | Devices, methods, and graphical user interfaces for authorizing a secure operation | |
US20230418372A1 (en) | Gaze behavior detection | |
Nguyen | Security from Implicit Information | |
Li | Empowering Security and Privacy-Preserving Interactions for Smart Device Users | |
Alharbi | An Authentication Framework for Wearable Devices | |
KR20200093392A (en) | An apparatus related to user identification, authentication, encryption using biometrics technology and method for operation the same | |
NZ736861A (en) | Augmented reality systems and methods for tracking biometric data | |
NZ736861B2 (en) | Augmented reality systems and methods for tracking biometric data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EYEFLUENCE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARGGRAFF, LEWIS JAMES;DRAKE, ELIOT FRANCIS;PUBLICOVER, NELSON GEORGE;SIGNING DATES FROM 20160819 TO 20160826;REEL/FRAME:039569/0330 |
|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EYEFLUENCE, INC.;REEL/FRAME:041160/0867 Effective date: 20170127 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044129/0001 Effective date: 20170929 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |