US20140218281A1 - Systems and methods for eye gaze determination - Google Patents
Systems and methods for eye gaze determination Download PDFInfo
- Publication number
- US20140218281A1 US20140218281A1 US14/099,900 US201314099900A US2014218281A1 US 20140218281 A1 US20140218281 A1 US 20140218281A1 US 201314099900 A US201314099900 A US 201314099900A US 2014218281 A1 US2014218281 A1 US 2014218281A1
- Authority
- US
- United States
- Prior art keywords
- user
- eye
- wearable device
- camera
- endo
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/38—Releasing-devices separate from shutter
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2213/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B2213/02—Viewfinders
- G03B2213/025—Sightline detection
Abstract
Devices and methods are provided for eye and gaze tracking determination. In one embodiment, a method for compensating for movement of a wearable eye tracking device relative to a user's eye is provided that includes wearing a wearable device on a user's head such that one or more endo-cameras are positioned to acquire images of one or both of the user's eyes, and an exo-camera is positioned to acquire images of the user's surroundings; calculating the location of features in a user's eye that cannot be directly observed from images of the eye acquired by an endo-camera; and spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment.
Description
- This application claims benefit of co-pending provisional applications Ser. Nos. 61/734,354, 61/734,294, and 61/734,342, all filed Dec. 6, 2012. This application is also related to applications Ser. Nos. 12/715,177, filed Mar. 1, 2010, 13/290,948, filed Nov. 7, 2011, and U.S. Pat. No. 6,541,081. The entire disclosures of these references are expressly incorporated by reference herein.
- The U.S. Government may have a paid-up license in this invention and the right in limited circumstances to require the patent owner to license others on reasonable terms as provided for by the terms of Department of Defense (US Army) Contract No. W81XWH-05-C-0045, U.S. Department of Defense Congressional Research Initiatives No. W81XWH-06-2-0037, W81XWH-09-2-0141, and W81XWH-11-2-0156; and U.S. Department of Transportation Congressional Research Initiative Agreement Award No. DTNH 22-05-H-01424.
- The present invention relates generally to systems and methods for eye tracking that are implemented for gaze determination, e.g., determining locations in space or object(s) being viewed by one or both eyes. In particular, the gaze-determination systems and methods herein may enable point-of-gaze determination in a wearable device without the need for head-tracking after calibration.
- This systems and methods herein relate to gaze tracking using a wearable eye-tracking device that utilizes head-pose estimation to improve gaze accuracy. The use of head-tracking allows the system to know the user's head position in relation to the monitor. This enables the user to accurately interact with an electronic display or other monitor (e.g., control a pointer) using his/her gaze.
- Many wearable eye-tracking devices do not include head pose estimation. However, minor shifts in head pose can introduce ambiguity in eye trackers that use the eye visual axis only when determining the gaze vector. Knowledge of the head pose can extend the range of accuracy of a gaze-tracking system.
- The present invention is directed to systems and methods for eye tracking that are implemented for gaze determination, e.g., determining locations in space or object(s) being viewed by one or both eyes. In particular, the gaze-determination systems and methods herein may enable point-of-gaze determination in a wearable device without the need for head-tracking after calibration.
- In accordance with an exemplary embodiment, a method is provided for eye tracking that includes one or more steps, such as calibrating a wearable device before the wearable device is worn by a user; placing the wearable device on a user's head adjacent one or both of the user's eyes; calibrating the wearable device after placing the wearable device on the user's head; detecting at least one eye feature of a first eye of the user's eyes; performing a compensation algorithm; and calculating a gaze direction of the user.
- In accordance with another embodiment, a system is provided for eye tracking that includes a wearable device configured to be worn on a user's head; an exo-camera on the wearable device configured to provide images of a user's surroundings when the wearable device is worn by the user; an endo-camera on the wearable device configured to provide images of a first eye of the user when the wearable device is worn by the user; and one or more processors configured for one or more of calibrating the wearable device before the wearable device is worn by a user; calibrating the wearable device after placing the wearable device on the user's head; detecting at least one eye feature of a first eye of the user's eyes; performing a compensation algorithm; and calculating a gaze direction of the user.
- In accordance with still another embodiment, a method is provided for compensating for movement of a wearable eye tracking device relative to a user's eye that includes wearing a wearable device on a user's head such that one or more endo-cameras are positioned to acquire images of one or both of the user's eyes, and an exo-camera is positioned to acquire images of the user's surroundings; calculating the location of features in a user's eye that cannot be directly observed from images of the eye acquired by an endo-camera; and spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment.
- Other aspects and features of the present invention will become apparent from consideration of the following description taken in conjunction with the accompanying drawings.
- The invention is best understood from the following detailed description when read in conjunction with the accompanying drawings. It will be appreciated that the exemplary apparatus shown in the drawings are not necessarily drawn to scale, with emphasis instead being placed on illustrating the various aspects and features of the illustrated embodiments.
-
FIGS. 1A and 1B are perspective and back views, respectively, of an exemplary embodiment of a wearable gaze tracking device. -
FIG. 2 is a flowchart showing an exemplary method for gaze tracking using a wearable device, such as that shown inFIGS. 1A and 1B . -
FIG. 3 is a flowchart showing an exemplary method for gaze mapping that may be included in the method shown inFIG. 2 . -
FIG. 4 is a flowchart showing an exemplary method for pupil detection that may be included in the method shown inFIG. 2 . -
FIGS. 5 and 6 are schematic representations showing a projected pupil point on a virtual plane after normalization and denormalization using a method, such as that shown inFIG. 2 . - The present invention may provide apparatus, systems, and methods for head tracking and gaze tracking that include one or more of the following features:
-
- gaze tracking in a system that allows unrestricted movement of the head;
- gaze tracking in a system that is robust to small shifts in frame position relative to the face for a given user;
- gaze point registration with scene image with or without head tracking;
- the storage of the user's calibration data for use with a single headset at a later time
- One of the hurdles to accurate gaze-mapping in a mobile wearable eye-tracking device is finding a user-friendly method to determine head pose information. In many cases, a user is comfortable with a short user-specific calibration. The main advantage of the gaze determination method disclosed herein is that point-of-regard may be maintained with or without head tracking after calibration. This is accomplished by estimating the point in space where the user is looking and projecting it onto the scene image. This allows for gaze determination in a plethora of environments not restricted to a computer desk.
- Turning to the drawings,
FIGS. 1A and 1B show an exemplary embodiment of a wearable gaze-tracking device 10 that includes awearable device 12, e.g., a frame for glasses (as shown), or a mask, a headset, a helmet, and the like that is configured to be worn on a users head (not shown), an exo-camera 20 (mounted on the device to image the user's surroundings), one or more endo-cameras 30 (mounted on the device to image one or more both of the user's eyes). In addition, thedevice 10 may include one or more light sources, processors, memory, and the like (not shown) coupled to other components for operating thedevice 10 and/or performing the various functions described herein. Exemplary components, e.g., wearable devices, cameras, light sources, processors, communication interfaces, and the like, that may be included in thedevice 10 are disclosed in U.S. Pat. Nos. 6,541,081 and 7,488,294, and U.S. Publication Nos. 2011/0211056 and 2013/0114850, the entire disclosures of which are expressly incorporated by reference herein. - Turning to
FIG. 2 , an exemplary method for gaze mapping and determination is shown. Although the steps are shown in an exemplary sequence, the steps may optionally be formed in a different order than that shown. Generally, the method includes a) acalibration step 110 in which the wearable device (e.g., device 10) is calibrated, amarker detection step 112, a pupil detection step 114, aglint detection step 116, anormalization step 118, auser calibration step 120, agaze mapping step 122, and a three-dimensional (3D) point-of-regard (POR)step 124. Instep 112, head pose estimation, typically operates substantially continuously. Once the user has placed the device upon their head or face, gaze determination (steps 114-124), includinguser calibration step 120, generally begins with i) pupil detection 114, and ii) glint location (identifying glints reflected off of one or more both eyes acquired by the endo-camera(s) 30), where i) and ii) may also be performed in reverse order (glint detection before pupil detection). The camera-to-camera calibration steps 110 (calibrating the endo-camera(s) 30 and exo-camera 20) is generally performed prior to the user placing the wearable device on their face, e.g., as described below. - Illumination Source Calibration Step: The first step in calibrating the glint locations in endo-camera images with the light source locations on the wearable device is to acquire a set of perspective images with a secondary reflective surface and light source(s). For example, images of a mirror placed near the working distance of the camera may be acquired, where the mirror's edges are surrounded by LEDs and the mirror is placed in front of the camera such that the glint-LEDs may be seen in the image. The second step is to use a software program to mark and extract the positions of the light sources surrounding the mirror and the reflections in the mirror of the glint-generating light sources on the wearable device. The next step is to determine the homography between the image and the plane of the reflective surface. The aforementioned homography is subsequently applied to the glint light source in the image plane to get the three-dimensional (3D) point corresponding to the light source on the reflective surface. With the 3D locations in space, the ray originating at the light source that generated the glint on the reflective surface may be determined. These steps are repeated for each of the perspective images. The intersection of the calculated ray vectors is determined for each glint source for each perspective image acquired.
- Camera-Camera Calibration Step: Two standard checkerboards, and/or other known geometric pattern, are positioned such that one pattern substantially fills the field of view of each of the exo-
camera 20 and the endo-camera(s) 30, e.g., positioned at a near optimal working distance of the respective camera, i.e., the object is at near best focus. The position of the checkerboards remains substantially fixed during camera-to-camera calibration. The wearable device is moved between the patterns, while several sets of images with the patterns in full view are acquired from both the endo-camera(s) 30 and exo-camera 20 (eye and scene camera, respectively). Each set of images yields a set of 3 equations. Multiple sets of images yield an overdetermined matrix of the form Ax=B. The matrix equation may be solved with SVD to get the camera-to-camera transformation. - In addition, the
calibration step 110 may then include a User-Specific Calibration. In this step, codes displayed on the monitor in the exo-camera images are registered with an established monitor plane. This provides an estimate of head-pose at each calibration and test point in the user's calibration session. The codes may come in the form of a variety of patterns comprising contrasting geometric phenomenon. The patterns may be displayed on the monitor, constructed of other materials and attached to the monitor, a series of light sources in pattern around the monitor, and the like. Additionally, head pose may be estimated using an accelerometer, MEMS device, or other orientation sensor. In the past, accelerometers were bulky, but have significantly been reduced in their overall footprint with the incorporation of MEMS technology. - In an exemplary embodiment, user-specific calibration may be performed with mapping techniques, wherein mapping refers to a mathematical function. The function takes as a variable raw data and evaluates to calibrated points. For example, a polynomial fit is applied to an entire space, and an output value for any point within that space is determined by the function.
- In another exemplary embodiment, user-specific calibration may be performed with interpolation. While mapping covers an entire space of interest, interpolation is performed in a piecewise fashion on specific subregions and localized data. For example, the entire space may be subdivided into four subregions, and linear fits may be applied to each of those subregions by using a weighted average of the corner points of each region. If the number of subregions is increased, the interpolation approaches the polynomial fit of the prior exemplary embodiment.
- In another exemplary embodiment, user-specific calibration may be performed with machine learning. While machine learning may appear to behave like mathematical functions as applied to the mapping method, machine learning techniques may internally represent highly irregular mappings that would otherwise require extremely complex mathematical equations like discontinuous functions and high-order polynomials. Machine learning techniques also make no assumptions about the types of equations they will model, meaning that the training procedure is identical regardless of the type of mapping it will ultimately represent. This eliminates, among other things, the need for the author to understand the relationship between inputs and outputs. They may also execute very quickly making them useful in high performance applications.
- Head-Pose Estimation Step: In the head-
pose estimation step 112 inFIG. 2 , each eye image in the video sequence is first pre-processed and has a threshold applied to acquire marker candidates as contours. The candidates are evaluated for contour size, roundness, and corner count. The final candidates are extracted and matched to marker codes stored within the system directories. The user's head pose and orientation are calculated relative to the marker corners. - The following steps may occur during and/or after the user calibration step:
- Pupil Detection Step: An exemplary embodiment of the pupil detection step 114 in
FIG. 2 is shown inFIG. 4 . One potential method for pupil detection is to first apply a blob detector, e.g., MSER, to a downsized and thresholded image to identify regions similar in features to a pupil from endo-camera images. The blob detector may, for example, be constrained to find circularity (e.g., eccentricity, low order moments, and the like) and stable regions that resemble a pupil. After a suitable region of interest is identified, an algorithm such as Dense Stage I Starburst may be applied to find pupil edges, while ignoring glints. Finally, an ellipse is fitted to the pupil edge, for example using methods such as Ransac or Hough transforms. Exemplary methods are disclosed in Chinese Publication No. CN102831610 and U.S. Pat. No. 7,110,568, the entire disclosures of which are expressly incorporated by reference herein. - Glint Detection Step: For the
glint detection step 116 ofFIG. 2 , in an exemplary embodiment, first, an adaptive threshold is applied to a subwindow of the full resolution image determined, where the threshold value is based on mean and median intensity of iris. The image contrast is enhanced. Then, the glints are segmented out of the image through a combination of the threshold and edge detection. Dilation and erosion filters are applied to segmented glints to remove noise. The contours of the glint candidates are determined. The aforementioned glint candidates are screened for predetermined parameters of the actual glint, e.g., size constraints, oddly shaped, eccentricity, and the like. The actual glints are selected from a final pool of candidates based on geometric constraints. - Cornea Center Calculation Step: Next, the
normalization step 118 ofFIG. 2 may be performed. The location of the light sources on thedevice 10 that produce the glints reflected at the anterior corneal surface and the eye tracking camera intrinsic parameters are known. The cornea is assumed to be substantially spherical. Each glint establishes a trajectory of possible cornea center positions in three dimensional (3D) space. Each trajectory pair generates a 3D location on which the cornea center resides. The corneal center coordinates are calculated using the aforementioned information together with a default corneal radius of curvature that matches the population average. - Once the gaze vector in the endo-camera coordinate system is obtained, it may be mapped to either a point-of-regard (POR) overlay, or the monitor plane if mouse or pointer control is required. In the case of two dimensional (2D) POR and pointer control, head pose continues to be calculated. In either scenario, accurate gaze determination with unrestricted head movement may be accomplished through proper normalization and denormalization of the endo- (toward the eye) and exo-camera spaces(outward-looking) relative to a virtual plane. The image pupil point is projected onto the virtual plane (the mapping between the endo-,exo-, and virtual coordinate spaces is determined during calibration). Then the gaze point is found by intersection of the line formed by the cornea and virtual plane point with the monitor.
- Because the frames are not fixed to the person, the user could move the frames while still looking at the same spot on the virtual plane. A processor analyzing the endo-camera images would detect that the center of the eye moved and project it to a different spot on the virtual plane. To rectify this problem, the center of the pupil is normalized. The cornea center is used as a reference point and every frame it is transformed to a specific, predetermined position. The normalized pupil position is then determined on the shifted cornea image.
- Essentially, the normalization puts the cornea in the same position in every frame of the endo-camera images, e.g., within an x-y-z reference frame. First, the normalization includes a rotation that will rotate the cornea about the origin and put it on the z-axis. This rotation is determined by restricting the rotation to a combination of rotation around the x-axis followed by rotation around the y-axis. The translation is determined by calculating the required translation to move the rotated cornea to a predetermined value on the z-axis. Because of the rotation done before translation, the translation only contains a z value.
FIG. 5 shows how the pupil position may be found on the cornea. - To determine the pupil position on the cornea, the pupil position on the image plane in 3D is retrieved and then the intersection of the line formed by the pupil on the image and origin with the non-normalized cornea is found. That point on the cornea is then normalized along with the cornea.
- Once the cornea and pupil are normalized on the cornea, the next step is to determine the normalized pupil on the image plane, e.g., at the
normalization step 118 shown inFIG. 2 . This is the intersection of the line formed by the normalized pupil on the cornea and origin with the image plane.FIG. 5 demonstrates this. - Normalization puts the cornea in a specific position in the endo-camera coordinate system. Since the cornea does not move relative to the screen, the screen moves as well. They are both fixed in space for the instance of this frame. The cameras and virtual plane are all fixed together as well as the frames. So when normalization moves the cornea into the specific position in the endo-camera coordinate system, it is functionally the same as the cornea remaining still and the coordinate system moving. The new normalized pupil center is projected onto the virtual plane but because the virtual plane moved with the endo coordinate system, the gaze point right now would be wrong. The virtual plane must now be denormalized to return it to the proper position for the gaze point, e.g., as shown in
FIG. 6 . - Normalization Step:
FIG. 3 shows an exemplary method for performing thenormalization step 118 shown inFIG. 2 . The cornea center is rotated about the origin to lie on the z-axis in the endo-camera coordinate system (eye camera coordinate system). Rotation is performed about the x-axis first, then the y-axis. The rotated cornea position is translated to a constant, predefined position along the z-axis. The next step is to transform pupil center data from image pixels to image plane in units of millimeters. Now the point where the line intersecting the endo-camera center and the pupil center on the image plane intersects with the cornea may be determined. The cornea is assumed to be a sphere with a radius centered at the normalized cornea center position. The intersection point is endo-normalized and scaled such that it lies on the image plane, and transformed back into pixels. The normalized pupil is then projected onto a virtual plane, where the polynomial projection function is user-dependent and generated during user calibration. The display origin and normal vector are transformed to the exo-camera coordinate system (scene camera coordinate system). The next step is to transform the cornea center to exo-camera coordinates, followed by transforming the endo-normalization into the exo-camera coordinate system to obtain exo-normalization transformation. The inverse of the exo-normalization transformation is applied to the projected normalized pupil point in the exo-camera coordinate system, e.g., as shown inFIG. 6 . The intersection of the line (exo-cornea and de-normalized projected normalized pupil) with the exo-screen plane is determined. The final step is to transform the result of that intersection to the screen coordinate system of the monitor, and then to pixel to obtain gaze point on the monitor. - For practical implementation, a mobile gaze-determination system must be robust to small shifts in frame position relative to the face for a given user in addition to accommodating unrestricted head movement. Both conditions may be accomplished through proper normalization of the endo-(toward the eye) and exo-spaces(outward-looking) relative to the viewing plane.
- For 3D POR, gaze point is determined by convergence of the left and right eye gaze vectors. The information may then be relayed to the user through the mobile device as an overlay on the ex-camera (scene) video images.
- Point-of-Regard Step: Next, at
step 124 ofFIG. 2 , a 3D POR overlap may be performed. The left gaze line is defined by de-normalized projected normalized pupil and cornea in exo-camera coordinate system for left eye. The same procedure is applied to right eye. The intersection (or closest point of intersection) between the two lines is determined and then projected onto the exo-camera images. - When the point of gaze data is integrated into a more elaborate user interface with cursor control, eye movements may be used interchangeably with other input devices, e.g., that utilize hands, feet, and/or other body movements to direct computer and other control applications.
- It will be appreciated that elements or components shown with any embodiment herein are exemplary for the specific embodiment and may be used on or in combination with other embodiments disclosed herein.
- While the invention is susceptible to various modifications, and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the invention is not to be limited to the particular forms or methods disclosed, but to the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the scope of the appended claims.
Claims (9)
1. A method for eye tracking, comprising:
a) calibrating a wearable device before the wearable device is worn by a user;
b) placing the wearable device on a user's head adjacent one or both of the user's eyes;
c) calibrating the wearable device after placing the wearable device on the user's head;
d) detecting at least one eye feature of a first eye of the user's eyes;
e) performing a compensation algorithm; and
f) calculating a gaze direction of the user.
2. The method of claim 1 , wherein step c) includes at least one of:
i) identifying one or more glints reflected off one or both eyes of the user; and
ii) calibrating between an endo-camera configured to acquire images of one eye of the user and an exo-camera configured to acquire images of the user's surroundings.
3. The method of claim 1 , wherein step a) comprises computer vision methods.
4. The method of claim 2 , wherein step a) is completed after manufacturing the wearable device and before first use of the wearable device.
5. The method of claim 1 , wherein step c) comprises at least one of estimating a head pose of the user wearing the wearable device.
6. The method of claim 1 , wherein step e) comprises at least one of normalization, denormalization, and spatial transform to correct for movement between the eye and the eye tracking camera.
7. The method of claim 1 , wherein step f) comprises calculating a target region within a real or virtual surface or volume, which includes at least one of construction of a vector in space, mapping, and interpolation.
8. A system for eye tracking, comprising:
a wearable device configured to be worn on a user's head;
an exo-camera on the wearable device configured to provide images of a user's surroundings when the wearable device is worn by the user;
an endo-camera on the wearable device configured to provide images of a first eye of the user when the wearable device is worn by the user; and
one or more processors configured for:
a) calibrating a wearable device before the wearable device is worn by a user;
b) calibrating the wearable device after placing the wearable device on the user's head;
c) detecting at least one eye feature of a first eye of the user's eyes;
d) performing a compensation algorithm; and
e) calculating a gaze direction of the user.
9. A method for compensating for movement of a wearable eye tracking device relative to a user's eye, comprising:
wearing a wearable device on a user's head such that one or more endo-cameras are positioned to acquire images of one or both of the user's eyes, and an exo-camera is positioned to acquire images of the user's surroundings;
calculating the location of features in a user's eye that cannot be directly observed from images of the eye acquired by an endo-camera; and
spatially transforming camera coordinate systems of the exo- and endo-cameras to place calculated eye features in a known location and alignment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/099,900 US20140218281A1 (en) | 2012-12-06 | 2013-12-06 | Systems and methods for eye gaze determination |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261734342P | 2012-12-06 | 2012-12-06 | |
US201261734294P | 2012-12-06 | 2012-12-06 | |
US201261734354P | 2012-12-06 | 2012-12-06 | |
US14/099,900 US20140218281A1 (en) | 2012-12-06 | 2013-12-06 | Systems and methods for eye gaze determination |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140218281A1 true US20140218281A1 (en) | 2014-08-07 |
Family
ID=50884065
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/099,908 Active 2035-11-02 US10025379B2 (en) | 2012-12-06 | 2013-12-06 | Eye tracking wearable devices and methods for use |
US14/099,900 Abandoned US20140218281A1 (en) | 2012-12-06 | 2013-12-06 | Systems and methods for eye gaze determination |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/099,908 Active 2035-11-02 US10025379B2 (en) | 2012-12-06 | 2013-12-06 | Eye tracking wearable devices and methods for use |
Country Status (6)
Country | Link |
---|---|
US (2) | US10025379B2 (en) |
EP (1) | EP2929413B1 (en) |
JP (1) | JP6498606B2 (en) |
KR (1) | KR102205374B1 (en) |
CN (1) | CN104903818B (en) |
WO (1) | WO2014089542A1 (en) |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140232638A1 (en) * | 2013-02-21 | 2014-08-21 | Samsung Electronics Co., Ltd. | Method and apparatus for user interface using gaze interaction |
WO2016037120A1 (en) * | 2014-09-05 | 2016-03-10 | Vision Service Plan | Computerized replacement temple for standard eyewear |
WO2016187457A3 (en) * | 2015-05-20 | 2017-03-23 | Magic Leap, Inc. | Tilt shift iris imaging |
US20170172408A1 (en) * | 2015-11-13 | 2017-06-22 | Hennepin Healthcare System, Inc. | Method for predicting convergence disorders caused by concussion or other neuropathology |
US9704038B2 (en) | 2015-01-07 | 2017-07-11 | Microsoft Technology Licensing, Llc | Eye tracking |
US9898865B2 (en) | 2015-06-22 | 2018-02-20 | Microsoft Technology Licensing, Llc | System and method for spawning drawing surfaces |
US9910298B1 (en) | 2017-04-17 | 2018-03-06 | Vision Service Plan | Systems and methods for a computerized temple for use with eyewear |
US10082866B2 (en) | 2016-04-12 | 2018-09-25 | International Business Machines Corporation | Gaze point detection using dynamic facial reference points under varying lighting conditions |
US20180348861A1 (en) * | 2017-05-31 | 2018-12-06 | Magic Leap, Inc. | Eye tracking calibration techniques |
US10215568B2 (en) | 2015-01-30 | 2019-02-26 | Vision Service Plan | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
US20190155380A1 (en) * | 2017-11-17 | 2019-05-23 | Dolby Laboratories Licensing Corporation | Slippage Compensation in Eye Tracking |
US10310269B2 (en) | 2016-07-29 | 2019-06-04 | Essilor International | Method for virtual testing of at least one lens having a predetermined optical feature and associated device |
US10327673B2 (en) * | 2015-12-21 | 2019-06-25 | Amer Sports Digital Services Oy | Activity intensity level determination |
EP3547216A1 (en) * | 2018-03-30 | 2019-10-02 | Tobii AB | Deep learning for three dimensional (3d) gaze prediction |
WO2019185150A1 (en) * | 2018-03-29 | 2019-10-03 | Tobii Ab | Determining a gaze direction using depth information |
WO2019190561A1 (en) * | 2018-03-30 | 2019-10-03 | Tobii Ab | Deep learning for three dimensional (3d) gaze prediction |
US10433768B2 (en) | 2015-12-21 | 2019-10-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
US20200076998A1 (en) * | 2013-09-03 | 2020-03-05 | Tobii Ab | Portable eye tracking device |
US10617342B2 (en) | 2014-09-05 | 2020-04-14 | Vision Service Plan | Systems, apparatus, and methods for using a wearable device to monitor operator alertness |
US20200128902A1 (en) * | 2018-10-29 | 2020-04-30 | Holosports Corporation | Racing helmet with visual and audible information exchange |
US10722128B2 (en) | 2018-08-01 | 2020-07-28 | Vision Service Plan | Heart rate detection system and method |
US10856776B2 (en) | 2015-12-21 | 2020-12-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
WO2021164867A1 (en) * | 2020-02-19 | 2021-08-26 | Pupil Labs Gmbh | Eye tracking module and head-wearable device |
US11137820B2 (en) | 2015-12-01 | 2021-10-05 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11144107B2 (en) | 2015-12-01 | 2021-10-12 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11145272B2 (en) | 2016-10-17 | 2021-10-12 | Amer Sports Digital Services Oy | Embedded computing device |
US11159782B2 (en) * | 2016-08-03 | 2021-10-26 | Samsung Electronics Co., Ltd. | Electronic device and gaze tracking method of electronic device |
US11194161B2 (en) | 2018-02-09 | 2021-12-07 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11210299B2 (en) | 2015-12-01 | 2021-12-28 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11215457B2 (en) | 2015-12-01 | 2022-01-04 | Amer Sports Digital Services Oy | Thematic map based route optimization |
US11284807B2 (en) | 2015-12-21 | 2022-03-29 | Amer Sports Digital Services Oy | Engaging exercising devices with a mobile device |
US11301677B2 (en) * | 2019-06-14 | 2022-04-12 | Tobil AB | Deep learning for three dimensional (3D) gaze prediction |
US11393251B2 (en) | 2018-02-09 | 2022-07-19 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US20220253135A1 (en) * | 2019-07-16 | 2022-08-11 | Magic Leap, Inc. | Eye center of rotation determination with one or more eye tracking cameras |
US20220269341A1 (en) * | 2021-02-19 | 2022-08-25 | Beijing Boe Optoelectronics Technology Co., Ltd. | Sight positioning method, head-mounted display device, computer device and computer-readable storage medium |
US11537202B2 (en) | 2019-01-16 | 2022-12-27 | Pupil Labs Gmbh | Methods for generating calibration data for head-wearable devices and eye tracking system |
US11541280B2 (en) | 2015-12-21 | 2023-01-03 | Suunto Oy | Apparatus and exercising device |
US11556741B2 (en) | 2018-02-09 | 2023-01-17 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
US11587484B2 (en) | 2015-12-21 | 2023-02-21 | Suunto Oy | Method for controlling a display |
US11607144B2 (en) | 2015-12-21 | 2023-03-21 | Suunto Oy | Sensor based context management |
US11676422B2 (en) | 2019-06-05 | 2023-06-13 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11703938B2 (en) | 2016-10-17 | 2023-07-18 | Suunto Oy | Embedded computing device |
US11838990B2 (en) | 2015-12-21 | 2023-12-05 | Suunto Oy | Communicating sensor data in wireless communication systems |
US11918375B2 (en) | 2014-09-05 | 2024-03-05 | Beijing Zitiao Network Technology Co., Ltd. | Wearable environmental pollution monitor computer apparatus, systems, and related methods |
Families Citing this family (136)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9158116B1 (en) | 2014-04-25 | 2015-10-13 | Osterhout Group, Inc. | Temple and ear horn assembly for headworn computer |
US9366867B2 (en) | 2014-07-08 | 2016-06-14 | Osterhout Group, Inc. | Optical systems for see-through displays |
US9400390B2 (en) | 2014-01-24 | 2016-07-26 | Osterhout Group, Inc. | Peripheral lighting for head worn computing |
US9715112B2 (en) | 2014-01-21 | 2017-07-25 | Osterhout Group, Inc. | Suppression of stray light in head worn computing |
WO2014089542A1 (en) | 2012-12-06 | 2014-06-12 | Eyefluence, Inc. | Eye tracking wearable devices and methods for use |
CN105164576B (en) * | 2013-04-25 | 2019-07-05 | 依视路国际公司 | The method that the wear-type electro-optic device for adapting to wearer is controlled |
KR102094965B1 (en) * | 2013-12-30 | 2020-03-31 | 삼성디스플레이 주식회사 | Awareness glasses, car mirror unit and display apparatus |
US10024679B2 (en) | 2014-01-14 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9578307B2 (en) * | 2014-01-14 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10248856B2 (en) | 2014-01-14 | 2019-04-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US9915545B2 (en) * | 2014-01-14 | 2018-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10360907B2 (en) | 2014-01-14 | 2019-07-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Smart necklace with stereo vision and onboard processing |
US10254856B2 (en) | 2014-01-17 | 2019-04-09 | Osterhout Group, Inc. | External user interface for head worn computing |
US9594246B2 (en) | 2014-01-21 | 2017-03-14 | Osterhout Group, Inc. | See-through computer display systems |
US10649220B2 (en) | 2014-06-09 | 2020-05-12 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US20150228119A1 (en) | 2014-02-11 | 2015-08-13 | Osterhout Group, Inc. | Spatial location presentation in head worn computing |
US9841599B2 (en) | 2014-06-05 | 2017-12-12 | Osterhout Group, Inc. | Optical configurations for head-worn see-through displays |
US9829707B2 (en) | 2014-08-12 | 2017-11-28 | Osterhout Group, Inc. | Measuring content brightness in head worn computing |
US10684687B2 (en) | 2014-12-03 | 2020-06-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11487110B2 (en) | 2014-01-21 | 2022-11-01 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US20160018653A1 (en) | 2014-01-24 | 2016-01-21 | Osterhout Group, Inc. | See-through computer display systems |
US9846308B2 (en) | 2014-01-24 | 2017-12-19 | Osterhout Group, Inc. | Haptic systems for head-worn computers |
GB2526515A (en) * | 2014-03-25 | 2015-12-02 | Jaguar Land Rover Ltd | Image capture system |
US10853589B2 (en) | 2014-04-25 | 2020-12-01 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US9651787B2 (en) | 2014-04-25 | 2017-05-16 | Osterhout Group, Inc. | Speaker assembly for headworn computer |
US20160137312A1 (en) | 2014-05-06 | 2016-05-19 | Osterhout Group, Inc. | Unmanned aerial vehicle launch system |
KR102173699B1 (en) | 2014-05-09 | 2020-11-03 | 아이플루언스, 인크. | Systems and methods for discerning eye signals and continuous biometric identification |
US10564714B2 (en) | 2014-05-09 | 2020-02-18 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
US10663740B2 (en) | 2014-06-09 | 2020-05-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US9818114B2 (en) | 2014-08-11 | 2017-11-14 | Mastercard International Incorporated | Systems and methods for performing payment card transactions using a wearable computing device |
US10024678B2 (en) | 2014-09-17 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable clip for providing social and environmental awareness |
US9922236B2 (en) | 2014-09-17 | 2018-03-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable eyeglasses for providing social and environmental awareness |
US9568603B2 (en) * | 2014-11-14 | 2017-02-14 | Microsoft Technology Licensing, Llc | Eyewear-mountable eye tracking device |
US10490102B2 (en) | 2015-02-10 | 2019-11-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for braille assistance |
US10878775B2 (en) | 2015-02-17 | 2020-12-29 | Mentor Acquisition One, Llc | See-through computer display systems |
US9972216B2 (en) | 2015-03-20 | 2018-05-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for storing and playback of information for blind users |
CN108139234B (en) | 2015-05-19 | 2021-02-05 | 奇跃公司 | Double composite light field device |
US9939644B2 (en) | 2015-06-25 | 2018-04-10 | Intel Corporation | Technologies for controlling vision correction of a wearable computing device |
US10139966B2 (en) | 2015-07-22 | 2018-11-27 | Osterhout Group, Inc. | External user interface for head worn computing |
US11003246B2 (en) | 2015-07-22 | 2021-05-11 | Mentor Acquisition One, Llc | External user interface for head worn computing |
JP2017060078A (en) * | 2015-09-18 | 2017-03-23 | カシオ計算機株式会社 | Image recording system, user attachment device, imaging apparatus, image processing system, image recording method, and program |
US10618521B2 (en) * | 2015-09-21 | 2020-04-14 | Ford Global Technologies, Llc | Wearable in-vehicle eye gaze detection |
WO2017116662A1 (en) * | 2015-12-28 | 2017-07-06 | Artilux Corporation | Eye gesture tracking |
WO2017127571A1 (en) * | 2016-01-19 | 2017-07-27 | Magic Leap, Inc. | Augmented reality systems and methods utilizing reflections |
US10850116B2 (en) | 2016-12-30 | 2020-12-01 | Mentor Acquisition One, Llc | Head-worn therapy device |
US10591728B2 (en) | 2016-03-02 | 2020-03-17 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US10667981B2 (en) | 2016-02-29 | 2020-06-02 | Mentor Acquisition One, Llc | Reading assistance system for visually impaired |
US9826299B1 (en) | 2016-08-22 | 2017-11-21 | Osterhout Group, Inc. | Speaker systems for head-worn computer systems |
US9880441B1 (en) | 2016-09-08 | 2018-01-30 | Osterhout Group, Inc. | Electrochromic systems for head-worn computer systems |
CN108701227B (en) | 2016-03-07 | 2022-01-14 | 奇跃公司 | Blue light modulation for biosafety |
US10024680B2 (en) | 2016-03-11 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Step based guidance system |
US10466491B2 (en) | 2016-06-01 | 2019-11-05 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US9958275B2 (en) | 2016-05-31 | 2018-05-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for wearable smart device communications |
US10561519B2 (en) | 2016-07-20 | 2020-02-18 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device having a curved back to reduce pressure on vertebrae |
WO2018023245A1 (en) * | 2016-07-31 | 2018-02-08 | 杨洁 | Method for automatic photographing and transmission and eyeglasses |
WO2018023246A1 (en) * | 2016-07-31 | 2018-02-08 | 杨洁 | Method for pushing information while photographing and eyeglasses |
WO2018023242A1 (en) * | 2016-07-31 | 2018-02-08 | 杨洁 | Automatic photographing method and glasses |
WO2018023247A1 (en) * | 2016-07-31 | 2018-02-08 | 杨洁 | Data acquisition method for automatic photographing and technical transmission and glasses |
WO2018023243A1 (en) * | 2016-07-31 | 2018-02-08 | 杨洁 | Automatic-photographing patent-information push method and eyeglasses |
US10268268B1 (en) | 2016-09-02 | 2019-04-23 | Facebook Technologies, Llc | Waveguide integrated eye tracking |
JP2018061622A (en) * | 2016-10-11 | 2018-04-19 | オプトス ピーエルシー | Fundus observation apparatus |
US10432851B2 (en) | 2016-10-28 | 2019-10-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable computing device for detecting photography |
USD827143S1 (en) | 2016-11-07 | 2018-08-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Blind aid device |
US10032053B2 (en) * | 2016-11-07 | 2018-07-24 | Rockwell Automation Technologies, Inc. | Tag based location |
US10012505B2 (en) | 2016-11-11 | 2018-07-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable system for providing walking directions |
US10521669B2 (en) | 2016-11-14 | 2019-12-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for providing guidance or feedback to a user |
US10168531B1 (en) | 2017-01-04 | 2019-01-01 | Facebook Technologies, Llc | Lightfield waveguide integrated eye tracking |
US10485420B2 (en) * | 2017-02-17 | 2019-11-26 | Analog Devices Global Unlimited Company | Eye gaze tracking |
CN106842625B (en) * | 2017-03-03 | 2020-03-17 | 西南交通大学 | Target tracking method based on feature consensus |
US20180255250A1 (en) * | 2017-03-03 | 2018-09-06 | Microsoft Technology Licensing, Llc | Pulsed, gated infrared illuminated camera systems and processes for eye tracking in high ambient light environments |
US10977858B2 (en) | 2017-03-30 | 2021-04-13 | Magic Leap, Inc. | Centralized rendering |
CN114125661A (en) | 2017-03-30 | 2022-03-01 | 奇跃公司 | Sound reproduction system and head-mounted device |
CN117389420A (en) | 2017-04-14 | 2024-01-12 | 奇跃公司 | Multimode eye tracking |
US20180336772A1 (en) * | 2017-05-19 | 2018-11-22 | Hcl Technologies Limited | System and method for alerting a user within a warehouse |
US11079522B1 (en) | 2017-05-31 | 2021-08-03 | Magic Leap, Inc. | Fiducial design |
US10810773B2 (en) * | 2017-06-14 | 2020-10-20 | Dell Products, L.P. | Headset display control based upon a user's pupil state |
WO2019011436A1 (en) * | 2017-07-13 | 2019-01-17 | Huawei Technologies Co., Ltd. | Dual mode headset |
US10422995B2 (en) | 2017-07-24 | 2019-09-24 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US10578869B2 (en) | 2017-07-24 | 2020-03-03 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US11409105B2 (en) | 2017-07-24 | 2022-08-09 | Mentor Acquisition One, Llc | See-through computer display systems |
US10969584B2 (en) | 2017-08-04 | 2021-04-06 | Mentor Acquisition One, Llc | Image expansion optic for head-worn computer |
IL307592A (en) | 2017-10-17 | 2023-12-01 | Magic Leap Inc | Mixed reality spatial audio |
FI20175960A1 (en) * | 2017-10-30 | 2019-05-01 | Univ Of Eastern Finland | Method and apparatus for gaze detection |
USD849822S1 (en) * | 2017-12-29 | 2019-05-28 | Aira Tech Corp. | Smart glasses for interactive use cases |
US11212636B2 (en) | 2018-02-15 | 2021-12-28 | Magic Leap, Inc. | Dual listener positions for mixed reality |
JP7313361B2 (en) | 2018-02-15 | 2023-07-24 | マジック リープ, インコーポレイテッド | mixed reality instrument |
IL305799A (en) | 2018-02-15 | 2023-11-01 | Magic Leap Inc | Mixed reality virtual reverberation |
US10281085B1 (en) * | 2018-03-30 | 2019-05-07 | Faspro Systems Co., Ltd. | Head-mounted wireless photographic apparatus |
CN110557552A (en) * | 2018-05-31 | 2019-12-10 | 联想企业解决方案(新加坡)有限公司 | Portable image acquisition equipment |
US10667072B2 (en) | 2018-06-12 | 2020-05-26 | Magic Leap, Inc. | Efficient rendering of virtual soundfields |
CN110596889A (en) * | 2018-06-13 | 2019-12-20 | 托比股份公司 | Eye tracking device and method of manufacturing an eye tracking device |
EP3807872B1 (en) | 2018-06-14 | 2024-04-10 | Magic Leap, Inc. | Reverberation gain normalization |
US10602292B2 (en) | 2018-06-14 | 2020-03-24 | Magic Leap, Inc. | Methods and systems for audio signal filtering |
EP3808108A4 (en) | 2018-06-18 | 2022-04-13 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
JP7419270B2 (en) | 2018-06-21 | 2024-01-22 | マジック リープ, インコーポレイテッド | Wearable system speech processing |
CN112639731A (en) | 2018-07-24 | 2021-04-09 | 奇跃公司 | Application sharing |
WO2020023721A1 (en) * | 2018-07-25 | 2020-01-30 | Natus Medical Incorporated | Real-time removal of ir led reflections from an image |
KR102574995B1 (en) * | 2018-09-21 | 2023-09-06 | 돌비 레버러토리즈 라이쎈싱 코오포레이션 | Integrating components into the optical stack of head-mounted devices |
EP3857291A4 (en) | 2018-09-25 | 2021-11-24 | Magic Leap, Inc. | Systems and methods for augmented reality |
CN116320907A (en) | 2018-10-05 | 2023-06-23 | 奇跃公司 | Near field audio rendering |
US10887720B2 (en) | 2018-10-05 | 2021-01-05 | Magic Leap, Inc. | Emphasis for audio spatialization |
JP7448530B2 (en) | 2018-10-09 | 2024-03-12 | マジック リープ, インコーポレイテッド | Systems and methods for virtual and augmented reality |
JP2022505662A (en) | 2018-10-24 | 2022-01-14 | マジック リープ, インコーポレイテッド | Asynchronous ASIC |
TWI699671B (en) * | 2018-12-12 | 2020-07-21 | 國立臺灣大學 | Method for reducing operation on eye-tracking and eye-tracking device thereof |
US11221814B2 (en) | 2018-12-27 | 2022-01-11 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
CN113748462A (en) | 2019-03-01 | 2021-12-03 | 奇跃公司 | Determining input for a speech processing engine |
US10877268B2 (en) * | 2019-04-16 | 2020-12-29 | Facebook Technologies, Llc | Active control of in-field light sources of a head mounted display |
EP3966624A4 (en) | 2019-05-10 | 2023-01-11 | Twenty Twenty Therapeutics LLC | Natural physio-optical user interface for intraocular microdisplay |
EP3980880A4 (en) | 2019-06-06 | 2022-11-23 | Magic Leap, Inc. | Photoreal character configurations for spatial computing |
US11704874B2 (en) | 2019-08-07 | 2023-07-18 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
US11328740B2 (en) | 2019-08-07 | 2022-05-10 | Magic Leap, Inc. | Voice onset detection |
CN114830182A (en) | 2019-10-18 | 2022-07-29 | 奇跃公司 | Gravity estimation and bundle adjustment of visual inertial odometer |
EP4049117A4 (en) | 2019-10-25 | 2022-12-14 | Magic Leap, Inc. | Non-uniform stereo rendering |
EP4049466A4 (en) | 2019-10-25 | 2022-12-28 | Magic Leap, Inc. | Reverberation fingerprint estimation |
US11959997B2 (en) | 2019-11-22 | 2024-04-16 | Magic Leap, Inc. | System and method for tracking a wearable device |
WO2021113782A1 (en) | 2019-12-04 | 2021-06-10 | Magic Leap, Inc. | Variable-pitch color emitting display |
US11627430B2 (en) | 2019-12-06 | 2023-04-11 | Magic Leap, Inc. | Environment acoustics persistence |
EP4073689A4 (en) | 2019-12-09 | 2022-12-14 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11337023B2 (en) | 2019-12-20 | 2022-05-17 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
JP2023514573A (en) | 2020-02-14 | 2023-04-06 | マジック リープ, インコーポレイテッド | tool bridge |
JP2023514572A (en) | 2020-02-14 | 2023-04-06 | マジック リープ, インコーポレイテッド | session manager |
US11778410B2 (en) | 2020-02-14 | 2023-10-03 | Magic Leap, Inc. | Delayed audio following |
EP4104002A4 (en) | 2020-02-14 | 2023-08-09 | Magic Leap, Inc. | 3d object annotation |
JP2023513746A (en) | 2020-02-14 | 2023-04-03 | マジック リープ, インコーポレイテッド | Multi-application audio rendering |
WO2021178454A1 (en) | 2020-03-02 | 2021-09-10 | Magic Leap, Inc. | Immersive audio platform |
US11917384B2 (en) | 2020-03-27 | 2024-02-27 | Magic Leap, Inc. | Method of waking a device using spoken voice commands |
US11561613B2 (en) | 2020-05-29 | 2023-01-24 | Magic Leap, Inc. | Determining angular acceleration |
EP4158908A4 (en) | 2020-05-29 | 2023-11-29 | Magic Leap, Inc. | Surface appropriate collisions |
US20230122300A1 (en) * | 2021-10-14 | 2023-04-20 | Microsoft Technology Licensing, Llc | Eye-tracking waveguides |
US11592899B1 (en) | 2021-10-28 | 2023-02-28 | Tectus Corporation | Button activation within an eye-controlled user interface |
US11619994B1 (en) | 2022-01-14 | 2023-04-04 | Tectus Corporation | Control of an electronic contact lens using pitch-based eye gestures |
US11874961B2 (en) | 2022-05-09 | 2024-01-16 | Tectus Corporation | Managing display of an icon in an eye tracking augmented reality device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
US20130201291A1 (en) * | 2012-02-08 | 2013-08-08 | Microsoft Corporation | Head pose tracking using a depth camera |
US20130242056A1 (en) * | 2012-03-14 | 2013-09-19 | Rod G. Fleck | Imaging structure emitter calibration |
US20130304479A1 (en) * | 2012-05-08 | 2013-11-14 | Google Inc. | Sustained Eye Gaze for Determining Intent to Interact |
US8942419B1 (en) * | 2012-01-06 | 2015-01-27 | Google Inc. | Position estimation using predetermined patterns of light sources |
Family Cites Families (110)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3863243A (en) | 1972-01-19 | 1975-01-28 | Max Skolnick | Sleep inhibiting alarm |
US3798599A (en) | 1972-02-24 | 1974-03-19 | H Kafafian | Single input controller for a communication system |
US4359724A (en) | 1980-04-28 | 1982-11-16 | Ronald R. Zimmerman | Eyelid movement detector |
DE3777461D1 (en) | 1986-06-20 | 1992-04-23 | Matsushita Electric Ind Co Ltd | OPTICAL RECORDING AND PLAYBACK DEVICE. |
US4953111A (en) | 1987-02-12 | 1990-08-28 | Omron Tateisi Electronics Co. | Doze detector |
US4850691A (en) | 1987-03-18 | 1989-07-25 | University Of Illinois | Method and apparatus for determining pupillary response parameters |
US4815839A (en) | 1987-08-03 | 1989-03-28 | Waldorf Ronald A | Infrared/video electronystagmographic apparatus |
US5214456A (en) | 1991-10-09 | 1993-05-25 | Computed Anatomy Incorporated | Mapping of corneal topography with display of pupil perimeter |
JPH05191683A (en) * | 1992-01-14 | 1993-07-30 | Canon Inc | Photographic recorder |
US5345281A (en) | 1992-12-17 | 1994-09-06 | John Taboada | Eye tracking system and method |
US5517021A (en) | 1993-01-19 | 1996-05-14 | The Research Foundation State University Of New York | Apparatus and method for eye tracking interface |
JPH07146431A (en) * | 1993-11-25 | 1995-06-06 | Canon Inc | Camera |
TW247985B (en) * | 1993-04-22 | 1995-05-21 | Canon Kk | Image-taking apparatus |
US5402109A (en) | 1993-04-29 | 1995-03-28 | Mannik; Kallis H. | Sleep prevention device for automobile drivers |
US5481622A (en) | 1994-03-01 | 1996-01-02 | Rensselaer Polytechnic Institute | Eye tracking apparatus and method employing grayscale threshold values |
JPH07283974A (en) * | 1994-04-12 | 1995-10-27 | Canon Inc | Video camera with line of sight detector |
JPH086708A (en) | 1994-04-22 | 1996-01-12 | Canon Inc | Display device |
CA2126142A1 (en) | 1994-06-17 | 1995-12-18 | David Alexander Kahn | Visual communications apparatus |
US5469143A (en) | 1995-01-10 | 1995-11-21 | Cooper; David E. | Sleep awakening device for drivers of motor vehicles |
US5566067A (en) | 1995-03-23 | 1996-10-15 | The President And Fellows Of Harvard College | Eyelid vigilance detector system |
US5689241A (en) | 1995-04-24 | 1997-11-18 | Clarke, Sr.; James Russell | Sleep detection and driver alert apparatus |
US5570698A (en) | 1995-06-02 | 1996-11-05 | Siemens Corporate Research, Inc. | System for monitoring eyes for detecting sleep behavior |
US5682144A (en) | 1995-11-20 | 1997-10-28 | Mannik; Kallis Hans | Eye actuated sleep prevention devices and other eye controlled devices |
US6003991A (en) | 1996-02-17 | 1999-12-21 | Erik Scott Viirre | Eye examination apparatus and method for remote examination of a patient by a health professional |
US5912721A (en) | 1996-03-13 | 1999-06-15 | Kabushiki Kaisha Toshiba | Gaze detection apparatus and its method as well as information display apparatus |
US5886683A (en) | 1996-06-25 | 1999-03-23 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven information retrieval |
US6163281A (en) | 1996-08-19 | 2000-12-19 | Torch; William C. | System and method for communication using eye movement |
US5748113A (en) | 1996-08-19 | 1998-05-05 | Torch; William C. | Method and apparatus for communication |
US6246344B1 (en) | 1996-08-19 | 2001-06-12 | William C. Torch | Method and apparatus for voluntary communication |
US6542081B2 (en) | 1996-08-19 | 2003-04-01 | William C. Torch | System and method for monitoring eye movement |
US5867587A (en) | 1997-05-19 | 1999-02-02 | Northrop Grumman Corporation | Impaired operator detection and warning system employing eyeblink analysis |
AU1091099A (en) | 1997-10-16 | 1999-05-03 | Board Of Trustees Of The Leland Stanford Junior University | Method for inferring mental states from eye movements |
US6007202A (en) | 1997-10-23 | 1999-12-28 | Lasersight Technologies, Inc. | Eye illumination system and method |
DE19803158C1 (en) | 1998-01-28 | 1999-05-06 | Daimler Chrysler Ag | Arrangement for determining the state of vigilance, esp. for machinery operator or vehicle driver |
US6204828B1 (en) | 1998-03-31 | 2001-03-20 | International Business Machines Corporation | Integrated gaze/manual cursor positioning system |
US6867752B1 (en) | 1998-08-31 | 2005-03-15 | Semiconductor Energy Laboratory Co., Ltd. | Portable information processing system |
US6087941A (en) | 1998-09-01 | 2000-07-11 | Ferraz; Mark | Warning device for alerting a person falling asleep |
US6243076B1 (en) | 1998-09-01 | 2001-06-05 | Synthetic Environments, Inc. | System and method for controlling host system interface with point-of-interest data |
AUPP612998A0 (en) | 1998-09-23 | 1998-10-15 | Canon Kabushiki Kaisha | Multiview multimedia generation system |
US6526159B1 (en) | 1998-12-31 | 2003-02-25 | Intel Corporation | Eye tracking for resource and power management |
US6577329B1 (en) | 1999-02-25 | 2003-06-10 | International Business Machines Corporation | Method and system for relevance feedback through gaze tracking and ticker interfaces |
GB2348520B (en) | 1999-03-31 | 2003-11-12 | Ibm | Assisting user selection of graphical user interface elements |
US6116736A (en) | 1999-04-23 | 2000-09-12 | Neuroptics, Inc. | Pupilometer with pupil irregularity detection capability |
JP2001183735A (en) * | 1999-12-27 | 2001-07-06 | Fuji Photo Film Co Ltd | Method and device for image pickup |
JP2001281520A (en) * | 2000-03-30 | 2001-10-10 | Minolta Co Ltd | Optical device |
US6456262B1 (en) | 2000-05-09 | 2002-09-24 | Intel Corporation | Microdisplay with eye gaze detection |
US6608615B1 (en) | 2000-09-19 | 2003-08-19 | Intel Corporation | Passive gaze-driven browsing |
WO2002031581A1 (en) | 2000-10-07 | 2002-04-18 | Physoptics Opto-Electronic Gmbh | Device and method for determining the orientation of an eye |
DE10103922A1 (en) | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
US20030038754A1 (en) | 2001-08-22 | 2003-02-27 | Mikael Goldstein | Method and apparatus for gaze responsive text presentation in RSVP display |
AUPR872301A0 (en) | 2001-11-08 | 2001-11-29 | Sleep Diagnostics Pty Ltd | Alertness monitor |
US6712468B1 (en) | 2001-12-12 | 2004-03-30 | Gregory T. Edwards | Techniques for facilitating use of eye tracking data |
US6919907B2 (en) | 2002-06-20 | 2005-07-19 | International Business Machines Corporation | Anticipatory image capture for stereoscopic remote viewing with foveal priority |
US20040061680A1 (en) | 2002-07-10 | 2004-04-01 | John Taboada | Method and apparatus for computer control |
US7347551B2 (en) | 2003-02-13 | 2008-03-25 | Fergason Patent Properties, Llc | Optical system for monitoring eye movement |
US7881493B1 (en) | 2003-04-11 | 2011-02-01 | Eyetools, Inc. | Methods and apparatuses for use of eye interpretation information |
US9274598B2 (en) | 2003-08-25 | 2016-03-01 | International Business Machines Corporation | System and method for selecting and activating a target object using a combination of eye gaze and key presses |
US20050047629A1 (en) | 2003-08-25 | 2005-03-03 | International Business Machines Corporation | System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking |
US7365738B2 (en) | 2003-12-02 | 2008-04-29 | International Business Machines Corporation | Guides and indicators for eye movement monitoring systems |
JP2005252732A (en) * | 2004-03-04 | 2005-09-15 | Olympus Corp | Imaging device |
US7561143B1 (en) | 2004-03-19 | 2009-07-14 | The University of the Arts | Using gaze actions to interact with a display |
GB2412431B (en) * | 2004-03-25 | 2007-11-07 | Hewlett Packard Development Co | Self-calibration for an eye tracker |
EP1755441B1 (en) | 2004-04-01 | 2015-11-04 | Eyefluence, Inc. | Biosensors, communicators, and controllers monitoring eye movement and methods for using them |
PT1607840E (en) | 2004-06-18 | 2015-05-20 | Tobii Ab | Eye control of computer apparatus |
BRPI0609394A2 (en) | 2005-03-04 | 2010-03-30 | Sleep Diagnostics Pty Ltd | readiness measurement (alert) |
JP2006345276A (en) * | 2005-06-09 | 2006-12-21 | Fujifilm Holdings Corp | Imaging apparatus |
EP1943583B1 (en) | 2005-10-28 | 2019-04-10 | Tobii AB | Eye tracker with visual feedback |
US7429108B2 (en) | 2005-11-05 | 2008-09-30 | Outland Research, Llc | Gaze-responsive interface to enhance on-screen user reading tasks |
US7760910B2 (en) | 2005-12-12 | 2010-07-20 | Eyetools, Inc. | Evaluation of visual stimuli using existing viewing data |
US8793620B2 (en) | 2011-04-21 | 2014-07-29 | Sony Computer Entertainment Inc. | Gaze-assisted computer interface |
JP2008288767A (en) | 2007-05-16 | 2008-11-27 | Sony Corp | Information processor, method, and program |
US8462949B2 (en) | 2007-11-29 | 2013-06-11 | Oculis Labs, Inc. | Method and apparatus for secure display of visual content |
US20100045596A1 (en) | 2008-08-21 | 2010-02-25 | Sony Ericsson Mobile Communications Ab | Discreet feature highlighting |
US7850306B2 (en) | 2008-08-28 | 2010-12-14 | Nokia Corporation | Visual cognition aware display and visual data transmission architecture |
US20100245765A1 (en) * | 2008-10-28 | 2010-09-30 | Dyer Holdings, Llc | Video infrared ophthalmoscope |
US8398239B2 (en) * | 2009-03-02 | 2013-03-19 | Honeywell International Inc. | Wearable eye tracking system |
JP2010213214A (en) * | 2009-03-12 | 2010-09-24 | Brother Ind Ltd | Head-mounted display |
EP2238889B1 (en) | 2009-04-01 | 2011-10-12 | Tobii Technology AB | Adaptive camera and illuminator eyetracker |
WO2010118292A1 (en) | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
CN101943982B (en) | 2009-07-10 | 2012-12-12 | 北京大学 | Method for manipulating image based on tracked eye movements |
ES2746378T3 (en) | 2009-07-16 | 2020-03-05 | Tobii Ab | Eye detection unit using parallel data stream |
JP5613025B2 (en) * | 2009-11-18 | 2014-10-22 | パナソニック株式会社 | Gaze detection apparatus, gaze detection method, electrooculogram measurement apparatus, wearable camera, head mounted display, electronic glasses, and ophthalmologic diagnosis apparatus |
JP5679655B2 (en) * | 2009-12-24 | 2015-03-04 | レノボ・イノベーションズ・リミテッド(香港) | Portable terminal device and display control method thereof |
US9507418B2 (en) | 2010-01-21 | 2016-11-29 | Tobii Ab | Eye tracker based contextual action |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US20130314303A1 (en) * | 2010-02-28 | 2013-11-28 | Osterhout Group, Inc. | Ar glasses with user action control of and between internal and external applications with feedback |
US8890946B2 (en) | 2010-03-01 | 2014-11-18 | Eyefluence, Inc. | Systems and methods for spatially controlled scene illumination |
US8593375B2 (en) | 2010-07-23 | 2013-11-26 | Gregory A Maltz | Eye gaze user interface and method |
US8531394B2 (en) * | 2010-07-23 | 2013-09-10 | Gregory A. Maltz | Unitized, vision-controlled, wireless eyeglasses transceiver |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
EP2499960B1 (en) * | 2011-03-18 | 2015-04-22 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method for determining at least one parameter of two eyes by setting data rates and optical measuring device |
US8643680B2 (en) | 2011-04-08 | 2014-02-04 | Amazon Technologies, Inc. | Gaze-based content display |
US8885877B2 (en) | 2011-05-20 | 2014-11-11 | Eyefluence, Inc. | Systems and methods for identifying gaze tracking scene reference locations |
US8911087B2 (en) | 2011-05-20 | 2014-12-16 | Eyefluence, Inc. | Systems and methods for measuring reactions of head, eyes, eyelids and pupils |
EP3200046A1 (en) | 2011-10-27 | 2017-08-02 | Tobii Technology AB | Power management in an eye-tracking system |
US8929589B2 (en) | 2011-11-07 | 2015-01-06 | Eyefluence, Inc. | Systems and methods for high-resolution gaze tracking |
KR101891786B1 (en) | 2011-11-29 | 2018-08-27 | 삼성전자주식회사 | Operation Method For User Function based on a Eye-Tracking and Portable Device supporting the same |
US8955973B2 (en) | 2012-01-06 | 2015-02-17 | Google Inc. | Method and system for input detection using structured light projection |
US9171198B1 (en) * | 2012-04-02 | 2015-10-27 | Google Inc. | Image capture technique |
WO2013169237A1 (en) | 2012-05-09 | 2013-11-14 | Intel Corporation | Eye tracking based selective accentuation of portions of a display |
DE102012105664A1 (en) | 2012-06-28 | 2014-04-10 | Oliver Hein | Method and device for coding eye and eye tracking data |
US9189064B2 (en) | 2012-09-05 | 2015-11-17 | Apple Inc. | Delay of display event based on user gaze |
US20140092006A1 (en) | 2012-09-28 | 2014-04-03 | Joshua Boelter | Device and method for modifying rendering based on viewer focus area from eye tracking |
WO2014089542A1 (en) | 2012-12-06 | 2014-06-12 | Eyefluence, Inc. | Eye tracking wearable devices and methods for use |
WO2014111924A1 (en) | 2013-01-15 | 2014-07-24 | Poow Innovation Ltd. | Dynamic icons |
US9829971B2 (en) | 2013-01-21 | 2017-11-28 | Facebook, Inc. | Systems and methods of eye tracking control |
KR102093198B1 (en) | 2013-02-21 | 2020-03-25 | 삼성전자주식회사 | Method and apparatus for user interface using gaze interaction |
KR102175853B1 (en) | 2013-02-22 | 2020-11-06 | 삼성전자주식회사 | Method for controlling operation and an electronic device thereof |
KR20160005013A (en) | 2013-03-01 | 2016-01-13 | 토비 에이비 | Delay warp gaze interaction |
WO2015103444A1 (en) | 2013-12-31 | 2015-07-09 | Eyefluence, Inc. | Systems and methods for gaze-based media selection and editing |
-
2013
- 2013-12-06 WO PCT/US2013/073753 patent/WO2014089542A1/en active Application Filing
- 2013-12-06 US US14/099,908 patent/US10025379B2/en active Active
- 2013-12-06 EP EP13860881.5A patent/EP2929413B1/en active Active
- 2013-12-06 US US14/099,900 patent/US20140218281A1/en not_active Abandoned
- 2013-12-06 CN CN201380068249.XA patent/CN104903818B/en active Active
- 2013-12-06 KR KR1020157017231A patent/KR102205374B1/en active IP Right Grant
- 2013-12-06 JP JP2015545901A patent/JP6498606B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
US7460940B2 (en) * | 2002-10-15 | 2008-12-02 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
US8942419B1 (en) * | 2012-01-06 | 2015-01-27 | Google Inc. | Position estimation using predetermined patterns of light sources |
US20150098620A1 (en) * | 2012-01-06 | 2015-04-09 | Google Inc. | Position Estimation |
US20130201291A1 (en) * | 2012-02-08 | 2013-08-08 | Microsoft Corporation | Head pose tracking using a depth camera |
US20130242056A1 (en) * | 2012-03-14 | 2013-09-19 | Rod G. Fleck | Imaging structure emitter calibration |
US20130304479A1 (en) * | 2012-05-08 | 2013-11-14 | Google Inc. | Sustained Eye Gaze for Determining Intent to Interact |
Cited By (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140232638A1 (en) * | 2013-02-21 | 2014-08-21 | Samsung Electronics Co., Ltd. | Method and apparatus for user interface using gaze interaction |
US10324524B2 (en) * | 2013-02-21 | 2019-06-18 | Samsung Electronics Co., Ltd. | Method and apparatus for user interface using gaze interaction |
US20200076998A1 (en) * | 2013-09-03 | 2020-03-05 | Tobii Ab | Portable eye tracking device |
US10188323B2 (en) | 2014-09-05 | 2019-01-29 | Vision Service Plan | Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual |
WO2016037120A1 (en) * | 2014-09-05 | 2016-03-10 | Vision Service Plan | Computerized replacement temple for standard eyewear |
US9795324B2 (en) | 2014-09-05 | 2017-10-24 | Vision Service Plan | System for monitoring individuals as they age in place |
US10617342B2 (en) | 2014-09-05 | 2020-04-14 | Vision Service Plan | Systems, apparatus, and methods for using a wearable device to monitor operator alertness |
US10694981B2 (en) | 2014-09-05 | 2020-06-30 | Vision Service Plan | Wearable physiology monitor computer apparatus, systems, and related methods |
US10448867B2 (en) | 2014-09-05 | 2019-10-22 | Vision Service Plan | Wearable gait monitoring apparatus, systems, and related methods |
US11918375B2 (en) | 2014-09-05 | 2024-03-05 | Beijing Zitiao Network Technology Co., Ltd. | Wearable environmental pollution monitor computer apparatus, systems, and related methods |
US9649052B2 (en) | 2014-09-05 | 2017-05-16 | Vision Service Plan | Systems, apparatus, and methods for using eyewear, or other wearable item, to confirm the identity of an individual |
US10542915B2 (en) | 2014-09-05 | 2020-01-28 | Vision Service Plan | Systems, apparatus, and methods for using a wearable device to confirm the identity of an individual |
US10307085B2 (en) | 2014-09-05 | 2019-06-04 | Vision Service Plan | Wearable physiology monitor computer apparatus, systems, and related methods |
US9704038B2 (en) | 2015-01-07 | 2017-07-11 | Microsoft Technology Licensing, Llc | Eye tracking |
US10215568B2 (en) | 2015-01-30 | 2019-02-26 | Vision Service Plan | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
US10533855B2 (en) | 2015-01-30 | 2020-01-14 | Vision Service Plan | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
IL255734B1 (en) * | 2015-05-20 | 2023-06-01 | Magic Leap Inc | Tilt shift iris imaging |
WO2016187457A3 (en) * | 2015-05-20 | 2017-03-23 | Magic Leap, Inc. | Tilt shift iris imaging |
US9898865B2 (en) | 2015-06-22 | 2018-02-20 | Microsoft Technology Licensing, Llc | System and method for spawning drawing surfaces |
US11064881B2 (en) * | 2015-11-13 | 2021-07-20 | Hennepin Healthcare System, Inc | Method for predicting convergence disorders caused by concussion or other neuropathology |
US20170172408A1 (en) * | 2015-11-13 | 2017-06-22 | Hennepin Healthcare System, Inc. | Method for predicting convergence disorders caused by concussion or other neuropathology |
US11215457B2 (en) | 2015-12-01 | 2022-01-04 | Amer Sports Digital Services Oy | Thematic map based route optimization |
US11144107B2 (en) | 2015-12-01 | 2021-10-12 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11210299B2 (en) | 2015-12-01 | 2021-12-28 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11137820B2 (en) | 2015-12-01 | 2021-10-05 | Amer Sports Digital Services Oy | Apparatus and method for presenting thematic maps |
US11541280B2 (en) | 2015-12-21 | 2023-01-03 | Suunto Oy | Apparatus and exercising device |
US11838990B2 (en) | 2015-12-21 | 2023-12-05 | Suunto Oy | Communicating sensor data in wireless communication systems |
US11587484B2 (en) | 2015-12-21 | 2023-02-21 | Suunto Oy | Method for controlling a display |
US11607144B2 (en) | 2015-12-21 | 2023-03-21 | Suunto Oy | Sensor based context management |
US11284807B2 (en) | 2015-12-21 | 2022-03-29 | Amer Sports Digital Services Oy | Engaging exercising devices with a mobile device |
US10327673B2 (en) * | 2015-12-21 | 2019-06-25 | Amer Sports Digital Services Oy | Activity intensity level determination |
US10433768B2 (en) | 2015-12-21 | 2019-10-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
US10856776B2 (en) | 2015-12-21 | 2020-12-08 | Amer Sports Digital Services Oy | Activity intensity level determination |
US10082866B2 (en) | 2016-04-12 | 2018-09-25 | International Business Machines Corporation | Gaze point detection using dynamic facial reference points under varying lighting conditions |
US10310269B2 (en) | 2016-07-29 | 2019-06-04 | Essilor International | Method for virtual testing of at least one lens having a predetermined optical feature and associated device |
US11159782B2 (en) * | 2016-08-03 | 2021-10-26 | Samsung Electronics Co., Ltd. | Electronic device and gaze tracking method of electronic device |
US11703938B2 (en) | 2016-10-17 | 2023-07-18 | Suunto Oy | Embedded computing device |
US11145272B2 (en) | 2016-10-17 | 2021-10-12 | Amer Sports Digital Services Oy | Embedded computing device |
US9910298B1 (en) | 2017-04-17 | 2018-03-06 | Vision Service Plan | Systems and methods for a computerized temple for use with eyewear |
US11068055B2 (en) | 2017-05-31 | 2021-07-20 | Magic Leap, Inc. | Eye tracking calibration techniques |
US11379036B2 (en) | 2017-05-31 | 2022-07-05 | Magic Leap, Inc. | Eye tracking calibration techniques |
CN110945405A (en) * | 2017-05-31 | 2020-03-31 | 奇跃公司 | Eye tracking calibration techniques |
WO2018222753A1 (en) * | 2017-05-31 | 2018-12-06 | Magic Leap, Inc. | Eye tracking calibration techniques |
US20180348861A1 (en) * | 2017-05-31 | 2018-12-06 | Magic Leap, Inc. | Eye tracking calibration techniques |
US10671160B2 (en) | 2017-05-31 | 2020-06-02 | Magic Leap, Inc. | Eye tracking calibration techniques |
US11181977B2 (en) * | 2017-11-17 | 2021-11-23 | Dolby Laboratories Licensing Corporation | Slippage compensation in eye tracking |
US20190155380A1 (en) * | 2017-11-17 | 2019-05-23 | Dolby Laboratories Licensing Corporation | Slippage Compensation in Eye Tracking |
US11393251B2 (en) | 2018-02-09 | 2022-07-19 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11340461B2 (en) | 2018-02-09 | 2022-05-24 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11194161B2 (en) | 2018-02-09 | 2021-12-07 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11556741B2 (en) | 2018-02-09 | 2023-01-17 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
US20220129067A1 (en) * | 2018-03-29 | 2022-04-28 | Tobii Ab | Determining a gaze direction using depth information |
WO2019185150A1 (en) * | 2018-03-29 | 2019-10-03 | Tobii Ab | Determining a gaze direction using depth information |
US11675428B2 (en) * | 2018-03-29 | 2023-06-13 | Tobii Ab | Determining a gaze direction using depth information |
EP3547216A1 (en) * | 2018-03-30 | 2019-10-02 | Tobii AB | Deep learning for three dimensional (3d) gaze prediction |
WO2019190561A1 (en) * | 2018-03-30 | 2019-10-03 | Tobii Ab | Deep learning for three dimensional (3d) gaze prediction |
US10722128B2 (en) | 2018-08-01 | 2020-07-28 | Vision Service Plan | Heart rate detection system and method |
US11730226B2 (en) | 2018-10-29 | 2023-08-22 | Robotarmy Corp. | Augmented reality assisted communication |
US20200128902A1 (en) * | 2018-10-29 | 2020-04-30 | Holosports Corporation | Racing helmet with visual and audible information exchange |
US10786033B2 (en) * | 2018-10-29 | 2020-09-29 | Robotarmy Corp. | Racing helmet with visual and audible information exchange |
US11537202B2 (en) | 2019-01-16 | 2022-12-27 | Pupil Labs Gmbh | Methods for generating calibration data for head-wearable devices and eye tracking system |
US11676422B2 (en) | 2019-06-05 | 2023-06-13 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11301677B2 (en) * | 2019-06-14 | 2022-04-12 | Tobil AB | Deep learning for three dimensional (3D) gaze prediction |
US20220253135A1 (en) * | 2019-07-16 | 2022-08-11 | Magic Leap, Inc. | Eye center of rotation determination with one or more eye tracking cameras |
US11868525B2 (en) * | 2019-07-16 | 2024-01-09 | Magic Leap, Inc. | Eye center of rotation determination with one or more eye tracking cameras |
WO2021164867A1 (en) * | 2020-02-19 | 2021-08-26 | Pupil Labs Gmbh | Eye tracking module and head-wearable device |
US11662814B2 (en) * | 2021-02-19 | 2023-05-30 | Beijing Boe Optoelectronics Technology Co., Ltd. | Sight positioning method, head-mounted display device, computer device and computer-readable storage medium |
US20220269341A1 (en) * | 2021-02-19 | 2022-08-25 | Beijing Boe Optoelectronics Technology Co., Ltd. | Sight positioning method, head-mounted display device, computer device and computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
KR102205374B1 (en) | 2021-01-21 |
EP2929413A4 (en) | 2016-07-13 |
KR20150116814A (en) | 2015-10-16 |
CN104903818A (en) | 2015-09-09 |
JP6498606B2 (en) | 2019-04-10 |
CN104903818B (en) | 2018-12-14 |
JP2016510517A (en) | 2016-04-07 |
EP2929413B1 (en) | 2020-06-03 |
EP2929413A1 (en) | 2015-10-14 |
US20140184775A1 (en) | 2014-07-03 |
WO2014089542A1 (en) | 2014-06-12 |
US10025379B2 (en) | 2018-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140218281A1 (en) | Systems and methods for eye gaze determination | |
JP6902075B2 (en) | Line-of-sight tracking using structured light | |
US20210110560A1 (en) | Method and system for determining spatial coordinates of a 3D reconstruction of at least part of a real object at absolute spatial scale | |
US9779512B2 (en) | Automatic generation of virtual materials from real-world materials | |
US11861062B2 (en) | Blink-based calibration of an optical see-through head-mounted display | |
CN109801379B (en) | Universal augmented reality glasses and calibration method thereof | |
US6659611B2 (en) | System and method for eye gaze tracking using corneal image mapping | |
Lai et al. | Hybrid method for 3-D gaze tracking using glint and contour features | |
Hennessey et al. | Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions | |
JP7030317B2 (en) | Pupil detection device and pupil detection method | |
CN108369744B (en) | 3D gaze point detection through binocular homography mapping | |
KR101255219B1 (en) | Method of eye-gaze tracking and system adopting the method | |
US20220100268A1 (en) | Eye tracking device and a method thereof | |
JP2016173313A (en) | Visual line direction estimation system, visual line direction estimation method and visual line direction estimation program | |
US10620454B2 (en) | System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images | |
JP7168953B2 (en) | Gaze measurement device for automatic calibration, Gaze measurement method and Gaze measurement program | |
Lee et al. | A robust eye gaze tracking method based on a virtual eyeball model | |
CN108537103B (en) | Living body face detection method and device based on pupil axis measurement | |
Chi et al. | A novel multi-camera global calibration method for gaze tracking system | |
JPH0351407B2 (en) | ||
JP2007029126A (en) | Line-of-sight detecting device | |
CN110895433B (en) | Method and apparatus for user interaction in augmented reality | |
Kang et al. | A robust extrinsic calibration method for non-contact gaze tracking in the 3-D space | |
Nitschke et al. | I see what you see: point of gaze estimation from corneal images | |
Plopski et al. | Hybrid eye tracking: Combining iris contour and corneal imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |