US20120306850A1 - Distributed asynchronous localization and mapping for augmented reality - Google Patents

Distributed asynchronous localization and mapping for augmented reality Download PDF

Info

Publication number
US20120306850A1
US20120306850A1 US13/152,220 US201113152220A US2012306850A1 US 20120306850 A1 US20120306850 A1 US 20120306850A1 US 201113152220 A US201113152220 A US 201113152220A US 2012306850 A1 US2012306850 A1 US 2012306850A1
Authority
US
United States
Prior art keywords
map
mobile device
rendering
environment
localization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/152,220
Inventor
Alexandru Balan
Jason Flaks
Steve Hodges
Michael Isard
Oliver Williams
Paul Barham
Shahram Izadi
Otmar Hilliges
David Molyneaux
David Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/152,220 priority Critical patent/US20120306850A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FLAKS, JASON, BARHAM, PAUL, HILLIGES, OTMAR, WILLIAMS, OLIVER, ISARD, MICHAEL, HODGES, STEVE, IZADI, SHAHRAM, KIM, DAVID, MOLYNEAUX, DAVID, BALAN, ALEXANDRU
Priority to US13/688,093 priority patent/US8933931B2/en
Publication of US20120306850A1 publication Critical patent/US20120306850A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • Augmented reality relates to providing an augmented real-world environment where the perception of a real-world environment (or data representing a real-world environment) is augmented or modified with computer-generated virtual data.
  • data representing a real-world environment may be captured in real-time using sensory input devices such as a camera or microphone and augmented with computer-generated virtual data including virtual images and virtual sounds.
  • the virtual data may also include information related to the real-world environment such as a text description associated with a real-world object in the real-world environment.
  • An AR environment may be used to enhance numerous applications including video game, mapping, navigation, and mobile device applications.
  • Some AR environments enable the perception of real-time interaction between real objects (i.e., objects existing in a particular real-world environment) and virtual objects (i.e., objects that do not exist in the particular real-world environment).
  • an AR system typically performs several steps including mapping and localization.
  • Mapping relates to the process of generating a map of the real-world environment.
  • Localization relates to the process of locating a particular point of view or pose relative to the map.
  • a fundamental requirement of many AR systems is the ability to localize the pose of a mobile device moving within a real-world environment in order to determine the particular view associated with the mobile device that needs to be augmented.
  • SLAM simultaneous localization and mapping
  • PTAM parallel tracking and mapping
  • Both SLAM and PTAM techniques produce sparse point clouds as maps. Sparse point clouds may be sufficient for enabling camera localization, but may not be sufficient for enabling complex augmented reality applications such as those that must handle collisions and occlusions due to the interaction of real objects and virtual objects.
  • Both SLAM and PTAM techniques utilize a common sensing source for both the mapping and localization steps.
  • an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices.
  • Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device.
  • One embodiment includes determining whether a first map is required, acquiring the first map, storing the first map on a mobile device, determining a first pose associated with the mobile device, determining whether a second map is required including determining whether a virtual object is located within a field of view associated with the first pose, acquiring the second map, storing the second map on the mobile device, registering a virtual object in relation to the second map, rendering the virtual object, and displaying on the mobile device a virtual image associated with the virtual object corresponding with a view of the virtual object such that the virtual object is perceived to exist within the field of view associated with the first pose.
  • FIG. 1 is a block diagram of one embodiment of a networked computing environment in which the disclosed technology may be practiced.
  • FIG. 2 depicts one embodiment of a portion of an HMD.
  • FIG. 3A depicts one embodiment of a field of view as seen by a user wearing a HMD.
  • FIG. 3B depicts one embodiment of an AR environment.
  • FIG. 3C depicts one embodiment of an AR environment.
  • FIG. 4 illustrates one embodiment of a mapping system.
  • FIG. 5 depicts one embodiment of an AR system.
  • FIG. 6A is a flowchart describing one embodiment of a process for generating a 3-D map of a real-world environment and locating virtual objects within the 3-D map.
  • FIG. 6B is a flowchart describing one embodiment of a process for updating a 3-D map of a real-world environment.
  • FIG. 7 is a flowchart describing one embodiment of a process for rendering and displaying virtual objects.
  • FIG. 8 is a block diagram of an embodiment of a gaming and media system.
  • FIG. 9 is a block diagram of one embodiment of a mobile device.
  • FIG. 10 is a block diagram of an embodiment of a computing system environment.
  • an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices.
  • Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device.
  • An AR system may allow a user of a mobile device to view augmented images in real-time as the user moves about within a particular real-world environment.
  • the AR system In order for the AR system to enable the perceived interaction between the particular real-world environment and virtual objects within the particular real-world environment, several problems must be solved.
  • the AR system must generate a map of the particular real-world environment in which the virtual objects are to appear and interact (i.e., perform a mapping step).
  • the AR system must determine a particular pose associated with the mobile device (i.e., perform a localization step).
  • the virtual objects that are to appear within the field of view of the particular pose must be registered or aligned with a coordinate system associated with the particular real-world environment. Solving these problems takes considerable processing power and providing the computer hardware necessary to accomplish real-time AR is especially difficult because of form factor and power limitations imposed on mobile devices.
  • mapping problem In general, solving the mapping problem is computationally much more difficult than the localization problem and may require more computational power as well as more sophisticated sensors for increased robustness and accuracy.
  • the mapping problem need not be solved in real-time if landmarks or other real objects within the real-world environment are static (i.e., do not move) or semi-static (i.e., do not move often).
  • mobile device localization does need to be performed in real-time in order enable the perception of real-time interaction between real objects and virtual objects.
  • FIG. 1 is a block diagram of one embodiment of a networked computing environment 100 in which the disclosed technology may be practiced.
  • Networked computing environment 100 includes a plurality of computing devices interconnected through one or more networks 180 .
  • the one or more networks 180 allow a particular computing device to connect to and communicate with another computing device.
  • the depicted computing devices include mobile device 140 , mobile devices 110 and 120 , desktop computer 130 , and mapping server 150 .
  • the plurality of computing devices may include other computing devices not shown.
  • the plurality of computing devices may include more than or less than the number of computing devices shown in FIG. 1 .
  • the one or more networks 180 may include a secure network such as an enterprise private network, an unsecure network such as a wireless open network, a local area network (LAN), a wide area network (WAN), and the Internet.
  • Each network of the one or more networks 180 may include hubs, bridges, routers, switches, and wired transmission media such as a wired network or direct-wired connection.
  • a server such as mapping server 150 may allow a client to download information (e.g., text, audio, image, and video files) from the server or to perform a search query related to particular information stored on the server.
  • a “server” may include a hardware device that acts as the host in a client-server relationship or a software process that shares a resource with or performs work for one or more clients. Communication between computing devices in a client-server relationship may be initiated by a client sending a request to the server asking for access to a particular resource or for particular work to be performed. The server may subsequently perform the actions requested and send a response back to the client.
  • mobile device 140 includes a camera 148 , microphone 149 , network interface 145 , processor 146 , and memory 147 , all in communication with each other.
  • Camera 148 may capture digital images and/or videos.
  • Microphone 149 may capture sounds.
  • Network interface 145 allows mobile device 140 to connect to one or more networks 180 .
  • Network interface 145 may include a wireless network interface, a modem, and/or a wired network interface.
  • Processor 146 allows mobile device 140 to execute computer readable instructions stored in memory 147 in order to perform processes discussed herein.
  • Networked computing environment 100 may provide a cloud computing environment for one or more computing devices.
  • Cloud computing refers to Internet-based computing, wherein shared resources, software, and/or information are provided to one or more computing devices on-demand via the Internet (or other global network).
  • the term “cloud” is used as a metaphor for the Internet, based on the cloud drawings used in computer network diagrams to depict the Internet as an abstraction of the underlying infrastructure it represents.
  • one or more users may move around a real-world environment (e.g., a living room) each wearing special glasses, such as mobile device 140 , that allow the one or more users to observe views of the real-world overlaid with virtual images of virtual objects that maintain coherent spatial relationship with the real-world environment (i.e., as a particular user turns their head or moves within the real-world environment, the virtual images displayed to the particular user will change such that the virtual objects appear to exist within the real-world environment as perceived by the particular user).
  • environmental mapping of the real-world environment is performed by mapping server 150 (i.e., on the server side) while camera localization is performed on mobile device 140 (i.e., on the client side).
  • the one or more users may also perceive 3-D localized virtual surround sounds that match the general acoustics of the real-world environment and that seem fixed in space with respect to the real-world environment.
  • live video images captured using a video camera on a mobile device may be augmented with computer-generated images of a virtual monster.
  • the resulting augmented video images may then be displayed on a display of the mobile device in real-time such that an end user of the mobile device sees the virtual monster interacting with the real-world environment captured by the mobile device.
  • the perception of a real-world environment may be augmented via a head-mounted display device (HMD), which may comprise a video see-through and/or an optical see-through system.
  • HMD head-mounted display device
  • Mobile device 140 is one example of an optical see-through HMD.
  • An optical see-through HMD worn by an end user may allow actual direct viewing of a real-world environment (e.g., via transparent lenses) and may, at the same time, project images of a virtual object into the visual field of the end user thereby augmenting the real-world environment perceived by the end user with the virtual object.
  • FIG. 2 depicts one embodiment of a portion of an HMD, such as mobile device 140 in FIG. 1 . Only the right side of a head-mounted device is depicted.
  • HMD 200 includes right temple 202 , nose bridge 204 , eye glass 216 , and eye glass frame 214 .
  • nose bridge 204 Built into nose bridge 204 is a microphone 210 for recording sounds and transmitting the audio recording to processing unit 236 .
  • a front facing camera 213 is embedded inside right temple 202 for recording digital images and/or videos and transmitting the visual recordings to processing unit 236 .
  • Front facing camera 213 may capture color information, IR information, and/or depth information.
  • Microphone 210 and front facing camera 213 are in communication with processing unit 236 .
  • Motion and orientation sensor 238 may include a three axis magnetometer, a three axis gyro, and/or a three axis accelerometer. In one embodiment, the motion and orientation sensor 238 may comprise an inertial measurement unit (IMU).
  • the GPS receiver may determine a GPS location associated with HMD 200 .
  • Processing unit 236 may include one or more processors and a memory for storing computer readable instructions to be executed on the one or more processors. The memory may also store other types of data to be executed on the one or more processors.
  • eye glass 216 may comprise a see-through display, whereby virtual images generated by processing unit 236 may be projected and/or displayed on the see-through display.
  • the front facing camera 213 may be calibrated such that the field of view captured by the front facing camera 213 corresponds with the field of view as seen by a user of HMD 200 .
  • the ear phones 230 may be used to output virtual sounds associated with the virtual images.
  • HMD 200 may include two or more front facing cameras (e.g., one on each temple) in order to obtain depth from stereo information associated with the field of view captured by the front facing cameras.
  • the two or more front facing cameras may also comprise 3-D, IR, and/or RGB cameras. Depth information may also be acquired from a single camera utilizing depth from motion techniques. For example, two images may be acquired from the single camera associated with two different points in space at different points in time. Parallax calculations may then be performed given position information regarding the two different points in space.
  • FIG. 3A depicts one embodiment of a field of view as seen by a user wearing a HMD such as mobile device 140 in FIG. 1 .
  • the user may see within the field of view both real objects and virtual objects.
  • the real objects may include a chair 16 and a mapping system 10 (e.g., comprising a portion of an entertainment system).
  • the virtual objects may include a virtual monster 17 . As the virtual monster 17 is displayed or overlaid over the real-world environment as perceived through the see-through lenses of the HMD, the user may perceive that the virtual monster 17 exists within the real-world environment.
  • FIG. 3B depicts one embodiment of an AR environment.
  • the AR environment includes a mapping system 10 and mobile devices 18 and 19 .
  • the mobile devices 18 and 19 depicted as special glasses in FIG. 3B , may include a HMD such as HMD 200 in FIG. 2 .
  • the mapping system 10 may include a computing environment 12 , a capture device 20 , and a display 14 , all in communication with each other.
  • Computing environment 12 may include one or more processors.
  • Capture device 20 may include a color or depth sensing camera that may be used to visually monitor one or more targets including humans and one or more other objects within a particular environment.
  • capture device 20 may comprise an RGB or depth camera and computing environment 12 may comprise a set-top box or gaming console.
  • Mapping system 10 may support multiple mobile devices or clients.
  • the mobile devices 18 and 19 may include HMDs. As shown in FIG. 3B , user 28 wears mobile device 18 and user 29 wears mobile device 19 .
  • the mobile devices 18 and 19 may receive virtual data from mapping system 10 such that a virtual object is perceived to exist within a field of view as displayed through the respective mobile device. For example, as seen by user 28 through mobile device 18 , the virtual object is displayed as the back of virtual monster 17 . As seen by user 29 through mobile device 19 , the virtual object is displayed as the front of virtual monster 17 appearing above the back of chair 16 .
  • the rendering of virtual monster 17 may be performed by mapping system 10 or by mobile devices 18 and 19 .
  • mapping system 10 renders images of virtual monster 17 associated with a field of view of a particular mobile device and transmits the rendered images to the particular mobile device.
  • FIG. 3C depicts one embodiment of an AR environment utilizing the mapping system 10 and mobile devices 18 and 19 depicted in FIG. 3B .
  • the mapping system 10 may track and analyze virtual objects within a particular environment such as virtual ball 27 .
  • the mapping system 10 may also track and analyze real objects within the particular environment such as user 28 , user 29 , and chair 16 .
  • the rendering of images associated with virtual ball 27 may be performed by mapping system 10 or by mobile devices 18 and 19 .
  • mapping system 10 tracks the position of virtual objects by taking into consideration the interaction between real and virtual objects. For example, user 18 may move their arm such that user 18 perceives hitting virtual ball 27 . The mapping system 10 may subsequently apply a virtual force to virtual ball 27 such that both users 28 and 29 perceive that the virtual ball has been hit by user 28 . In one example, mapping system 10 may register the placement of virtual ball 27 within a 3-D map of the particular environment and provide virtual data information to mobile devices 18 and 19 such that users 28 and 29 perceive the virtual ball 27 as existing within the particular environment from their respective points of view. In another embodiment, a particular mobile device may render virtual objects that are specific to the particular mobile device. For example, if the virtual ball 27 is only rendered on mobile device 18 then the virtual ball 27 would only be perceived as existing within the particular environment by user 28 . In some embodiments, the dynamics of virtual objects may be performed on the particular mobile device and not on the mapping system.
  • FIG. 4 illustrates one embodiment of a mapping system 50 including a capture device 58 and computing environment 54 .
  • Mapping system 50 is one example of an implementation for mapping system 10 in FIGS. 3B-3C .
  • computing environment 54 may correspond with computing environment 12 in FIGS. 3B-3C and capture device 58 may correspond with capture device 20 in FIGS. 3B-3C .
  • the capture device 58 may include one or more image sensors for capturing images and videos.
  • An image sensor may comprise a CCD image sensor or a CMOS sensor.
  • capture device 58 may include an IR CMOS image sensor.
  • the capture device 58 may also include a depth camera (or depth sensing camera) configured to capture video with depth information including a depth image that may include depth values via any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like.
  • the capture device 58 may include an image camera component 32 .
  • the image camera component 32 may include a depth camera that may capture a depth image of a scene.
  • the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may represent a depth value such as a distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the camera.
  • the image camera component 32 may include an IR light component 34 , a three-dimensional (3-D) camera 36 , and an RGB camera 38 that may be used to capture the depth image of a capture area.
  • the IR light component 34 of the capture device 58 may emit an infrared light onto the capture area and may then use sensors to detect the backscattered light from the surface of one or more objects in the capture area using, for example, the 3-D camera 36 and/or the RGB camera 38 .
  • pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 58 to a particular location on the one or more objects in the capture area. Additionally, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location associated with the one or more objects.
  • the capture device 58 may use structured light to capture depth information.
  • patterned light i.e., light displayed as a known pattern such as grid pattern or a stripe pattern
  • the pattern may become deformed in response.
  • Such a deformation of the pattern may be captured by, for example, the 3-D camera 36 and/or the RGB camera 38 and analyzed to determine a physical distance from the capture device to a particular location on the one or more objects.
  • two or more different cameras may be incorporated into an integrated capture device.
  • a depth camera and a video camera e.g., an RGB video camera
  • two or more separate capture devices of the same or differing types may be cooperatively used.
  • a depth camera and a separate video camera may be used, two video cameras may be used, two depth cameras may be used, two RGB cameras may be used or any combination and number of cameras may be used.
  • the capture device 58 may include two or more physically separated cameras that may view a capture area from different angles to obtain visual stereo data that may be resolved to generate depth information.
  • Depth may also be determined by capturing images using a plurality of detectors that may be monochromatic, infrared, RGB, or any other type of detector and performing a parallax calculation. Other types of depth image sensors can also be used to create a depth image.
  • capture device 58 may include a microphone 40 .
  • the microphone 40 may include a transducer or sensor that may receive and convert sound into an electrical signal.
  • the capture device 58 may include a processor 42 that may be in operative communication with the image camera component 32 .
  • the processor may include a standardized processor, a specialized processor, a microprocessor, or the like.
  • the processor 42 may execute instructions that may include instructions for storing filters or profiles, receiving and analyzing images, determining whether a particular situation has occurred, or any other suitable instructions. It is to be understood that at least some image analysis and/or target analysis and tracking operations may be executed by processors contained within one or more capture devices such as capture device 58 .
  • the capture device 58 may include a memory 44 that may store the instructions that may be executed by the processor 42 , images or frames of images captured by the 3-D camera or RGB camera, filters or profiles, or any other suitable information, images, or the like.
  • the memory 44 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component.
  • RAM random access memory
  • ROM read only memory
  • cache Flash memory
  • the memory 44 may be a separate component in communication with the image capture component 32 and the processor 42 .
  • the memory 44 may be integrated into the processor 42 and/or the image capture component 32 .
  • some or all of the components 32 , 34 , 36 , 38 , 40 , 42 and 44 of the capture device 58 illustrated in FIG. 4 are housed in a single housing.
  • the capture device 58 may be in communication with the computing environment 54 via a communication link 46 .
  • the communication link 46 may be a wired connection including, for example, a USB connection, a FireWire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection.
  • the computing environment 54 may provide a clock to the capture device 58 that may be used to determine when to capture, for example, a scene via the communication link 46 .
  • the capture device 58 may provide the images captured by, for example, the 3D camera 36 and/or the RGB camera 38 to the computing environment 54 via the communication link 46 .
  • computing environment 54 includes image and audio processing engine 194 in communication with operating system 196 .
  • Image and audio processing engine 194 includes virtual data engine 197 , gesture recognizer engine 190 , structure data 198 , processing unit 191 , and memory unit 192 , all in communication with each other.
  • Image and audio processing engine 194 processes video, image, and audio data received from capture device 58 .
  • image and audio processing engine 194 may utilize structure data 198 and gesture recognition engine 190 .
  • Virtual data engine 197 processes virtual objects and registers the position and orientation of virtual objects in relation to various maps of a real-world environment stored in memory unit 192 .
  • Processing unit 191 may include one or more processors for executing object, facial, and voice recognition algorithms.
  • image and audio processing engine 194 may apply object recognition and facial recognition techniques to image or video data.
  • object recognition may be used to detect particular objects (e.g., soccer balls, cars, or landmarks) and facial recognition may be used to detect the face of a particular person.
  • Image and audio processing engine 194 may apply audio and voice recognition techniques to audio data.
  • audio recognition may be used to detect a particular sound. The particular faces, voices, sounds, and objects to be detected may be stored in one or more memories contained in memory unit 192 .
  • one or more objects being tracked may be augmented with one or more markers such as an IR retroreflective marker to improve object detection and/or tracking.
  • Planar reference images, coded AR markers, QR codes, and/or bar codes may also be used to improve object detection and/or tracking.
  • image and audio processing engine 194 may report to operating system 196 an identification of each object detected and a corresponding position and/or orientation.
  • the image and audio processing engine 194 may utilize structural data 198 while performing object recognition.
  • Structure data 198 may include structural information about targets and/or objects to be tracked. For example, a skeletal model of a human may be stored to help recognize body parts.
  • structure data 198 may include structural information regarding one or more inanimate objects in order to help recognize the one or more inanimate objects.
  • the image and audio processing engine 194 may also utilize gesture recognizer engine 190 while performing object recognition.
  • gestures recognizer engine 190 may include a collection of gesture filters, each comprising information concerning a gesture that may be performed by a skeletal model.
  • the gesture recognition engine 190 may compare the data captured by capture device 58 in the form of the skeletal model and movements associated with it to the gesture filters in a gesture library to identify when a user (as represented by the skeletal model) has performed one or more gestures.
  • image and audio processing engine 194 may use the gesture recognition engine 190 to help interpret movements of a skeletal model and to detect the performance of a particular gesture.
  • FIG. 5 depicts one embodiment of an AR system 600 .
  • AR system 600 includes mapping server 620 and mobile device 630 .
  • Mapping server 620 may comprise a mapping system such as mapping system 50 in FIG. 4 .
  • Mapping server 620 may work asynchronously to build one or more maps associated with one or more real-world environments.
  • the one or more maps may include sparse 3-D maps and/or dense 3-D maps based on images captured from a variety of sources including sensors dedicated to mapping server 620 .
  • Mobile device 630 may comprise an HMD such as mobile device 140 in FIG. 1 . Although a single mobile device 630 is depicted, mapping server 620 may support numerous mobile devices.
  • the process steps 621 - 624 associated with mapping server 620 are decoupled from and are performed independently of the process steps 631 - 635 associated with the mobile device 630 .
  • images are received from one or more server sensors.
  • the one or more server sensors includes the 3-D and/or RGB cameras within the image camera component 32 in FIG. 4 .
  • the one or more server sensors may be associated with one or more stationary cameras and/or one or more mobile cameras which may move about a particular environment over time.
  • the one or more mobile cameras may move in a predetermined or predictable manner about the particular environment (e.g., a dedicated mapping server camera that runs on tracks above a room or hangs from the ceiling of a room and rotates in a deterministic manner).
  • the one or more server sensors providing images to the mapping server 620 may move about an environment attached to a mobile robot or autonomous vehicle.
  • images received from the one or more server sensors are registered.
  • the mapping server 620 may register different images taken within a particular environment (e.g., images from different points of view, taken at different points in time, and/or images associated with different types of information such as color or depth information) into a single real-world coordinate system associated with the particular environment.
  • image registration into a common coordinate system may be performed using an extrinsic calibration process.
  • the registration and/or alignment of images (or objects within the images) onto a common coordinate system allows the mapping server 620 to be able to compare and integrate real-world objects, landmarks, or other features extracted from the different images into one or more maps associated with the particular environment.
  • the one or more maps associated with a particular environment may comprise 3-D maps.
  • one or more additional images derived from sources decoupled from mapping server 620 may be used to extend or update the one or more maps associated with the particular environment.
  • client images 640 derived from sensors associated with mobile device 630 may be used by mapping server 620 to refine the one or more maps.
  • the registration of one or more images may require knowledge of one or more poses associated with the one or more images.
  • a six degree of freedom (6DOF) pose may be provided including information associated with the position and orientation of the particular sensor from which the image was captured.
  • Mapping and pose estimation (or localization) associated with a particular sensor may be determined using traditional SLAM or PTAM techniques.
  • a sparse map representing a portion of a particular environment is generated.
  • the sparse map may comprise a 3-D point cloud and may include one or more image descriptors.
  • the mapping server 620 may identify one or more image descriptors associated with a particular map of the one or more maps by applying various image processing methods such as object recognition, feature detection, corner detection, blob detection, and edge detection methods.
  • the one or more image descriptors may be used as landmarks in determining a particular pose, position, and/or orientation in relation to the particular map.
  • An image descriptor may include color and/or depth information associated with a particular object (e.g., a red apple) or a portion of a particular object within the particular environment (e.g., the top of a red apple).
  • a particular object e.g., a red apple
  • a portion of a particular object within the particular environment e.g., the top of a red apple
  • an initial sparse map is generated first and then subsequently refined and updated over time. The initial sparse map may be limited to only covering static objects within the particular environment.
  • a dense map representing a portion of the particular environment is generated.
  • the dense map may comprise a dense 3-D surface mesh, which may be created using the 3-D point cloud generated in step 623 .
  • the dense map may provide sufficient detail of the particular environment so as to enable complex augmented reality applications such as those that must handle collisions and occlusions due to the interaction of real objects and virtual objects within the particular environment.
  • a particular dense map is not generated by mapping server 620 until a request for the particular dense map is received from mobile device 630 .
  • a mobile device 630 may request a particular dense map first without requesting a related sparse map.
  • the process steps 621 - 624 associated with mapping server 620 may be performed in the cloud.
  • mapping data 641 and/or mapping data 642 associated with a particular environment.
  • the decision to request either mapping data 641 or mapping data 642 depends on the requirements of a particular AR application running on mobile device 630 .
  • one or more client images are received from one or more client sensors.
  • One example of the one or more client sensors includes the 3-D and/or RGB cameras associated with HMD 200 in FIG. 2 .
  • the one or more client sensors detect light from a patterned or structured light source associated with and projected by mapping server 620 . This allows several mobile devices to each independently utilize the same structured light source for localization purposes.
  • one or more client image descriptors are extracted from the one or more client images.
  • the mobile device 630 may identify one or more client image descriptors within the one or more client images by applying various image processing methods such as object recognition, feature detection, corner detection, blob detection, and edge detection methods.
  • a particular map is acquired from mapping server 620 .
  • the particular map may be included within mapping data 641 .
  • the map may be a 3-D map and may include one or more image descriptors associated with one or more objects located within the particular environment described by the 3-D map.
  • An updated map may be requested by mobile device 630 every time the mobile device enters a new portion of the particular environment or at a predefined frequency (e.g., every 10 minutes).
  • a pose (e.g., a 6 DOF pose) associated with a field of view of mobile device 630 is determined via localization and/or pose estimation.
  • Pose estimation may include the matching and aligning of detected image descriptors. For example, matching may be performed between the one or more client image descriptors and the image descriptors received from mapping server 620 included within mapping data 641 . In some embodiments, matching may take into consideration depth and color information associated with the one or more image descriptors.
  • the one or more client images comprise color images (e.g., RGB images) and matching is performed for only image descriptors including color information, thereby enabling image-based localization for mobile devices with only RGB sensor input.
  • Alignment of matched image descriptors may be performed using image or frame alignment techniques, which may include determining point to point correspondences between the field of view of mobile device 630 and views derived from the particular map.
  • mobile device 630 may include a GPS receiver, such as GPS receiver 232 in FIG. 2 , and a motion and orientation sensor, such as motion and orientation sensor 238 in FIG. 2 .
  • a first-pass estimate for the pose associated with mobile device 630 may be obtained by utilizing GPS location information and orientation information generated on mobile device 630 .
  • Mobile device 630 may also obtain location and orientation information by detecting, tracking, and triangulating the position of physical tags (e.g., reflective markers) or emitters (e.g., LEDs) attached to one or more other mobile devices using marker-based motion capture technologies.
  • the location and/or orientation information associated with mobile device 630 may be used to improve the accuracy of the localization step.
  • localization of mobile device 630 includes searching for image descriptors associated with landmarks or other objects within a field of view of mobile device 630 .
  • the extracted image descriptors may subsequently be matched with image descriptors associated with the most recent map of the particular environment acquired from mapping server 620 in step 623 .
  • Landmarks may be associated with image features that are easily observed and distinguished from other features within the field of view.
  • Stationary landmarks may act as anchor points or reference points in a 3-D map of a real-world environment. Other points of interest within a real-world environment (e.g., geographical elements such as land features) may also be used for both mapping and localization purposes.
  • Landmarks and other points of interest within the real-world environment may be detected using various image processing methods such as object recognition, frame alignment, feature detection, corner detection, blob detection, and edge detection methods.
  • image processing methods such as object recognition, frame alignment, feature detection, corner detection, blob detection, and edge detection methods.
  • mobile device 630 may continuously search for and extract landmarks and associate newly discovered landmarks to those found previously.
  • an image descriptor may comprise image information related to a portion of an object or to the entire object.
  • an image descriptor may describe characteristics of the object such as its location, color, texture, shape, and/or its relationship to other objects or landmarks within the environment.
  • image processing techniques such as object and pattern matching, the one or more image descriptors may be used to locate an object in an image containing other objects. It is desirable that the image processing techniques for detecting and matching the one or more image descriptors be robust to changes in image scale, noise, illumination, local geometric distortion, and image orientation.
  • localization of mobile device 630 may be performed in the cloud (e.g., by the mapping server 620 ).
  • the mobile device 630 may transmit client images 640 derived from sensors associated with mobile device 630 and/or image descriptors 644 extracted in step 632 to the mapping server 620 .
  • the mapping server 620 may perform the mobile device localization based on the received client images 640 and/or image descriptors 644 utilizing process steps similar to those performed by mobile device 630 in step 634 .
  • mobile device 630 may acquire a pose associated with a field of view of mobile device 630 from the mapping server 620 without needing to acquire the particular map in step 633 or perform localization step 634 (i.e., steps 633 and 634 may be omitted if localization of the mobile device is performed by the mapping server).
  • one or more virtual objects are displayed on mobile device 630 .
  • the one or more virtual objects may be rendered on mobile device 630 using a processing element, such as processing unit 236 in FIG. 2 .
  • Rendering may involve the use of ray tracing techniques (e.g., simulating the propagation of light and sound waves through a model of the AR environment) in order to generate virtual images and/or sounds from the perspective of a mobile device.
  • Virtual data associated with the one or more virtual objects may be generated by either the mapping server 620 and/or mobile device 630 .
  • the one or more virtual objects are rendered locally on mobile device 630 .
  • the one or more virtual objects are rendered on mapping server 620 and virtual images corresponding with the determined pose in step 634 may be transmitted to mobile device 630 .
  • Virtual images associated with the one or more virtual objects may be displayed on mobile device 630 such that the one or more virtual objects are perceived to exist within a field of view associated with the previously determined pose of step 634 .
  • the virtual data associated with the one or more virtual object may also include virtual sounds.
  • mobile device 630 may request a dense 3-D map of a particular environment in order to realistically render virtual objects and to handle collisions and occlusions due to the interaction of real objects and virtual objects within the particular environment. Virtual sounds may also be registered in relation to the dense 3-D map.
  • the dense 3-D map may be acquired from mapping server 620 and included within mapping data 642 .
  • FIG. 6A is a flowchart describing one embodiment of a process for generating a 3-D map of a real-world environment and locating virtual objects within the 3-D map.
  • the process of FIG. 6A may be performed continuously and by one or more computing devices. Each step in the process of FIG. 6A may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device.
  • the process of FIG. 6A is performed by a mapping server such as mapping server 620 in FIG. 5 .
  • a 3-D map of a real-world environment is generated.
  • one or more virtual objects may be computer generated.
  • the 3-D map and the one or more virtual objects are generated using a mapping server.
  • the one or more virtual objects may be stored in a database on the mapping server.
  • the one or more virtual objects may be registered in relation to the 3-D map. If registration is performed by the mapping server, then the mapping server may control the location of the virtual objects being displayed by one or more mobile devices.
  • step 708 a determination is made as to whether or not the location of one or more virtual objects in relation to the 3-D map has changed.
  • the change in location may be due to a perceived interaction between one or more virtual objects and/or one or more real objects, or due to the natural movement of a virtual object through an environment (e.g., via physics simulation). If the location of one or more virtual objects has changed, then the one or more virtual objects are reregistered in relation to the 3-D map in step 706 . If the one or more virtual objects are all characterized as static then step 708 may be omitted.
  • step 710 location information associated with one or more mobile devices is received.
  • the received location information may be used by a mapping server to customize a particular map associated with the location information.
  • step 712 at least a portion of the 3-D map is outputted to the one or more mobile devices.
  • the 3-D map may include a customized map of an environment associated with the location information received in step 710 .
  • the 3-D map may include one or more image descriptors.
  • the one or more image descriptors may include RGB image descriptors and/or 3-D image descriptors.
  • a mapping server outputs one of more image descriptors in response to a request from a mobile device for either RGB image descriptors or 3-D image descriptors depending on the requirements of a particular application running on the mobile device.
  • virtual data associated with the one or more virtual objects is outputted to the one or more mobile devices.
  • FIG. 6B is a flowchart describing one embodiment of a process for updating a 3-D map of a real-world environment.
  • the process of FIG. 6B may be performed continuously and by one or more computing devices.
  • Each step in the process of FIG. 6B may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device.
  • the process of FIG. 6B is performed by a mapping server such as mapping server 620 in FIG. 5 .
  • step 802 one or more images of a first environment is acquired from one or more cameras associated with a mapping server.
  • step 804 the one or more images are registered.
  • step 806 a first map of the first environment is generated.
  • step 808 one or more objects existing within the first environment are detected and characterized.
  • Each of the one or more objects may be characterized as either static, semi-static, or dynamic.
  • An object characterized as static is one which does not move.
  • An object characterized as semi-static is one which does not move often.
  • Semi-static objects may be useful for mapping and/or localization purposes because they may act as static landmarks over a period of time within an environment.
  • Object recognition techniques may be used to identify objects, such as furniture or appliances, that may typically be classified as semi-static objects.
  • maps of an environment are stored and time stamped (e.g., every 2 hours) in order to detect movement of landmarks and to possibly recharacterize one or more objects.
  • mapping and localization are performed using only objects classified as static objects, thereby discarding any objects characterized as either semi-static or dynamic (i.e., not taking into consideration any moving objects).
  • one or more additional images of the first environment are received from a first mobile device. Localization information associated with the one or more additional images may also be received from the first mobile device. Obtaining pose information along with one or more additional images allows a mapping server to generate one or more maps associated with an environment beyond that capable of being sensed by dedicated mapping server sensors. The one or more additional images may also include depth information.
  • the one or more additional images are registered. In some embodiments, the one or more additional images are registered by a mapping server. A mapping server may also accept information from one or more mobile devices in the form of registered RGB data and/or depth reconstructions.
  • the first map is updated based on the one or more additional images.
  • step 816 one or more new objects existing within the first environment are detected and characterized.
  • step 818 at least a subset of the updated first map is transmitted to a second mobile device.
  • a first set of mobile devices may be used to update the first map, while a second set of mobile devices perform localization in relation to the updated first map.
  • FIG. 7 is a flowchart describing one embodiment of a process for rendering and displaying virtual objects.
  • the process of FIG. 7 may be performed continuously and by one or more computing devices. Each step in the process of FIG. 7 may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device.
  • the process of FIG. 7 is performed by a mobile device such as mobile device 630 in FIG. 5 .
  • a mobile device may determine whether a first map is required for localization purposes by determining a location of the mobile device (e.g., by acquiring a GPS location) and comparing the location with the locations associated with one or more maps stored on the mobile device.
  • a first map may be required if none of the one or more maps stored on the mobile device provide adequate coverage of a particular environment or no map is deemed to be a valid map of the environment encompassing the location.
  • a particular map stored on a mobile device may be deemed invalid because the map has not been refreshed within a certain period of time or an expiration date associated with the particular map has passed.
  • a particular map stored on a mobile device may be invalidated by a mapping server upon an update to the particular map made by the mapping server.
  • the first map is acquired.
  • the first map may be acquired from a mapping server.
  • a mobile device may transmit a request for the first map to a mapping server.
  • a mobile device may transmit a request for the first map to a second mobile device located within a particular environment.
  • the second mobile device may act as a local mapping cache for the mobile device.
  • the first map may include one or more image descriptors associated with one or more real objects within a particular environment.
  • the first map is stored.
  • the first map may be stored in a non-volatile memory within a mobile device.
  • a first pose associated with the mobile device in relation to the first map is determined.
  • the pose estimation or determination may be performed on a mobile device or on a mapping server.
  • localization is performed on a mobile device using the process of step 634 in FIG. 5 .
  • image descriptors associated with non-static objects may be discarded.
  • a mobile device may transmit images of its environment to a mapping server and the mapping server may perform the localization process on behalf of the mobile device.
  • a mobile device may determine whether a second map is required for rendering and/or display purposes by determining whether a virtual object is located within (or near) a field of view associated with the first pose determined in step 908 .
  • the second map may comprise a higher resolution map of a portion of the first map.
  • the second map is acquired.
  • the second map is stored.
  • a virtual object is registered in relation to the second map.
  • a mobile device registers the virtual object in relation to the second map prior to rendering the virtual object.
  • the virtual object is rendered.
  • the rendering step may include receiving virtual data associated with the virtual object from a mapping server and generating one or more virtual images of the virtual object based on the virtual data.
  • the virtual object is displayed.
  • the virtual object may be displayed by displaying on a mobile device one or more virtual images associated with the virtual object.
  • the one or more virtual images may correspond with a view of the virtual object such that the virtual object is perceived to exist within the field of view associated with the first pose determined in step 908 .
  • FIGS. 8-10 provide examples of various computing systems that can be used to implement embodiments of the disclosed technology.
  • FIG. 8 is a block diagram of an embodiment of a gaming and media system 7201 .
  • Console 7203 has a central processing unit (CPU) 7200 , and a memory controller 7202 that facilitates processor access to various types of memory, including a flash Read Only Memory (ROM) 7204 , a Random Access Memory (RAM) 7206 , a hard disk drive 7208 , and portable media drive 7107 .
  • CPU 7200 includes a level 1 cache 7210 and a level 2 cache 7212 , to temporarily store data and hence reduce the number of memory access cycles made to the hard drive 7208 , thereby improving processing speed and throughput.
  • the one or more buses might include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures.
  • bus architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnects
  • CPU 7200 , memory controller 7202 , ROM 7204 , and RAM 7206 are integrated onto a common module 7214 .
  • ROM 7204 is configured as a flash ROM that is connected to memory controller 7202 via a PCI bus and a ROM bus (neither of which are shown).
  • RAM 7206 is configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by memory controller 7202 via separate buses (not shown).
  • Hard disk drive 7208 and portable media drive 7107 are shown connected to the memory controller 7202 via the PCI bus and an AT Attachment (ATA) bus 7216 .
  • ATA AT Attachment
  • dedicated data bus structures of different types may also be applied in the alternative.
  • a three-dimensional graphics processing unit 7220 and a video encoder 7222 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing.
  • Data are carried from graphics processing unit 7220 to video encoder 7222 via a digital video bus (not shown).
  • An audio processing unit 7224 and an audio codec (coder/decoder) 7226 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between audio processing unit 7224 and audio codec 7226 via a communication link (not shown).
  • the video and audio processing pipelines output data to an A/V (audio/video) port 7228 for transmission to a television or other display.
  • video and audio processing components 7220 - 7228 are mounted on module 7214 .
  • FIG. 8 shows module 7214 including a USB host controller 7230 and a network interface 7232 .
  • USB host controller 7230 is in communication with CPU 7200 and memory controller 7202 via a bus (not shown) and serves as host for peripheral controllers 7205 ( 1 )- 7205 ( 4 ).
  • Network interface 7232 provides access to a network (e.g., Internet, home network, etc.) and may be any of a wide variety of various wire or wireless interface components including an Ethernet card, a modem, a wireless access card, a Bluetooth® module, a cable modem, and the like.
  • console 7203 includes a controller support subassembly 7240 for supporting four controllers 7205 ( 1 )- 7205 ( 4 ).
  • the controller support subassembly 7240 includes any hardware and software components needed to support wired and wireless operation with an external control device, such as for example, a media and game controller.
  • a front panel I/O subassembly 7242 supports the multiple functionalities of power button 7213 , the eject button 7215 , as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of console 7203 .
  • Subassemblies 7240 and 7242 are in communication with module 7214 via one or more cable assemblies 7244 .
  • console 7203 can include additional controller subassemblies.
  • the illustrated implementation also shows an optical I/O interface 7235 that is configured to send and receive signals (e.g., from remote control 7290 ) that can be communicated to module 7214 .
  • MUs 7241 ( 1 ) and 7241 ( 2 ) are illustrated as being connectable to MU ports “A” 7231 ( 1 ) and “B” 7231 ( 2 ) respectively. Additional MUs (e.g., MUs 7241 ( 3 )- 7241 ( 6 )) are illustrated as being connectable to controllers 7205 ( 1 ) and 7205 ( 3 ), i.e., two MUs for each controller. Controllers 7205 ( 2 ) and 7205 ( 4 ) can also be configured to receive MUs (not shown). Each MU 7241 offers additional storage on which games, game parameters, and other data may be stored. Additional memory devices, such as portable USB devices, can be used in place of the MUs.
  • the other data can include any of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file.
  • MU 7241 When inserted into console 7203 or a controller, MU 7241 can be accessed by memory controller 7202 .
  • a system power supply module 7250 provides power to the components of gaming system 7201 .
  • a fan 7252 cools the circuitry within console 7203 .
  • An application 7260 comprising machine instructions is stored on hard disk drive 7208 .
  • various portions of application 7260 are loaded into RAM 7206 , and/or caches 7210 and 7212 , for execution on CPU 7200 .
  • Other applications may also be stored on hard disk drive 7208 for execution on CPU 7200 .
  • Gaming and media system 7201 may be operated as a standalone system by simply connecting the system to a monitor, a television, a video projector, or other display device. In this standalone mode, gaming and media system 7201 enables one or more players to play games or enjoy digital media (e.g., by watching movies or listening to music). However, with the integration of broadband connectivity made available through network interface 7232 , gaming and media system 7201 may further be operated as a participant in a larger network gaming community.
  • FIG. 9 is a block diagram of one embodiment of a mobile device 8300 .
  • Mobile devices may include laptop computers, pocket computers, mobile phones, personal digital assistants, and handheld media devices that have been integrated with wireless receiver/transmitter technology.
  • Mobile device 8300 includes one or more processors 8312 and memory 8310 .
  • Memory 8310 includes applications 8330 and non-volatile storage 8340 .
  • Memory 8310 can be any variety of memory storage media types, including non-volatile and volatile memory.
  • a mobile device operating system handles the different operations of the mobile device 8300 and may contain user interfaces for operations, such as placing and receiving phone calls, text messaging, checking voicemail, and the like.
  • the applications 8330 can be any assortment of programs, such as a camera application for photos and/or videos, an address book, a calendar application, a media player, an internet browser, games, an alarm application, and other applications.
  • the non-volatile storage component 8340 in memory 8310 may contain data such as music, photos, contact data, scheduling data, and other files.
  • the one or more processors 8312 also communicates with RF transmitter/receiver 8306 which in turn is coupled to an antenna 8302 , with infrared transmitter/receiver 8308 , with global positioning service (GPS) receiver 8365 , and with movement/orientation sensor 8314 which may include an accelerometer and/or magnetometer.
  • RF transmitter/receiver 8308 may enable wireless communication via various wireless technology standards such as Bluetooth® or the IEEE 802.11 standards. Accelerometers have been incorporated into mobile devices to enable applications such as intelligent user interface applications that let users input commands through gestures, and orientation applications which can automatically change the display from portrait to landscape when the mobile device is rotated.
  • An accelerometer can be provided, e.g., by a micro-electromechanical system (MEMS) which is a tiny mechanical device (of micrometer dimensions) built onto a semiconductor chip. Acceleration direction, as well as orientation, vibration, and shock can be sensed.
  • the one or more processors 8312 further communicate with a ringer/vibrator 8316 , a user interface keypad/screen 8318 , a speaker 8320 , a microphone 8322 , a camera 8324 , a light sensor 8326 , and a temperature sensor 8328 .
  • the user interface keypad/screen may include a touch-sensitive screen display.
  • the one or more processors 8312 controls transmission and reception of wireless signals. During a transmission mode, the one or more processors 8312 provide voice signals from microphone 8322 , or other data signals, to the RF transmitter/receiver 8306 . The transmitter/receiver 8306 transmits the signals through the antenna 8302 . The ringer/vibrator 8316 is used to signal an incoming call, text message, calendar reminder, alarm clock reminder, or other notification to the user. During a receiving mode, the RF transmitter/receiver 8306 receives a voice signal or data signal from a remote station through the antenna 8302 . A received voice signal is provided to the speaker 8320 while other received data signals are processed appropriately.
  • a physical connector 8388 may be used to connect the mobile device 8300 to an external power source, such as an AC adapter or powered docking station, in order to recharge battery 8304 .
  • the physical connector 8388 may also be used as a data connection to an external computing device. The data connection allows for operations such as synchronizing mobile device data with the computing data on another device.
  • FIG. 10 is a block diagram of an embodiment of a computing system environment 2200 .
  • Computing system environment 2200 includes a general purpose computing device in the form of a computer 2210 .
  • Components of computer 2210 may include, but are not limited to, a processing unit 2220 , a system memory 2230 , and a system bus 2221 that couples various system components including the system memory 2230 to the processing unit 2220 .
  • the system bus 2221 may be any of several types of bus structures including a memory bus, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • Computer 2210 typically includes a variety of computer readable media.
  • Computer readable media can be any available media that can be accessed by computer 2210 and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may comprise computer storage media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 2210 . Combinations of the any of the above should also be included within the scope of computer readable media.
  • the system memory 2230 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 2231 and random access memory (RAM) 2232 .
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system 2233 (BIOS) containing the basic routines that help to transfer information between elements within computer 2210 , such as during start-up, is typically stored in ROM 2231 .
  • BIOS basic input/output system 2233
  • RAM 2232 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 2220 .
  • FIG. 10 illustrates operating system 2234 , application programs 2235 , other program modules 2236 , and program data 2237 .
  • the computer 2210 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
  • FIG. 10 illustrates a hard disk drive 2241 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 2251 that reads from or writes to a removable, nonvolatile magnetic disk 2252 , and an optical disk drive 2255 that reads from or writes to a removable, nonvolatile optical disk 2256 such as a CD ROM or other optical media.
  • removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
  • the hard disk drive 2241 is typically connected to the system bus 2221 through an non-removable memory interface such as interface 2240
  • magnetic disk drive 2251 and optical disk drive 2255 are typically connected to the system bus 2221 by a removable memory interface, such as interface 2250 .
  • hard disk drive 2241 is illustrated as storing operating system 2244 , application programs 2245 , other program modules 2246 , and program data 2247 . Note that these components can either be the same as or different from operating system 2234 , application programs 2235 , other program modules 2236 , and program data 2237 . Operating system 2244 , application programs 2245 , other program modules 2246 , and program data 2247 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • a user may enter commands and information into computer 2210 through input devices such as a keyboard 2262 and pointing device 2261 , commonly referred to as a mouse, trackball, or touch pad.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 2220 through a user input interface 2260 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 2291 or other type of display device is also connected to the system bus 2221 via an interface, such as a video interface 2290 .
  • computers may also include other peripheral output devices such as speakers 2297 and printer 2296 , which may be connected through an output peripheral interface 2295 .
  • the computer 2210 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 2280 .
  • the remote computer 2280 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 2210 , although only a memory storage device 2281 has been illustrated in FIG. 10 .
  • the logical connections depicted in FIG. 10 include a local area network (LAN) 2271 and a wide area network (WAN) 2273 , but may also include other networks.
  • LAN local area network
  • WAN wide area network
  • the computer 2210 When used in a LAN networking environment, the computer 2210 is connected to the LAN 2271 through a network interface or adapter 2270 .
  • the computer 2210 When used in a WAN networking environment, the computer 2210 typically includes a modem 2272 or other means for establishing communications over the WAN 2273 , such as the Internet.
  • the modem 2272 which may be internal or external, may be connected to the system bus 2221 via the user input interface 2260 , or other appropriate mechanism.
  • program modules depicted relative to the computer 2210 may be stored in the remote memory storage device.
  • FIG. 10 illustrates remote application programs 2285 as residing on memory device 2281 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • the disclosed technology is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the technology include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • the disclosed technology may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • software and program modules as described herein include routines, programs, objects, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types.
  • Hardware or combinations of hardware and software may be substituted for software modules as described herein.
  • the disclosed technology may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • a connection can be a direct connection or an indirect connection (e.g., via another part).
  • set refers to a “set” of one or more of the objects.

Abstract

A system and method for providing an augmented reality environment in which the environmental mapping process is decoupled from the localization processes performed by one or more mobile devices is described. In some embodiments, an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices. Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device. This distributed approach provides an efficient way for supporting mapping and localization processes for a large number of mobile devices, which are typically constrained by form factor and battery life limitations.

Description

    BACKGROUND
  • Augmented reality (AR) relates to providing an augmented real-world environment where the perception of a real-world environment (or data representing a real-world environment) is augmented or modified with computer-generated virtual data. For example, data representing a real-world environment may be captured in real-time using sensory input devices such as a camera or microphone and augmented with computer-generated virtual data including virtual images and virtual sounds. The virtual data may also include information related to the real-world environment such as a text description associated with a real-world object in the real-world environment. An AR environment may be used to enhance numerous applications including video game, mapping, navigation, and mobile device applications.
  • Some AR environments enable the perception of real-time interaction between real objects (i.e., objects existing in a particular real-world environment) and virtual objects (i.e., objects that do not exist in the particular real-world environment). In order to realistically integrate the virtual objects into an AR environment, an AR system typically performs several steps including mapping and localization. Mapping relates to the process of generating a map of the real-world environment. Localization relates to the process of locating a particular point of view or pose relative to the map. A fundamental requirement of many AR systems is the ability to localize the pose of a mobile device moving within a real-world environment in order to determine the particular view associated with the mobile device that needs to be augmented.
  • In robotics, traditional methods employing simultaneous localization and mapping (SLAM) techniques have been used by robots and autonomous vehicles in order to build a map of an unknown environment (or to update a map within a known environment) while simultaneously tracking their current location for navigation purposes. Most SLAM approaches are incremental, meaning that they iteratively update the map and then update the estimated camera pose in the same process. An extension of SLAM is parallel tracking and mapping (PTAM), which separates the mapping and localization steps into parallel computation threads. Both SLAM and PTAM techniques produce sparse point clouds as maps. Sparse point clouds may be sufficient for enabling camera localization, but may not be sufficient for enabling complex augmented reality applications such as those that must handle collisions and occlusions due to the interaction of real objects and virtual objects. Both SLAM and PTAM techniques utilize a common sensing source for both the mapping and localization steps.
  • SUMMARY
  • Technology is described for providing an augmented reality environment in which the environmental mapping process is decoupled from the localization processes performed by one or more mobile devices. In some embodiments, an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices. Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device. This distributed approach provides an efficient way for supporting mapping and localization processes for a large number of mobile devices, which are typically constrained by form factor and battery life limitations.
  • One embodiment includes determining whether a first map is required, acquiring the first map, storing the first map on a mobile device, determining a first pose associated with the mobile device, determining whether a second map is required including determining whether a virtual object is located within a field of view associated with the first pose, acquiring the second map, storing the second map on the mobile device, registering a virtual object in relation to the second map, rendering the virtual object, and displaying on the mobile device a virtual image associated with the virtual object corresponding with a view of the virtual object such that the virtual object is perceived to exist within the field of view associated with the first pose.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of one embodiment of a networked computing environment in which the disclosed technology may be practiced.
  • FIG. 2 depicts one embodiment of a portion of an HMD.
  • FIG. 3A depicts one embodiment of a field of view as seen by a user wearing a HMD.
  • FIG. 3B depicts one embodiment of an AR environment.
  • FIG. 3C depicts one embodiment of an AR environment.
  • FIG. 4 illustrates one embodiment of a mapping system.
  • FIG. 5 depicts one embodiment of an AR system.
  • FIG. 6A is a flowchart describing one embodiment of a process for generating a 3-D map of a real-world environment and locating virtual objects within the 3-D map.
  • FIG. 6B is a flowchart describing one embodiment of a process for updating a 3-D map of a real-world environment.
  • FIG. 7 is a flowchart describing one embodiment of a process for rendering and displaying virtual objects.
  • FIG. 8 is a block diagram of an embodiment of a gaming and media system.
  • FIG. 9 is a block diagram of one embodiment of a mobile device.
  • FIG. 10 is a block diagram of an embodiment of a computing system environment.
  • DETAILED DESCRIPTION
  • Technology is described for providing an augmented reality environment in which the environmental mapping process is decoupled from the localization processes performed by one or more mobile devices. In some embodiments, an augmented reality system includes a mapping system with independent sensing devices for mapping a particular real-world environment and one or more mobile devices. Each of the one or more mobile devices utilizes a separate asynchronous computing pipeline for localizing the mobile device and rendering virtual objects from a point of view of the mobile device. This distributed approach provides an efficient way for supporting mapping and localization processes for a large number of mobile devices, which are typically constrained by form factor and battery life limitations.
  • An AR system may allow a user of a mobile device to view augmented images in real-time as the user moves about within a particular real-world environment. In order for the AR system to enable the perceived interaction between the particular real-world environment and virtual objects within the particular real-world environment, several problems must be solved. First, the AR system must generate a map of the particular real-world environment in which the virtual objects are to appear and interact (i.e., perform a mapping step). Second, the AR system must determine a particular pose associated with the mobile device (i.e., perform a localization step). Finally, the virtual objects that are to appear within the field of view of the particular pose must be registered or aligned with a coordinate system associated with the particular real-world environment. Solving these problems takes considerable processing power and providing the computer hardware necessary to accomplish real-time AR is especially difficult because of form factor and power limitations imposed on mobile devices.
  • In general, solving the mapping problem is computationally much more difficult than the localization problem and may require more computational power as well as more sophisticated sensors for increased robustness and accuracy. However, the mapping problem need not be solved in real-time if landmarks or other real objects within the real-world environment are static (i.e., do not move) or semi-static (i.e., do not move often). On the other hand, mobile device localization does need to be performed in real-time in order enable the perception of real-time interaction between real objects and virtual objects.
  • FIG. 1 is a block diagram of one embodiment of a networked computing environment 100 in which the disclosed technology may be practiced. Networked computing environment 100 includes a plurality of computing devices interconnected through one or more networks 180. The one or more networks 180 allow a particular computing device to connect to and communicate with another computing device. The depicted computing devices include mobile device 140, mobile devices 110 and 120, desktop computer 130, and mapping server 150. In some embodiments, the plurality of computing devices may include other computing devices not shown. In some embodiments, the plurality of computing devices may include more than or less than the number of computing devices shown in FIG. 1. The one or more networks 180 may include a secure network such as an enterprise private network, an unsecure network such as a wireless open network, a local area network (LAN), a wide area network (WAN), and the Internet. Each network of the one or more networks 180 may include hubs, bridges, routers, switches, and wired transmission media such as a wired network or direct-wired connection.
  • A server, such as mapping server 150, may allow a client to download information (e.g., text, audio, image, and video files) from the server or to perform a search query related to particular information stored on the server. In general, a “server” may include a hardware device that acts as the host in a client-server relationship or a software process that shares a resource with or performs work for one or more clients. Communication between computing devices in a client-server relationship may be initiated by a client sending a request to the server asking for access to a particular resource or for particular work to be performed. The server may subsequently perform the actions requested and send a response back to the client.
  • One embodiment of mobile device 140 includes a camera 148, microphone 149, network interface 145, processor 146, and memory 147, all in communication with each other. Camera 148 may capture digital images and/or videos. Microphone 149 may capture sounds. Network interface 145 allows mobile device 140 to connect to one or more networks 180. Network interface 145 may include a wireless network interface, a modem, and/or a wired network interface. Processor 146 allows mobile device 140 to execute computer readable instructions stored in memory 147 in order to perform processes discussed herein.
  • Networked computing environment 100 may provide a cloud computing environment for one or more computing devices. Cloud computing refers to Internet-based computing, wherein shared resources, software, and/or information are provided to one or more computing devices on-demand via the Internet (or other global network). The term “cloud” is used as a metaphor for the Internet, based on the cloud drawings used in computer network diagrams to depict the Internet as an abstraction of the underlying infrastructure it represents.
  • In one example of an AR environment, one or more users may move around a real-world environment (e.g., a living room) each wearing special glasses, such as mobile device 140, that allow the one or more users to observe views of the real-world overlaid with virtual images of virtual objects that maintain coherent spatial relationship with the real-world environment (i.e., as a particular user turns their head or moves within the real-world environment, the virtual images displayed to the particular user will change such that the virtual objects appear to exist within the real-world environment as perceived by the particular user). In one embodiment, environmental mapping of the real-world environment is performed by mapping server 150 (i.e., on the server side) while camera localization is performed on mobile device 140 (i.e., on the client side). The one or more users may also perceive 3-D localized virtual surround sounds that match the general acoustics of the real-world environment and that seem fixed in space with respect to the real-world environment.
  • In another example, live video images captured using a video camera on a mobile device, such as mobile device 120, may be augmented with computer-generated images of a virtual monster. The resulting augmented video images may then be displayed on a display of the mobile device in real-time such that an end user of the mobile device sees the virtual monster interacting with the real-world environment captured by the mobile device. In another example, the perception of a real-world environment may be augmented via a head-mounted display device (HMD), which may comprise a video see-through and/or an optical see-through system. Mobile device 140 is one example of an optical see-through HMD. An optical see-through HMD worn by an end user may allow actual direct viewing of a real-world environment (e.g., via transparent lenses) and may, at the same time, project images of a virtual object into the visual field of the end user thereby augmenting the real-world environment perceived by the end user with the virtual object.
  • FIG. 2 depicts one embodiment of a portion of an HMD, such as mobile device 140 in FIG. 1. Only the right side of a head-mounted device is depicted. HMD 200 includes right temple 202, nose bridge 204, eye glass 216, and eye glass frame 214. Built into nose bridge 204 is a microphone 210 for recording sounds and transmitting the audio recording to processing unit 236. A front facing camera 213 is embedded inside right temple 202 for recording digital images and/or videos and transmitting the visual recordings to processing unit 236. Front facing camera 213 may capture color information, IR information, and/or depth information. Microphone 210 and front facing camera 213 are in communication with processing unit 236.
  • Also embedded inside right temple 202 are ear phones 230, motion and orientation sensor 238, GPS receiver 232, power supply 239, and wireless interface 237, all in communication with processing unit 236. Motion and orientation sensor 238 may include a three axis magnetometer, a three axis gyro, and/or a three axis accelerometer. In one embodiment, the motion and orientation sensor 238 may comprise an inertial measurement unit (IMU). The GPS receiver may determine a GPS location associated with HMD 200. Processing unit 236 may include one or more processors and a memory for storing computer readable instructions to be executed on the one or more processors. The memory may also store other types of data to be executed on the one or more processors.
  • In one embodiment, eye glass 216 may comprise a see-through display, whereby virtual images generated by processing unit 236 may be projected and/or displayed on the see-through display. The front facing camera 213 may be calibrated such that the field of view captured by the front facing camera 213 corresponds with the field of view as seen by a user of HMD 200. The ear phones 230 may be used to output virtual sounds associated with the virtual images. In some embodiments, HMD 200 may include two or more front facing cameras (e.g., one on each temple) in order to obtain depth from stereo information associated with the field of view captured by the front facing cameras. The two or more front facing cameras may also comprise 3-D, IR, and/or RGB cameras. Depth information may also be acquired from a single camera utilizing depth from motion techniques. For example, two images may be acquired from the single camera associated with two different points in space at different points in time. Parallax calculations may then be performed given position information regarding the two different points in space.
  • FIG. 3A depicts one embodiment of a field of view as seen by a user wearing a HMD such as mobile device 140 in FIG. 1. The user may see within the field of view both real objects and virtual objects. The real objects may include a chair 16 and a mapping system 10 (e.g., comprising a portion of an entertainment system). The virtual objects may include a virtual monster 17. As the virtual monster 17 is displayed or overlaid over the real-world environment as perceived through the see-through lenses of the HMD, the user may perceive that the virtual monster 17 exists within the real-world environment.
  • FIG. 3B depicts one embodiment of an AR environment. The AR environment includes a mapping system 10 and mobile devices 18 and 19. The mobile devices 18 and 19, depicted as special glasses in FIG. 3B, may include a HMD such as HMD 200 in FIG. 2. The mapping system 10 may include a computing environment 12, a capture device 20, and a display 14, all in communication with each other. Computing environment 12 may include one or more processors. Capture device 20 may include a color or depth sensing camera that may be used to visually monitor one or more targets including humans and one or more other objects within a particular environment. In one example, capture device 20 may comprise an RGB or depth camera and computing environment 12 may comprise a set-top box or gaming console. Mapping system 10 may support multiple mobile devices or clients.
  • The mobile devices 18 and 19 may include HMDs. As shown in FIG. 3B, user 28 wears mobile device 18 and user 29 wears mobile device 19. The mobile devices 18 and 19 may receive virtual data from mapping system 10 such that a virtual object is perceived to exist within a field of view as displayed through the respective mobile device. For example, as seen by user 28 through mobile device 18, the virtual object is displayed as the back of virtual monster 17. As seen by user 29 through mobile device 19, the virtual object is displayed as the front of virtual monster 17 appearing above the back of chair 16. The rendering of virtual monster 17 may be performed by mapping system 10 or by mobile devices 18 and 19. In one embodiment, mapping system 10 renders images of virtual monster 17 associated with a field of view of a particular mobile device and transmits the rendered images to the particular mobile device.
  • FIG. 3C depicts one embodiment of an AR environment utilizing the mapping system 10 and mobile devices 18 and 19 depicted in FIG. 3B. The mapping system 10 may track and analyze virtual objects within a particular environment such as virtual ball 27. The mapping system 10 may also track and analyze real objects within the particular environment such as user 28, user 29, and chair 16. The rendering of images associated with virtual ball 27 may be performed by mapping system 10 or by mobile devices 18 and 19.
  • In one embodiment, mapping system 10 tracks the position of virtual objects by taking into consideration the interaction between real and virtual objects. For example, user 18 may move their arm such that user 18 perceives hitting virtual ball 27. The mapping system 10 may subsequently apply a virtual force to virtual ball 27 such that both users 28 and 29 perceive that the virtual ball has been hit by user 28. In one example, mapping system 10 may register the placement of virtual ball 27 within a 3-D map of the particular environment and provide virtual data information to mobile devices 18 and 19 such that users 28 and 29 perceive the virtual ball 27 as existing within the particular environment from their respective points of view. In another embodiment, a particular mobile device may render virtual objects that are specific to the particular mobile device. For example, if the virtual ball 27 is only rendered on mobile device 18 then the virtual ball 27 would only be perceived as existing within the particular environment by user 28. In some embodiments, the dynamics of virtual objects may be performed on the particular mobile device and not on the mapping system.
  • FIG. 4 illustrates one embodiment of a mapping system 50 including a capture device 58 and computing environment 54. Mapping system 50 is one example of an implementation for mapping system 10 in FIGS. 3B-3C. For example, computing environment 54 may correspond with computing environment 12 in FIGS. 3B-3C and capture device 58 may correspond with capture device 20 in FIGS. 3B-3C.
  • In one embodiment, the capture device 58 may include one or more image sensors for capturing images and videos. An image sensor may comprise a CCD image sensor or a CMOS sensor. In some embodiments, capture device 58 may include an IR CMOS image sensor. The capture device 58 may also include a depth camera (or depth sensing camera) configured to capture video with depth information including a depth image that may include depth values via any suitable technique including, for example, time-of-flight, structured light, stereo image, or the like.
  • The capture device 58 may include an image camera component 32. In one embodiment, the image camera component 32 may include a depth camera that may capture a depth image of a scene. The depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may represent a depth value such as a distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the camera.
  • The image camera component 32 may include an IR light component 34, a three-dimensional (3-D) camera 36, and an RGB camera 38 that may be used to capture the depth image of a capture area. For example, in time-of-flight analysis, the IR light component 34 of the capture device 58 may emit an infrared light onto the capture area and may then use sensors to detect the backscattered light from the surface of one or more objects in the capture area using, for example, the 3-D camera 36 and/or the RGB camera 38. In some embodiments, pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 58 to a particular location on the one or more objects in the capture area. Additionally, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device to a particular location associated with the one or more objects.
  • In another example, the capture device 58 may use structured light to capture depth information. In such an analysis, patterned light (i.e., light displayed as a known pattern such as grid pattern or a stripe pattern) may be projected onto the capture area via, for example, the IR light component 34. Upon striking the surface of one or more objects (or targets) in the capture area, the pattern may become deformed in response. Such a deformation of the pattern may be captured by, for example, the 3-D camera 36 and/or the RGB camera 38 and analyzed to determine a physical distance from the capture device to a particular location on the one or more objects.
  • In some embodiments, two or more different cameras may be incorporated into an integrated capture device. For example, a depth camera and a video camera (e.g., an RGB video camera) may be incorporated into a common capture device. In some embodiments, two or more separate capture devices of the same or differing types may be cooperatively used. For example, a depth camera and a separate video camera may be used, two video cameras may be used, two depth cameras may be used, two RGB cameras may be used or any combination and number of cameras may be used. In one embodiment, the capture device 58 may include two or more physically separated cameras that may view a capture area from different angles to obtain visual stereo data that may be resolved to generate depth information. Depth may also be determined by capturing images using a plurality of detectors that may be monochromatic, infrared, RGB, or any other type of detector and performing a parallax calculation. Other types of depth image sensors can also be used to create a depth image.
  • As shown in FIG. 4, capture device 58 may include a microphone 40. The microphone 40 may include a transducer or sensor that may receive and convert sound into an electrical signal.
  • The capture device 58 may include a processor 42 that may be in operative communication with the image camera component 32. The processor may include a standardized processor, a specialized processor, a microprocessor, or the like. The processor 42 may execute instructions that may include instructions for storing filters or profiles, receiving and analyzing images, determining whether a particular situation has occurred, or any other suitable instructions. It is to be understood that at least some image analysis and/or target analysis and tracking operations may be executed by processors contained within one or more capture devices such as capture device 58.
  • The capture device 58 may include a memory 44 that may store the instructions that may be executed by the processor 42, images or frames of images captured by the 3-D camera or RGB camera, filters or profiles, or any other suitable information, images, or the like. In one example, the memory 44 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component. As shown in FIG. 4, the memory 44 may be a separate component in communication with the image capture component 32 and the processor 42. In another embodiment, the memory 44 may be integrated into the processor 42 and/or the image capture component 32. In other embodiments, some or all of the components 32, 34, 36, 38, 40, 42 and 44 of the capture device 58 illustrated in FIG. 4 are housed in a single housing.
  • The capture device 58 may be in communication with the computing environment 54 via a communication link 46. The communication link 46 may be a wired connection including, for example, a USB connection, a FireWire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection. The computing environment 54 may provide a clock to the capture device 58 that may be used to determine when to capture, for example, a scene via the communication link 46. In one embodiment, the capture device 58 may provide the images captured by, for example, the 3D camera 36 and/or the RGB camera 38 to the computing environment 54 via the communication link 46.
  • As shown in FIG. 4, computing environment 54 includes image and audio processing engine 194 in communication with operating system 196. Image and audio processing engine 194 includes virtual data engine 197, gesture recognizer engine 190, structure data 198, processing unit 191, and memory unit 192, all in communication with each other. Image and audio processing engine 194 processes video, image, and audio data received from capture device 58. To assist in the detection and/or tracking of objects, image and audio processing engine 194 may utilize structure data 198 and gesture recognition engine 190. Virtual data engine 197 processes virtual objects and registers the position and orientation of virtual objects in relation to various maps of a real-world environment stored in memory unit 192.
  • Processing unit 191 may include one or more processors for executing object, facial, and voice recognition algorithms. In one embodiment, image and audio processing engine 194 may apply object recognition and facial recognition techniques to image or video data. For example, object recognition may be used to detect particular objects (e.g., soccer balls, cars, or landmarks) and facial recognition may be used to detect the face of a particular person. Image and audio processing engine 194 may apply audio and voice recognition techniques to audio data. For example, audio recognition may be used to detect a particular sound. The particular faces, voices, sounds, and objects to be detected may be stored in one or more memories contained in memory unit 192.
  • In some embodiments, one or more objects being tracked may be augmented with one or more markers such as an IR retroreflective marker to improve object detection and/or tracking. Planar reference images, coded AR markers, QR codes, and/or bar codes may also be used to improve object detection and/or tracking. Upon detection of one or more objects, image and audio processing engine 194 may report to operating system 196 an identification of each object detected and a corresponding position and/or orientation.
  • The image and audio processing engine 194 may utilize structural data 198 while performing object recognition. Structure data 198 may include structural information about targets and/or objects to be tracked. For example, a skeletal model of a human may be stored to help recognize body parts. In another example, structure data 198 may include structural information regarding one or more inanimate objects in order to help recognize the one or more inanimate objects.
  • The image and audio processing engine 194 may also utilize gesture recognizer engine 190 while performing object recognition. In one example, gestures recognizer engine 190 may include a collection of gesture filters, each comprising information concerning a gesture that may be performed by a skeletal model. The gesture recognition engine 190 may compare the data captured by capture device 58 in the form of the skeletal model and movements associated with it to the gesture filters in a gesture library to identify when a user (as represented by the skeletal model) has performed one or more gestures. In one example, image and audio processing engine 194 may use the gesture recognition engine 190 to help interpret movements of a skeletal model and to detect the performance of a particular gesture.
  • More information about the detection and tracking of objects can be found in U.S. patent application Ser. No. 12/641,788, “Motion Detection Using Depth Images,” filed on Dec. 18, 2009; and U.S. patent application Ser. No. 12/475,308, “Device for Identifying and Tracking Multiple Humans over Time,” both of which are incorporated herein by reference in their entirety. More information about gesture recognizer engine 190 can be found in U.S. patent application Ser. No. 12/422,661, “Gesture Recognizer System Architecture,” filed on Apr. 13, 2009, incorporated herein by reference in its entirety. More information about recognizing gestures can be found in U.S. patent application Pat. No. 12/391,150, “Standard Gestures,” filed on Feb. 23, 2009; and U.S. patent application Ser. No. 12/474,655, “Gesture Tool,” filed on May 29, 2009, both of which are incorporated by reference herein in their entirety.
  • FIG. 5 depicts one embodiment of an AR system 600. AR system 600 includes mapping server 620 and mobile device 630. Mapping server 620 may comprise a mapping system such as mapping system 50 in FIG. 4. Mapping server 620 may work asynchronously to build one or more maps associated with one or more real-world environments. The one or more maps may include sparse 3-D maps and/or dense 3-D maps based on images captured from a variety of sources including sensors dedicated to mapping server 620. Mobile device 630 may comprise an HMD such as mobile device 140 in FIG. 1. Although a single mobile device 630 is depicted, mapping server 620 may support numerous mobile devices.
  • The process steps 621-624 associated with mapping server 620 are decoupled from and are performed independently of the process steps 631-635 associated with the mobile device 630. In step 621, images are received from one or more server sensors. One example of the one or more server sensors includes the 3-D and/or RGB cameras within the image camera component 32 in FIG. 4. The one or more server sensors may be associated with one or more stationary cameras and/or one or more mobile cameras which may move about a particular environment over time. The one or more mobile cameras may move in a predetermined or predictable manner about the particular environment (e.g., a dedicated mapping server camera that runs on tracks above a room or hangs from the ceiling of a room and rotates in a deterministic manner). In one embodiment, the one or more server sensors providing images to the mapping server 620 may move about an environment attached to a mobile robot or autonomous vehicle.
  • In step 622, images received from the one or more server sensors are registered. During image registration, the mapping server 620 may register different images taken within a particular environment (e.g., images from different points of view, taken at different points in time, and/or images associated with different types of information such as color or depth information) into a single real-world coordinate system associated with the particular environment. In one example, image registration into a common coordinate system may be performed using an extrinsic calibration process. The registration and/or alignment of images (or objects within the images) onto a common coordinate system allows the mapping server 620 to be able to compare and integrate real-world objects, landmarks, or other features extracted from the different images into one or more maps associated with the particular environment. The one or more maps associated with a particular environment may comprise 3-D maps.
  • More information about generating 3-D maps can be found in U.S. patent application Ser. No. 13/017,690, “Three-Dimensional Environment Reconstruction,” incorporated herein by reference in its entirety.
  • In one embodiment, one or more additional images derived from sources decoupled from mapping server 620 may be used to extend or update the one or more maps associated with the particular environment. For example, client images 640 derived from sensors associated with mobile device 630 may be used by mapping server 620 to refine the one or more maps.
  • In some embodiments, the registration of one or more images may require knowledge of one or more poses associated with the one or more images. For example, for each image being registered, a six degree of freedom (6DOF) pose may be provided including information associated with the position and orientation of the particular sensor from which the image was captured. Mapping and pose estimation (or localization) associated with a particular sensor may be determined using traditional SLAM or PTAM techniques.
  • In step 623, a sparse map representing a portion of a particular environment is generated. The sparse map may comprise a 3-D point cloud and may include one or more image descriptors. The mapping server 620 may identify one or more image descriptors associated with a particular map of the one or more maps by applying various image processing methods such as object recognition, feature detection, corner detection, blob detection, and edge detection methods. The one or more image descriptors may be used as landmarks in determining a particular pose, position, and/or orientation in relation to the particular map. An image descriptor may include color and/or depth information associated with a particular object (e.g., a red apple) or a portion of a particular object within the particular environment (e.g., the top of a red apple). In some embodiments, an initial sparse map is generated first and then subsequently refined and updated over time. The initial sparse map may be limited to only covering static objects within the particular environment.
  • In step 624, a dense map representing a portion of the particular environment is generated. The dense map may comprise a dense 3-D surface mesh, which may be created using the 3-D point cloud generated in step 623. The dense map may provide sufficient detail of the particular environment so as to enable complex augmented reality applications such as those that must handle collisions and occlusions due to the interaction of real objects and virtual objects within the particular environment. In one embodiment, a particular dense map is not generated by mapping server 620 until a request for the particular dense map is received from mobile device 630. In another embodiment, a mobile device 630 may request a particular dense map first without requesting a related sparse map. In some embodiments, the process steps 621-624 associated with mapping server 620 may be performed in the cloud.
  • As mobile device 630 moves around a particular environment it may request mapping data 641 and/or mapping data 642 associated with a particular environment. The decision to request either mapping data 641 or mapping data 642 depends on the requirements of a particular AR application running on mobile device 630. In step 631, one or more client images are received from one or more client sensors. One example of the one or more client sensors includes the 3-D and/or RGB cameras associated with HMD 200 in FIG. 2. In one embodiment, the one or more client sensors detect light from a patterned or structured light source associated with and projected by mapping server 620. This allows several mobile devices to each independently utilize the same structured light source for localization purposes.
  • In step 632, one or more client image descriptors are extracted from the one or more client images. The mobile device 630 may identify one or more client image descriptors within the one or more client images by applying various image processing methods such as object recognition, feature detection, corner detection, blob detection, and edge detection methods. In step 633, a particular map is acquired from mapping server 620. The particular map may be included within mapping data 641. The map may be a 3-D map and may include one or more image descriptors associated with one or more objects located within the particular environment described by the 3-D map. An updated map may be requested by mobile device 630 every time the mobile device enters a new portion of the particular environment or at a predefined frequency (e.g., every 10 minutes).
  • In order to augment a real-world environment as observed from the point of view of mobile device 630, the pose associated with a particular field of view of mobile device 630 in relation to the real-world environment must be determined. In step 634, a pose (e.g., a 6DOF pose) associated with a field of view of mobile device 630 is determined via localization and/or pose estimation. Pose estimation may include the matching and aligning of detected image descriptors. For example, matching may be performed between the one or more client image descriptors and the image descriptors received from mapping server 620 included within mapping data 641. In some embodiments, matching may take into consideration depth and color information associated with the one or more image descriptors. In other embodiments, the one or more client images comprise color images (e.g., RGB images) and matching is performed for only image descriptors including color information, thereby enabling image-based localization for mobile devices with only RGB sensor input. Alignment of matched image descriptors may be performed using image or frame alignment techniques, which may include determining point to point correspondences between the field of view of mobile device 630 and views derived from the particular map.
  • In some embodiments, mobile device 630 may include a GPS receiver, such as GPS receiver 232 in FIG. 2, and a motion and orientation sensor, such as motion and orientation sensor 238 in FIG. 2. Prior to image processing, a first-pass estimate for the pose associated with mobile device 630 may be obtained by utilizing GPS location information and orientation information generated on mobile device 630. Mobile device 630 may also obtain location and orientation information by detecting, tracking, and triangulating the position of physical tags (e.g., reflective markers) or emitters (e.g., LEDs) attached to one or more other mobile devices using marker-based motion capture technologies. The location and/or orientation information associated with mobile device 630 may be used to improve the accuracy of the localization step.
  • In one embodiment, localization of mobile device 630 includes searching for image descriptors associated with landmarks or other objects within a field of view of mobile device 630. The extracted image descriptors may subsequently be matched with image descriptors associated with the most recent map of the particular environment acquired from mapping server 620 in step 623. Landmarks may be associated with image features that are easily observed and distinguished from other features within the field of view. Stationary landmarks may act as anchor points or reference points in a 3-D map of a real-world environment. Other points of interest within a real-world environment (e.g., geographical elements such as land features) may also be used for both mapping and localization purposes. Landmarks and other points of interest within the real-world environment may be detected using various image processing methods such as object recognition, frame alignment, feature detection, corner detection, blob detection, and edge detection methods. As mobile device 630 moves around within an environment, mobile device 630 may continuously search for and extract landmarks and associate newly discovered landmarks to those found previously.
  • For each object within an environment, there may be one or more image descriptors associated with the object that can be extracted and used to identify the object. An image descriptor may comprise image information related to a portion of an object or to the entire object. In one example, an image descriptor may describe characteristics of the object such as its location, color, texture, shape, and/or its relationship to other objects or landmarks within the environment. Utilizing image processing techniques such as object and pattern matching, the one or more image descriptors may be used to locate an object in an image containing other objects. It is desirable that the image processing techniques for detecting and matching the one or more image descriptors be robust to changes in image scale, noise, illumination, local geometric distortion, and image orientation.
  • In some embodiments, localization of mobile device 630 may be performed in the cloud (e.g., by the mapping server 620). For example, the mobile device 630 may transmit client images 640 derived from sensors associated with mobile device 630 and/or image descriptors 644 extracted in step 632 to the mapping server 620. The mapping server 620 may perform the mobile device localization based on the received client images 640 and/or image descriptors 644 utilizing process steps similar to those performed by mobile device 630 in step 634. Subsequently, mobile device 630 may acquire a pose associated with a field of view of mobile device 630 from the mapping server 620 without needing to acquire the particular map in step 633 or perform localization step 634 (i.e., steps 633 and 634 may be omitted if localization of the mobile device is performed by the mapping server).
  • More information regarding performing pose estimation and/or localization for a mobile device can be found in U.S. patent application Ser. No. 13/017,474, “Mobile Camera Localization Using Depth Maps,” incorporated herein by reference in its entirety.
  • More information regarding differentiating objects within an environment can be found in U.S. patent application Ser. No. 13/017,626, “Moving Object Segmentation Using Depth Images,” incorporated herein by reference in its entirety.
  • Referring to FIG. 5, in step 635, one or more virtual objects are displayed on mobile device 630. The one or more virtual objects may be rendered on mobile device 630 using a processing element, such as processing unit 236 in FIG. 2. Rendering may involve the use of ray tracing techniques (e.g., simulating the propagation of light and sound waves through a model of the AR environment) in order to generate virtual images and/or sounds from the perspective of a mobile device. Virtual data associated with the one or more virtual objects may be generated by either the mapping server 620 and/or mobile device 630.
  • In one embodiment, the one or more virtual objects are rendered locally on mobile device 630. In another embodiment, the one or more virtual objects are rendered on mapping server 620 and virtual images corresponding with the determined pose in step 634 may be transmitted to mobile device 630. Virtual images associated with the one or more virtual objects may be displayed on mobile device 630 such that the one or more virtual objects are perceived to exist within a field of view associated with the previously determined pose of step 634. The virtual data associated with the one or more virtual object may also include virtual sounds. In some embodiments, mobile device 630 may request a dense 3-D map of a particular environment in order to realistically render virtual objects and to handle collisions and occlusions due to the interaction of real objects and virtual objects within the particular environment. Virtual sounds may also be registered in relation to the dense 3-D map. The dense 3-D map may be acquired from mapping server 620 and included within mapping data 642.
  • More information regarding AR systems and environments can be found in U.S. patent application Ser. No. 12/912,937, “Low-Latency Fusing of Virtual and Real Content,” incorporated herein by reference in its entirety.
  • More information regarding generating virtual sounds in an AR environment can be found in U.S. patent application Ser. No. 12/903,610, “System and Method for High-Precision 3-Dimensional Audio for Augmented Reality,” incorporated herein by reference in its entirety.
  • FIG. 6A is a flowchart describing one embodiment of a process for generating a 3-D map of a real-world environment and locating virtual objects within the 3-D map. The process of FIG. 6A may be performed continuously and by one or more computing devices. Each step in the process of FIG. 6A may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device. In one embodiment, the process of FIG. 6A is performed by a mapping server such as mapping server 620 in FIG. 5.
  • In step 702, a 3-D map of a real-world environment is generated. In step 704, one or more virtual objects may be computer generated. In one example, the 3-D map and the one or more virtual objects are generated using a mapping server. The one or more virtual objects may be stored in a database on the mapping server. In step 706, the one or more virtual objects may be registered in relation to the 3-D map. If registration is performed by the mapping server, then the mapping server may control the location of the virtual objects being displayed by one or more mobile devices.
  • In step 708, a determination is made as to whether or not the location of one or more virtual objects in relation to the 3-D map has changed. The change in location may be due to a perceived interaction between one or more virtual objects and/or one or more real objects, or due to the natural movement of a virtual object through an environment (e.g., via physics simulation). If the location of one or more virtual objects has changed, then the one or more virtual objects are reregistered in relation to the 3-D map in step 706. If the one or more virtual objects are all characterized as static then step 708 may be omitted.
  • In step 710, location information associated with one or more mobile devices is received. The received location information may be used by a mapping server to customize a particular map associated with the location information. In step 712, at least a portion of the 3-D map is outputted to the one or more mobile devices. The 3-D map may include a customized map of an environment associated with the location information received in step 710. The 3-D map may include one or more image descriptors. The one or more image descriptors may include RGB image descriptors and/or 3-D image descriptors. In one embodiment, a mapping server outputs one of more image descriptors in response to a request from a mobile device for either RGB image descriptors or 3-D image descriptors depending on the requirements of a particular application running on the mobile device. In step 714, virtual data associated with the one or more virtual objects is outputted to the one or more mobile devices.
  • FIG. 6B is a flowchart describing one embodiment of a process for updating a 3-D map of a real-world environment. The process of FIG. 6B may be performed continuously and by one or more computing devices. Each step in the process of FIG. 6B may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device. In one embodiment, the process of FIG. 6B is performed by a mapping server such as mapping server 620 in FIG. 5.
  • In step 802, one or more images of a first environment is acquired from one or more cameras associated with a mapping server. In step 804, the one or more images are registered. In step 806, a first map of the first environment is generated. In step 808, one or more objects existing within the first environment are detected and characterized. Each of the one or more objects may be characterized as either static, semi-static, or dynamic. An object characterized as static is one which does not move. An object characterized as semi-static is one which does not move often. Semi-static objects may be useful for mapping and/or localization purposes because they may act as static landmarks over a period of time within an environment. Object recognition techniques may be used to identify objects, such as furniture or appliances, that may typically be classified as semi-static objects. In one embodiment, maps of an environment are stored and time stamped (e.g., every 2 hours) in order to detect movement of landmarks and to possibly recharacterize one or more objects. In some embodiments, mapping and localization are performed using only objects classified as static objects, thereby discarding any objects characterized as either semi-static or dynamic (i.e., not taking into consideration any moving objects).
  • In step 810, one or more additional images of the first environment are received from a first mobile device. Localization information associated with the one or more additional images may also be received from the first mobile device. Obtaining pose information along with one or more additional images allows a mapping server to generate one or more maps associated with an environment beyond that capable of being sensed by dedicated mapping server sensors. The one or more additional images may also include depth information. In step 812, the one or more additional images are registered. In some embodiments, the one or more additional images are registered by a mapping server. A mapping server may also accept information from one or more mobile devices in the form of registered RGB data and/or depth reconstructions. In step 814, the first map is updated based on the one or more additional images. In step 816, one or more new objects existing within the first environment are detected and characterized. In step 818, at least a subset of the updated first map is transmitted to a second mobile device. In one embodiment, a first set of mobile devices may be used to update the first map, while a second set of mobile devices perform localization in relation to the updated first map.
  • FIG. 7 is a flowchart describing one embodiment of a process for rendering and displaying virtual objects. The process of FIG. 7 may be performed continuously and by one or more computing devices. Each step in the process of FIG. 7 may be performed by the same or different computing devices as those used in other steps, and each step need not necessarily be performed by a single computing device. In one embodiment, the process of FIG. 7 is performed by a mobile device such as mobile device 630 in FIG. 5.
  • In step 902, it is determined whether a first map is required. In one example, a mobile device may determine whether a first map is required for localization purposes by determining a location of the mobile device (e.g., by acquiring a GPS location) and comparing the location with the locations associated with one or more maps stored on the mobile device. A first map may be required if none of the one or more maps stored on the mobile device provide adequate coverage of a particular environment or no map is deemed to be a valid map of the environment encompassing the location. A particular map stored on a mobile device may be deemed invalid because the map has not been refreshed within a certain period of time or an expiration date associated with the particular map has passed. In one example, a particular map stored on a mobile device may be invalidated by a mapping server upon an update to the particular map made by the mapping server.
  • In step 904, the first map is acquired. The first map may be acquired from a mapping server. In one embodiment, a mobile device may transmit a request for the first map to a mapping server. In another embodiment, a mobile device may transmit a request for the first map to a second mobile device located within a particular environment. In this case, the second mobile device may act as a local mapping cache for the mobile device. The first map may include one or more image descriptors associated with one or more real objects within a particular environment. In step 906, the first map is stored. The first map may be stored in a non-volatile memory within a mobile device.
  • In step 908, a first pose associated with the mobile device in relation to the first map is determined. The pose estimation or determination may be performed on a mobile device or on a mapping server. In one embodiment, localization is performed on a mobile device using the process of step 634 in FIG. 5. When performing localization on the mobile device, image descriptors associated with non-static objects may be discarded. In another embodiment, a mobile device may transmit images of its environment to a mapping server and the mapping server may perform the localization process on behalf of the mobile device.
  • In step 910, it is determined whether a second map is required. In one example, a mobile device may determine whether a second map is required for rendering and/or display purposes by determining whether a virtual object is located within (or near) a field of view associated with the first pose determined in step 908. The second map may comprise a higher resolution map of a portion of the first map. In step 912, the second map is acquired. In step 914, the second map is stored. In step 916, a virtual object is registered in relation to the second map. In one example, a mobile device registers the virtual object in relation to the second map prior to rendering the virtual object.
  • In step 918, the virtual object is rendered. The rendering step may include receiving virtual data associated with the virtual object from a mapping server and generating one or more virtual images of the virtual object based on the virtual data. In step 920, the virtual object is displayed. The virtual object may be displayed by displaying on a mobile device one or more virtual images associated with the virtual object. The one or more virtual images may correspond with a view of the virtual object such that the virtual object is perceived to exist within the field of view associated with the first pose determined in step 908.
  • The disclosed technology may be used with various computing systems. FIGS. 8-10 provide examples of various computing systems that can be used to implement embodiments of the disclosed technology.
  • FIG. 8 is a block diagram of an embodiment of a gaming and media system 7201. Console 7203 has a central processing unit (CPU) 7200, and a memory controller 7202 that facilitates processor access to various types of memory, including a flash Read Only Memory (ROM) 7204, a Random Access Memory (RAM) 7206, a hard disk drive 7208, and portable media drive 7107. In one implementation, CPU 7200 includes a level 1 cache 7210 and a level 2 cache 7212, to temporarily store data and hence reduce the number of memory access cycles made to the hard drive 7208, thereby improving processing speed and throughput.
  • CPU 7200, memory controller 7202, and various memory devices are interconnected via one or more buses (not shown). The one or more buses might include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus.
  • In one implementation, CPU 7200, memory controller 7202, ROM 7204, and RAM 7206 are integrated onto a common module 7214. In this implementation, ROM 7204 is configured as a flash ROM that is connected to memory controller 7202 via a PCI bus and a ROM bus (neither of which are shown). RAM 7206 is configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by memory controller 7202 via separate buses (not shown). Hard disk drive 7208 and portable media drive 7107 are shown connected to the memory controller 7202 via the PCI bus and an AT Attachment (ATA) bus 7216. However, in other implementations, dedicated data bus structures of different types may also be applied in the alternative.
  • A three-dimensional graphics processing unit 7220 and a video encoder 7222 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing. Data are carried from graphics processing unit 7220 to video encoder 7222 via a digital video bus (not shown). An audio processing unit 7224 and an audio codec (coder/decoder) 7226 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between audio processing unit 7224 and audio codec 7226 via a communication link (not shown). The video and audio processing pipelines output data to an A/V (audio/video) port 7228 for transmission to a television or other display. In the illustrated implementation, video and audio processing components 7220-7228 are mounted on module 7214.
  • FIG. 8 shows module 7214 including a USB host controller 7230 and a network interface 7232. USB host controller 7230 is in communication with CPU 7200 and memory controller 7202 via a bus (not shown) and serves as host for peripheral controllers 7205(1)-7205(4). Network interface 7232 provides access to a network (e.g., Internet, home network, etc.) and may be any of a wide variety of various wire or wireless interface components including an Ethernet card, a modem, a wireless access card, a Bluetooth® module, a cable modem, and the like.
  • In the implementation depicted in FIG. 8, console 7203 includes a controller support subassembly 7240 for supporting four controllers 7205(1)-7205(4). The controller support subassembly 7240 includes any hardware and software components needed to support wired and wireless operation with an external control device, such as for example, a media and game controller. A front panel I/O subassembly 7242 supports the multiple functionalities of power button 7213, the eject button 7215, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of console 7203. Subassemblies 7240 and 7242 are in communication with module 7214 via one or more cable assemblies 7244. In other implementations, console 7203 can include additional controller subassemblies. The illustrated implementation also shows an optical I/O interface 7235 that is configured to send and receive signals (e.g., from remote control 7290) that can be communicated to module 7214.
  • MUs 7241(1) and 7241(2) are illustrated as being connectable to MU ports “A” 7231(1) and “B” 7231(2) respectively. Additional MUs (e.g., MUs 7241(3)-7241(6)) are illustrated as being connectable to controllers 7205(1) and 7205(3), i.e., two MUs for each controller. Controllers 7205(2) and 7205(4) can also be configured to receive MUs (not shown). Each MU 7241 offers additional storage on which games, game parameters, and other data may be stored. Additional memory devices, such as portable USB devices, can be used in place of the MUs. In some implementations, the other data can include any of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into console 7203 or a controller, MU 7241 can be accessed by memory controller 7202. A system power supply module 7250 provides power to the components of gaming system 7201. A fan 7252 cools the circuitry within console 7203.
  • An application 7260 comprising machine instructions is stored on hard disk drive 7208. When console 7203 is powered on, various portions of application 7260 are loaded into RAM 7206, and/or caches 7210 and 7212, for execution on CPU 7200. Other applications may also be stored on hard disk drive 7208 for execution on CPU 7200.
  • Gaming and media system 7201 may be operated as a standalone system by simply connecting the system to a monitor, a television, a video projector, or other display device. In this standalone mode, gaming and media system 7201 enables one or more players to play games or enjoy digital media (e.g., by watching movies or listening to music). However, with the integration of broadband connectivity made available through network interface 7232, gaming and media system 7201 may further be operated as a participant in a larger network gaming community.
  • FIG. 9 is a block diagram of one embodiment of a mobile device 8300. Mobile devices may include laptop computers, pocket computers, mobile phones, personal digital assistants, and handheld media devices that have been integrated with wireless receiver/transmitter technology.
  • Mobile device 8300 includes one or more processors 8312 and memory 8310. Memory 8310 includes applications 8330 and non-volatile storage 8340. Memory 8310 can be any variety of memory storage media types, including non-volatile and volatile memory. A mobile device operating system handles the different operations of the mobile device 8300 and may contain user interfaces for operations, such as placing and receiving phone calls, text messaging, checking voicemail, and the like. The applications 8330 can be any assortment of programs, such as a camera application for photos and/or videos, an address book, a calendar application, a media player, an internet browser, games, an alarm application, and other applications. The non-volatile storage component 8340 in memory 8310 may contain data such as music, photos, contact data, scheduling data, and other files.
  • The one or more processors 8312 also communicates with RF transmitter/receiver 8306 which in turn is coupled to an antenna 8302, with infrared transmitter/receiver 8308, with global positioning service (GPS) receiver 8365, and with movement/orientation sensor 8314 which may include an accelerometer and/or magnetometer. RF transmitter/receiver 8308 may enable wireless communication via various wireless technology standards such as Bluetooth® or the IEEE 802.11 standards. Accelerometers have been incorporated into mobile devices to enable applications such as intelligent user interface applications that let users input commands through gestures, and orientation applications which can automatically change the display from portrait to landscape when the mobile device is rotated. An accelerometer can be provided, e.g., by a micro-electromechanical system (MEMS) which is a tiny mechanical device (of micrometer dimensions) built onto a semiconductor chip. Acceleration direction, as well as orientation, vibration, and shock can be sensed. The one or more processors 8312 further communicate with a ringer/vibrator 8316, a user interface keypad/screen 8318, a speaker 8320, a microphone 8322, a camera 8324, a light sensor 8326, and a temperature sensor 8328. The user interface keypad/screen may include a touch-sensitive screen display.
  • The one or more processors 8312 controls transmission and reception of wireless signals. During a transmission mode, the one or more processors 8312 provide voice signals from microphone 8322, or other data signals, to the RF transmitter/receiver 8306. The transmitter/receiver 8306 transmits the signals through the antenna 8302. The ringer/vibrator 8316 is used to signal an incoming call, text message, calendar reminder, alarm clock reminder, or other notification to the user. During a receiving mode, the RF transmitter/receiver 8306 receives a voice signal or data signal from a remote station through the antenna 8302. A received voice signal is provided to the speaker 8320 while other received data signals are processed appropriately.
  • Additionally, a physical connector 8388 may be used to connect the mobile device 8300 to an external power source, such as an AC adapter or powered docking station, in order to recharge battery 8304. The physical connector 8388 may also be used as a data connection to an external computing device. The data connection allows for operations such as synchronizing mobile device data with the computing data on another device.
  • FIG. 10 is a block diagram of an embodiment of a computing system environment 2200. Computing system environment 2200 includes a general purpose computing device in the form of a computer 2210. Components of computer 2210 may include, but are not limited to, a processing unit 2220, a system memory 2230, and a system bus 2221 that couples various system components including the system memory 2230 to the processing unit 2220. The system bus 2221 may be any of several types of bus structures including a memory bus, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
  • Computer 2210 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 2210 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 2210. Combinations of the any of the above should also be included within the scope of computer readable media.
  • The system memory 2230 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 2231 and random access memory (RAM) 2232. A basic input/output system 2233 (BIOS), containing the basic routines that help to transfer information between elements within computer 2210, such as during start-up, is typically stored in ROM 2231. RAM 2232 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 2220. By way of example, and not limitation, FIG. 10 illustrates operating system 2234, application programs 2235, other program modules 2236, and program data 2237.
  • The computer 2210 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 10 illustrates a hard disk drive 2241 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 2251 that reads from or writes to a removable, nonvolatile magnetic disk 2252, and an optical disk drive 2255 that reads from or writes to a removable, nonvolatile optical disk 2256 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 2241 is typically connected to the system bus 2221 through an non-removable memory interface such as interface 2240, and magnetic disk drive 2251 and optical disk drive 2255 are typically connected to the system bus 2221 by a removable memory interface, such as interface 2250.
  • The drives and their associated computer storage media discussed above and illustrated in FIG. 10, provide storage of computer readable instructions, data structures, program modules and other data for the computer 2210. In FIG. 10, for example, hard disk drive 2241 is illustrated as storing operating system 2244, application programs 2245, other program modules 2246, and program data 2247. Note that these components can either be the same as or different from operating system 2234, application programs 2235, other program modules 2236, and program data 2237. Operating system 2244, application programs 2245, other program modules 2246, and program data 2247 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into computer 2210 through input devices such as a keyboard 2262 and pointing device 2261, commonly referred to as a mouse, trackball, or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 2220 through a user input interface 2260 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). A monitor 2291 or other type of display device is also connected to the system bus 2221 via an interface, such as a video interface 2290. In addition to the monitor, computers may also include other peripheral output devices such as speakers 2297 and printer 2296, which may be connected through an output peripheral interface 2295.
  • The computer 2210 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 2280. The remote computer 2280 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 2210, although only a memory storage device 2281 has been illustrated in FIG. 10. The logical connections depicted in FIG. 10 include a local area network (LAN) 2271 and a wide area network (WAN) 2273, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • When used in a LAN networking environment, the computer 2210 is connected to the LAN 2271 through a network interface or adapter 2270. When used in a WAN networking environment, the computer 2210 typically includes a modem 2272 or other means for establishing communications over the WAN 2273, such as the Internet. The modem 2272, which may be internal or external, may be connected to the system bus 2221 via the user input interface 2260, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 2210, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 10 illustrates remote application programs 2285 as residing on memory device 2281. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • The disclosed technology is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the technology include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • The disclosed technology may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, software and program modules as described herein include routines, programs, objects, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Hardware or combinations of hardware and software may be substituted for software modules as described herein.
  • The disclosed technology may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
  • For purposes of this document, reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “another embodiment” are used to described different embodiments and do not necessarily refer to the same embodiment.
  • For purposes of this document, a connection can be a direct connection or an indirect connection (e.g., via another part).
  • For purposes of this document, the term “set” of objects, refers to a “set” of one or more of the objects.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (20)

1. A method for providing an augmented reality environment on a mobile device, comprising:
determining whether a localization map is required, the step of determining whether a localization map is required is performed on the mobile device and includes acquiring a first location associated with the mobile device, the localization map is associated with a first environment;
acquiring the localization map, the localization map includes one or more image descriptors, the one or more image descriptors are associated with one or more real objects within the first environment;
storing the localization map on the mobile device;
determining a first pose associated with the mobile device, the step of determining a first pose is performed on the mobile device and includes receiving one or more images, the one or more images comprise a field of view associated with the first pose, the step of determining a first pose includes detecting at least one of the one or more image descriptors within the one or more images;
determining whether a rendering map is required, the step of determining whether a rendering map is required includes determining whether a virtual object is located within the field of view associated with the first pose, the rendering map is associated with at least a portion of the first environment, the rendering map has a higher resolution than the localization map;
acquiring the rendering map;
storing the rendering map on the mobile device;
rendering the virtual object, the step of rendering includes registering the virtual object in relation to the rendering map; and
displaying on the mobile device a virtual image associated with the virtual object, the virtual image corresponds with a view of the virtual object such that the virtual object is perceived to exist within the field of view associated with the first pose.
2. The method of claim 1, wherein:
the step of rendering is performed on the mobile device, the step of rendering includes receiving virtual data associated with the virtual object from a mapping server, the step of rendering includes generating the virtual image of the virtual object based on the virtual data, the step of rendering includes generating virtual sounds associated with the virtual object.
3. The method of claim 1, wherein:
the step of determining whether a localization map is required includes determining whether another map stored on the mobile device is valid and associated with the first environment;
the step of acquiring a first location includes acquiring the first location from a GPS receiver associated with the mobile device;
each of the one or more real objects are characterized as one of static objects or semi-static objects;
the localization map includes a sparse 3-D map of the first environment;
the rendering map includes a dense 3-D map of the first environment; and
the mobile device includes a head mounted display device.
4. The method of claim 1, wherein:
the step of determining a first pose includes determining a first position and a first orientation of the mobile device in relation to the localization map, the step of determining a first pose includes discarding each of the one or more image descriptors that are associated with semi-static objects.
5. The method of claim 1, wherein:
the step of acquiring the localization map includes transmitting a request for the localization map to a mapping server.
6. The method of claim 1, wherein:
the step of acquiring the localization map includes transmitting a request for the localization map to a second mobile device, the second mobile device is located within the first environment.
7. The method of claim 1, wherein:
the step of determining whether a virtual object is within the field of view associated with the first pose is performed on the mobile device.
8. The method of claim 1, wherein:
the step of acquiring the localization map includes transmitting the first location associated with the mobile device.
9. An electronic device for providing an augmented reality environment, comprising:
one or more processors, the one or more processors determine whether a localization map is required, the one or more processors request the localization map from a mapping server;
a network interface, the network interface receives the localization map, the localization map includes one or more image descriptors, the one or more image descriptors are associated with one or more real objects within a first environment, the one or more processors determine a first pose associated with a field of view of the electronic device, the network interface receives virtual data associated with the virtual object, the one or more processors determine whether a rendering map is required based on the location of the virtual object within the field of view, the network interface receives the rendering map, the rendering map has a higher resolution than the localization map;
a memory, the memory stores the rendering map, the one or more processors render the virtual object in relation to the rendering map; and
a display, the display displays a virtual image associated with the virtual object, the virtual image corresponds with a view of the virtual object such that the virtual object is perceived to exist within the field of view.
10. The electronic device of claim 9, wherein:
the electronic device is a mobile device; and
the network interface receives virtual data from a mapping server.
11. The electronic device of claim 9, wherein:
each of the one or more real objects are characterized as one of static objects or semi-static objects;
the localization map includes a sparse 3-D map of the first environment;
the rendering map includes a dense 3-D map of the first environment; and
the electronic device includes a head mounted display device.
12. The electronic device of claim 9, wherein:
the determination of the first pose includes determining a first position and a first orientation of the electronic device in relation to the localization map.
13. The electronic device of claim 9, wherein:
the network interface receives the localization map from a mapping server.
14. The electronic device of claim 9, wherein:
the network interface receives the localization map from a second electronic device, the second electronic device is a mobile device.
15. One or more storage devices containing processor readable code for programming one or more processors to perform a method comprising the steps of:
determining whether a localization map is required, the step of determining whether a localization map is required is performed on a mobile device and includes acquiring a first location associated with the mobile device, the localization map is associated with a first environment, the step of determining whether a localization map is required includes determining whether another map stored on the mobile device is valid and associated with the first environment;
acquiring the localization map, the localization map includes one or more image descriptors, the one or more image descriptors are associated with one or more real objects within the first environment;
storing the localization map on the mobile device;
determining a first pose associated with the mobile device, the step of determining a first pose is performed on the mobile device and includes receiving one or more images, the one or more images comprise a field of view associated with the first pose, the step of determining a first pose includes detecting at least one of the one or more image descriptors within the one or more images;
determining whether a rendering map is required, the step of determining whether a rendering map is required includes determining whether a virtual object is located within the field of view associated with the first pose, the step of determining whether a virtual object is within the field of view associated with the first pose is performed on the mobile device, the rendering map is associated with at least a portion of the first environment, the rendering map has a higher resolution than the localization map;
acquiring the rendering map;
storing the rendering map on the mobile device;
rendering the virtual object, the step of rendering includes registering the virtual object in relation to the rendering map; and
playing on the mobile device a virtual sound associated with the virtual object, the virtual sound is perceived to originate from a location associated with the virtual object.
16. The one or more storage devices of claim 15, wherein:
the step of rendering is performed on the mobile device, the step of rendering includes receiving virtual data associated with the virtual object from a mapping server.
17. The one or more storage devices of claim 16, wherein:
the step of acquiring a first location includes acquiring the first location from a GPS receiver associated with the mobile device;
the localization map includes a sparse 3-D map of the first environment;
the rendering map includes a dense 3-D map of the first environment; and
the mobile device includes a head mounted display device.
18. The one or more storage devices of claim 16, wherein:
the step of determining a first pose includes determining a first position and a first orientation of the mobile device in relation to the localization map, the step of determining a first pose includes discarding each of the one or more image descriptors that are characterized as non-static objects.
19. The one or more storage devices of claim 18, wherein:
the step of acquiring the localization map includes transmitting a request for the localization map to a mapping server.
20. The one or more storage devices of claim 18, wherein:
the step of acquiring the localization map includes transmitting a request for the localization map to a second mobile device, the second mobile device is located within the first environment.
US13/152,220 2011-06-02 2011-06-02 Distributed asynchronous localization and mapping for augmented reality Abandoned US20120306850A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/152,220 US20120306850A1 (en) 2011-06-02 2011-06-02 Distributed asynchronous localization and mapping for augmented reality
US13/688,093 US8933931B2 (en) 2011-06-02 2012-11-28 Distributed asynchronous localization and mapping for augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/152,220 US20120306850A1 (en) 2011-06-02 2011-06-02 Distributed asynchronous localization and mapping for augmented reality

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/688,093 Continuation US8933931B2 (en) 2011-06-02 2012-11-28 Distributed asynchronous localization and mapping for augmented reality

Publications (1)

Publication Number Publication Date
US20120306850A1 true US20120306850A1 (en) 2012-12-06

Family

ID=47261301

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/152,220 Abandoned US20120306850A1 (en) 2011-06-02 2011-06-02 Distributed asynchronous localization and mapping for augmented reality
US13/688,093 Active 2031-07-04 US8933931B2 (en) 2011-06-02 2012-11-28 Distributed asynchronous localization and mapping for augmented reality

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/688,093 Active 2031-07-04 US8933931B2 (en) 2011-06-02 2012-11-28 Distributed asynchronous localization and mapping for augmented reality

Country Status (1)

Country Link
US (2) US20120306850A1 (en)

Cited By (195)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120320216A1 (en) * 2011-06-14 2012-12-20 Disney Enterprises, Inc. Method and System for Object Recognition, Authentication, and Tracking with Infrared Distortion Caused by Objects for Augmented Reality
US20130004100A1 (en) * 2011-06-30 2013-01-03 Nokia Corporation Method, apparatus and computer program product for generating panorama images
US20130173089A1 (en) * 2011-01-05 2013-07-04 Orbotix, Inc. Remotely controlling a self-propelled device in a virtualized environment
US20130182858A1 (en) * 2012-01-12 2013-07-18 Qualcomm Incorporated Augmented reality with sound and geometric analysis
US20130215109A1 (en) * 2012-02-22 2013-08-22 Silka Miesnieks Designating Real World Locations for Virtual World Control
US20140010407A1 (en) * 2012-07-09 2014-01-09 Microsoft Corporation Image-based localization
US20140176591A1 (en) * 2012-12-26 2014-06-26 Georg Klein Low-latency fusing of color image data
WO2014116467A1 (en) * 2013-01-22 2014-07-31 Microsoft Corporation Mixed reality experience sharing
US20140214605A1 (en) * 2013-01-31 2014-07-31 Wal-Mart Stores, Inc. Method And System For Answering A Query From A Consumer In A Retail Store
WO2014124062A1 (en) * 2013-02-07 2014-08-14 Microsoft Corporation Aligning virtual camera with real camera
US20140292807A1 (en) * 2013-03-27 2014-10-02 Giuseppe Raffa Environment actuation by one or more augmented reality elements
US20140323148A1 (en) * 2013-04-30 2014-10-30 Qualcomm Incorporated Wide area localization from slam maps
WO2014204330A1 (en) * 2013-06-17 2014-12-24 3Divi Company Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
WO2014209709A1 (en) * 2013-06-24 2014-12-31 Microsoft Corporation Tracking head movement when wearing mobile device
US8933931B2 (en) 2011-06-02 2015-01-13 Microsoft Corporation Distributed asynchronous localization and mapping for augmented reality
US8990682B1 (en) * 2011-10-05 2015-03-24 Google Inc. Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
WO2015048749A1 (en) * 2013-09-30 2015-04-02 Interdigital Patent Holdings, Inc. Methods, apparatus, systems, devices, and computer program products for providing an augmented reality display and/or user interface
US20150138231A1 (en) * 2012-07-17 2015-05-21 Zte Corporation Method, device and system for realizing augmented reality information sharing
US9081177B2 (en) 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
US20150235441A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for rendering virtual content
US9122054B2 (en) 2014-01-24 2015-09-01 Osterhout Group, Inc. Stray light suppression for head worn computing
US9150263B2 (en) 2011-01-05 2015-10-06 Sphero, Inc. Self-propelled device implementing three-dimensional control
US20150285638A1 (en) * 2012-06-12 2015-10-08 Trx Systems, Inc. System and method for localizing a trackee at a location and mapping the location using signal-based features
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
EP2933780A1 (en) * 2014-04-14 2015-10-21 Baidu Online Network Technology (Beijing) Co., Ltd Reality augmenting method, client device and server
US20150302652A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20150302646A1 (en) * 2014-02-11 2015-10-22 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20150334503A1 (en) * 2014-05-13 2015-11-19 Lenovo (Singapore) Pte. Ltd. Audio system tuning
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9298002B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US20160117860A1 (en) * 2014-10-24 2016-04-28 Usens, Inc. System and method for immersive and interactive multimedia generation
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US20160163107A1 (en) * 2014-12-09 2016-06-09 Industrial Technology Research Institute Augmented Reality Method and System, and User Mobile Device Applicable Thereto
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
WO2016118371A1 (en) * 2015-01-20 2016-07-28 Microsoft Technology Licensing, Llc Mixed reality system
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US20160327953A1 (en) * 2015-05-05 2016-11-10 Volvo Car Corporation Method and arrangement for determining safe vehicle trajectories
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9495389B2 (en) 2013-03-15 2016-11-15 Qualcomm Incorporated Client-server based dynamic search
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9547406B1 (en) 2011-10-31 2017-01-17 Google Inc. Velocity-based triggering
US9545542B2 (en) 2011-03-25 2017-01-17 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US20170045941A1 (en) * 2011-08-12 2017-02-16 Sony Interactive Entertainment Inc. Wireless Head Mounted Display with Differential Rendering and Sound Localization
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9600768B1 (en) * 2013-04-16 2017-03-21 Google Inc. Using behavior of objects to infer changes in a driving environment
US9606992B2 (en) * 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US9612403B2 (en) 2013-06-11 2017-04-04 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9626803B2 (en) 2014-12-12 2017-04-18 Qualcomm Incorporated Method and apparatus for image processing in augmented reality systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US20170153866A1 (en) * 2014-07-03 2017-06-01 Imagine Mobile Augmented Reality Ltd. Audiovisual Surround Augmented Reality (ASAR)
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US20170178375A1 (en) * 2015-03-24 2017-06-22 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
WO2017117675A1 (en) * 2016-01-08 2017-07-13 Sulon Technologies Inc. Head mounted device for augmented reality
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US20170221265A1 (en) * 2016-01-29 2017-08-03 Rovi Guides, Inc. Methods and systems for associating media content with physical world objects
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9773313B1 (en) * 2014-01-03 2017-09-26 Google Inc. Image registration with device data
US9785249B1 (en) 2016-12-06 2017-10-10 Vuelosophy Inc. Systems and methods for tracking motion and gesture of heads and eyes
US9811153B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9811731B2 (en) 2013-10-04 2017-11-07 Qualcomm Incorporated Dynamic extension of map data for object detection and tracking
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9826299B1 (en) 2016-08-22 2017-11-21 Osterhout Group, Inc. Speaker systems for head-worn computer systems
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9827487B2 (en) 2012-05-14 2017-11-28 Sphero, Inc. Interactive augmented reality using a self-propelled device
US9829882B2 (en) 2013-12-20 2017-11-28 Sphero, Inc. Self-propelled device with center of mass drive system
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US20170351918A1 (en) * 2015-02-13 2017-12-07 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
US9886032B2 (en) 2011-01-05 2018-02-06 Sphero, Inc. Self propelled device with magnetic coupling
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
WO2018085089A1 (en) * 2016-11-01 2018-05-11 Google Llc Map summarization and localization
WO2018111658A1 (en) * 2016-12-12 2018-06-21 Microsoft Technology Licensing, Llc Interacting with an environment using a parent device and at least one companion device
GB2558279A (en) * 2016-12-23 2018-07-11 Sony Interactive Entertainment Inc Head mountable display system
US10022643B2 (en) 2011-01-05 2018-07-17 Sphero, Inc. Magnetically coupled accessory for a self-propelled device
US10056791B2 (en) 2012-07-13 2018-08-21 Sphero, Inc. Self-optimizing power transfer
US20180241985A1 (en) * 2014-05-19 2018-08-23 Occipital, Inc. Methods for automatic registration of 3d image data
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
WO2018184675A1 (en) * 2017-04-05 2018-10-11 Telefonaktiebolaget Lm Ericsson (Publ) Illuminating an environment for localisation
US10120437B2 (en) 2016-01-29 2018-11-06 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US10139966B2 (en) 2015-07-22 2018-11-27 Osterhout Group, Inc. External user interface for head worn computing
US10146302B2 (en) 2016-09-30 2018-12-04 Sony Interactive Entertainment Inc. Head mounted display with multiple antennas
WO2018222115A1 (en) * 2017-05-30 2018-12-06 Crunchfish Ab Improved activation of a virtual object
WO2018231481A1 (en) 2017-06-16 2018-12-20 Microsoft Technology Licensing, Llc Object holographic augmentation
US10168701B2 (en) 2011-01-05 2019-01-01 Sphero, Inc. Multi-purposed self-propelled device
US10192310B2 (en) 2012-05-14 2019-01-29 Sphero, Inc. Operating a computing device by detecting rounded objects in an image
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
USD840395S1 (en) 2016-10-17 2019-02-12 Osterhout Group, Inc. Head-worn computer
US20190096131A1 (en) * 2017-09-27 2019-03-28 Fisher-Rosemount Systems, Inc. 3D Mapping of a Process Control Environment
US10256859B2 (en) 2014-10-24 2019-04-09 Usens, Inc. System and method for immersive and interactive multimedia generation
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
CN109727288A (en) * 2017-12-28 2019-05-07 北京京东尚科信息技术有限公司 System and method for monocular simultaneous localization and mapping
WO2019114986A1 (en) * 2017-12-15 2019-06-20 Telefonaktiebolaget Lm Ericsson (Publ) Determining a content pose of a piece of virtual content
CN110036359A (en) * 2017-06-23 2019-07-19 杰创科虚拟现实有限公司 The interactive augmented reality of first person role playing
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US20190293937A1 (en) * 2018-03-20 2019-09-26 Boe Technology Group Co., Ltd. Augmented reality display device and method, and augmented reality glasses
US20190311544A1 (en) * 2018-04-10 2019-10-10 Arm Ip Limited Image processing for augmented reality
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US10444021B2 (en) 2016-08-04 2019-10-15 Reification Inc. Methods for simultaneous localization and mapping (SLAM) and related apparatus and systems
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10466492B2 (en) 2014-04-25 2019-11-05 Mentor Acquisition One, Llc Ear horn assembly for headworn computer
US10499997B2 (en) 2017-01-03 2019-12-10 Mako Surgical Corp. Systems and methods for surgical navigation
WO2020023582A1 (en) * 2018-07-24 2020-01-30 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
WO2020072905A1 (en) * 2018-10-04 2020-04-09 Google Llc Depth from motion for augmented reality for handheld user devices
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10690936B2 (en) 2016-08-29 2020-06-23 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US20200234395A1 (en) * 2019-01-23 2020-07-23 Qualcomm Incorporated Methods and apparatus for standardized apis for split rendering
US10726264B2 (en) 2018-06-25 2020-07-28 Microsoft Technology Licensing, Llc Object-based localization
US10748302B1 (en) * 2019-05-02 2020-08-18 Apple Inc. Multiple user simultaneous localization and mapping (SLAM)
US10810430B2 (en) 2018-12-27 2020-10-20 At&T Intellectual Property I, L.P. Augmented reality with markerless, context-aware object tracking
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US10850116B2 (en) 2016-12-30 2020-12-01 Mentor Acquisition One, Llc Head-worn therapy device
CN112130660A (en) * 2020-08-14 2020-12-25 青岛小鸟看看科技有限公司 Interaction method and system based on virtual reality all-in-one machine
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
WO2020142048A3 (en) * 2018-12-31 2021-01-21 Havelsan Hava Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇ Augmented virtual reality tactical state display system (asger-tds)
US10901499B2 (en) * 2017-06-15 2021-01-26 Tencent Technology (Shenzhen) Company Limited System and method of instantly previewing immersive content
US10939977B2 (en) 2018-11-26 2021-03-09 Augmedics Ltd. Positioning marker
US10962780B2 (en) 2015-10-26 2021-03-30 Microsoft Technology Licensing, Llc Remote rendering for virtual images
US10962371B2 (en) * 2019-04-02 2021-03-30 GM Global Technology Operations LLC Method and apparatus of parallel tracking and localization via multi-mode slam fusion process
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US10990829B2 (en) * 2017-04-28 2021-04-27 Micro Focus Llc Stitching maps generated using simultaneous localization and mapping
CN112785700A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method, global map updating method and device
CN112785715A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method and electronic device
US11010921B2 (en) 2019-05-16 2021-05-18 Qualcomm Incorporated Distributed pose estimation
CN113129450A (en) * 2021-04-21 2021-07-16 北京百度网讯科技有限公司 Virtual fitting method, device, electronic equipment and medium
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
CN113452896A (en) * 2020-03-26 2021-09-28 华为技术有限公司 Image display method and electronic equipment
US11138692B2 (en) * 2017-09-28 2021-10-05 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
CN113614794A (en) * 2019-04-26 2021-11-05 谷歌有限责任公司 Managing content in augmented reality
US11196842B2 (en) 2019-09-26 2021-12-07 At&T Intellectual Property I, L.P. Collaborative and edge-enhanced augmented reality systems
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11244509B2 (en) 2018-08-20 2022-02-08 Fisher-Rosemount Systems, Inc. Drift correction for industrial augmented reality applications
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11321929B2 (en) * 2018-05-18 2022-05-03 Purdue Research Foundation System and method for spatially registering multiple augmented reality devices
US11331575B2 (en) 2017-03-30 2022-05-17 Electronic Arts Inc. Virtual environment mapping system
US11331568B2 (en) * 2018-03-16 2022-05-17 Sony Interactive Entertainment LLC Asynchronous virtual reality interactions
US11335058B2 (en) 2020-10-13 2022-05-17 Electronic Arts Inc. Spatial partitioning for graphics rendering
US20220151704A1 (en) * 2019-03-04 2022-05-19 Smith & Nephew, Inc. Co-registration for augmented reality and surgical navigation
US11389252B2 (en) 2020-06-15 2022-07-19 Augmedics Ltd. Rotating marker for image guided surgery
US11410372B2 (en) 2019-03-27 2022-08-09 Electronic Arts Inc. Artificial intelligence based virtual object aging
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11504608B2 (en) * 2017-01-18 2022-11-22 Xvisio Technology Corp. 6DoF inside-out tracking game controller
WO2022252347A1 (en) * 2021-06-04 2022-12-08 华为技术有限公司 3d map retrieval method and apparatus
US11532136B2 (en) * 2013-03-01 2022-12-20 Apple Inc. Registration between actual mobile device position and environmental model
EP4078540A4 (en) * 2019-12-20 2023-01-18 Niantic, Inc. Merging local maps from mapping devices
US11620800B2 (en) * 2019-03-27 2023-04-04 Electronic Arts Inc. Three dimensional reconstruction of objects based on geolocation and image data
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US20230237692A1 (en) * 2022-01-26 2023-07-27 Meta Platforms Technologies, Llc Methods and systems to facilitate passive relocalization using three-dimensional maps
US11727648B2 (en) * 2017-09-28 2023-08-15 Apple Inc. Method and device for synchronizing augmented reality coordinate systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
WO2023163760A1 (en) * 2022-02-25 2023-08-31 Faro Technologies, Inc. Tracking with reference to a world coordinate system
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
US11816887B2 (en) 2020-08-04 2023-11-14 Fisher-Rosemount Systems, Inc. Quick activation techniques for industrial augmented reality applications
US11851177B2 (en) 2014-05-06 2023-12-26 Mentor Acquisition One, Llc Unmanned aerial vehicle launch system
US11887253B2 (en) 2019-07-24 2024-01-30 Electronic Arts Inc. Terrain generation and population system
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US11960089B2 (en) 2022-06-27 2024-04-16 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9037468B2 (en) * 2008-10-27 2015-05-19 Sony Computer Entertainment Inc. Sound localization for user in motion
US9087058B2 (en) * 2011-08-03 2015-07-21 Google Inc. Method and apparatus for enabling a searchable history of real-world user experiences
US9137308B1 (en) 2012-01-09 2015-09-15 Google Inc. Method and apparatus for enabling event-based media data capture
US9406090B1 (en) 2012-01-09 2016-08-02 Google Inc. Content sharing system
JP2014191718A (en) 2013-03-28 2014-10-06 Sony Corp Display control device, display control method, and recording medium
US9874749B2 (en) * 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US9524434B2 (en) * 2013-10-04 2016-12-20 Qualcomm Incorporated Object tracking based on dynamically built environment map data
US9407809B2 (en) 2013-11-27 2016-08-02 Qualcomm Incorporated Strategies for triggering depth sensors and transmitting RGBD images in a cloud-based object recognition system
CN107219628B (en) 2013-11-27 2020-05-01 奇跃公司 Virtual and augmented reality systems and methods
CN106062580B (en) 2013-12-27 2019-11-05 麻省理工学院 Utilize the positioning of non-simultaneous transmission and Multipath Transmission
US20150193971A1 (en) * 2014-01-03 2015-07-09 Motorola Mobility Llc Methods and Systems for Generating a Map including Sparse and Dense Mapping Information
US20150223005A1 (en) * 2014-01-31 2015-08-06 Raytheon Company 3-dimensional audio projection
WO2015134958A1 (en) * 2014-03-07 2015-09-11 Magic Leap, Inc. Virtual and augmented reality systems and methods
US9885774B2 (en) 2014-04-18 2018-02-06 Massachusetts Institute Of Technology Indoor localization of a multi-antenna receiver
US9579799B2 (en) * 2014-04-30 2017-02-28 Coleman P. Parker Robotic control system using virtual reality input
US9916002B2 (en) 2014-11-16 2018-03-13 Eonite Perception Inc. Social applications for augmented reality technologies
US10055892B2 (en) 2014-11-16 2018-08-21 Eonite Perception Inc. Active region determination for head mounted displays
WO2016077798A1 (en) 2014-11-16 2016-05-19 Eonite Perception Inc. Systems and methods for augmented reality preparation, processing, and application
US10915161B2 (en) * 2014-12-11 2021-02-09 Intel Corporation Facilitating dynamic non-visual markers for augmented reality on computing devices
US10180734B2 (en) 2015-03-05 2019-01-15 Magic Leap, Inc. Systems and methods for augmented reality
US10838207B2 (en) 2015-03-05 2020-11-17 Magic Leap, Inc. Systems and methods for augmented reality
EP3265866B1 (en) 2015-03-05 2022-12-28 Magic Leap, Inc. Systems and methods for augmented reality
CN107533376A (en) * 2015-05-14 2018-01-02 奇跃公司 It is couple to the consumer cameras of the privacy-sensitive of augmented reality system
JP2018536244A (en) 2015-12-04 2018-12-06 マジック リープ, インコーポレイテッドMagic Leap,Inc. Relocation system and method
KR102482595B1 (en) * 2015-12-17 2022-12-30 삼성전자주식회사 Method for providing map information and electronic device supporting the same
US10217231B2 (en) 2016-05-31 2019-02-26 Microsoft Technology Licensing, Llc Systems and methods for utilizing anchor graphs in mixed reality environments
EP3494549A4 (en) 2016-08-02 2019-08-14 Magic Leap, Inc. Fixed-distance virtual and augmented reality systems and methods
US11017712B2 (en) 2016-08-12 2021-05-25 Intel Corporation Optimized display image rendering
US10074205B2 (en) 2016-08-30 2018-09-11 Intel Corporation Machine creation of program with frame analysis method and apparatus
US9928660B1 (en) 2016-09-12 2018-03-27 Intel Corporation Hybrid rendering for a wearable display attached to a tethered computer
US10134192B2 (en) 2016-10-17 2018-11-20 Microsoft Technology Licensing, Llc Generating and displaying a computer generated image on a future pose of a real world object
US9940729B1 (en) 2016-11-18 2018-04-10 Here Global B.V. Detection of invariant features for localization
US10812936B2 (en) 2017-01-23 2020-10-20 Magic Leap, Inc. Localization determination for mixed reality systems
RU2667602C2 (en) * 2017-03-15 2018-09-21 Общество с ограниченной ответственностью "АВИАРЕАЛ" Method of forming an image of an augmented reality that provides the correlation of visual characteristics of real and virtual objects
US10861237B2 (en) 2017-03-17 2020-12-08 Magic Leap, Inc. Mixed reality system with multi-source virtual content compositing and method of generating virtual content using same
CN110431599B (en) 2017-03-17 2022-04-12 奇跃公司 Mixed reality system with virtual content warping and method for generating virtual content using the same
CN110402425B (en) 2017-03-17 2024-01-09 奇跃公司 Mixed reality system with color virtual content distortion and method for generating virtual content using the same
US10531065B2 (en) 2017-03-30 2020-01-07 Microsoft Technology Licensing, Llc Coarse relocalization using signal fingerprints
US10600252B2 (en) 2017-03-30 2020-03-24 Microsoft Technology Licensing, Llc Coarse relocalization using signal fingerprints
US10251011B2 (en) * 2017-04-24 2019-04-02 Intel Corporation Augmented reality virtual reality ray tracing sensory enhancement system, apparatus and method
US11682045B2 (en) * 2017-06-28 2023-06-20 Samsung Electronics Co., Ltd. Augmented reality advertisements on objects
US10885714B2 (en) 2017-07-07 2021-01-05 Niantic, Inc. Cloud enabled augmented reality
US10685456B2 (en) 2017-10-12 2020-06-16 Microsoft Technology Licensing, Llc Peer to peer remote localization for devices
US10650553B2 (en) * 2017-12-27 2020-05-12 Intel IP Corporation Method of image processing and image processing device
WO2019172874A1 (en) 2018-03-05 2019-09-12 Reavire, Inc. Maintaining localization and orientation of an electronic headset after loss of slam tracking
CN117711284A (en) 2018-07-23 2024-03-15 奇跃公司 In-field subcode timing in a field sequential display
JP7304934B2 (en) 2018-07-23 2023-07-07 マジック リープ, インコーポレイテッド Mixed reality system with virtual content warping and method of using it to generate virtual content
CN112233647A (en) * 2019-06-26 2021-01-15 索尼公司 Information processing apparatus and method, and computer-readable storage medium
CN111242704B (en) * 2020-04-26 2020-12-08 北京外号信息技术有限公司 Method and electronic equipment for superposing live character images in real scene

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5786849A (en) * 1997-02-07 1998-07-28 Lynde; C. Macgill Marine navigation I
US20080310757A1 (en) * 2007-06-15 2008-12-18 George Wolberg System and related methods for automatically aligning 2D images of a scene to a 3D model of the scene
US20100238161A1 (en) * 2009-03-19 2010-09-23 Kenneth Varga Computer-aided system for 360º heads up display of safety/mission critical data

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7831094B2 (en) 2004-04-27 2010-11-09 Honda Motor Co., Ltd. Simultaneous localization and mapping using multiple view feature descriptors
US20100045701A1 (en) 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US20100208033A1 (en) 2009-02-13 2010-08-19 Microsoft Corporation Personal Media Landscapes in Mixed Reality
US20100257252A1 (en) 2009-04-01 2010-10-07 Microsoft Corporation Augmented Reality Cloud Computing
US20120306850A1 (en) 2011-06-02 2012-12-06 Microsoft Corporation Distributed asynchronous localization and mapping for augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5786849A (en) * 1997-02-07 1998-07-28 Lynde; C. Macgill Marine navigation I
US20080310757A1 (en) * 2007-06-15 2008-12-18 George Wolberg System and related methods for automatically aligning 2D images of a scene to a 3D model of the scene
US20100238161A1 (en) * 2009-03-19 2010-09-23 Kenneth Varga Computer-aided system for 360º heads up display of safety/mission critical data

Cited By (537)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11506912B2 (en) 2008-01-02 2022-11-22 Mentor Acquisition One, Llc Temple and ear horn assembly for headworn computer
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9395725B2 (en) 2011-01-05 2016-07-19 Sphero, Inc. Self-propelled device implementing three-dimensional control
US11460837B2 (en) 2011-01-05 2022-10-04 Sphero, Inc. Self-propelled device with actively engaged drive system
US9766620B2 (en) 2011-01-05 2017-09-19 Sphero, Inc. Self-propelled device with actively engaged drive system
US10022643B2 (en) 2011-01-05 2018-07-17 Sphero, Inc. Magnetically coupled accessory for a self-propelled device
US9218316B2 (en) * 2011-01-05 2015-12-22 Sphero, Inc. Remotely controlling a self-propelled device in a virtualized environment
US9836046B2 (en) 2011-01-05 2017-12-05 Adam Wilson System and method for controlling a self-propelled device using a dynamically configurable instruction library
US9841758B2 (en) 2011-01-05 2017-12-12 Sphero, Inc. Orienting a user interface of a controller for operating a self-propelled device
US10281915B2 (en) 2011-01-05 2019-05-07 Sphero, Inc. Multi-purposed self-propelled device
US10248118B2 (en) 2011-01-05 2019-04-02 Sphero, Inc. Remotely controlling a self-propelled device in a virtualized environment
US9389612B2 (en) 2011-01-05 2016-07-12 Sphero, Inc. Self-propelled device implementing three-dimensional control
US10168701B2 (en) 2011-01-05 2019-01-01 Sphero, Inc. Multi-purposed self-propelled device
US9886032B2 (en) 2011-01-05 2018-02-06 Sphero, Inc. Self propelled device with magnetic coupling
US9952590B2 (en) 2011-01-05 2018-04-24 Sphero, Inc. Self-propelled device implementing three-dimensional control
US20130173089A1 (en) * 2011-01-05 2013-07-04 Orbotix, Inc. Remotely controlling a self-propelled device in a virtualized environment
US9290220B2 (en) 2011-01-05 2016-03-22 Sphero, Inc. Orienting a user interface of a controller for operating a self-propelled device
US10678235B2 (en) 2011-01-05 2020-06-09 Sphero, Inc. Self-propelled device with actively engaged drive system
US11630457B2 (en) 2011-01-05 2023-04-18 Sphero, Inc. Multi-purposed self-propelled device
US10012985B2 (en) 2011-01-05 2018-07-03 Sphero, Inc. Self-propelled device for interpreting input from a controller device
US10423155B2 (en) 2011-01-05 2019-09-24 Sphero, Inc. Self propelled device with magnetic coupling
US9150263B2 (en) 2011-01-05 2015-10-06 Sphero, Inc. Self-propelled device implementing three-dimensional control
US11605977B2 (en) 2011-03-25 2023-03-14 May Patents Ltd. Device for displaying in response to a sensed motion
US11305160B2 (en) 2011-03-25 2022-04-19 May Patents Ltd. Device for displaying in response to a sensed motion
US9592428B2 (en) 2011-03-25 2017-03-14 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US11631996B2 (en) 2011-03-25 2023-04-18 May Patents Ltd. Device for displaying in response to a sensed motion
US9630062B2 (en) 2011-03-25 2017-04-25 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US11916401B2 (en) 2011-03-25 2024-02-27 May Patents Ltd. Device for displaying in response to a sensed motion
US9555292B2 (en) 2011-03-25 2017-01-31 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US10926140B2 (en) 2011-03-25 2021-02-23 May Patents Ltd. Device for displaying in response to a sensed motion
US10953290B2 (en) 2011-03-25 2021-03-23 May Patents Ltd. Device for displaying in response to a sensed motion
US11631994B2 (en) 2011-03-25 2023-04-18 May Patents Ltd. Device for displaying in response to a sensed motion
US9545542B2 (en) 2011-03-25 2017-01-17 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US11192002B2 (en) 2011-03-25 2021-12-07 May Patents Ltd. Device for displaying in response to a sensed motion
US9878228B2 (en) 2011-03-25 2018-01-30 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US11689055B2 (en) 2011-03-25 2023-06-27 May Patents Ltd. System and method for a motion sensing device
US9878214B2 (en) 2011-03-25 2018-01-30 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US11173353B2 (en) 2011-03-25 2021-11-16 May Patents Ltd. Device for displaying in response to a sensed motion
US9868034B2 (en) 2011-03-25 2018-01-16 May Patents Ltd. System and method for a motion sensing device which provides a visual or audible indication
US9757624B2 (en) 2011-03-25 2017-09-12 May Patents Ltd. Motion sensing device which provides a visual indication with a wireless signal
US11298593B2 (en) 2011-03-25 2022-04-12 May Patents Ltd. Device for displaying in response to a sensed motion
US9808678B2 (en) 2011-03-25 2017-11-07 May Patents Ltd. Device for displaying in respose to a sensed motion
US11141629B2 (en) 2011-03-25 2021-10-12 May Patents Ltd. Device for displaying in response to a sensed motion
US10525312B2 (en) 2011-03-25 2020-01-07 May Patents Ltd. Device for displaying in response to a sensed motion
US11260273B2 (en) 2011-03-25 2022-03-01 May Patents Ltd. Device for displaying in response to a sensed motion
US9782637B2 (en) 2011-03-25 2017-10-10 May Patents Ltd. Motion sensing device which provides a signal in response to the sensed motion
US11949241B2 (en) 2011-03-25 2024-04-02 May Patents Ltd. Device for displaying in response to a sensed motion
US9764201B2 (en) 2011-03-25 2017-09-19 May Patents Ltd. Motion sensing device with an accelerometer and a digital display
US8933931B2 (en) 2011-06-02 2015-01-13 Microsoft Corporation Distributed asynchronous localization and mapping for augmented reality
US20120320216A1 (en) * 2011-06-14 2012-12-20 Disney Enterprises, Inc. Method and System for Object Recognition, Authentication, and Tracking with Infrared Distortion Caused by Objects for Augmented Reality
US20130004100A1 (en) * 2011-06-30 2013-01-03 Nokia Corporation Method, apparatus and computer program product for generating panorama images
US9342866B2 (en) * 2011-06-30 2016-05-17 Nokia Technologies Oy Method, apparatus and computer program product for generating panorama images
US11269408B2 (en) 2011-08-12 2022-03-08 Sony Interactive Entertainment Inc. Wireless head mounted display with differential rendering
US10585472B2 (en) * 2011-08-12 2020-03-10 Sony Interactive Entertainment Inc. Wireless head mounted display with differential rendering and sound localization
US20170045941A1 (en) * 2011-08-12 2017-02-16 Sony Interactive Entertainment Inc. Wireless Head Mounted Display with Differential Rendering and Sound Localization
US9606992B2 (en) * 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US10379346B2 (en) 2011-10-05 2019-08-13 Google Llc Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
US8990682B1 (en) * 2011-10-05 2015-03-24 Google Inc. Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
US9784971B2 (en) 2011-10-05 2017-10-10 Google Inc. Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
US9341849B2 (en) 2011-10-07 2016-05-17 Google Inc. Wearable computer with nearby object response
US9081177B2 (en) 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
US9552676B2 (en) 2011-10-07 2017-01-24 Google Inc. Wearable computer with nearby object response
US9547406B1 (en) 2011-10-31 2017-01-17 Google Inc. Velocity-based triggering
US20130182858A1 (en) * 2012-01-12 2013-07-18 Qualcomm Incorporated Augmented reality with sound and geometric analysis
US9563265B2 (en) * 2012-01-12 2017-02-07 Qualcomm Incorporated Augmented reality with sound and geometric analysis
US20130215109A1 (en) * 2012-02-22 2013-08-22 Silka Miesnieks Designating Real World Locations for Virtual World Control
US9827487B2 (en) 2012-05-14 2017-11-28 Sphero, Inc. Interactive augmented reality using a self-propelled device
US10192310B2 (en) 2012-05-14 2019-01-29 Sphero, Inc. Operating a computing device by detecting rounded objects in an image
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US20150285638A1 (en) * 2012-06-12 2015-10-08 Trx Systems, Inc. System and method for localizing a trackee at a location and mapping the location using signal-based features
US9664521B2 (en) * 2012-06-12 2017-05-30 Trx Systems, Inc. System and method for localizing a trackee at a location and mapping the location using signal-based features
US20140010407A1 (en) * 2012-07-09 2014-01-09 Microsoft Corporation Image-based localization
US8798357B2 (en) * 2012-07-09 2014-08-05 Microsoft Corporation Image-based localization
US10056791B2 (en) 2012-07-13 2018-08-21 Sphero, Inc. Self-optimizing power transfer
US20150138231A1 (en) * 2012-07-17 2015-05-21 Zte Corporation Method, device and system for realizing augmented reality information sharing
US20140176591A1 (en) * 2012-12-26 2014-06-26 Georg Klein Low-latency fusing of color image data
WO2014116467A1 (en) * 2013-01-22 2014-07-31 Microsoft Corporation Mixed reality experience sharing
CN105050670A (en) * 2013-01-22 2015-11-11 微软技术许可有限责任公司 Mixed reality experience sharing
US9342929B2 (en) 2013-01-22 2016-05-17 Microsoft Technology Licensing, Llc Mixed reality experience sharing
US9092818B2 (en) * 2013-01-31 2015-07-28 Wal-Mart Stores, Inc. Method and system for answering a query from a consumer in a retail store
US20140214605A1 (en) * 2013-01-31 2014-07-31 Wal-Mart Stores, Inc. Method And System For Answering A Query From A Consumer In A Retail Store
WO2014124062A1 (en) * 2013-02-07 2014-08-14 Microsoft Corporation Aligning virtual camera with real camera
US11532136B2 (en) * 2013-03-01 2022-12-20 Apple Inc. Registration between actual mobile device position and environmental model
US9495389B2 (en) 2013-03-15 2016-11-15 Qualcomm Incorporated Client-server based dynamic search
US9489772B2 (en) * 2013-03-27 2016-11-08 Intel Corporation Environment actuation by one or more augmented reality elements
US20140292807A1 (en) * 2013-03-27 2014-10-02 Giuseppe Raffa Environment actuation by one or more augmented reality elements
US9600768B1 (en) * 2013-04-16 2017-03-21 Google Inc. Using behavior of objects to infer changes in a driving environment
CN105143821A (en) * 2013-04-30 2015-12-09 高通股份有限公司 Wide area localization from SLAM maps
US20140323148A1 (en) * 2013-04-30 2014-10-30 Qualcomm Incorporated Wide area localization from slam maps
US9612403B2 (en) 2013-06-11 2017-04-04 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
WO2014204330A1 (en) * 2013-06-17 2014-12-24 3Divi Company Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
KR20160022921A (en) * 2013-06-24 2016-03-02 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Tracking head movement when wearing mobile device
US9256987B2 (en) 2013-06-24 2016-02-09 Microsoft Technology Licensing, Llc Tracking head movement when wearing mobile device
KR102213725B1 (en) 2013-06-24 2021-02-05 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Tracking head movement when wearing mobile device
WO2014209709A1 (en) * 2013-06-24 2014-12-31 Microsoft Corporation Tracking head movement when wearing mobile device
US10288419B2 (en) 2013-07-12 2019-05-14 Magic Leap, Inc. Method and system for generating a virtual user interface related to a totem
US11060858B2 (en) 2013-07-12 2021-07-13 Magic Leap, Inc. Method and system for generating a virtual user interface related to a totem
US10228242B2 (en) 2013-07-12 2019-03-12 Magic Leap, Inc. Method and system for determining user input based on gesture
US10866093B2 (en) 2013-07-12 2020-12-15 Magic Leap, Inc. Method and system for retrieving data in response to user input
US10408613B2 (en) * 2013-07-12 2019-09-10 Magic Leap, Inc. Method and system for rendering virtual content
US20150235088A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world
US10591286B2 (en) 2013-07-12 2020-03-17 Magic Leap, Inc. Method and system for generating virtual rooms
US20150235447A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for generating map data from an image
US20150235441A1 (en) * 2013-07-12 2015-08-20 Magic Leap, Inc. Method and system for rendering virtual content
US10571263B2 (en) 2013-07-12 2020-02-25 Magic Leap, Inc. User and object interaction with an augmented reality scenario
US10352693B2 (en) 2013-07-12 2019-07-16 Magic Leap, Inc. Method and system for obtaining texture data of a space
US10495453B2 (en) 2013-07-12 2019-12-03 Magic Leap, Inc. Augmented reality system totems and methods of using same
US11221213B2 (en) 2013-07-12 2022-01-11 Magic Leap, Inc. Method and system for generating a retail experience using an augmented reality system
US9857170B2 (en) 2013-07-12 2018-01-02 Magic Leap, Inc. Planar waveguide apparatus having a plurality of diffractive optical elements
US9651368B2 (en) 2013-07-12 2017-05-16 Magic Leap, Inc. Planar waveguide apparatus configured to return light therethrough
US10641603B2 (en) 2013-07-12 2020-05-05 Magic Leap, Inc. Method and system for updating a virtual world
US10767986B2 (en) 2013-07-12 2020-09-08 Magic Leap, Inc. Method and system for interacting with user interfaces
US9952042B2 (en) 2013-07-12 2018-04-24 Magic Leap, Inc. Method and system for identifying a user location
US10295338B2 (en) * 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
US11656677B2 (en) 2013-07-12 2023-05-23 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US10473459B2 (en) 2013-07-12 2019-11-12 Magic Leap, Inc. Method and system for determining user input based on totem
US11029147B2 (en) 2013-07-12 2021-06-08 Magic Leap, Inc. Method and system for facilitating surgery using an augmented reality system
US10533850B2 (en) * 2013-07-12 2020-01-14 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world
US9541383B2 (en) 2013-07-12 2017-01-10 Magic Leap, Inc. Optical system having a return planar waveguide
CN105814626A (en) * 2013-09-30 2016-07-27 Pcms控股公司 Methods, apparatus, systems, devices, and computer program products for providing an augmented reality display and/or user interface
KR20160068827A (en) * 2013-09-30 2016-06-15 피씨엠에스 홀딩스, 인크. Methods, apparatus, systems, devices, and computer program products for providing an augmented reality display and/or user interface
WO2015048749A1 (en) * 2013-09-30 2015-04-02 Interdigital Patent Holdings, Inc. Methods, apparatus, systems, devices, and computer program products for providing an augmented reality display and/or user interface
KR101873127B1 (en) 2013-09-30 2018-06-29 피씨엠에스 홀딩스, 인크. Methods, apparatus, systems, devices, and computer program products for providing an augmented reality display and/or user interface
US9811731B2 (en) 2013-10-04 2017-11-07 Qualcomm Incorporated Dynamic extension of map data for object detection and tracking
US10620622B2 (en) 2013-12-20 2020-04-14 Sphero, Inc. Self-propelled device with center of mass drive system
US9829882B2 (en) 2013-12-20 2017-11-28 Sphero, Inc. Self-propelled device with center of mass drive system
US11454963B2 (en) 2013-12-20 2022-09-27 Sphero, Inc. Self-propelled device with center of mass drive system
US10282856B2 (en) 2014-01-03 2019-05-07 Google Llc Image registration with device data
US9773313B1 (en) * 2014-01-03 2017-09-26 Google Inc. Image registration with device data
US11507208B2 (en) 2014-01-17 2022-11-22 Mentor Acquisition One, Llc External user interface for head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US11231817B2 (en) 2014-01-17 2022-01-25 Mentor Acquisition One, Llc External user interface for head worn computing
US11782529B2 (en) 2014-01-17 2023-10-10 Mentor Acquisition One, Llc External user interface for head worn computing
US11169623B2 (en) 2014-01-17 2021-11-09 Mentor Acquisition One, Llc External user interface for head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US10222618B2 (en) 2014-01-21 2019-03-05 Osterhout Group, Inc. Compact optics with reduced chromatic aberrations
US11126003B2 (en) 2014-01-21 2021-09-21 Mentor Acquisition One, Llc See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US10866420B2 (en) 2014-01-21 2020-12-15 Mentor Acquisition One, Llc See-through computer display systems
US9746676B2 (en) 2014-01-21 2017-08-29 Osterhout Group, Inc. See-through computer display systems
US10890760B2 (en) 2014-01-21 2021-01-12 Mentor Acquisition One, Llc See-through computer display systems
US10705339B2 (en) 2014-01-21 2020-07-07 Mentor Acquisition One, Llc Suppression of stray light in head worn computing
US9772492B2 (en) 2014-01-21 2017-09-26 Osterhout Group, Inc. Eye imaging in head worn computing
US9740012B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US10698223B2 (en) 2014-01-21 2020-06-30 Mentor Acquisition One, Llc See-through computer display systems
US11002961B2 (en) 2014-01-21 2021-05-11 Mentor Acquisition One, Llc See-through computer display systems
US9811153B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9720235B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US10139632B2 (en) 2014-01-21 2018-11-27 Osterhout Group, Inc. See-through computer display systems
US9538915B2 (en) 2014-01-21 2017-01-10 Osterhout Group, Inc. Eye imaging in head worn computing
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9829703B2 (en) 2014-01-21 2017-11-28 Osterhout Group, Inc. Eye imaging in head worn computing
US10579140B2 (en) 2014-01-21 2020-03-03 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11054902B2 (en) 2014-01-21 2021-07-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11947126B2 (en) 2014-01-21 2024-04-02 Mentor Acquisition One, Llc See-through computer display systems
US9720227B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US10073266B2 (en) 2014-01-21 2018-09-11 Osterhout Group, Inc. See-through computer display systems
US10191284B2 (en) 2014-01-21 2019-01-29 Osterhout Group, Inc. See-through computer display systems
US9529192B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US11619820B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US11622426B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US11099380B2 (en) 2014-01-21 2021-08-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11796805B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11719934B2 (en) 2014-01-21 2023-08-08 Mentor Acquisition One, Llc Suppression of stray light in head worn computing
US11796799B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc See-through computer display systems
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US9684171B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. See-through computer display systems
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9684165B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. Eye imaging in head worn computing
US10481393B2 (en) 2014-01-21 2019-11-19 Mentor Acquisition One, Llc See-through computer display systems
US9885868B2 (en) 2014-01-21 2018-02-06 Osterhout Group, Inc. Eye imaging in head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US11103132B2 (en) 2014-01-21 2021-08-31 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9298002B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9298001B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9927612B2 (en) 2014-01-21 2018-03-27 Osterhout Group, Inc. See-through computer display systems
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9933622B2 (en) 2014-01-21 2018-04-03 Osterhout Group, Inc. See-through computer display systems
US10379365B2 (en) 2014-01-21 2019-08-13 Mentor Acquisition One, Llc See-through computer display systems
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9377625B2 (en) 2014-01-21 2016-06-28 Osterhout Group, Inc. Optical configurations for head worn computing
US9658457B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9658458B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9958674B2 (en) 2014-01-21 2018-05-01 Osterhout Group, Inc. Eye imaging in head worn computing
US10012838B2 (en) 2014-01-21 2018-07-03 Osterhout Group, Inc. Compact optical system with improved contrast uniformity
US11650416B2 (en) 2014-01-21 2023-05-16 Mentor Acquisition One, Llc See-through computer display systems
US9971156B2 (en) 2014-01-21 2018-05-15 Osterhout Group, Inc. See-through computer display systems
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9436006B2 (en) 2014-01-21 2016-09-06 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US10001644B2 (en) 2014-01-21 2018-06-19 Osterhout Group, Inc. See-through computer display systems
US9651783B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US11353957B2 (en) 2014-01-21 2022-06-07 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US10007118B2 (en) 2014-01-21 2018-06-26 Osterhout Group, Inc. Compact optical system with improved illumination
US9651789B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-Through computer display systems
US10012840B2 (en) 2014-01-21 2018-07-03 Osterhout Group, Inc. See-through computer display systems
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9939646B2 (en) 2014-01-24 2018-04-10 Osterhout Group, Inc. Stray light suppression for head worn computing
US11822090B2 (en) 2014-01-24 2023-11-21 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9122054B2 (en) 2014-01-24 2015-09-01 Osterhout Group, Inc. Stray light suppression for head worn computing
US10578874B2 (en) 2014-01-24 2020-03-03 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US11782274B2 (en) 2014-01-24 2023-10-10 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9841602B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Location indicating avatar in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9843093B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Spatial location presentation in head worn computing
US10558420B2 (en) * 2014-02-11 2020-02-11 Mentor Acquisition One, Llc Spatial location presentation in head worn computing
US20150302646A1 (en) * 2014-02-11 2015-10-22 Osterhout Group, Inc. Spatial location presentation in head worn computing
US11599326B2 (en) 2014-02-11 2023-03-07 Mentor Acquisition One, Llc Spatial location presentation in head worn computing
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9229234B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9286728B2 (en) 2014-02-11 2016-03-15 Osterhout Group, Inc. Spatial location presentation in head worn computing
US10140079B2 (en) 2014-02-14 2018-11-27 Osterhout Group, Inc. Object shadowing in head worn computing
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9928019B2 (en) 2014-02-14 2018-03-27 Osterhout Group, Inc. Object shadowing in head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
EP2933780A1 (en) * 2014-04-14 2015-10-21 Baidu Online Network Technology (Beijing) Co., Ltd Reality augmenting method, client device and server
US9922461B2 (en) 2014-04-14 2018-03-20 Baidu Online Network Technology (Beijing) Co., Ltd. Reality augmenting method, client device and server
US9922462B2 (en) 2014-04-18 2018-03-20 Magic Leap, Inc. Interacting with totems in augmented or virtual reality systems
US10665018B2 (en) * 2014-04-18 2020-05-26 Magic Leap, Inc. Reducing stresses in the passable world model in augmented or virtual reality systems
US10186085B2 (en) * 2014-04-18 2019-01-22 Magic Leap, Inc. Generating a sound wavefront in augmented or virtual reality systems
US10846930B2 (en) * 2014-04-18 2020-11-24 Magic Leap, Inc. Using passable world model for augmented or virtual reality
US9761055B2 (en) 2014-04-18 2017-09-12 Magic Leap, Inc. Using object recognizers in an augmented or virtual reality system
US20170061688A1 (en) * 2014-04-18 2017-03-02 Magic Leap, Inc. Reducing stresses in the passable world model in augmented or virtual reality systems
US9767616B2 (en) 2014-04-18 2017-09-19 Magic Leap, Inc. Recognizing objects in a passable world model in an augmented or virtual reality system
US10198864B2 (en) 2014-04-18 2019-02-05 Magic Leap, Inc. Running object recognizers in a passable world model for augmented or virtual reality
US9766703B2 (en) 2014-04-18 2017-09-19 Magic Leap, Inc. Triangulation of points using known points in augmented or virtual reality systems
US10909760B2 (en) 2014-04-18 2021-02-02 Magic Leap, Inc. Creating a topological map for localization in augmented or virtual reality systems
US20150302652A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20150301797A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Systems and methods for rendering user interfaces for augmented or virtual reality
US10825248B2 (en) * 2014-04-18 2020-11-03 Magic Leap, Inc. Eye tracking systems and method for augmented or virtual reality
US20150301599A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Eye tracking systems and method for augmented or virtual reality
US20150302625A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Generating a sound wavefront in augmented or virtual reality systems
US10013806B2 (en) 2014-04-18 2018-07-03 Magic Leap, Inc. Ambient light compensation for augmented or virtual reality
US10008038B2 (en) 2014-04-18 2018-06-26 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
US9996977B2 (en) 2014-04-18 2018-06-12 Magic Leap, Inc. Compensating for ambient light in augmented or virtual reality systems
US10262462B2 (en) * 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20150302657A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Using passable world model for augmented or virtual reality
US20150316982A1 (en) * 2014-04-18 2015-11-05 Magic Leap, Inc. Utilizing pseudo-random patterns for eye tracking in augmented or virtual reality systems
US20230274511A1 (en) * 2014-04-18 2023-08-31 Magic Leap, Inc. Displaying virtual content in augmented reality using a map of the world
US9984506B2 (en) 2014-04-18 2018-05-29 Magic Leap, Inc. Stress reduction in geometric maps of passable world model in augmented or virtual reality systems
US20150356781A1 (en) * 2014-04-18 2015-12-10 Magic Leap, Inc. Rendering an avatar for a user in an augmented or virtual reality system
US9852548B2 (en) 2014-04-18 2017-12-26 Magic Leap, Inc. Systems and methods for generating sound wavefronts in augmented or virtual reality systems
US9881420B2 (en) 2014-04-18 2018-01-30 Magic Leap, Inc. Inferential avatar rendering techniques in augmented or virtual reality systems
US9972132B2 (en) 2014-04-18 2018-05-15 Magic Leap, Inc. Utilizing image based light solutions for augmented or virtual reality
US10043312B2 (en) 2014-04-18 2018-08-07 Magic Leap, Inc. Rendering techniques to find new map points in augmented or virtual reality systems
US9911233B2 (en) 2014-04-18 2018-03-06 Magic Leap, Inc. Systems and methods for using image based light solutions for augmented or virtual reality
US9911234B2 (en) 2014-04-18 2018-03-06 Magic Leap, Inc. User interface rendering in augmented or virtual reality systems
US10127723B2 (en) 2014-04-18 2018-11-13 Magic Leap, Inc. Room based sensors in an augmented reality system
US10115232B2 (en) 2014-04-18 2018-10-30 Magic Leap, Inc. Using a map of the world for augmented or virtual reality systems
US11205304B2 (en) * 2014-04-18 2021-12-21 Magic Leap, Inc. Systems and methods for rendering user interfaces for augmented or virtual reality
US10115233B2 (en) 2014-04-18 2018-10-30 Magic Leap, Inc. Methods and systems for mapping virtual objects in an augmented or virtual reality system
US10109108B2 (en) 2014-04-18 2018-10-23 Magic Leap, Inc. Finding new points by render rather than search in augmented or virtual reality systems
US9928654B2 (en) * 2014-04-18 2018-03-27 Magic Leap, Inc. Utilizing pseudo-random patterns for eye tracking in augmented or virtual reality systems
US11880041B2 (en) 2014-04-25 2024-01-23 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US10466492B2 (en) 2014-04-25 2019-11-05 Mentor Acquisition One, Llc Ear horn assembly for headworn computer
US10146772B2 (en) 2014-04-25 2018-12-04 Osterhout Group, Inc. Language translation with head-worn computing
US11727223B2 (en) 2014-04-25 2023-08-15 Mentor Acquisition One, Llc Language translation with head-worn computing
US10634922B2 (en) 2014-04-25 2020-04-28 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US9897822B2 (en) 2014-04-25 2018-02-20 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US11809022B2 (en) 2014-04-25 2023-11-07 Mentor Acquisition One, Llc Temple and ear horn assembly for headworn computer
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US11474360B2 (en) 2014-04-25 2022-10-18 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US10101588B2 (en) 2014-04-25 2018-10-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US10732434B2 (en) 2014-04-25 2020-08-04 Mentor Acquisition One, Llc Temple and ear horn assembly for headworn computer
US11851177B2 (en) 2014-05-06 2023-12-26 Mentor Acquisition One, Llc Unmanned aerial vehicle launch system
US20150334503A1 (en) * 2014-05-13 2015-11-19 Lenovo (Singapore) Pte. Ltd. Audio system tuning
US9918176B2 (en) * 2014-05-13 2018-03-13 Lenovo (Singapore) Pte. Ltd. Audio system tuning
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US11265526B2 (en) 2014-05-19 2022-03-01 Occipital, Inc. Methods for automatic registration of 3D image data
US20180241985A1 (en) * 2014-05-19 2018-08-23 Occipital, Inc. Methods for automatic registration of 3d image data
US10750150B2 (en) 2014-05-19 2020-08-18 Occipital, Inc. Methods for automatic registration of 3D image data
US11402639B2 (en) 2014-06-05 2022-08-02 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US10877270B2 (en) 2014-06-05 2020-12-29 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US11327323B2 (en) 2014-06-09 2022-05-10 Mentor Acquisition One, Llc Content presentation in head worn computing
US10139635B2 (en) 2014-06-09 2018-11-27 Osterhout Group, Inc. Content presentation in head worn computing
US11790617B2 (en) 2014-06-09 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US9720241B2 (en) 2014-06-09 2017-08-01 Osterhout Group, Inc. Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10976559B2 (en) 2014-06-09 2021-04-13 Mentor Acquisition One, Llc Content presentation in head worn computing
US11663794B2 (en) 2014-06-09 2023-05-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US11360318B2 (en) 2014-06-09 2022-06-14 Mentor Acquisition One, Llc Content presentation in head worn computing
US11022810B2 (en) 2014-06-09 2021-06-01 Mentor Acquisition One, Llc Content presentation in head worn computing
US11887265B2 (en) 2014-06-09 2024-01-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US11789267B2 (en) 2014-06-17 2023-10-17 Mentor Acquisition One, Llc External user interface for head worn computing
US11294180B2 (en) 2014-06-17 2022-04-05 Mentor Acquisition One, Llc External user interface for head worn computing
US11054645B2 (en) 2014-06-17 2021-07-06 Mentor Acquisition One, Llc External user interface for head worn computing
US10698212B2 (en) 2014-06-17 2020-06-30 Mentor Acquisition One, Llc External user interface for head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US20170153866A1 (en) * 2014-07-03 2017-06-01 Imagine Mobile Augmented Reality Ltd. Audiovisual Surround Augmented Reality (ASAR)
US10775630B2 (en) 2014-07-08 2020-09-15 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9798148B2 (en) 2014-07-08 2017-10-24 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US11409110B2 (en) 2014-07-08 2022-08-09 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11940629B2 (en) 2014-07-08 2024-03-26 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US10564426B2 (en) 2014-07-08 2020-02-18 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11786105B2 (en) 2014-07-15 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US11630315B2 (en) 2014-08-12 2023-04-18 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US11360314B2 (en) 2014-08-12 2022-06-14 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10908422B2 (en) 2014-08-12 2021-02-02 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US11474575B2 (en) 2014-09-18 2022-10-18 Mentor Acquisition One, Llc Thermal management for head-worn computer
US10520996B2 (en) 2014-09-18 2019-12-31 Mentor Acquisition One, Llc Thermal management for head-worn computer
US10963025B2 (en) 2014-09-18 2021-03-30 Mentor Acquisition One, Llc Thermal management for head-worn computer
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US10078224B2 (en) 2014-09-26 2018-09-18 Osterhout Group, Inc. See-through computer display systems
US10320437B2 (en) 2014-10-24 2019-06-11 Usens, Inc. System and method for immersive and interactive multimedia generation
US9858722B2 (en) * 2014-10-24 2018-01-02 Usens, Inc. System and method for immersive and interactive multimedia generation
US20160117860A1 (en) * 2014-10-24 2016-04-28 Usens, Inc. System and method for immersive and interactive multimedia generation
US10223834B2 (en) 2014-10-24 2019-03-05 Usens, Inc. System and method for immersive and interactive multimedia generation
US10256859B2 (en) 2014-10-24 2019-04-09 Usens, Inc. System and method for immersive and interactive multimedia generation
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US10036889B2 (en) 2014-12-03 2018-07-31 Osterhout Group, Inc. Head worn computer display systems
US10197801B2 (en) 2014-12-03 2019-02-05 Osterhout Group, Inc. Head worn computer display systems
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10018837B2 (en) 2014-12-03 2018-07-10 Osterhout Group, Inc. Head worn computer display systems
US11262846B2 (en) 2014-12-03 2022-03-01 Mentor Acquisition One, Llc See-through computer display systems
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US11809628B2 (en) 2014-12-03 2023-11-07 Mentor Acquisition One, Llc See-through computer display systems
US20160163107A1 (en) * 2014-12-09 2016-06-09 Industrial Technology Research Institute Augmented Reality Method and System, and User Mobile Device Applicable Thereto
US9626803B2 (en) 2014-12-12 2017-04-18 Qualcomm Incorporated Method and apparatus for image processing in augmented reality systems
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD792400S1 (en) 2014-12-31 2017-07-18 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
USD794637S1 (en) 2015-01-05 2017-08-15 Osterhout Group, Inc. Air mouse
CN107408314A (en) * 2015-01-20 2017-11-28 微软技术许可有限责任公司 Mixed reality system
WO2016118371A1 (en) * 2015-01-20 2016-07-28 Microsoft Technology Licensing, Llc Mixed reality system
US9846968B2 (en) 2015-01-20 2017-12-19 Microsoft Technology Licensing, Llc Holographic bird's eye view camera
US20170351918A1 (en) * 2015-02-13 2017-12-07 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US10146998B2 (en) * 2015-02-13 2018-12-04 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US11721303B2 (en) 2015-02-17 2023-08-08 Mentor Acquisition One, Llc See-through computer display systems
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US20170178375A1 (en) * 2015-03-24 2017-06-22 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US20190273916A1 (en) * 2015-03-24 2019-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US20230379449A1 (en) * 2015-03-24 2023-11-23 Augmedics Ltd. Systems for facilitating augmented reality-assisted medical procedures
US9928629B2 (en) * 2015-03-24 2018-03-27 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US11750794B2 (en) * 2015-03-24 2023-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US20230379448A1 (en) * 2015-03-24 2023-11-23 Augmedics Ltd. Head-mounted augmented reality near eye display device
US10037036B2 (en) * 2015-05-05 2018-07-31 Volvo Car Corporation Method and arrangement for determining safe vehicle trajectories
US20160327953A1 (en) * 2015-05-05 2016-11-10 Volvo Car Corporation Method and arrangement for determining safe vehicle trajectories
US11816296B2 (en) 2015-07-22 2023-11-14 Mentor Acquisition One, Llc External user interface for head worn computing
US11209939B2 (en) 2015-07-22 2021-12-28 Mentor Acquisition One, Llc External user interface for head worn computing
US10139966B2 (en) 2015-07-22 2018-11-27 Osterhout Group, Inc. External user interface for head worn computing
US10962780B2 (en) 2015-10-26 2021-03-30 Microsoft Technology Licensing, Llc Remote rendering for virtual images
WO2017117675A1 (en) * 2016-01-08 2017-07-13 Sulon Technologies Inc. Head mounted device for augmented reality
US20190041977A1 (en) * 2016-01-29 2019-02-07 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US10948975B2 (en) * 2016-01-29 2021-03-16 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US11868518B2 (en) 2016-01-29 2024-01-09 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US10120437B2 (en) 2016-01-29 2018-11-06 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US20170221265A1 (en) * 2016-01-29 2017-08-03 Rovi Guides, Inc. Methods and systems for associating media content with physical world objects
US11507180B2 (en) 2016-01-29 2022-11-22 Rovi Guides, Inc. Methods and systems for associating input schemes with physical world objects
US11654074B2 (en) 2016-02-29 2023-05-23 Mentor Acquisition One, Llc Providing enhanced images for navigation
US11298288B2 (en) 2016-02-29 2022-04-12 Mentor Acquisition One, Llc Providing enhanced images for navigation
US10849817B2 (en) 2016-02-29 2020-12-01 Mentor Acquisition One, Llc Providing enhanced images for navigation
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US11592669B2 (en) 2016-03-02 2023-02-28 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11156834B2 (en) 2016-03-02 2021-10-26 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11320656B2 (en) 2016-05-09 2022-05-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11500212B2 (en) 2016-05-09 2022-11-15 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10824253B2 (en) 2016-05-09 2020-11-03 Mentor Acquisition One, Llc User interface systems for head-worn computers
US10684478B2 (en) 2016-05-09 2020-06-16 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11226691B2 (en) 2016-05-09 2022-01-18 Mentor Acquisition One, Llc User interface systems for head-worn computers
US11586048B2 (en) 2016-06-01 2023-02-21 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11754845B2 (en) 2016-06-01 2023-09-12 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11022808B2 (en) 2016-06-01 2021-06-01 Mentor Acquisition One, Llc Modular systems for head-worn computers
US10466491B2 (en) 2016-06-01 2019-11-05 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11460708B2 (en) 2016-06-01 2022-10-04 Mentor Acquisition One, Llc Modular systems for head-worn computers
US11215465B2 (en) 2016-08-04 2022-01-04 Reification Inc. Methods for simultaneous localization and mapping (SLAM) and related apparatus and systems
US10444021B2 (en) 2016-08-04 2019-10-15 Reification Inc. Methods for simultaneous localization and mapping (SLAM) and related apparatus and systems
US11350196B2 (en) 2016-08-22 2022-05-31 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US9826299B1 (en) 2016-08-22 2017-11-21 Osterhout Group, Inc. Speaker systems for head-worn computer systems
US10757495B2 (en) 2016-08-22 2020-08-25 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US11825257B2 (en) 2016-08-22 2023-11-21 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US11409128B2 (en) 2016-08-29 2022-08-09 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US10690936B2 (en) 2016-08-29 2020-06-23 Mentor Acquisition One, Llc Adjustable nose bridge assembly for headworn computer
US10768500B2 (en) 2016-09-08 2020-09-08 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
US11415856B2 (en) 2016-09-08 2022-08-16 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US11604358B2 (en) 2016-09-08 2023-03-14 Mentor Acquisition One, Llc Optical systems for head-worn computers
US11768417B2 (en) 2016-09-08 2023-09-26 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
US11366320B2 (en) 2016-09-08 2022-06-21 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10534180B2 (en) 2016-09-08 2020-01-14 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10514754B2 (en) * 2016-09-30 2019-12-24 Sony Interactive Entertainment Inc. RF beamforming for head mounted display
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
US10747306B2 (en) 2016-09-30 2020-08-18 Sony Interactive Entertainment Inc. Wireless communication system for head mounted display
US20190138087A1 (en) * 2016-09-30 2019-05-09 Sony Interactive Entertainment Inc. RF Beamforming for Head Mounted Display
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US10146302B2 (en) 2016-09-30 2018-12-04 Sony Interactive Entertainment Inc. Head mounted display with multiple antennas
USD840395S1 (en) 2016-10-17 2019-02-12 Osterhout Group, Inc. Head-worn computer
WO2018085089A1 (en) * 2016-11-01 2018-05-11 Google Llc Map summarization and localization
US10339708B2 (en) 2016-11-01 2019-07-02 Google Inc. Map summarization and localization
CN109804220A (en) * 2016-12-06 2019-05-24 美国景书公司 System and method for tracking the fortune dynamic posture on head and eyes
US9785249B1 (en) 2016-12-06 2017-10-10 Vuelosophy Inc. Systems and methods for tracking motion and gesture of heads and eyes
CN112578911A (en) * 2016-12-06 2021-03-30 美国景书公司 Apparatus and method for tracking head and eye movements
WO2018106220A1 (en) * 2016-12-06 2018-06-14 Vuelosophy Inc. Systems and methods for tracking motion and gesture of heads and eyes
CN110073313A (en) * 2016-12-12 2019-07-30 微软技术许可有限责任公司 Using female equipment and at least one with equipment and environmental interaction
US10452133B2 (en) 2016-12-12 2019-10-22 Microsoft Technology Licensing, Llc Interacting with an environment using a parent device and at least one companion device
WO2018111658A1 (en) * 2016-12-12 2018-06-21 Microsoft Technology Licensing, Llc Interacting with an environment using a parent device and at least one companion device
GB2558279A (en) * 2016-12-23 2018-07-11 Sony Interactive Entertainment Inc Head mountable display system
US11771915B2 (en) 2016-12-30 2023-10-03 Mentor Acquisition One, Llc Head-worn therapy device
US10850116B2 (en) 2016-12-30 2020-12-01 Mentor Acquisition One, Llc Head-worn therapy device
US10499997B2 (en) 2017-01-03 2019-12-10 Mako Surgical Corp. Systems and methods for surgical navigation
US11707330B2 (en) 2017-01-03 2023-07-25 Mako Surgical Corp. Systems and methods for surgical navigation
USD947186S1 (en) 2017-01-04 2022-03-29 Mentor Acquisition One, Llc Computer glasses
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
USD918905S1 (en) 2017-01-04 2021-05-11 Mentor Acquisition One, Llc Computer glasses
US11504608B2 (en) * 2017-01-18 2022-11-22 Xvisio Technology Corp. 6DoF inside-out tracking game controller
US11331575B2 (en) 2017-03-30 2022-05-17 Electronic Arts Inc. Virtual environment mapping system
US11163379B2 (en) * 2017-04-05 2021-11-02 Telefonaktiebolaget Lm Ericsson (Publ) Illuminating an environment for localisation
WO2018184675A1 (en) * 2017-04-05 2018-10-11 Telefonaktiebolaget Lm Ericsson (Publ) Illuminating an environment for localisation
EP4163675A1 (en) * 2017-04-05 2023-04-12 Telefonaktiebolaget LM Ericsson (publ) Illuminating an environment for localisation
US10990829B2 (en) * 2017-04-28 2021-04-27 Micro Focus Llc Stitching maps generated using simultaneous localization and mapping
US11467708B2 (en) 2017-05-30 2022-10-11 Crunchfish Gesture Interaction Ab Activation of a virtual object
WO2018222115A1 (en) * 2017-05-30 2018-12-06 Crunchfish Ab Improved activation of a virtual object
US10901499B2 (en) * 2017-06-15 2021-01-26 Tencent Technology (Shenzhen) Company Limited System and method of instantly previewing immersive content
WO2018231481A1 (en) 2017-06-16 2018-12-20 Microsoft Technology Licensing, Llc Object holographic augmentation
US10325409B2 (en) 2017-06-16 2019-06-18 Microsoft Technology Licensing, Llc Object holographic augmentation
CN110036359A (en) * 2017-06-23 2019-07-19 杰创科虚拟现实有限公司 The interactive augmented reality of first person role playing
US11042035B2 (en) 2017-07-24 2021-06-22 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11226489B2 (en) 2017-07-24 2022-01-18 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11550157B2 (en) 2017-07-24 2023-01-10 Mentor Acquisition One, Llc See-through computer display systems
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
US11567328B2 (en) 2017-07-24 2023-01-31 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11668939B2 (en) 2017-07-24 2023-06-06 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US11789269B2 (en) 2017-07-24 2023-10-17 Mentor Acquisition One, Llc See-through computer display systems
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11500207B2 (en) 2017-08-04 2022-11-15 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US11947120B2 (en) 2017-08-04 2024-04-02 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
US11062517B2 (en) 2017-09-27 2021-07-13 Fisher-Rosemount Systems, Inc. Virtual access to a limited-access object
US11080931B2 (en) 2017-09-27 2021-08-03 Fisher-Rosemount Systems, Inc. Virtual x-ray vision in a process control environment
US20190096131A1 (en) * 2017-09-27 2019-03-28 Fisher-Rosemount Systems, Inc. 3D Mapping of a Process Control Environment
US11244515B2 (en) 2017-09-27 2022-02-08 Fisher-Rosemount Systems, Inc. 3D mapping of a process control environment
US10796487B2 (en) * 2017-09-27 2020-10-06 Fisher-Rosemount Systems, Inc. 3D mapping of a process control environment
US11727648B2 (en) * 2017-09-28 2023-08-15 Apple Inc. Method and device for synchronizing augmented reality coordinate systems
US20220092741A1 (en) * 2017-09-28 2022-03-24 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US11138692B2 (en) * 2017-09-28 2021-10-05 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
US11790490B2 (en) * 2017-09-28 2023-10-17 Intel Corporation Super-resolution apparatus and method for virtual and mixed reality
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object
US11244514B2 (en) 2017-12-15 2022-02-08 Telefonaktiebolaget Lm Ericsson (Publ) Determining a content pose of a piece of virtual content
WO2019114986A1 (en) * 2017-12-15 2019-06-20 Telefonaktiebolaget Lm Ericsson (Publ) Determining a content pose of a piece of virtual content
US11657581B2 (en) 2017-12-15 2023-05-23 Telefonaktiebolaget Lm Ericsson (Publ) Determining a content pose of a piece of virtual content
CN109727288B (en) * 2017-12-28 2021-10-01 北京京东尚科信息技术有限公司 System and method for monocular simultaneous localization and mapping
US10636198B2 (en) * 2017-12-28 2020-04-28 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for monocular simultaneous localization and mapping
CN109727288A (en) * 2017-12-28 2019-05-07 北京京东尚科信息技术有限公司 System and method for monocular simultaneous localization and mapping
US20220241683A1 (en) * 2018-03-16 2022-08-04 Sony Interactive Entertainment LLC Asynchronous Virtual Reality Interactions
US11806615B2 (en) * 2018-03-16 2023-11-07 Sony Interactive Entertainment LLC Asynchronous virtual reality interactions
US11331568B2 (en) * 2018-03-16 2022-05-17 Sony Interactive Entertainment LLC Asynchronous virtual reality interactions
US20190293937A1 (en) * 2018-03-20 2019-09-26 Boe Technology Group Co., Ltd. Augmented reality display device and method, and augmented reality glasses
CN110365964A (en) * 2018-04-10 2019-10-22 安谋知识产权有限公司 The image procossing of augmented reality
US20190311544A1 (en) * 2018-04-10 2019-10-10 Arm Ip Limited Image processing for augmented reality
US11605204B2 (en) * 2018-04-10 2023-03-14 Arm Limited Image processing for augmented reality
US11321929B2 (en) * 2018-05-18 2022-05-03 Purdue Research Foundation System and method for spatially registering multiple augmented reality devices
US10726264B2 (en) 2018-06-25 2020-07-28 Microsoft Technology Licensing, Llc Object-based localization
US11182614B2 (en) * 2018-07-24 2021-11-23 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
US20230273676A1 (en) * 2018-07-24 2023-08-31 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
US11687151B2 (en) * 2018-07-24 2023-06-27 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
WO2020023582A1 (en) * 2018-07-24 2020-01-30 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
US20220036078A1 (en) * 2018-07-24 2022-02-03 Magic Leap, Inc. Methods and apparatuses for determining and/or evaluating localizing maps of image display devices
US11244509B2 (en) 2018-08-20 2022-02-08 Fisher-Rosemount Systems, Inc. Drift correction for industrial augmented reality applications
US11783553B2 (en) 2018-08-20 2023-10-10 Fisher-Rosemount Systems, Inc. Systems and methods for facilitating creation of a map of a real-world, process control environment
WO2020072905A1 (en) * 2018-10-04 2020-04-09 Google Llc Depth from motion for augmented reality for handheld user devices
US11145075B2 (en) * 2018-10-04 2021-10-12 Google Llc Depth from motion for augmented reality for handheld user devices
US10939977B2 (en) 2018-11-26 2021-03-09 Augmedics Ltd. Positioning marker
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US10810430B2 (en) 2018-12-27 2020-10-20 At&T Intellectual Property I, L.P. Augmented reality with markerless, context-aware object tracking
WO2020142048A3 (en) * 2018-12-31 2021-01-21 Havelsan Hava Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇ Augmented virtual reality tactical state display system (asger-tds)
US20200234395A1 (en) * 2019-01-23 2020-07-23 Qualcomm Incorporated Methods and apparatus for standardized apis for split rendering
US11625806B2 (en) * 2019-01-23 2023-04-11 Qualcomm Incorporated Methods and apparatus for standardized APIs for split rendering
US20220151704A1 (en) * 2019-03-04 2022-05-19 Smith & Nephew, Inc. Co-registration for augmented reality and surgical navigation
US11937885B2 (en) * 2019-03-04 2024-03-26 Smith & Nephew, Inc. Co-registration for augmented reality and surgical navigation
US11620800B2 (en) * 2019-03-27 2023-04-04 Electronic Arts Inc. Three dimensional reconstruction of objects based on geolocation and image data
US11410372B2 (en) 2019-03-27 2022-08-09 Electronic Arts Inc. Artificial intelligence based virtual object aging
US10962371B2 (en) * 2019-04-02 2021-03-30 GM Global Technology Operations LLC Method and apparatus of parallel tracking and localization via multi-mode slam fusion process
CN113614794A (en) * 2019-04-26 2021-11-05 谷歌有限责任公司 Managing content in augmented reality
CN111880644A (en) * 2019-05-02 2020-11-03 苹果公司 Multi-user instant location and map construction (SLAM)
US10748302B1 (en) * 2019-05-02 2020-08-18 Apple Inc. Multiple user simultaneous localization and mapping (SLAM)
US11127161B2 (en) 2019-05-02 2021-09-21 Apple Inc. Multiple user simultaneous localization and mapping (SLAM)
US11010921B2 (en) 2019-05-16 2021-05-18 Qualcomm Incorporated Distributed pose estimation
US11887253B2 (en) 2019-07-24 2024-01-30 Electronic Arts Inc. Terrain generation and population system
US11196842B2 (en) 2019-09-26 2021-12-07 At&T Intellectual Property I, L.P. Collaborative and edge-enhanced augmented reality systems
CN112785715A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method and electronic device
CN112785700A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method, global map updating method and device
EP4078540A4 (en) * 2019-12-20 2023-01-18 Niantic, Inc. Merging local maps from mapping devices
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
CN113452896A (en) * 2020-03-26 2021-09-28 华为技术有限公司 Image display method and electronic equipment
US11389252B2 (en) 2020-06-15 2022-07-19 Augmedics Ltd. Rotating marker for image guided surgery
US11816887B2 (en) 2020-08-04 2023-11-14 Fisher-Rosemount Systems, Inc. Quick activation techniques for industrial augmented reality applications
CN112130660A (en) * 2020-08-14 2020-12-25 青岛小鸟看看科技有限公司 Interaction method and system based on virtual reality all-in-one machine
US11335058B2 (en) 2020-10-13 2022-05-17 Electronic Arts Inc. Spatial partitioning for graphics rendering
US11704868B2 (en) 2020-10-13 2023-07-18 Electronic Arts Inc. Spatial partitioning for graphics rendering
CN113129450A (en) * 2021-04-21 2021-07-16 北京百度网讯科技有限公司 Virtual fitting method, device, electronic equipment and medium
WO2022252347A1 (en) * 2021-06-04 2022-12-08 华为技术有限公司 3d map retrieval method and apparatus
US20230237692A1 (en) * 2022-01-26 2023-07-27 Meta Platforms Technologies, Llc Methods and systems to facilitate passive relocalization using three-dimensional maps
WO2023163760A1 (en) * 2022-02-25 2023-08-31 Faro Technologies, Inc. Tracking with reference to a world coordinate system
US11960089B2 (en) 2022-06-27 2024-04-16 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11960095B2 (en) 2023-04-19 2024-04-16 Mentor Acquisition One, Llc See-through computer display systems

Also Published As

Publication number Publication date
US8933931B2 (en) 2015-01-13
US20130169626A1 (en) 2013-07-04

Similar Documents

Publication Publication Date Title
US8933931B2 (en) Distributed asynchronous localization and mapping for augmented reality
US11127210B2 (en) Touch and social cues as inputs into a computer
US10062213B2 (en) Augmented reality spaces with adaptive rules
US10007349B2 (en) Multiple sensor gesture recognition
US11816853B2 (en) Systems and methods for simultaneous localization and mapping
US8660362B2 (en) Combined depth filtering and super resolution
US20130174213A1 (en) Implicit sharing and privacy control through physical behaviors using sensor-rich devices
EP2595402B1 (en) System for controlling light enabled devices
JP7316282B2 (en) Systems and methods for augmented reality
Deldjoo et al. A low-cost infrared-optical head tracking solution for virtual 3d audio environment using the nintendo wii-remote
WO2023250267A1 (en) Robotic learning of tasks using augmented reality
US20220375110A1 (en) Augmented reality guided depth estimation
US20240127006A1 (en) Sign language interpretation with collaborative agents
US11681361B2 (en) Reducing startup time of augmented reality experience
US20220374505A1 (en) Bending estimation as a biometric signal
US20230267691A1 (en) Scene change detection with novel view synthesis
WO2022245649A1 (en) Augmented reality guided depth estimation
WO2022246382A1 (en) Bending estimation as a biometric signal
KR20240008370A (en) Late warping to minimize latency for moving objects

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BALAN, ALEXANDRU;FLAKS, JASON;HODGES, STEVE;AND OTHERS;SIGNING DATES FROM 20110524 TO 20110602;REEL/FRAME:026382/0498

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0509

Effective date: 20141014