US20120231887A1 - Augmented Reality Mission Generators - Google Patents

Augmented Reality Mission Generators Download PDF

Info

Publication number
US20120231887A1
US20120231887A1 US13/414,491 US201213414491A US2012231887A1 US 20120231887 A1 US20120231887 A1 US 20120231887A1 US 201213414491 A US201213414491 A US 201213414491A US 2012231887 A1 US2012231887 A1 US 2012231887A1
Authority
US
United States
Prior art keywords
mission
generator
data
template
mobile device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/414,491
Inventor
Brian Elan Lee
Michael Sean Stewart
James Stewartson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nant Holdings IP LLC
Original Assignee
FOURTH WALL STUDIOS Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FOURTH WALL STUDIOS Inc filed Critical FOURTH WALL STUDIOS Inc
Priority to US13/414,491 priority Critical patent/US20120231887A1/en
Assigned to FOURTH WALL STUDIOS, INC. reassignment FOURTH WALL STUDIOS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STEWART, Michael Sean, LEE, BRIAN ELAN, STEWARTSON, JAMES
Publication of US20120231887A1 publication Critical patent/US20120231887A1/en
Assigned to NANT HOLDINGS IP, LLC reassignment NANT HOLDINGS IP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FOURTH WALL STUDIOS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/217Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/332Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/209Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform characterized by low level software layer, relating to hardware management, e.g. Operating System, Application Programming Interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the field of the invention is mixed reality technologies.
  • U.S. pat. publ. no. 2007/0281765 to Mullen discusses systems and methods for location based games. Although Mullen contemplates using the physical location of the user to correspond to a virtual location of a virtual character, Mullen fails to contemplate the use of ambient environmental information apart from location information when generating the game. U.S. pat. publ. no. 2011/0081973 to Hall (publ. April 2011) discusses a different location based game, but also fails to contemplate the use of ambient environmental information apart from location information when generating the game.
  • an augmented reality platform can be constructed to generate augmented reality missions for users.
  • a mission can be generated, possibly from a template, based on a user's environment or data collected about the user's environment.
  • Mission objects can have their attributes populated based on the environment data. For example, all red cars local to the user can become mission objects. As the missions are based on a user's environment, two users could experience quite different missions even though the missions are generated from the same template.
  • the inventive subject matter provides apparatus, systems and methods in which one can provide augmented or mixed reality experiences to users.
  • One of the many aspects of the inventive subject matter includes an augmented reality (AR) gaming system capable of generating one or more AR missions.
  • An AR mission can be presented to a user via a mobile device (e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.) where the user can interact with the mission via the mobile device, or other interactive devices.
  • a mobile device e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.
  • AR missions can be generated via an AR mission generator that includes a mission database storing one or more AR mission templates and an AR mission engine coupled with the database.
  • the AR mission engine can obtain environmental data apart from location information (e.g., GPS coordinates) from one or more remote sensing devices, including the user's mobile device, where the environmental data comprises a digital representation of a scene.
  • the AR mission engine can combine information derived from the digital representation of the scene with an AR mission template to construct a quest (i.e., an instantiated mission) for the user.
  • the AR mission engine can select a mission template from the database based on the environmental data and the location of the user's mobile device, and then populate the mission template with AR objects (e.g., objectives, rewards, goals, etc.) to flush out the mission.
  • AR objects e.g., objectives, rewards, goals, etc.
  • the attributes of the AR objects can also be populated based on the environmental data.
  • FIG. 1 is a schematic of an augmented reality system having an augmented reality mission generator.
  • computing devices comprise a processor configured to execute software instructions stored on a tangible, non-transitory computer readable storage medium (e.g., hard drive, solid state drive, RAM, flash, ROM, etc.).
  • the software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus.
  • the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on SMS, MMS, HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods.
  • Data exchanges preferably are conducted over a packet-switched network, the Internet, LAN, WAN, VPN, PAN, or other type of packet switched network.
  • the disclosed techniques provide many advantageous technical effects including providing an augmented reality infrastructure capable of configuring one or more mobile devices to present a mixed reality interactive environment to users.
  • the mixed reality environment, and accompany missions can be constructed from external data obtained from sensors that are external to the infrastructure.
  • a mission can be populated with information obtained from satellites, Google® StreetViewTM, third party mapping information, security cameras, kiosks, televisions or television stations, set top boxes, weather stations, radios or radio stations, web sites, cellular towers, or other data sources.
  • Coupled to is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms “coupled to” and “coupled with” are used synonymously.
  • inventive subject matter is considered to include all possible combinations of the disclosed elements.
  • inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
  • FIG. 1 presents an overview of one embodiment of an augmented or mixed reality environment 100 where a user can obtain one or more missions from an AR mission generator 110 .
  • each user can utilize a mobile device 102 to obtain sensor data from one or more sensor(s) 104 related to a scene 120 or the user's environment that is separate from a user's location information.
  • a user's mobile device 102 can exchange the collected environmental data or a digital representation of the scene 120 with the AR mission generator 110 .
  • Data exchanges preferably are conducted over a network 130 , which could include, for example, cell networks, mesh networks, Internet, LANs, WANs, VPNs, PANs, or other types of networks or combinations thereof.
  • the AR mission generator 110 can generate one or more missions for the user, at least in part based on the obtained environment data.
  • the mobile device 102 could also transmit location information such as GPS coordinates and/or cellular triangulation information to the AR mission generator 110 .
  • the mobile device 102 is presented as a smart phone, which represents one of many different types of devices that can integrate into the overall AR environment 100 .
  • Mobile devices can include, for example, smart phones and other wireless telephones, laptops, netbooks, tablet PCs, and other mobile computers, vehicles, sensors (e.g., a camera), media players, personal digital assistants, MP3 or other media players, watches, and gaming platforms.
  • Other types of devices can include electronic picture frames, desktop computers, appliances (e.g., STB, kitchen appliances, etc), kiosks, non-mobile sensors, media players, game consoles, televisions, or other types of devices.
  • Preferred devices have a communication link and offer a presentation system (e.g., display, speakers, vibrators, etc.) for presenting AR data to the user.
  • a presentation system e.g., display, speakers, vibrators, etc.
  • Environmental data or a digital representation of the scene 120 can include data from multiple sources or sensors.
  • a sensor 122 e.g., a camera
  • Contemplated sensors can include, for example, microphones, magnetometers, accelerometers, biosensors, still and video cameras, weather sensors, optical sensors, or other types of sensors.
  • the types of data used to form a digital representation of the scene can cover a wide range of modalities including image data, audio data, haptic data, or other modalities.
  • additional data can include weather data, location data, orientation data, movement data, biometrics data, or other types of data.
  • the AR mission generator 110 can include one or more modules or components configured to support the roles or responsibilities of the AR mission generator 110 .
  • the AR mission generator 110 can include an AR mission template database 112 and an AR mission engine 114 .
  • the AR mission template database 112 and AR mission engine 114 are shown as local to the AR mission generator 110 , it is contemplated that one or both of the AR mission template database 112 and AR mission engine 114 can be separate from, and located locally or remotely with respect to, the AR mission generator 110 .
  • the AR mission template database 112 can store a plurality of AR mission template objects where each mission template object comprises attributes or metadata describing characteristics of a mission.
  • the mission template objects can be stored as an XML file or other serialized format.
  • a mission template object can include a wide spectrum of information including, for example, name/ID of mission, a type of mission (e.g., dynamic, chain, etc.), goals, supporting objects, rewards, narratives, digital assets (e.g., video, audio, etc), mission requirements (e.g., required weapons, achievements, user level, number of players, etc.), location requirements (e.g., indoors or outdoors), conditions, programmatic instructions, links to other missions, or other information that can be used to instantiate a mission.
  • a type of mission e.g., dynamic, chain, etc.
  • goals e.g., dynamic, chain, etc.
  • supporting objects e.g., supporting objects
  • rewards, narratives e.g., digital assets (e.g., video, audio, etc)
  • digital assets e.g., video, audio, etc
  • mission requirements e.g., required weapons, achievements, user level, number of players, etc.
  • location requirements e.g., indoors or outdoors
  • the AR mission generator 110 is illustrated as being remote relative to the scene 120 or mobile device 102 . However, it is specifically contemplated that some or all of the features of the mission generator 110 , AR mission engine 114 and/or AR mission template database 112 , for example, can be integrated into the mobile device 102 . In such embodiments, information can be exchanged through an application program interface (API) or other suitable interface. In other embodiments, the AR mission engine 114 or other components can comprise a distal computing server, a distributed computing platform, or even an AR computing platform.
  • API application program interface
  • the AR mission engine 114 is preferably configured to obtain environmental data from the user's mobile device 102 , about the scene 120 proximate to the mobile device 102 . Based on the environmental data, the AR mission engine 114 can determine the characteristics of the scene 120 and generate one or more missions (i.e., an instantiated mission) from an AR mission template object from the mission template database 112 .
  • Scene characteristics can include user identification and capabilities of the mobile device 102 including, for example, available sensors 104 , screen size, processor speed, available memory, presence of a camera or other imaging sensor. Scene characteristics can also include weather conditions, visual images, location information, orientation, captured audio, presence and type of real-world objects, or other types of characteristics.
  • the AR mission engine 114 can compare the characteristics to the requirements, attributes, or conditions associated with the stored AR mission template objects to select a mission template. Once selected or otherwise obtained, the AR mission engine 114 can instantiate a mission for the user from the selected mission template object. It is contemplated that the AR mission generator 110 can configured the mobile device 102 to present the generated mission.
  • a mission template object includes a defined grammar having verbs that define user actions with respect to one or more AR objects associated with a mission.
  • an AR mission template object might have several verbs that define a mission with respect to the user's actions.
  • Contemplated verbs include, for example, read, view, deliver, fire (e.g., a weapon, etc.), upgrade, collect, converse, travel, or other actions.
  • the AR objects associated with a mission template can also be stored as a template, or rather as AR object templates.
  • the selected AR mission template object can be populated based on the environmental data.
  • a user could be in a shopping mall and log in to the AR mission generator 110 via their mobile phone to obtain a mission.
  • the AR mission engine 114 recognizes from the user's location (e.g., based on GPS coordinates) that the user is in a mall, and selects a mission that requires the user to collect objects.
  • the AR mission engine 114 instantiates AR objects as mannequins, and the mission requires that the user travels around the mall photographing mannequins (e.g., collecting the AR objects) to complete the mission.
  • the mobile device 102 could be configured to identify the mannequins, or other object of interest, by its associated features such as by using image recognition software.
  • Populating attributes or features of a mission or associated AR objects can also be achieved through object recognition.
  • the AR mission engine 114 can recognize real-world objects in the scene 120 and use the objects' attributes to populate attributes of the one or more AR objects 124 .
  • the attributes can be simply observed or looked-up from a database based on object recognition algorithms (e.g., SIFT, vSLAM, Viper, etc.).
  • a user may capture a picture of a scene having a plurality of trees.
  • the trees can be recognized by the AR mission engine, and AR objects can be generated based upon the trees' attributes (e.g., size, leave color, distance from mobile device, etc.).
  • the AR objects associated with a mission can range across a full spectrum of objects from completely real-world objects through completely virtual objects.
  • Exemplary AR objects can include, for example, a mission objective, a reward, an award point, a currency, a relationship, a virtual object, a real-world object, a promotion, a coupon, or other types of objects.
  • the AR objects can be integrated into the real-world via mobile device 102 .
  • the AR objects associated with the mission could be superimposed (overlaid) on the captured scene 120 while also maintaining their proper location and orientation with respect to real-world objects within the scene 120 .
  • Superimposing images of AR objects on a real-world image can be accomplished by many techniques.
  • a mission can be customized for a specific user based on the user's specific environment. Still, the missions can be efficiently based on just a few types of mission templates.
  • One especially interesting type of mission is a dynamic mission that can be fully customizable for the user. Dynamic missions can be a single one-off mission constructed in real-time if desired based on the obtained environmental data. While completion of a dynamic mission may not advance a story, users may obtain rewards for completing the mission including, for example, points, levels, currency, weapons, and experience. Examples of dynamic missions include shooting ten boars, collective five coins, going on a night patrol, finding a treasure, and so forth.
  • Chain mission Another interesting type of mission is a chain mission that can be linked with preceding or succeeding missions to form a story arch. Chain mission can be constructed with more thought to create a greater level of immersion for the user.
  • missions have been presented as a single player platform. However, one should appreciate that missions can also comprise multi-player missions requiring two or more users. When multiple users are involved, new types of interactions can occur. Some multi-player missions might require cooperative objectives, while other multi-player missions might comprise counter objectives for the players where the players oppose or compete against each other. Because of the AR nature of the missions, it is contemplated that players could be in a variety of disparate locations while interacting with one another. An exemplary mission having counter objectives could be to infiltrate an enemy's base or to defend a fort.
  • missions are associated with game play. Still, missions can bridge across many markets beyond game play. Other types of missions can be constructed as an exercise program, an advertising campaign, or even following an alternative navigation route home. By constructing various types of missions for a user, the user can be enticed to discover new businesses or opportunities, possibly commercial opportunities.
  • Contemplated AR systems can include an analysis engine that correlates player attributes against mission objectives. Collecting and tracking of such information can be advantageous to businesses when targeting promotions or missions to players or other individuals.
  • ambient environmental data separate from the mobile device's location can be received.
  • An AR mission generator can select an AR mission template from a mission database coupled to the AR mission generator. It is contemplated that the AR mission template can be selected based at least in part upon the ambient environmental data.
  • a mission can be generated using the AR mission generator and the selected AR mission template, where the mission is based on at least a portion of the ambient environmental data.
  • a mobile device can be configured via the AR mission generator to present the generated mission to a user.

Abstract

Augmented reality (AR) mission generators are described that generate missions based on environmental data separate from a user's location. The environmental data can be obtained from a user's mobile device or using other sensors or third-party information. The missions can be generated from an AR mission template stored in a mission database, and presented to the user on the user's mobile device.

Description

  • This application claims the benefit of priority to U.S. provisional application having Ser. No. 61/450,052 filed on Mar. 7, 2011. This and all other extrinsic materials discussed herein are incorporated by reference in their entirety. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
  • FIELD OF THE INVENTION
  • The field of the invention is mixed reality technologies.
  • BACKGROUND
  • With the popularity of on-line virtual world games like World of Warcraft® and advances in mobile device processing capabilities, it is quite a wonder that no viable, marketable union of the two has yet been achieved. One likely reason is the static nature of the virtual worlds offer very limited pliability in the real-world. Another reason might include the failure to integrate real-world aspects into a game so that game has a broader appeal. Ideally, an augmented reality environment would combine goals of a game with the real-world.
  • To some degree U.S. pat. publ. no. 2004/0041788 to Ternullo (publ. March 2004) provides some techniques suitable for us in an augmented reality system. Simplistically, Ternullo only allows for a virtual walk through of a home.
  • Others have put forth some effort in combing virtual and real-world gaming systems. For example, U.S. pat. no. 6951515 to Oshima et al., and U.S. pat. no. 6972734 also to Oshima et al., both describe integrating virtual objects with the real-world. Unfortunately the Oshima approaches require bulky support equipment and fail to appreciate the world itself could be a platform for a mixed reality environment.
  • More recently, U.S. pat. no. 7564469 to Cohen and U.S. pat. publ. no. 2007/0104348 also to Cohen (publ. May 2007) both provide additional details regarding interacting with virtual objects in the real-world. Still, these citations merely focus on interactions between virtual object and the real-world as opposed to game play.
  • U.S. pat. publ. no. 2006/0223635 to Rosenberg (publ. October 2006) takes simulated gaming a step further by combing simulated gaming objects and events with the real-world. A display can present simulated objects on a display. However, even Rosenberg fails to appreciate the dynamic nature of the real-world and that each game player can have their game play experience.
  • U.S. pat. publ. no. 2007/0281765 to Mullen (publ. December 2007) discusses systems and methods for location based games. Although Mullen contemplates using the physical location of the user to correspond to a virtual location of a virtual character, Mullen fails to contemplate the use of ambient environmental information apart from location information when generating the game. U.S. pat. publ. no. 2011/0081973 to Hall (publ. April 2011) discusses a different location based game, but also fails to contemplate the use of ambient environmental information apart from location information when generating the game.
  • U.S. pat. publ. no. 2011/0319148 to Kinnebrew, et al. (publ. December 2011) advances location-based gaming by combining real world and virtual elements to influence game play. However, Kinnebrew also fails to contemplate the use of ambient environmental information apart from location information when generating the game, which limits the influence of a player's real-world environment on the game play.
  • Unless the context dictates the contrary, all ranges set forth herein should be interpreted as being inclusive of their endpoints, and open-ended ranges should be interpreted to include commercially practical values. Similarly, all lists of values should be considered as inclusive of intermediate values unless the context indicates the contrary.
  • It has yet to be appreciated that an augmented reality platform can be constructed to generate augmented reality missions for users. Rather than being bound to a static game play, a mission can be generated, possibly from a template, based on a user's environment or data collected about the user's environment. Mission objects can have their attributes populated based on the environment data. For example, all red cars local to the user can become mission objects. As the missions are based on a user's environment, two users could experience quite different missions even though the missions are generated from the same template.
  • Thus, there is still a need for augmented reality mission generators that utilize ambient environmental data to generate a mission.
  • SUMMARY OF THE INVENTION
  • The inventive subject matter provides apparatus, systems and methods in which one can provide augmented or mixed reality experiences to users. One of the many aspects of the inventive subject matter includes an augmented reality (AR) gaming system capable of generating one or more AR missions. An AR mission can be presented to a user via a mobile device (e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.) where the user can interact with the mission via the mobile device, or other interactive devices.
  • AR missions can be generated via an AR mission generator that includes a mission database storing one or more AR mission templates and an AR mission engine coupled with the database. The AR mission engine can obtain environmental data apart from location information (e.g., GPS coordinates) from one or more remote sensing devices, including the user's mobile device, where the environmental data comprises a digital representation of a scene. The AR mission engine can combine information derived from the digital representation of the scene with an AR mission template to construct a quest (i.e., an instantiated mission) for the user. For example, the AR mission engine can select a mission template from the database based on the environmental data and the location of the user's mobile device, and then populate the mission template with AR objects (e.g., objectives, rewards, goals, etc.) to flush out the mission. One should note the attributes of the AR objects can also be populated based on the environmental data. When the user is presented with the mission, it is contemplated that one or more of the AR objects can be superimposed on a real-world view of a scene. Using the inventive subject matter discussed herein, users can conceivably convert the entire planet into a usable game space.
  • Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
  • BRIEF DESCRIPTION OF THE DRAWING
  • FIG. 1 is a schematic of an augmented reality system having an augmented reality mission generator.
  • DETAILED DESCRIPTION
  • It should be noted that while the following description is drawn to a computer/server based augmented reality generator, various alternative configurations are also deemed suitable and may employ various computing devices including servers, interfaces, systems, databases, engines, agents, controllers, or other types of computing devices operating individually or collectively. One should appreciate the computing devices comprise a processor configured to execute software instructions stored on a tangible, non-transitory computer readable storage medium (e.g., hard drive, solid state drive, RAM, flash, ROM, etc.). The software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus. In especially preferred embodiments, the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on SMS, MMS, HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods. Data exchanges preferably are conducted over a packet-switched network, the Internet, LAN, WAN, VPN, PAN, or other type of packet switched network.
  • One should appreciate that the disclosed techniques provide many advantageous technical effects including providing an augmented reality infrastructure capable of configuring one or more mobile devices to present a mixed reality interactive environment to users. One should also appreciate that the mixed reality environment, and accompany missions, can be constructed from external data obtained from sensors that are external to the infrastructure. For example, a mission can be populated with information obtained from satellites, Google® StreetView™, third party mapping information, security cameras, kiosks, televisions or television stations, set top boxes, weather stations, radios or radio stations, web sites, cellular towers, or other data sources.
  • As used herein, and unless the context dictates otherwise, the term “coupled to” is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms “coupled to” and “coupled with” are used synonymously.
  • The following discussion provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
  • FIG. 1 presents an overview of one embodiment of an augmented or mixed reality environment 100 where a user can obtain one or more missions from an AR mission generator 110. In the embodiment shown, each user can utilize a mobile device 102 to obtain sensor data from one or more sensor(s) 104 related to a scene 120 or the user's environment that is separate from a user's location information. Upon proper registration, authentication, or authorization, a user's mobile device 102 can exchange the collected environmental data or a digital representation of the scene 120 with the AR mission generator 110. Data exchanges preferably are conducted over a network 130, which could include, for example, cell networks, mesh networks, Internet, LANs, WANs, VPNs, PANs, or other types of networks or combinations thereof. In some embodiments, the AR mission generator 110 can generate one or more missions for the user, at least in part based on the obtained environment data. The mobile device 102 could also transmit location information such as GPS coordinates and/or cellular triangulation information to the AR mission generator 110.
  • The mobile device 102 is presented as a smart phone, which represents one of many different types of devices that can integrate into the overall AR environment 100. Mobile devices can include, for example, smart phones and other wireless telephones, laptops, netbooks, tablet PCs, and other mobile computers, vehicles, sensors (e.g., a camera), media players, personal digital assistants, MP3 or other media players, watches, and gaming platforms. Other types of devices can include electronic picture frames, desktop computers, appliances (e.g., STB, kitchen appliances, etc), kiosks, non-mobile sensors, media players, game consoles, televisions, or other types of devices. Preferred devices have a communication link and offer a presentation system (e.g., display, speakers, vibrators, etc.) for presenting AR data to the user.
  • Environmental data or a digital representation of the scene 120 can include data from multiple sources or sensors. In the embodiment shown, a sensor 122 (e.g., a camera) collects data from a lamppost while the mobile device 102 also collects data via at least one sensor 104. Contemplated sensors can include, for example, microphones, magnetometers, accelerometers, biosensors, still and video cameras, weather sensors, optical sensors, or other types of sensors. Furthermore, the types of data used to form a digital representation of the scene can cover a wide range of modalities including image data, audio data, haptic data, or other modalities. Even further, additional data can include weather data, location data, orientation data, movement data, biometrics data, or other types of data.
  • The AR mission generator 110 can include one or more modules or components configured to support the roles or responsibilities of the AR mission generator 110. As shown in FIG. 1, the AR mission generator 110 can include an AR mission template database 112 and an AR mission engine 114. Although the AR mission template database 112 and AR mission engine 114 are shown as local to the AR mission generator 110, it is contemplated that one or both of the AR mission template database 112 and AR mission engine 114 can be separate from, and located locally or remotely with respect to, the AR mission generator 110. The AR mission template database 112 can store a plurality of AR mission template objects where each mission template object comprises attributes or metadata describing characteristics of a mission. In some embodiments, the mission template objects can be stored as an XML file or other serialized format. A mission template object can include a wide spectrum of information including, for example, name/ID of mission, a type of mission (e.g., dynamic, chain, etc.), goals, supporting objects, rewards, narratives, digital assets (e.g., video, audio, etc), mission requirements (e.g., required weapons, achievements, user level, number of players, etc.), location requirements (e.g., indoors or outdoors), conditions, programmatic instructions, links to other missions, or other information that can be used to instantiate a mission.
  • The AR mission generator 110 is illustrated as being remote relative to the scene 120 or mobile device 102. However, it is specifically contemplated that some or all of the features of the mission generator 110, AR mission engine 114 and/or AR mission template database 112, for example, can be integrated into the mobile device 102. In such embodiments, information can be exchanged through an application program interface (API) or other suitable interface. In other embodiments, the AR mission engine 114 or other components can comprise a distal computing server, a distributed computing platform, or even an AR computing platform.
  • The AR mission engine 114 is preferably configured to obtain environmental data from the user's mobile device 102, about the scene 120 proximate to the mobile device 102. Based on the environmental data, the AR mission engine 114 can determine the characteristics of the scene 120 and generate one or more missions (i.e., an instantiated mission) from an AR mission template object from the mission template database 112. Scene characteristics can include user identification and capabilities of the mobile device 102 including, for example, available sensors 104, screen size, processor speed, available memory, presence of a camera or other imaging sensor. Scene characteristics can also include weather conditions, visual images, location information, orientation, captured audio, presence and type of real-world objects, or other types of characteristics. The AR mission engine 114 can compare the characteristics to the requirements, attributes, or conditions associated with the stored AR mission template objects to select a mission template. Once selected or otherwise obtained, the AR mission engine 114 can instantiate a mission for the user from the selected mission template object. It is contemplated that the AR mission generator 110 can configured the mobile device 102 to present the generated mission.
  • It is also contemplated that missions can be generated through numerous methods. In preferred embodiments, a mission template object includes a defined grammar having verbs that define user actions with respect to one or more AR objects associated with a mission. For example, an AR mission template object might have several verbs that define a mission with respect to the user's actions. Contemplated verbs include, for example, read, view, deliver, fire (e.g., a weapon, etc.), upgrade, collect, converse, travel, or other actions. By defining AR mission templates based at least in part on a grammar, mission template development can be greatly streamlined, and mission complexity can be significantly reduced for users.
  • The AR objects associated with a mission template can also be stored as a template, or rather as AR object templates. When a mission is generated, the selected AR mission template object can be populated based on the environmental data. As an example, a user could be in a shopping mall and log in to the AR mission generator 110 via their mobile phone to obtain a mission. The AR mission engine 114 recognizes from the user's location (e.g., based on GPS coordinates) that the user is in a mall, and selects a mission that requires the user to collect objects. With the knowledge that the user is in a mall, the AR mission engine 114 instantiates AR objects as mannequins, and the mission requires that the user travels around the mall photographing mannequins (e.g., collecting the AR objects) to complete the mission. One should note the mobile device 102 could be configured to identify the mannequins, or other object of interest, by its associated features such as by using image recognition software.
  • Populating attributes or features of a mission or associated AR objects can also be achieved through object recognition. As a user collects data associated with the scene 120, such as through still images or video from a camera on the mobile device 102 for example, the AR mission engine 114, perhaps in the mobile device 102, can recognize real-world objects in the scene 120 and use the objects' attributes to populate attributes of the one or more AR objects 124. The attributes can be simply observed or looked-up from a database based on object recognition algorithms (e.g., SIFT, vSLAM, Viper, etc.).
  • Thus, for example, a user may capture a picture of a scene having a plurality of trees. The trees can be recognized by the AR mission engine, and AR objects can be generated based upon the trees' attributes (e.g., size, leave color, distance from mobile device, etc.).
  • The AR objects associated with a mission can range across a full spectrum of objects from completely real-world objects through completely virtual objects. Exemplary AR objects can include, for example, a mission objective, a reward, an award point, a currency, a relationship, a virtual object, a real-world object, a promotion, a coupon, or other types of objects. The AR objects can be integrated into the real-world via mobile device 102. For example, as the user pans and tilts their mobile device 102, the AR objects associated with the mission could be superimposed (overlaid) on the captured scene 120 while also maintaining their proper location and orientation with respect to real-world objects within the scene 120. Superimposing images of AR objects on a real-world image can be accomplished by many techniques. One suitable technique that could be adapted for use with the inventive subject matter includes those found in U.S. pat. no. 6771294 to Pulli et al. One should appreciate that superimposing AR objects on a digital representation of a real-world scene is considered to include other modalities beyond visual data, including audio, haptic, kinesthetic, temperature, or other types of modal data.
  • Using the inventive subject matter discussed herein, many different types of missions are possible, especially in view that a mission can be customized for a specific user based on the user's specific environment. Still, the missions can be efficiently based on just a few types of mission templates. One especially interesting type of mission is a dynamic mission that can be fully customizable for the user. Dynamic missions can be a single one-off mission constructed in real-time if desired based on the obtained environmental data. While completion of a dynamic mission may not advance a story, users may obtain rewards for completing the mission including, for example, points, levels, currency, weapons, and experience. Examples of dynamic missions include shooting ten boars, collective five coins, going on a night patrol, finding a treasure, and so forth.
  • Another interesting type of mission is a chain mission that can be linked with preceding or succeeding missions to form a story arch. Chain mission can be constructed with more thought to create a greater level of immersion for the user.
  • Up to this point, missions have been presented as a single player platform. However, one should appreciate that missions can also comprise multi-player missions requiring two or more users. When multiple users are involved, new types of interactions can occur. Some multi-player missions might require cooperative objectives, while other multi-player missions might comprise counter objectives for the players where the players oppose or compete against each other. Because of the AR nature of the missions, it is contemplated that players could be in a variety of disparate locations while interacting with one another. An exemplary mission having counter objectives could be to infiltrate an enemy's base or to defend a fort.
  • In more preferred embodiments, missions are associated with game play. Still, missions can bridge across many markets beyond game play. Other types of missions can be constructed as an exercise program, an advertising campaign, or even following an alternative navigation route home. By constructing various types of missions for a user, the user can be enticed to discover new businesses or opportunities, possibly commercial opportunities.
  • Interestingly, missions constructed around commercial opportunities can target a wide variety of player demographics, psychographics, or other player attributes. Contemplated AR systems can include an analysis engine that correlates player attributes against mission objectives. Collecting and tracking of such information can be advantageous to businesses when targeting promotions or missions to players or other individuals.
  • Methods for generating AR missions are also contemplated. In some contemplated embodiments, ambient environmental data separate from the mobile device's location can be received. An AR mission generator can select an AR mission template from a mission database coupled to the AR mission generator. It is contemplated that the AR mission template can be selected based at least in part upon the ambient environmental data.
  • A mission can be generated using the AR mission generator and the selected AR mission template, where the mission is based on at least a portion of the ambient environmental data. A mobile device can be configured via the AR mission generator to present the generated mission to a user.
  • It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the scope of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C . . . and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.

Claims (20)

1. An augmented reality mission generator comprising:
a mission database storing augmented reality (AR) mission templates;
an AR mission generator engine coupled with the mission database and with a mobile device capable of transmitting a location of the mobile device and ambient environmental data separate from the mobile device's location, the AR mission generator configured to:
obtain the environmental data from the mobile device,
generate a mission based on at least one mission template and the environmental data, and
configure the mobile device to present the mission.
2. The generator of claim 1, wherein the AR mission templates comprises a mission defined based on a grammar.
3. The generator of claim 2, wherein the grammar comprises verbs relating to AR objects.
4. The generator of claim 1, wherein the AR mission templates comprise AR mission template objects.
5. The generator of claim 4, wherein the AR mission generator engine is further configured to generate an AR mission object by populating an AR mission template object based on the environmental data.
6. The generator of claim 5, where in the AR mission object comprise a mission objective.
7. The generator of claim 5, wherein the AR mission object comprises a reward object.
8. The generator of claim 7, wherein the reward object comprises at least one of the following: an award point, a currency, a virtual object, a real-world object, a relationship, and a promotion.
9. The generator of claim 1, the AR mission generator engine is further configured to select a mission template based on the environmental data.
10. The generator of claim 1, wherein the AR mission templates comprise a dynamic mission template.
11. The generator of claim 1, wherein the AR mission templates comprise a chain mission template.
12. The generator of claim 1, wherein the mobile device comprises at least one of the following: a vehicle, a phone, a sensor, a gaming platform, a portable computer, and a media player.
13. The generator of claim 1, the AR mission templates comprise a multi-player mission template.
14. The generator of claim 13, wherein the multi-player mission template comprises cooperative objectives.
15. The generator of claim 13, wherein the multi-player mission template comprises counter objectives.
16. The generator of claim 1, wherein the AR mission templates comprise an exercise program.
17. The generator of claim 1, further comprising an analysis engine configured to establish correlations between player demographics and mission objectives.
18. The generator of claim 1, wherein the environmental data comprises a digital representation of a scene.
19. The generator of claim 18, wherein the digital representation comprises data from multiple sensors.
20. The generator of claim 18, wherein the digital representation comprises at least one of the following types of data: image data, audio data, haptic data, weather data, location data, movement data, biometric data, and orientation data.
US13/414,491 2011-03-07 2012-03-07 Augmented Reality Mission Generators Abandoned US20120231887A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/414,491 US20120231887A1 (en) 2011-03-07 2012-03-07 Augmented Reality Mission Generators

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161450052P 2011-03-07 2011-03-07
US13/414,491 US20120231887A1 (en) 2011-03-07 2012-03-07 Augmented Reality Mission Generators

Publications (1)

Publication Number Publication Date
US20120231887A1 true US20120231887A1 (en) 2012-09-13

Family

ID=45976510

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/414,491 Abandoned US20120231887A1 (en) 2011-03-07 2012-03-07 Augmented Reality Mission Generators

Country Status (2)

Country Link
US (1) US20120231887A1 (en)
WO (1) WO2012122293A1 (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US20130274013A1 (en) * 2000-11-06 2013-10-17 Nant Holdings Ip, Llc Image Capture and Identification System and Process
US20130281202A1 (en) * 2012-04-18 2013-10-24 Zynga, Inc. Method and apparatus for providing game elements in a social gaming environment
WO2014055376A2 (en) 2012-10-04 2014-04-10 Bjontegard Bernt Erik Contextually intelligent communication systems and processes
WO2014074465A1 (en) * 2012-11-06 2014-05-15 Stephen Latta Cross-platform augmented reality experience
US20140201205A1 (en) * 2013-01-14 2014-07-17 Disney Enterprises, Inc. Customized Content from User Data
US8792750B2 (en) 2000-11-06 2014-07-29 Nant Holdings Ip, Llc Object information derived from object images
US8824738B2 (en) 2000-11-06 2014-09-02 Nant Holdings Ip, Llc Data capture and identification system and process
WO2014181892A1 (en) * 2013-05-08 2014-11-13 Square Enix Holdings Co., Ltd. Information processing apparatus, control method and program
US8907982B2 (en) * 2008-12-03 2014-12-09 Alcatel Lucent Mobile device for augmented reality applications
US8968099B1 (en) 2012-11-01 2015-03-03 Google Inc. System and method for transporting virtual objects in a parallel reality game
US9128789B1 (en) 2012-07-31 2015-09-08 Google Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9174128B2 (en) * 2012-04-26 2015-11-03 Zynga Inc. Dynamic quests in game
US9226106B1 (en) 2012-07-31 2015-12-29 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9539498B1 (en) 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US9545565B1 (en) 2013-10-31 2017-01-17 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
US20170043256A1 (en) * 2014-04-30 2017-02-16 Robert Paul Severn An augmented gaming platform
US9604131B1 (en) 2012-07-31 2017-03-28 Niantic, Inc. Systems and methods for verifying player proximity within a location-based game
US20170087469A1 (en) * 2015-09-29 2017-03-30 International Business Machines Corporation Dynamic personalized location and contact-aware games
US9621635B1 (en) 2012-07-31 2017-04-11 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9669296B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Linking real world activities with a parallel reality game
US9669293B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
CN107219916A (en) * 2016-03-21 2017-09-29 埃森哲环球解决方案有限公司 Generated based on multi-platform experience
US9782668B1 (en) 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
CN108245881A (en) * 2017-12-29 2018-07-06 武汉市马里欧网络有限公司 Three-dimensional jointed plate model buildings system based on AR
US10051457B2 (en) 2007-07-27 2018-08-14 Intertrust Technologies Corporation Content publishing systems and methods
WO2018160080A1 (en) * 2017-03-02 2018-09-07 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
WO2018160081A1 (en) * 2017-03-02 2018-09-07 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
US20190022530A1 (en) * 2017-07-22 2019-01-24 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US10384130B2 (en) * 2016-08-05 2019-08-20 AR Sports LLC Fantasy sport platform with augmented reality player acquisition
US10463953B1 (en) 2013-07-22 2019-11-05 Niantic, Inc. Detecting and preventing cheating in a location-based game
WO2020009935A1 (en) * 2018-07-05 2020-01-09 Themissionzone, Inc. Systems and methods for manipulating the shape and behavior of a physical space
US10617568B2 (en) 2000-11-06 2020-04-14 Nant Holdings Ip, Llc Image capture and identification system and process
US10741088B1 (en) 2017-09-29 2020-08-11 DroneUp, LLC Multiplexed communications for coordination of piloted aerial drones enlisted to a common mission
US10873951B1 (en) 2019-06-04 2020-12-22 Motorola Solutions, Inc. Method and device to minimize interference in a converged LMR/LTE communication device
US11210857B2 (en) 2019-09-26 2021-12-28 The Toronto-Dominion Bank Systems and methods for providing an augmented-reality virtual treasure hunt
US11410488B2 (en) * 2019-05-03 2022-08-09 Igt Augmented reality virtual object collection based on symbol combinations
US20220327461A1 (en) * 2021-04-08 2022-10-13 Raytheon Company Intelligence preparation of the battlefield (ipb) collaborative time machine with real-time options
US11574423B2 (en) 2021-01-29 2023-02-07 Boomanity Corp. A Delaware Corporation Augmented reality (AR) object communication and interaction system and method
US20230123933A1 (en) * 2016-06-06 2023-04-20 Warner Bros. Entertainment Inc. Mixed reality system for context-aware virtual object rendering
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US11854036B2 (en) * 2011-11-21 2023-12-26 Nant Holdings Ip, Llc Location-based transaction reconciliation management methods and systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104235B2 (en) 2013-08-22 2015-08-11 International Business Machines Corporation Modifying information presented by an augmented reality device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030232649A1 (en) * 2002-06-18 2003-12-18 Gizis Alexander C.M. Gaming system and method
US20040002843A1 (en) * 2002-05-13 2004-01-01 Consolidated Global Fun Unlimited, Llc Method and system for interacting with simulated phenomena
US20050009608A1 (en) * 2002-05-13 2005-01-13 Consolidated Global Fun Unlimited Commerce-enabled environment for interacting with simulated phenomena
US20060105838A1 (en) * 2004-11-16 2006-05-18 Mullen Jeffrey D Location-based games and augmented reality systems
US20060223637A1 (en) * 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
US20080015024A1 (en) * 2003-09-02 2008-01-17 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
US20080280676A1 (en) * 2007-05-07 2008-11-13 Samsung Electronics Co. Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
US20100203933A1 (en) * 2007-05-31 2010-08-12 Sony Computer Entertainment Europe Limited Entertainment system and method
US20100279776A1 (en) * 2007-08-17 2010-11-04 Hall Robert J Location-Based Mobile Gaming Application and Method for Implementing the Same Using a Scalable Tiered Geocast Protocol
US20110081973A1 (en) * 2005-11-30 2011-04-07 Hall Robert J Geogame for mobile device
US20110242134A1 (en) * 2010-03-30 2011-10-06 Sony Computer Entertainment Inc. Method for an augmented reality character to maintain and exhibit awareness of an observer
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US20120015730A1 (en) * 2010-07-19 2012-01-19 XMG Studio Sensor Error Reduction in Mobile Device Based Interactive Multiplayer Augmented Reality Gaming Through Use of One or More Game Conventions
US20120046108A1 (en) * 2010-08-17 2012-02-23 Samsung Electronics Co., Ltd. Multiplatform gaming system
US20120046113A1 (en) * 2010-08-17 2012-02-23 Ballas Paul Angelos System and method for rating intensity of video games
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000350865A (en) 1999-06-11 2000-12-19 Mr System Kenkyusho:Kk Game device for composite real space, image processing method therefor and program storage medium
US6972734B1 (en) 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
US8130242B2 (en) 2000-11-06 2012-03-06 Nant Holdings Ip, Llc Interactivity via mobile image recognition
GB2385238A (en) * 2002-02-07 2003-08-13 Hewlett Packard Co Using virtual environments in wireless communication systems
US6950116B2 (en) 2002-08-28 2005-09-27 Lockheed Martin Corporation Interactive virtual portal
US20060223635A1 (en) 2005-04-04 2006-10-05 Outland Research method and apparatus for an on-screen/off-screen first person gaming experience
CN101273368A (en) 2005-08-29 2008-09-24 埃韦里克斯技术股份有限公司 Interactivity via mobile image recognition
DE102007035844A1 (en) * 2007-07-31 2009-02-05 Jochen Hummel Method for the computer-assisted generation of an interactive three-dimensional virtual reality
US8231465B2 (en) * 2008-02-21 2012-07-31 Palo Alto Research Center Incorporated Location-aware mixed-reality gaming platform

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002843A1 (en) * 2002-05-13 2004-01-01 Consolidated Global Fun Unlimited, Llc Method and system for interacting with simulated phenomena
US20050009608A1 (en) * 2002-05-13 2005-01-13 Consolidated Global Fun Unlimited Commerce-enabled environment for interacting with simulated phenomena
US20030232649A1 (en) * 2002-06-18 2003-12-18 Gizis Alexander C.M. Gaming system and method
US20080015024A1 (en) * 2003-09-02 2008-01-17 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
US20060105838A1 (en) * 2004-11-16 2006-05-18 Mullen Jeffrey D Location-based games and augmented reality systems
US20060223637A1 (en) * 2005-03-31 2006-10-05 Outland Research, Llc Video game system combining gaming simulation with remote robot control and remote robot feedback
US20110081973A1 (en) * 2005-11-30 2011-04-07 Hall Robert J Geogame for mobile device
US20080280676A1 (en) * 2007-05-07 2008-11-13 Samsung Electronics Co. Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
US20100203933A1 (en) * 2007-05-31 2010-08-12 Sony Computer Entertainment Europe Limited Entertainment system and method
US20100279776A1 (en) * 2007-08-17 2010-11-04 Hall Robert J Location-Based Mobile Gaming Application and Method for Implementing the Same Using a Scalable Tiered Geocast Protocol
US20110242134A1 (en) * 2010-03-30 2011-10-06 Sony Computer Entertainment Inc. Method for an augmented reality character to maintain and exhibit awareness of an observer
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US20120015730A1 (en) * 2010-07-19 2012-01-19 XMG Studio Sensor Error Reduction in Mobile Device Based Interactive Multiplayer Augmented Reality Gaming Through Use of One or More Game Conventions
US20120046108A1 (en) * 2010-08-17 2012-02-23 Samsung Electronics Co., Ltd. Multiplatform gaming system
US20120046113A1 (en) * 2010-08-17 2012-02-23 Ballas Paul Angelos System and method for rating intensity of video games
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience

Cited By (157)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9116920B2 (en) 2000-11-06 2015-08-25 Nant Holdings Ip, Llc Image capture and identification system and process
US8824738B2 (en) 2000-11-06 2014-09-02 Nant Holdings Ip, Llc Data capture and identification system and process
US8718410B2 (en) 2000-11-06 2014-05-06 Nant Holdings Ip, Llc Image capture and identification system and process
US8774463B2 (en) 2000-11-06 2014-07-08 Nant Holdings Ip, Llc Image capture and identification system and process
US8792750B2 (en) 2000-11-06 2014-07-29 Nant Holdings Ip, Llc Object information derived from object images
US9808376B2 (en) 2000-11-06 2017-11-07 Nant Holdings Ip, Llc Image capture and identification system and process
US9805063B2 (en) 2000-11-06 2017-10-31 Nant Holdings Ip Llc Object information derived from object images
US9785859B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip Llc Image capture and identification system and process
US9844468B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US8798368B2 (en) 2000-11-06 2014-08-05 Nant Holdings Ip, Llc Image capture and identification system and process
US8798322B2 (en) 2000-11-06 2014-08-05 Nant Holdings Ip, Llc Object information derived from object images
US10617568B2 (en) 2000-11-06 2020-04-14 Nant Holdings Ip, Llc Image capture and identification system and process
US8837868B2 (en) 2000-11-06 2014-09-16 Nant Holdings Ip, Llc Image capture and identification system and process
US8842941B2 (en) 2000-11-06 2014-09-23 Nant Holdings Ip, Llc Image capture and identification system and process
US8849069B2 (en) 2000-11-06 2014-09-30 Nant Holdings Ip, Llc Object information derived from object images
US8855423B2 (en) * 2000-11-06 2014-10-07 Nant Holdings Ip, Llc Image capture and identification system and process
US8861859B2 (en) 2000-11-06 2014-10-14 Nant Holdings Ip, Llc Image capture and identification system and process
US8867839B2 (en) 2000-11-06 2014-10-21 Nant Holdings Ip, Llc Image capture and identification system and process
US8873891B2 (en) 2000-11-06 2014-10-28 Nant Holdings Ip, Llc Image capture and identification system and process
US8885983B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Image capture and identification system and process
US8885982B2 (en) 2000-11-06 2014-11-11 Nant Holdings Ip, Llc Object information derived from object images
US9135355B2 (en) 2000-11-06 2015-09-15 Nant Holdings Ip, Llc Image capture and identification system and process
US10639199B2 (en) 2000-11-06 2020-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9785651B2 (en) 2000-11-06 2017-10-10 Nant Holdings Ip, Llc Object information derived from object images
US8923563B2 (en) 2000-11-06 2014-12-30 Nant Holdings Ip, Llc Image capture and identification system and process
US8938096B2 (en) 2000-11-06 2015-01-20 Nant Holdings Ip, Llc Image capture and identification system and process
US8948460B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US8948544B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Object information derived from object images
US8948459B2 (en) 2000-11-06 2015-02-03 Nant Holdings Ip, Llc Image capture and identification system and process
US10635714B2 (en) 2000-11-06 2020-04-28 Nant Holdings Ip, Llc Object information derived from object images
US9014514B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014515B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014516B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9014513B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Image capture and identification system and process
US9014512B2 (en) 2000-11-06 2015-04-21 Nant Holdings Ip, Llc Object information derived from object images
US9020305B2 (en) 2000-11-06 2015-04-28 Nant Holdings Ip, Llc Image capture and identification system and process
US9025813B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9025814B2 (en) 2000-11-06 2015-05-05 Nant Holdings Ip, Llc Image capture and identification system and process
US9031278B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9824099B2 (en) 2000-11-06 2017-11-21 Nant Holdings Ip, Llc Data capture and identification system and process
US9036948B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9036949B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9036947B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9036862B2 (en) 2000-11-06 2015-05-19 Nant Holdings Ip, Llc Object information derived from object images
US9046930B2 (en) 2000-11-06 2015-06-02 Nant Holdings Ip, Llc Object information derived from object images
US9087240B2 (en) 2000-11-06 2015-07-21 Nant Holdings Ip, Llc Object information derived from object images
US9104916B2 (en) 2000-11-06 2015-08-11 Nant Holdings Ip, Llc Object information derived from object images
US9110925B2 (en) 2000-11-06 2015-08-18 Nant Holdings Ip, Llc Image capture and identification system and process
US9031290B2 (en) 2000-11-06 2015-05-12 Nant Holdings Ip, Llc Object information derived from object images
US8712193B2 (en) 2000-11-06 2014-04-29 Nant Holdings Ip, Llc Image capture and identification system and process
US10772765B2 (en) 2000-11-06 2020-09-15 Nant Holdings Ip, Llc Image capture and identification system and process
US9141714B2 (en) 2000-11-06 2015-09-22 Nant Holdings Ip, Llc Image capture and identification system and process
US9148562B2 (en) 2000-11-06 2015-09-29 Nant Holdings Ip, Llc Image capture and identification system and process
US9152864B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Object information derived from object images
US9154695B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9154694B2 (en) 2000-11-06 2015-10-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9844467B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9182828B2 (en) 2000-11-06 2015-11-10 Nant Holdings Ip, Llc Object information derived from object images
US10509821B2 (en) 2000-11-06 2019-12-17 Nant Holdings Ip, Llc Data capture and identification system and process
US9235600B2 (en) 2000-11-06 2016-01-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9244943B2 (en) 2000-11-06 2016-01-26 Nant Holdings Ip, Llc Image capture and identification system and process
US9262440B2 (en) 2000-11-06 2016-02-16 Nant Holdings Ip, Llc Image capture and identification system and process
US9269015B2 (en) 2000-11-06 2016-02-23 Nant Holdings Ip, Llc Image capture and identification system and process
US9288271B2 (en) 2000-11-06 2016-03-15 Nant Holdings Ip, Llc Data capture and identification system and process
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US9311554B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Image capture and identification system and process
US9311552B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9311553B2 (en) 2000-11-06 2016-04-12 Nant Holdings IP, LLC. Image capture and identification system and process
US9317769B2 (en) 2000-11-06 2016-04-19 Nant Holdings Ip, Llc Image capture and identification system and process
US9324004B2 (en) 2000-11-06 2016-04-26 Nant Holdings Ip, Llc Image capture and identification system and process
US9330328B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9330326B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9330327B2 (en) 2000-11-06 2016-05-03 Nant Holdings Ip, Llc Image capture and identification system and process
US9336453B2 (en) 2000-11-06 2016-05-10 Nant Holdings Ip, Llc Image capture and identification system and process
US9342748B2 (en) 2000-11-06 2016-05-17 Nant Holdings Ip. Llc Image capture and identification system and process
US9360945B2 (en) 2000-11-06 2016-06-07 Nant Holdings Ip Llc Object information derived from object images
US10509820B2 (en) 2000-11-06 2019-12-17 Nant Holdings Ip, Llc Object information derived from object images
US9536168B2 (en) 2000-11-06 2017-01-03 Nant Holdings Ip, Llc Image capture and identification system and process
US20130274013A1 (en) * 2000-11-06 2013-10-17 Nant Holdings Ip, Llc Image Capture and Identification System and Process
US10500097B2 (en) * 2000-11-06 2019-12-10 Nant Holdings Ip, Llc Image capture and identification system and process
US20190167479A1 (en) * 2000-11-06 2019-06-06 Nant Holdings Ip, Llc Image capture and identification system and process
US9844466B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US9578107B2 (en) 2000-11-06 2017-02-21 Nant Holdings Ip, Llc Data capture and identification system and process
US9844469B2 (en) 2000-11-06 2017-12-19 Nant Holdings Ip Llc Image capture and identification system and process
US10095712B2 (en) 2000-11-06 2018-10-09 Nant Holdings Ip, Llc Data capture and identification system and process
US10089329B2 (en) 2000-11-06 2018-10-02 Nant Holdings Ip, Llc Object information derived from object images
US9613284B2 (en) 2000-11-06 2017-04-04 Nant Holdings Ip, Llc Image capture and identification system and process
US10080686B2 (en) 2000-11-06 2018-09-25 Nant Holdings Ip, Llc Image capture and identification system and process
US10051457B2 (en) 2007-07-27 2018-08-14 Intertrust Technologies Corporation Content publishing systems and methods
US10271197B2 (en) 2007-07-27 2019-04-23 Intertrust Technologies Corporation Content publishing systems and methods
US11218866B2 (en) 2007-07-27 2022-01-04 Intertrust Technologies Corporation Content publishing systems and methods
US8907982B2 (en) * 2008-12-03 2014-12-09 Alcatel Lucent Mobile device for augmented reality applications
US9573064B2 (en) * 2010-06-24 2017-02-21 Microsoft Technology Licensing, Llc Virtual and location-based multiplayer gaming
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US11854036B2 (en) * 2011-11-21 2023-12-26 Nant Holdings Ip, Llc Location-based transaction reconciliation management methods and systems
US20130281202A1 (en) * 2012-04-18 2013-10-24 Zynga, Inc. Method and apparatus for providing game elements in a social gaming environment
US9174128B2 (en) * 2012-04-26 2015-11-03 Zynga Inc. Dynamic quests in game
US9539498B1 (en) 2012-07-31 2017-01-10 Niantic, Inc. Mapping real world actions to a virtual world associated with a location-based game
US11167205B2 (en) 2012-07-31 2021-11-09 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US9782668B1 (en) 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US10646783B1 (en) 2012-07-31 2020-05-12 Niantic, Inc. Linking real world activities with a parallel reality game
US9723107B1 (en) 2012-07-31 2017-08-01 Niantic, Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9669293B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9226106B1 (en) 2012-07-31 2015-12-29 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US10130888B1 (en) 2012-07-31 2018-11-20 Niantic, Inc. Game data validation
US9669296B1 (en) 2012-07-31 2017-06-06 Niantic, Inc. Linking real world activities with a parallel reality game
US10300395B1 (en) 2012-07-31 2019-05-28 Niantic, Inc. Systems and methods for filtering communication within a location-based game
US9128789B1 (en) 2012-07-31 2015-09-08 Google Inc. Executing cross-cutting concerns for client-server remote procedure calls
US9621635B1 (en) 2012-07-31 2017-04-11 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US10806998B1 (en) 2012-07-31 2020-10-20 Niantic, Inc. Using side channels in remote procedure calls to return information in an interactive environment
US9604131B1 (en) 2012-07-31 2017-03-28 Niantic, Inc. Systems and methods for verifying player proximity within a location-based game
EP2904565A4 (en) * 2012-10-04 2016-12-14 Bernt Erik Bjontegard Contextually intelligent communication systems and processes
WO2014055376A2 (en) 2012-10-04 2014-04-10 Bjontegard Bernt Erik Contextually intelligent communication systems and processes
US8968099B1 (en) 2012-11-01 2015-03-03 Google Inc. System and method for transporting virtual objects in a parallel reality game
WO2014074465A1 (en) * 2012-11-06 2014-05-15 Stephen Latta Cross-platform augmented reality experience
US20140201205A1 (en) * 2013-01-14 2014-07-17 Disney Enterprises, Inc. Customized Content from User Data
WO2014181892A1 (en) * 2013-05-08 2014-11-13 Square Enix Holdings Co., Ltd. Information processing apparatus, control method and program
US10463953B1 (en) 2013-07-22 2019-11-05 Niantic, Inc. Detecting and preventing cheating in a location-based game
US10912989B2 (en) 2013-07-22 2021-02-09 Niantic, Inc. Detecting and preventing cheating in a location-based game
US10471358B1 (en) 2013-10-31 2019-11-12 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
US9545565B1 (en) 2013-10-31 2017-01-17 Niantic, Inc. Regulating and scoring player interactions within a virtual world associated with a location-based parallel reality game
CN106536004A (en) * 2014-04-30 2017-03-22 图片动态有限公司 An augmented gaming platform
US20170043256A1 (en) * 2014-04-30 2017-02-16 Robert Paul Severn An augmented gaming platform
US20170087469A1 (en) * 2015-09-29 2017-03-30 International Business Machines Corporation Dynamic personalized location and contact-aware games
US9861894B2 (en) * 2015-09-29 2018-01-09 International Business Machines Corporation Dynamic personalized location and contact-aware games
US10115234B2 (en) 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
US10642567B2 (en) 2016-03-21 2020-05-05 Accenture Global Solutions Limited Multiplatform based experience generation
CN107219916A (en) * 2016-03-21 2017-09-29 埃森哲环球解决方案有限公司 Generated based on multi-platform experience
AU2017200358B2 (en) * 2016-03-21 2017-11-23 Accenture Global Solutions Limited Multiplatform based experience generation
US20230123933A1 (en) * 2016-06-06 2023-04-20 Warner Bros. Entertainment Inc. Mixed reality system for context-aware virtual object rendering
US10384130B2 (en) * 2016-08-05 2019-08-20 AR Sports LLC Fantasy sport platform with augmented reality player acquisition
US10384131B2 (en) * 2016-08-05 2019-08-20 AR Sports LLC Fantasy sport platform with augmented reality player acquisition
US20190336861A1 (en) * 2016-08-05 2019-11-07 AR Sports LLC Fantasy Sport Platform with Augmented Reality Player Acquisition
US11123640B2 (en) * 2016-08-05 2021-09-21 AR Sports LLC Fantasy sport platform with augmented reality player acquisition
WO2018160081A1 (en) * 2017-03-02 2018-09-07 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
GB2573477A (en) * 2017-03-02 2019-11-06 Motorola Solutions Inc Method and apparatus for gathering visual data using an augmented-reality application
US10789830B2 (en) 2017-03-02 2020-09-29 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
GB2573478A (en) * 2017-03-02 2019-11-06 Motorola Solutions Inc Method and apparatus for gathering visual data using an augmented-reality application
WO2018160080A1 (en) * 2017-03-02 2018-09-07 Motorola Solutions, Inc. Method and apparatus for gathering visual data using an augmented-reality application
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US10717005B2 (en) * 2017-07-22 2020-07-21 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US20190022530A1 (en) * 2017-07-22 2019-01-24 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US11541315B2 (en) * 2017-07-22 2023-01-03 Niantic, Inc. Validating a player's real-world location using activity within a parallel-reality game
US11631336B1 (en) 2017-09-29 2023-04-18 DroneUp, LLC Multiplexed communications for coordination of piloted aerial drones enlisted to a common mission
US10872533B1 (en) 2017-09-29 2020-12-22 DroneUp, LLC Multiplexed communications of telemetry data, video stream data and voice data among piloted aerial drones via a common software application
US10741088B1 (en) 2017-09-29 2020-08-11 DroneUp, LLC Multiplexed communications for coordination of piloted aerial drones enlisted to a common mission
US11620914B1 (en) 2017-09-29 2023-04-04 DroneUp, LLC Multiplexed communication of telemetry data, video stream data and voice data among piloted aerial drones via a common software application
US11436931B1 (en) 2017-09-29 2022-09-06 DroneUp, LLC Multiplexed communications for coordination of piloted aerial drones enlisted to a common mission
CN108245881A (en) * 2017-12-29 2018-07-06 武汉市马里欧网络有限公司 Three-dimensional jointed plate model buildings system based on AR
WO2020009935A1 (en) * 2018-07-05 2020-01-09 Themissionzone, Inc. Systems and methods for manipulating the shape and behavior of a physical space
US11410488B2 (en) * 2019-05-03 2022-08-09 Igt Augmented reality virtual object collection based on symbol combinations
US10873951B1 (en) 2019-06-04 2020-12-22 Motorola Solutions, Inc. Method and device to minimize interference in a converged LMR/LTE communication device
US11210857B2 (en) 2019-09-26 2021-12-28 The Toronto-Dominion Bank Systems and methods for providing an augmented-reality virtual treasure hunt
US11436809B2 (en) 2019-09-26 2022-09-06 The Toronto-Dominion Bank Systems and methods for providing an augmented-reality virtual treasure hunt
US11574423B2 (en) 2021-01-29 2023-02-07 Boomanity Corp. A Delaware Corporation Augmented reality (AR) object communication and interaction system and method
US20220327461A1 (en) * 2021-04-08 2022-10-13 Raytheon Company Intelligence preparation of the battlefield (ipb) collaborative time machine with real-time options
US11941558B2 (en) * 2021-04-08 2024-03-26 Raytheon Company Intelligence preparation of the battlefield (IPB) collaborative time machine with real-time options

Also Published As

Publication number Publication date
WO2012122293A1 (en) 2012-09-13

Similar Documents

Publication Publication Date Title
US20120231887A1 (en) Augmented Reality Mission Generators
JP6905154B2 (en) Verification of player's real-world position using in-game activities
US20210374782A1 (en) Spectator and participant system and method for displaying different views of an event
US9573064B2 (en) Virtual and location-based multiplayer gaming
CN109445662B (en) Operation control method and device for virtual object, electronic equipment and storage medium
CA2621191C (en) Interactivity via mobile image recognition
KR101736477B1 (en) Local sensor augmentation of stored content and ar communication
JP7145976B2 (en) Virtual object information display method and its application program, device, terminal and server
JP7239668B2 (en) Verification of the player's real-world location using image data of landmarks corresponding to the verification path
CN109529356B (en) Battle result determining method, device and storage medium
WO2012007764A1 (en) Augmented reality system
JP2021535806A (en) Virtual environment observation methods, devices and storage media
CN113058264A (en) Virtual scene display method, virtual scene processing method, device and equipment
US20230206268A1 (en) Spectator and participant system and method for displaying different views of an event
CN112569596A (en) Video picture display method and device, computer equipment and storage medium
CN112569607A (en) Display method, device, equipment and medium for pre-purchased prop
TW202300201A (en) Repeatability predictions of interest points
CN112169321B (en) Mode determination method, device, equipment and readable storage medium
CN113633970A (en) Action effect display method, device, equipment and medium
CN113144595A (en) Virtual road generation method, device, terminal and storage medium
US20240075380A1 (en) Using Location-Based Game to Generate Language Information
US11748961B2 (en) Interactable augmented and virtual reality experience
US20240075379A1 (en) Dynamically Generated Local Virtual Events
US20240108989A1 (en) Generating additional content items for parallel-reality games based on geo-location and usage characteristics

Legal Events

Date Code Title Description
AS Assignment

Owner name: FOURTH WALL STUDIOS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, BRIAN ELAN;STEWART, MICHAEL SEAN;STEWARTSON, JAMES;SIGNING DATES FROM 20120426 TO 20120430;REEL/FRAME:028132/0158

AS Assignment

Owner name: NANT HOLDINGS IP, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FOURTH WALL STUDIOS, INC.;REEL/FRAME:030414/0660

Effective date: 20130422

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION