US20070087828A1 - Computer system for creating and playing location aware games - Google Patents

Computer system for creating and playing location aware games Download PDF

Info

Publication number
US20070087828A1
US20070087828A1 US11/163,329 US16332905A US2007087828A1 US 20070087828 A1 US20070087828 A1 US 20070087828A1 US 16332905 A US16332905 A US 16332905A US 2007087828 A1 US2007087828 A1 US 2007087828A1
Authority
US
United States
Prior art keywords
gameboard
virtual
location
effects
virtual effect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/163,329
Inventor
Alexander Robertson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/163,329 priority Critical patent/US20070087828A1/en
Publication of US20070087828A1 publication Critical patent/US20070087828A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5573Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history player location
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • A63F2300/6018Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content where the game content is authored by the player, e.g. level editor or by game device at runtime, e.g. level is created from music data on CD

Definitions

  • the present invention relates to systems, processes and methods that use GPS and small computing devices to create utility or entertainment value.
  • a system becomes location aware obtaining coordinates from a location receiver.
  • GPS Global Positioning System
  • DGPS Differential GPS
  • Small computing devices include personal digital assistants (PDA), Pocket PC's, Smart Phones and some pagers.
  • a category of location aware systems help users find their way relying on GPS, maps and navigation software. For such navigation systems, a user enters an address and is guided to that destination. Being location aware, the system can show the user in the context of a map: current location, where to go and past movements.
  • Navigation systems are sometimes coupled with databases of known municipal sites, landmarks and places of commerce. Said databases allow users to find places of interest near their location. Conversely, places of interest are also used as destinations.
  • Navigation systems provide useful directions from point A to point B and illustrate known places of interest.
  • location specific ad-hoc virtual effects are location specific because they are tied to an actual location and corresponding point on a map.
  • the effects are ad hoc because they are created at the discretion of the annotator or author of the personalized map.
  • the effects are virtual because they are real only in the context of the personalized map.
  • they are effects because their existence triggers a desired result. For example, a personalized map author may want to create a specialized map giving directions to his house.
  • the specialized map could include video display or audio narration effects that are triggered when passing by places of interest.
  • the places of interest may be trivial to the general public but have meaning to specific audiences. For example, a grandfather, giving his grandchildren directions to his house, might want to point out his first girlfriend's house or the tree that he hit while learning how to drive.
  • the effects for these personal places of interest could include a picture of the girlfriend or wrecked car and an audio narration of the place and picture's significance.
  • Zones are geographic areas defined by bounds.
  • the areas surrounding a golf tee or areas surrounding a putting green are zones.
  • To define a zone one needs to know the horizontal borders (latitude) and the vertical borders (longitude).
  • the application triggers helpful hints within the context of the player's location. For example, entering a tee zone triggers the system to display a picture of the hole, distance to hazards and strategies for playing.
  • the golf systems do not allow individuals to create personalized location aware experiences. For example, a Friday afternoon golfing league might want to annotate a golf course with entertaining challenges that change each week. A possible challenge for a certain outing could have the golfers watch a video of a famous golfer getting out of a treacherous sand trap. The challenge would be to imitate the shot and try to better the famous golfer's result. This virtual effect, the video and instructions, would be triggered as the golfers entered a zone marking the bounds around the trap area. That is, the challenge would be triggered as they approached the trap.
  • Irish's invention requires a game creation sub process before the game loop is executed.
  • the creation sub process is named “Define Global Cartridge Settings” and includes the following steps and sub steps allowing for the detailed specification of game personalization:
  • Irish's example application is written in a third generation programming language to allow the user to express the conditional logic and events required to create the experience.
  • this source code is compiled into a cartridge before it can be executed. It is the principle object of this invention to disclose a computer system that hides this complexity making it easier to create and play personalized location aware experiences.
  • FIG. 1 illustrates three sub processes comprising this invention.
  • the first two sub processes allow for game creation.
  • the first sub process, Develop Toolset ( 1 a ) involves the creation and compilation of a computer system that contains the complexity of a location independent, functionally specific game type. In this step, Virtual Effects are defined with behavior traits and without location.
  • a programming language is required to express the complexity of conditional logic and events.
  • the second sub process uses the resulting Toolset and a map to create and annotate a location specific instance of the game.
  • the output of the second process is called a Gameboard.
  • the Toolset's pre defined Virtual Effects are selected and placed on the Gameboard. If needed, the Virtual Effect's attributes are changed.
  • the Author Gameboard step is unique and easier than Irish's game creation process.
  • user-definable events and conditional logic are not defined.
  • Toolset can be used en mass: Gameboards are easily authored and interpreted. That is, once one golf course Toolset is created, an unlimited number of golf courses could be personalized with the Toolset's pre defined Virtual Effects. Toolsets can also be created capturing the logic of a game, training aid, tour or scavenger hunt.
  • the present invention allows for the annotation and interpretation of Virtual Effects on a map.
  • the invention facilitates a simple creation of custom location aware games, training aids, tours and scavenger hunts using a mobile computing device and map.
  • the interpretation process allows the same to be played or experienced using a mobile computing device and location receiver (e.g. GPS receiver).
  • This invention allows for the implementation of a new type of game: one where players experience a virtual reality interacting with Virtual Effects within their own neighborhoods. Compared to traditional computer games, these games will promote imagination and active rather than sedentary entertainment ultimately promoting more healthy lifestyles.
  • Glossary Definition List 1 Term Definition Gameboard A virtual playfield corresponding to an actual playfield.
  • the Gameboard has a map and geographical boundaries corresponding to the actual playfield.
  • the Gameboard also contains one or more Virtual Effects.
  • Gameboard Part of a Toolset allowing a user to author and edit a game Editor consisting of a Gameboard and associated Virtual Effects.
  • Gameboard Part of a Toolset allowing a user to play a user defined Interpreter game consisting of a previously defined Gameboard and Virtual Effects.
  • Instantiation Programming term for allocating memory on computing device for object used within computing system or method.
  • Sidewalk A child's game and example application of this invention.
  • Squirrel Toolset A computer system containing a Gameboard Editor and Gameboard Interpreter.
  • Toolset A software developer creating an application of this Creator invention.
  • Toolset User An individual who either authors or interprets a Gameboard.
  • Virtual An effect having appearance, size, audio and interaction Effect rules that is virtual because
  • FIG. 1 Toolset Creation and Use
  • FIG. 2 . 1 Toolset Class Diagram
  • FIG. 2 . 2 Virtual Effect Sub Types Class Diagram
  • FIG. 3 Gameboard XML Document
  • FIG. 4 Virtual Effect Default Images
  • FIG. 5 Create Toolset Process
  • FIG. 6 Toolset User Processes
  • FIG. 7 Author Gameboard Process
  • FIG. 8 Initialize Gameboard Process
  • FIG. 9 Acquire External Map Process
  • FIG. 10 Multiple Map Image Example
  • FIG. 11 Additional and Edit Virtual Effect Process
  • FIG. 12 Gameboard Control During Author Gameboard Process
  • FIG. 13 Set Virtual Effect Attributes
  • FIG. 14 Flow of the Gameboard Interpreter
  • FIG. 15 Gameboard Control during Interpret Gameboard Process
  • FIG. 16 Player Game Process
  • FIG. 17 Toolset, GPS receiver, GPS receiver interface, mapping interface, Internet and network interface contained in a small computing device
  • FIG. 18 Additional Embodiment with Multiple Players
  • This invention defines a computer system for annotating and interpreting predefined, location specific Virtual Effects onto a map.
  • the annotation process allows for the simple creation and editing of custom location aware games, training aids, tours and scavenger hunts using a mobile computing device and map.
  • the interpretation process allows same to be played or experienced using a mobile computing device and location receiver (e.g. GPS receiver).
  • FIG. 1 illustrates this invention's high level process.
  • a Toolset consisting of a Gameboard Editor, Gameboard Interpreter, Gameboard Control, Game, Virtual Effects and other supporting constructs is created and compiled.
  • the second sub process ( 1 b ), Author Gameboard uses the Gameboard Editor of the Toolset to create and annotate a virtual Gameboard. Once created, the Gameboard can be saved and played repeatedly using the Gameboard Interpreter ( 1 c ).
  • the toolset is created by a Toolset Creator.
  • the Toolset Creator is typically one or many software developers.
  • the next two steps are executed by Toolset Users. Toolset Users are application users who are not required to have programming skills.
  • Gameboard Author who creates the Gameboard
  • Gameboard Player who plays the resulting game.
  • the Game becomes location aware using a location receiver to obtain coordinates of the Gameboard Player.
  • the Gameboard Player's coordinates are used by the Gameboard Interpreter to place the Player into the virtual Gameboard and allow interaction with Virtual Effects.
  • the process and method defined in this invention allows for many different Toolsets or types of games.
  • An example Toolset Sidewalk Squirrel ⁇ from Sneaker Entertainment ⁇ , is used throughout this patent application to demonstrate the invention.
  • Sidewalk Squirrel ⁇ the Gameboard Author annotates a Gameboard with predefined Virtual Effects representing acorns, bones, dogs and other items.
  • the Gameboard Player using a GPS device is represented within the game as a squirrel.
  • Playing Sidewalk Squirrel ⁇ the Gameboard Player (the squirrel) collects the Gameboard's acorns and bones for points while avoiding or eliminating attacking dogs.
  • FIG. 2 . 1 illustrates a simplified implementation of the Toolset represented as a class diagram.
  • the legend to FIG. 2 . 1 ( FIG. 2 . 1 Legend) defines the three sections of the class description: Class Name, Attributes and Methods.
  • the Gameboard Editor ( 2 . 1 a ) and Gameboard Interpreter ( 2 . 1 g ) are responsible for the Author Gameboard and Interpret Gameboard processes respectively.
  • the Gameboard Editor has methods ( 2 . 1 c ) to create a new Gameboard, open an existing Gameboard and save a Gameboard.
  • the Gameboard Interpreter has a similar open Gameboard method.
  • the Gameboard Interpreter also has methods for playing, pausing and stopping the game ( 2 . 1 i ).
  • Gameboard Control ( 2 . 1 d ) and Virtual Effect Control ( 2 . 1 j ) represent platform specific implementations of the Gameboard and Virtual Effect respectively. As such, these classes are optional. Platform specific instructions within the Gameboard and Virtual Effect could be implemented in the Gameboard and Virtual Effect classes.
  • a Gameboard ( 2 . 1 m ) consists of ( 2 . 1 n ) an address, a game area center location represented as longitude and latitude, north-east and south-west bounding coordinates represented as longitude and latitude, a game area size represented as a length and width, a default scale representing a ratio of real world measurements to that of a map image, a collection of Virtual Effects ( 2 . 1 s ) and a collection of Gameboard Maps ( 2 . 1 p ) each with a map image and scale ( 2 . 1 q ).
  • a Virtual Effect ( 2 . 1 s ) consists of a list of images, a list of audio files a size represented as a length and width, a location ( 2 . 1 t ) and four methods defining the Virtual Effect's behavior: movement, interaction, appearance and audio ( 2 . 1 u ).
  • the four methods are overridden within Virtual Effect subtypes ( 2 . 2 d , 2 . 2 g , 2 . 2 j ).
  • the audio trait of the dog ( 2 . 2 g ) Virtual Effect is repetitive barking.
  • the prize Virtual Effect ( 2 . 2 d ) is silent until it is acquired where it “dings”.
  • the dog Virtual Effect attacks the Player (the squirrel).
  • the Virtual Effect images capture different positions of a running dog.
  • the dog's speed attribute ( 2 . 2 f ) is used to calculate the Virtual Effect's next location. Over time, using speed and a rotating image, the dog's movement behavior is represented.
  • the audio method uses the list of audio files to represent an excited, barking dog. To make the barking realistic, the audio files are rotated at random. The effect is sporadic barking of different volumes and pitches similar to that of an attacking dog.
  • Virtual Effects also have a size and location used for intersection detection ( 2 . 1 t ). Within the Sidewalk Squirrel ⁇ , size is represented as a length and width of pixels corresponding to the size of the Virtual Effect's image. Virtual Effect location is represented in longitude and latitude. Described in prior art terms, each Virtual Effect represents location independent zones that have self contained conditional logic that trigger effects. Virtual Effects stay location independent until used in the Author Gameboard process where they are assigned actual locations.
  • FIG. 4 illustrates the appearance of 6 types of Virtual Effects used within Sidewalk Squirrel ⁇ : acorn ( 4 a ), bone ( 4 b ), finish line ( 4 c ), stop sign ( 4 d ), dog ( 4 e ) and squirrel ( 4 f ).
  • acorns, bones, the finish line and stop sign are prizes of different types ( 2 . 2 b , 2 . 2 c ).
  • the dog ( 2 . 2 e ) and player ( 2 . 2 h ) are named accordingly.
  • Gameboard ( 2 . 1 m ), Gameboard Map ( 2 . 1 p ) and Virtual Effect ( 2 . 1 s ) each have methods for saving XML and reading from XML ( 2 . 1 o , 2 . 1 r , 2 . 1 u ) allowing the Gameboard to be persisted, edited and interpreted when desired.
  • XML is an acronym for extensible markup language and is frequently used in the art for persistence and process interaction.
  • FIG. 3 represents a Gameboard XML document created by a Sidewalk Squirrel ⁇ Toolset. To create the document, the Gameboard's ToXML method ( 2 . 1 o ) is called from Gameboard Editor's Save Gameboard method ( 2 .
  • the Gameboard's ToXML method creates sections of XML ( 3 a through 3 e ) corresponding to the attributes managed by the Gameboard ( 2 . 1 n ).
  • the Gameboard's ToXML process then iterates through the Gameboard Maps and Virtual Effects calling their respective ToXML methods ( 2 . 1 r , 2 . 1 u ) to create their corresponding XML sections ( 3 f and 3 g through 3 j ).
  • the FromXML methods are similar to the ToXML methods except that the XML is read rather than written.
  • the FromXML method within the Gameboard ( 2 . 1 o ) is called from the Initialize Gameboard process ( 8 a ) in both the Author Gameboard ( 7 a ) and Interpret Gameboard ( 14 a ) processes.
  • Virtual Effect subtypes override their parent's persistence methods ( 2 . 2 d , 2 . 2 g , 2 . 2 j ) writing and reading different XML sections for each subtype ( 3 g through 3 j ). This allows the dog XML segment ( 3 i ) to have a speed attribute corresponding to the speed attribute in the dog Virtual Effect subtype ( 2 . 2 f ).
  • the Gameboard Control manages the Toolset's game level traits through specialized displays.
  • Sidewalk Squirrel ⁇ is a simple game where points are won and lives are lost.
  • the Gameboard Control manages a score board that displays score and remaining lives.
  • Other embodiments may require different game level displays. For example, a scavenger hunt Toolset may require a display of elapsed time or time remaining in game.
  • a Game 2 . 1 v
  • a timer is used to set the cadence of the game.
  • the timer is also the mechanism used to start, stop and pause the game.
  • the location receiver provides coordinates to the Toolset allowing the game to become location aware.
  • the Game ( 2 . 1 v ) manages references to the timer and location receiver ( 2 . 1 w ).
  • the Game construct is optional. That is, the references can me managed within the Gameboard or Gameboard Interpreter.
  • the Player Virtual Effect ( 2 . 2 h ) is only instantiated during Gameboard Interpretation. Within Sidewalk Squirrel ⁇ , the Player holds score and lives attributes ( 2 . 2 i ). Similar to other Virtual Effects, the Player has methods that define behavior traits ( 2 . 2 j ). However, since the Game and Player Virtual Effect are not part of the Gameboard itself, these objects are not required to be persisted or represented as XML. Other embodiments of the Toolset will change this implementation. For example, a multiple player Toolset requires an XML representation to communicate Game and Player attributes between Players.
  • this invention requires but does not include a process for handling events, interfacing with device drivers or managing time.
  • Sidewalk Squirrel ⁇ uses Microsoft's ⁇ Compact Framework ⁇ to handle these tasks.
  • the Acquire External Map and location receiver also require external processes which will be introduced in a later Detailed Description section.
  • FIG. 5 illustrates the Create Toolset process.
  • Virtual Effects 5 a
  • Gameboard Control 5 b
  • the Toolset consisting of a Gameboard Editor, Gameboard Interpreter, Gameboard Control, Game, Virtual Effects and supporting external processes is compiled into an executable ( 5 c ) which can be distributed en mass ( 5 d ).
  • the compiled Toolset executable will be used by the Gameboard Author for Gameboard Authoring and used by the Gameboard Player for Gameboard Interpretation.
  • FIG. 6 illustrates the high level processes of a Toolset User.
  • the Toolset can Author a Gameboard ( 6 b ).
  • Gameboards can be authored anywhere a map is available. Once the Gameboard is authored it can be interpreted again and again ( 6 c ) ( 6 d ).
  • FIG. 7 illustrates the Author Gameboard process.
  • a Gameboard is constructed new or initialized ( 7 a ).
  • a map is then acquired externally ( 7 b ), and the Gameboard is annotated by adding Virtual Effects ( 7 c ) until the Gameboard Author is satisfied ( 7 d ).
  • the Gameboard is saved for later use ( 7 e ).
  • FIG. 8 further breaks down the Initialize Gameboard process.
  • a Gameboard is read from XML ( 8 a )
  • the Gameboard Control is created ( 8 b ) and given the Gameboard Map with the largest scale ( 8 c ).
  • the largest scale Gameboard Map is used to give the Gameboard Player a view of the entire game area.
  • Each Virtual Effect is read ( 8 d ), instantiated ( 8 e ), images are sized for each Gameboard Map scale ( 8 f ) and added to the Gameboard ( 8 g ).
  • the Gameboard map represents where the game is to be played. For example, if a Gameboard Author wanted to play Sidewalk Squirrel ⁇ within his or her locale, the Gameboard Map would encompass the neighborhood.
  • the annotation process as well as the individually specified map makes each game experience unique and customizable.
  • MapPoint ⁇ is available as a web service over the Internet. Besides map images, MapPoint ⁇ provides the northeast and southwest bounding coordinates. The coordinates are represented in latitude and longitude.
  • FIG. 9 illustrates the detailed process of Acquiring an External Map.
  • the process starts by entering a location represented as an address ( 9 a ) and calling the external service ( 9 b ). If the address is found by the external service, a center point represented in latitude and longitude is returned ( 9 c ). To retrieve an actual map ( 9 e ), the center point is used in subsequent calls to the service with the Gameboard's default scale and desired map size ( 9 d ). To change the game area ( 9 i ), the map service is called again with altered scale or center point to respectively alter size or location ( 9 j ).
  • the intent of this patent's process and method is to author and interpret a location aware game.
  • the game itself will be played or experienced through moving around outdoors in the game area represented within the Gameboard.
  • the map service is not called during the Interpret Gameboard step. Maps are acquired upfront during the Author Gameboard step and managed within the Gameboard Map construct. Many Gameboard Maps can be associated to the Gameboard itself. To represent large game areas, the map service may be called multiple times using different scales ( 9 g ) ( 9 h ).
  • the default scale is 6770. Since the device display is 1.5 inches tall by 1.1 inches wide ( 10 a ), the default game area will be 1.5 ⁇ 6770 by 1.1 ⁇ 6770 or 846 by 621 feet. To double the size of the game area, the scale can be either doubled to 13440 keeping the display size constant or the size of the map can be doubled keeping the scale constant. To allow offline scrolling and zooming, both the maps are acquired. The resulting first map is 3 by 2.2 inches with a game area of 1692 by 1242 feet with the default scale of 6770 ( 10 b ).
  • the second map is 1.5 ⁇ 1.1 inches representing the same game area and a doubled scale of 13540 ( 10 c ).
  • the smaller scale map is used ( 10 d )
  • the larger scale map is used ( 10 e ).
  • the map size is larger than the device display for the map larger than the display ( 10 b )
  • scrolling is used to view the game area by setting the map display offset ( 10 f ) to the horizontal and vertical scroll values ( 10 g and 10 h ).
  • FIG. 11 further breaks down the Virtual Effect add and edit process of Gameboard Authoring. If the Virtual Effect is new, the desired type will be selected ( 11 a ). If the Virtual Effect exists, it will be selected ( 11 a ). Within this sub process, a Virtual Effect's location is designated ( 11 b ) and attributes are set ( 11 c ). If the Virtual Effect is new it is added to the Gameboard. Designating the Virtual Effect's location is performed through selection of a point on the Gameboard's map image.
  • FIG. 12 shows the Gameboard Control during game editing.
  • Five of the six Virtual Effect types are represented on top of the Gameboard Control ( 12 a ).
  • Sidewalk Squirrel ⁇ to add a Virtual Effect to the Gameboard, the Virtual Effect type is designated through a mouse click ( 12 b ).
  • Clicking on the Gameboard Control ( 12 c ) the Virtual Effect's location is designated.
  • the x and y coordinates of the screen are then translated to latitude and longitude coordinates using the above algorithm and held by the Virtual Effect itself.
  • the Virtual Effect's attributes are set.
  • FIG. 13 illustrates a dialog where Sidewalk Squirrel's ⁇ dog attributes are set.
  • the Gameboard Author can designate the dog's speed ( 13 a ).
  • FIG. 14 illustrates the Interpret Gameboard process.
  • a Gameboard is initialized ( 14 a )
  • a game timer is instantiated ( 14 b )
  • a location receiver is initialized ( 14 c )
  • the Gameboard Player is instantiated ( 14 d ) and the game is played ( 14 e ).
  • the Gameboard Player is the Virtual Effect subtype whose movement and location is driven by the location receiver's coordinates.
  • the sample Toolset, Sidewalk Squirrel ⁇ implemented a GPS device as the location receiver. Using the GPS device's coordinates, the Player Virtual Effect becomes a proxy within the game representing the actual Gameboard Player.
  • FIG. 15 illustrates a Sidewalk Squirrel ⁇ Gameboard Control with Virtual Effects instantiated.
  • the topmost Virtual Effect is the Player (the squirrel) ( 15 a ).
  • the actual Gameboard Player (a human), is standing at the corner of Key Boulevard and North Danville Street in Arlington, Va.
  • FIG. 8 further breaks down the Initialize Gameboard process. This is the same process used during Gameboard Editing. After a Gameboard is instantiated, the Gameboard Control is created and given the Gameboard Map with the largest scale. The largest scale Gameboard Map is used to give the Gameboard Player a view of the entire game area. Virtual Effects are then read, instantiated and added to the Gameboard.
  • FIG. 16 further breaks down the Gameboard Interpreter's Play Game process.
  • Virtual Effects are continuously ( 16 b 16 g ) evaluated for movement ( 16 c ), interaction ( 16 d ), appearance ( 16 e ), and audio ( 16 f ).
  • a device specific timer is used ( 16 a ). This process continues until a game over condition is reached and the timer is disabled ( 16 h ). While playing, the actual Gameboard Player represented by the Player Virtual Effect is placed on the Gameboard with the other Virtual Effects. Over time, the Player Virtual Effect interacts with the other Virtual Effects by entering their zones.
  • the Play Game processes Virtual Effects uniformly.
  • the Virtual Effects implement their behavior traits differently, the game has a unique personality.
  • the dog Virtual Effect barks while attacking the Gameboard Player Virtual Effect. While attacking, the dog's image changes over time to give it a running appearance. If the dog enters the Player's zone, the interaction removes one of the three lives granted to the Player. Loss of all three lives evokes a game over condition and the timer stops.
  • acorns and bones are prizes. If the Player enters a prize Virtual Effect zone, the prize disappears (is acquired) and points are awarded.
  • Another Virtual Effect, the finish line is also immobile. Entering the finish line's zone allows the Player to end game a winner with remaining lives and points.
  • FIG. 17 illustrates how a compiled Toolset ( 17 b ) interacts with the operating system ( 17 a ) and device interfaces ( 17 d , 17 e, 17 f ) on a small computing device for a single player.
  • the Toolset uses the mapping ( 17 e ) and network interface ( 17 d ) to retrieve external maps from the Internet ( 17 g ).
  • GPS interfacing software ( 17 f ) is used to receive coordinates from an internal or external GPS device ( 17 c ).
  • FIG. 18 Another embodiment of this invention allows communication between multiple players. This embodiment is illustrated in FIG. 18 and requires Gameboards and Virtual Effects to be synchronized across multiple executing Toolsets ( 18 a, 18 b ) during Gameboard Interpretation.
  • This alternative embodiment supports an expanded number of Toolsets and Game themes. For example, a Toolset Creator could enhance a game of hide and seek or sharks and minnows. In both these Toolset examples, players could receive hints to where other players are hiding.
  • This invention's process and method as defined use a square zone represented by horizontal and vertical bounds.
  • Another embodiment could use any shape.
  • a zone could be represented as the area within a set of points or the area could be defined by an equation: x**2+2y**2.
  • Size used in this invention's example implementation is represented as pixels. In other embodiments, size could be represented in any unit of measure: feet, inches or meters. It is required though that size correlates to the actual size of the Virtual Effect image to give the Gameboard Player a realistic experience when interacting with the Virtual Effects within a game.
  • Visual display may not be required by some applications. For example, tours might completely rely on audio Virtual Effects.
  • Wizards may aid in the creation of Virtual Effects on a Gameboard. For example, following input from a Game Author, a Wizard could be instructed to place Virtual Effects on all street corners. Likewise, a Wizard could be used to generate dog Virtual Effects in Sidewalk Squirrel ⁇ during Gameboard annotation or interpretation.
  • this invention allows for the implementation of new location aware game types for small computing devices using location receivers.
  • the Toolsets created using this process and method will promote active rather than sedentary entertainment ultimately promoting more healthy lifestyles.

Abstract

Computer system for creating and playing location aware games. The creation process allows for authoring of custom location aware games, training aids, tours and scavenger hunts using a mobile computing device and map. The resulting game can be played using a mobile computing device and location receiver (e.g. GPS receiver).

Description

    BACKGROUND OF THE INVENTION—FIELD OF INVENTION
  • The present invention relates to systems, processes and methods that use GPS and small computing devices to create utility or entertainment value.
  • BACKGROUND OF THE INVENTION—PRIOR ART
  • The availability of small computing devices using increasingly accurate location receivers is boosting the popularity of location aware systems. In the context of this invention, a system becomes location aware obtaining coordinates from a location receiver. Global Positioning System (GPS) receivers, Differential GPS (DGPS) receivers and some wireless networks provide these systems with location coordinates. Small computing devices include personal digital assistants (PDA), Pocket PC's, Smart Phones and some pagers.
  • A category of location aware systems help users find their way relying on GPS, maps and navigation software. For such navigation systems, a user enters an address and is guided to that destination. Being location aware, the system can show the user in the context of a map: current location, where to go and past movements.
  • Navigation systems are sometimes coupled with databases of known municipal sites, landmarks and places of commerce. Said databases allow users to find places of interest near their location. Conversely, places of interest are also used as destinations.
  • Navigation systems provide useful directions from point A to point B and illustrate known places of interest. However, using these systems, one cannot personalize the journey. That is, using these systems, one cannot personally annotate the map with location specific ad-hoc virtual effects that get triggered as one uses the map. In the context of this invention, location specific ad-hoc virtual effects are location specific because they are tied to an actual location and corresponding point on a map. The effects are ad hoc because they are created at the discretion of the annotator or author of the personalized map. The effects are virtual because they are real only in the context of the personalized map. Finally, they are effects because their existence triggers a desired result. For example, a personalized map author may want to create a specialized map giving directions to his house. The specialized map could include video display or audio narration effects that are triggered when passing by places of interest. The places of interest may be trivial to the general public but have meaning to specific audiences. For example, a grandfather, giving his grandchildren directions to his house, might want to point out his first girlfriend's house or the tree that he hit while learning how to drive. The effects for these personal places of interest could include a picture of the girlfriend or wrecked car and an audio narration of the place and picture's significance.
  • U.S. Pat. No. 5,364,093 to Huston and U.S. Pat. No. 6,525,690 to Rudow introduce location aware systems used to enhance a game of golf. Like navigation systems, these systems show users in the context of a map, in this case a golf course. A unique and helpful aspect of these golf systems is the definition of zones and their use driving the player's display. Zones are geographic areas defined by bounds. The areas surrounding a golf tee or areas surrounding a putting green are zones. To define a zone, one needs to know the horizontal borders (latitude) and the vertical borders (longitude). Once entering a zone, the application triggers helpful hints within the context of the player's location. For example, entering a tee zone triggers the system to display a picture of the hole, distance to hazards and strategies for playing.
  • Like the navigation systems, the golf systems do not allow individuals to create personalized location aware experiences. For example, a Friday afternoon golfing league might want to annotate a golf course with entertaining challenges that change each week. A possible challenge for a certain outing could have the golfers watch a video of a famous golfer getting out of a treacherous sand trap. The challenge would be to imitate the shot and try to better the famous golfer's result. This virtual effect, the video and instructions, would be triggered as the golfers entered a zone marking the bounds around the trap area. That is, the challenge would be triggered as they approached the trap.
  • US Patent Application 20050049022 to Mullen introduces the creation of location aware games given a location. This application is limited in that the game makes no consideration for parks or streets within the game area. The PACS in PACMAN referenced in the application should follow a street or path within a park. This can only be done if the game creator is allowed to personally annotate the Gameboard.
  • U.S. Pat. No. 6,691,032 to Irish introduces a process and method for creating and executing user-definable events triggered through location data describing zones of influence. Using the invention, one could create an annotated golf course and annotated maps as previously described. However, Irish's invention does not allow for the easy creation of these games.
  • Irish's invention requires a game creation sub process before the game loop is executed. The creation sub process is named “Define Global Cartridge Settings” and includes the following steps and sub steps allowing for the detailed specification of game personalization:
    • Define zones of influence
    • Define items
    • Define events
    • Define non-player characters
    • Compile cartridge
  • Within these steps, users define attributes, conditions and events allowing flexibility in Irish's invention. This flexibility also adds to the complexity involved in creating a personalized experience. Within the patent, Irish's example application is written in a third generation programming language to allow the user to express the conditional logic and events required to create the experience. Within the “Define Global Cartridge Settings” process, this source code is compiled into a cartridge before it can be executed. It is the principle object of this invention to disclose a computer system that hides this complexity making it easier to create and play personalized location aware experiences.
  • BACKGROUND OF THE INVENTION—OBJECT AND ADVANTAGES
  • Another approach to the same problem addressed in Irish's invention is to split game creation and playing into three sub processes. FIG. 1 illustrates three sub processes comprising this invention. The first two sub processes allow for game creation. The first sub process, Develop Toolset (1 a), involves the creation and compilation of a computer system that contains the complexity of a location independent, functionally specific game type. In this step, Virtual Effects are defined with behavior traits and without location. To implement the Develop Toolset computer system, a programming language is required to express the complexity of conditional logic and events.
  • The second sub process, Author Gameboard (1 b), uses the resulting Toolset and a map to create and annotate a location specific instance of the game. Within the context of this invention, the output of the second process is called a Gameboard. To author a Gameboard the Toolset's pre defined Virtual Effects are selected and placed on the Gameboard. If needed, the Virtual Effect's attributes are changed. The Author Gameboard step is unique and easier than Irish's game creation process. Within this invention, for this sub process, user-definable events and conditional logic are not defined.
  • Because of its simplicity, the Toolset can be used en mass: Gameboards are easily authored and interpreted. That is, once one golf course Toolset is created, an unlimited number of golf courses could be personalized with the Toolset's pre defined Virtual Effects. Toolsets can also be created capturing the logic of a game, training aid, tour or scavenger hunt.
  • SUMMARY
  • The present invention allows for the annotation and interpretation of Virtual Effects on a map. Applied in the intent of the invention's primary embodiment, the invention facilitates a simple creation of custom location aware games, training aids, tours and scavenger hunts using a mobile computing device and map. The interpretation process allows the same to be played or experienced using a mobile computing device and location receiver (e.g. GPS receiver).
  • This invention allows for the implementation of a new type of game: one where players experience a virtual reality interacting with Virtual Effects within their own neighborhoods. Compared to traditional computer games, these games will promote imagination and active rather than sedentary entertainment ultimately promoting more healthy lifestyles.
  • Glossary:
    Definition List 1
    Term Definition
    Gameboard A virtual playfield corresponding to an actual playfield.
    The Gameboard has a map and geographical boundaries
    corresponding to the actual playfield. The Gameboard also
    contains one or more Virtual Effects.
    Gameboard Part of a Toolset allowing a user to author and edit a game
    Editor consisting of a Gameboard and associated Virtual Effects.
    Gameboard Part of a Toolset allowing a user to play a user defined
    Interpreter game consisting of a previously defined Gameboard and
    Virtual Effects.
    Instantiation Programming term for allocating memory on computing
    device for object used within computing system or
    method.
    Sidewalk A child's game and example application of this invention.
    Squirrel ©
    Toolset A computer system containing a Gameboard Editor and
    Gameboard Interpreter.
    Toolset A software developer creating an application of this
    Creator invention.
    Toolset User An individual who either authors or interprets a
    Gameboard.
    Virtual An effect having appearance, size, audio and interaction
    Effect rules that is virtual because it is real only in the context of
    a Gameboard.
  • DRAWINGS
  • FIG. 1—Toolset Creation and Use
  • FIG. 2.1 —Toolset Class Diagram
  • FIG. 2.2—Virtual Effect Sub Types Class Diagram
  • FIG. 3—Gameboard XML Document
  • FIG. 4—Virtual Effect Default Images
  • FIG. 5—Create Toolset Process
  • FIG. 6—Toolset User Processes
  • FIG. 7—Author Gameboard Process
  • FIG. 8—Initialize Gameboard Process
  • FIG. 9—Acquire External Map Process
  • FIG. 10—Multiple Map Image Example
  • FIG. 11—Add and Edit Virtual Effect Process
  • FIG. 12—Gameboard Control During Author Gameboard Process
  • FIG. 13—Set Virtual Effect Attributes
  • FIG. 14—Flow of the Gameboard Interpreter
  • FIG. 15—Gameboard Control during Interpret Gameboard Process
  • FIG. 16—Play Game Process
  • FIG. 17—Toolset, GPS receiver, GPS receiver interface, mapping interface, Internet and network interface contained in a small computing device
  • FIG. 18—Additional Embodiment with Multiple Players
  • DETAILED DESCRIPTION
  • This invention defines a computer system for annotating and interpreting predefined, location specific Virtual Effects onto a map. The annotation process allows for the simple creation and editing of custom location aware games, training aids, tours and scavenger hunts using a mobile computing device and map. The interpretation process allows same to be played or experienced using a mobile computing device and location receiver (e.g. GPS receiver).
  • FIG. 1 illustrates this invention's high level process. In the first sub process (1 a) a Toolset consisting of a Gameboard Editor, Gameboard Interpreter, Gameboard Control, Game, Virtual Effects and other supporting constructs is created and compiled. The second sub process (1 b), Author Gameboard, uses the Gameboard Editor of the Toolset to create and annotate a virtual Gameboard. Once created, the Gameboard can be saved and played repeatedly using the Gameboard Interpreter (1 c). In the context of this invention, the toolset is created by a Toolset Creator. The Toolset Creator is typically one or many software developers. The next two steps are executed by Toolset Users. Toolset Users are application users who are not required to have programming skills. There are two types of Toolset Users: a Gameboard Author who creates the Gameboard and a Gameboard Player who plays the resulting game. During the Interpret Gameboard step, the Game becomes location aware using a location receiver to obtain coordinates of the Gameboard Player. The Gameboard Player's coordinates are used by the Gameboard Interpreter to place the Player into the virtual Gameboard and allow interaction with Virtual Effects.
  • To aid the reader of this patent application, nouns representing new concepts within this invention are capitalized. For example, Gameboard Editor, Gameboard and Virtual Effect are capitalized. A map image is used within this invention but it is not a new concept and not capitalized.
  • The process and method defined in this invention allows for many different Toolsets or types of games. An example Toolset, Sidewalk Squirrel© from Sneaker Entertainment©, is used throughout this patent application to demonstrate the invention. To play Sidewalk Squirrel©, the Gameboard Author annotates a Gameboard with predefined Virtual Effects representing acorns, bones, dogs and other items. The Gameboard Player using a GPS device is represented within the game as a squirrel. Playing Sidewalk Squirrel©, the Gameboard Player (the squirrel) collects the Gameboard's acorns and bones for points while avoiding or eliminating attacking dogs.
  • The detailed description section of this patent documents this invention from three perspectives: Toolset, Author Gameboard and Interpret Gameboard. The Toolset section defines the constructs used within the invention while the Author and Interpret Gameboard sections introduce the processes that use the constructs. Other Embodiments is presented as the final section.
  • DETAILED DESCRIPTION—TOOLSET
  • FIG. 2.1 illustrates a simplified implementation of the Toolset represented as a class diagram. The legend to FIG. 2.1 (FIG. 2.1 Legend) defines the three sections of the class description: Class Name, Attributes and Methods. Within the Toolset, the Gameboard Editor (2.1 a) and Gameboard Interpreter (2.1 g) are responsible for the Author Gameboard and Interpret Gameboard processes respectively. To launch these processes, the Gameboard Editor has methods (2.1 c) to create a new Gameboard, open an existing Gameboard and save a Gameboard. The Gameboard Interpreter has a similar open Gameboard method. The Gameboard Interpreter also has methods for playing, pausing and stopping the game (2.1 i). Both of these constructs have references to (2.1 b, 2.1 h) and rely on the Gameboard Control (2.1 d) to visually represent the Gameboard (2.1 m) itself. As such, the Gameboard Control has a reference (2.1 e) to the Gameboard and methods to paint the Gameboard image and to zoom and scroll (2.1 f). A collection of Virtual Effects (2.1 s) is managed by the Gameboard. Similar to the Gameboard Control's function, Virtual Effect Controls (2.1 j) represent the physical properties of Virtual Effects. Like the Gameboard Control, Virtual Effect Controls have references to their respective Virtual Effects (2.1 k) and a method to paint the image (2.1l). Within the Paint Gameboard method (2.1 f) of the Gameboard Control, after the map image is presented on the physical device display, each of the referenced Virtual Effect Controls (2.1 e) is required to do the same.
  • Different hardware platforms require different implementations to drive respective hardware components. In the art, to keep a separation between platform independent components and platform specific components, functionality is sometimes split into two classes. Within FIG. 2.1, Gameboard Control (2.1 d) and Virtual Effect Control (2.1 j) represent platform specific implementations of the Gameboard and Virtual Effect respectively. As such, these classes are optional. Platform specific instructions within the Gameboard and Virtual Effect could be implemented in the Gameboard and Virtual Effect classes.
  • In detail, a Gameboard (2.1 m) consists of (2.1 n) an address, a game area center location represented as longitude and latitude, north-east and south-west bounding coordinates represented as longitude and latitude, a game area size represented as a length and width, a default scale representing a ratio of real world measurements to that of a map image, a collection of Virtual Effects (2.1 s) and a collection of Gameboard Maps (2.1 p) each with a map image and scale (2.1 q).
  • In detail, a Virtual Effect (2.1 s) consists of a list of images, a list of audio files a size represented as a length and width, a location (2.1 t) and four methods defining the Virtual Effect's behavior: movement, interaction, appearance and audio (2.1 u). To create different effects for different games, the four methods are overridden within Virtual Effect subtypes (2.2 d, 2.2 g, 2.2 j). For example, the audio trait of the dog (2.2 g) Virtual Effect is repetitive barking. The prize Virtual Effect (2.2 d) is silent until it is acquired where it “dings”. Since both the prize and dog are sub classes of Virtual Effect, they are treated exactly the same by the Author Gameboard and Interpret Gameboard processes. In the art, this technique is called polymorphism. Polymorphism allows the Author Gameboard and Interpret Gameboard processes to work regardless of the detailed behavior traits of the Virtual Effects which in turn allows the creation of many Toolsets or game types under this invention's process and method. It is the responsibility of the Game Creator to ensure that all Virtual Effect subclasses (2.2 a) implement all necessary methods so that the game can be edited and interpreted.
  • Within Sidewalk Squirrel©, the dog Virtual Effect attacks the Player (the squirrel). To characterize the dog's movement, the Virtual Effect images capture different positions of a running dog. Within the movement method, the dog's speed attribute (2.2 f) is used to calculate the Virtual Effect's next location. Over time, using speed and a rotating image, the dog's movement behavior is represented.
  • Similar to the dog's movement method, the audio method uses the list of audio files to represent an excited, barking dog. To make the barking realistic, the audio files are rotated at random. The effect is sporadic barking of different volumes and pitches similar to that of an attacking dog.
  • Virtual Effects also have a size and location used for intersection detection (2.1 t). Within the Sidewalk Squirrel©, size is represented as a length and width of pixels corresponding to the size of the Virtual Effect's image. Virtual Effect location is represented in longitude and latitude. Described in prior art terms, each Virtual Effect represents location independent zones that have self contained conditional logic that trigger effects. Virtual Effects stay location independent until used in the Author Gameboard process where they are assigned actual locations.
  • FIG. 4 illustrates the appearance of 6 types of Virtual Effects used within Sidewalk Squirrel©: acorn (4 a), bone (4 b), finish line (4 c), stop sign (4 d), dog (4 e) and squirrel (4 f). Within FIG. 3, acorns, bones, the finish line and stop sign are prizes of different types (2.2 b, 2.2 c). The dog (2.2 e) and player (2.2 h) are named accordingly.
  • Within Sidewalk Squirrel©, the Virtual Effect types and behavior traits are as follows:
      • Acorn
        • Immobile
        • Intersection with Player awards points
        • Acorn appearance determined from Prize Type attribute (2.2 c) and doesn't change
        • until intersection, then Acorn disappears
        • A “ding” sound is triggered by an intersection with Player
      • Bone
        • Immobile
        • Intersection gives Player Bone and awards points to Player
        • Bone appearance determined from Prize Type attribute (2.2 c) and doesn't change until intersection, then Bone disappears
        • A “ding” sound is triggered by an intersection with Player
      • Dog
        • Moves towards Player at designated speed (2.2 f)
        • Intersection removes one life from Squirrel unless Bone is given to dog prior to intersection. Loss of all lives ends game.
        • Running appearance simulated by changing stride images
        • Dog barks while chasing Player and gulps when receiving Bone
      • Stop sign
        • Immobile
        • Intersection with Player suspends game until Player leaves stop area
        • Stop sign appearance determined from Prize Type attribute (2.2 c) and never changes
        • No audio
      • Finish line
        • Immobile
        • Intersection with Player awards point and ends game
        • Finish line appearance determined from Prize Type attribute (2.2 c) and never changes
        • Intersection with Player triggers applause
      • Squirrel
        • Moves in accordance to coordinates taken from GPS device
        • Interactions defined within other Virtual Effects
        • Appearance changes when Dog nears and when Dog intersects
        • Dog intersection triggers “ouch” sound and subtracts from Lives attribute (2.2 i).
        • Prize intersection adds to Score attribute (2.2 i).
  • Gameboard (2.1 m), Gameboard Map (2.1 p) and Virtual Effect (2.1 s) each have methods for saving XML and reading from XML (2.1 o, 2.1 r, 2.1 u) allowing the Gameboard to be persisted, edited and interpreted when desired. XML is an acronym for extensible markup language and is frequently used in the art for persistence and process interaction. FIG. 3 represents a Gameboard XML document created by a Sidewalk Squirrel© Toolset. To create the document, the Gameboard's ToXML method (2.1 o) is called from Gameboard Editor's Save Gameboard method (2.1 c) which is called within the Author Gameboard process (7 e). The Gameboard's ToXML method creates sections of XML (3 a through 3 e) corresponding to the attributes managed by the Gameboard (2.1 n). The Gameboard's ToXML process then iterates through the Gameboard Maps and Virtual Effects calling their respective ToXML methods (2.1 r, 2.1 u) to create their corresponding XML sections (3 f and 3 g through 3 j). The FromXML methods are similar to the ToXML methods except that the XML is read rather than written. The FromXML method within the Gameboard (2.1 o) is called from the Initialize Gameboard process (8 a) in both the Author Gameboard (7 a) and Interpret Gameboard (14 a) processes.
  • Following the same polymorphic technique as the behavior trait methods, Virtual Effect subtypes override their parent's persistence methods (2.2 d, 2.2 g, 2.2 j) writing and reading different XML sections for each subtype (3 g through 3 j). This allows the dog XML segment (3 i) to have a speed attribute corresponding to the speed attribute in the dog Virtual Effect subtype (2.2 f).
  • As Virtual Effects define the Toolset's personality through their behavior methods, the Gameboard Control manages the Toolset's game level traits through specialized displays. Sidewalk Squirrel© is a simple game where points are won and lives are lost. To implement this behavior, the Gameboard Control manages a score board that displays score and remaining lives. Other embodiments may require different game level displays. For example, a scavenger hunt Toolset may require a display of elapsed time or time remaining in game.
  • During Gameboard Interpretation three additional constructs are used: a Game (2.1 v), a timer and a location receiver. During interpretation, the timer is used to set the cadence of the game. The timer is also the mechanism used to start, stop and pause the game. During interpretation, the location receiver provides coordinates to the Toolset allowing the game to become location aware. The Game (2.1 v) manages references to the timer and location receiver (2.1 w). As such, the Game construct is optional. That is, the references can me managed within the Gameboard or Gameboard Interpreter.
  • Similar to the Game, the Player Virtual Effect (2.2 h) is only instantiated during Gameboard Interpretation. Within Sidewalk Squirrel©, the Player holds score and lives attributes (2.2 i). Similar to other Virtual Effects, the Player has methods that define behavior traits (2.2 j). However, since the Game and Player Virtual Effect are not part of the Gameboard itself, these objects are not required to be persisted or represented as XML. Other embodiments of the Toolset will change this implementation. For example, a multiple player Toolset requires an XML representation to communicate Game and Player attributes between Players.
  • For any implementation, this invention requires but does not include a process for handling events, interfacing with device drivers or managing time. Sidewalk Squirrel© uses Microsoft's© Compact Framework© to handle these tasks. The Acquire External Map and location receiver also require external processes which will be introduced in a later Detailed Description section.
  • FIG. 5 illustrates the Create Toolset process. In this process, Virtual Effects (5 a) are created and the Gameboard Control (5 b) is altered to represent the Toolset theme. With any embodiment, the Toolset, consisting of a Gameboard Editor, Gameboard Interpreter, Gameboard Control, Game, Virtual Effects and supporting external processes is compiled into an executable (5 c) which can be distributed en mass (5 d). The compiled Toolset executable will be used by the Gameboard Author for Gameboard Authoring and used by the Gameboard Player for Gameboard Interpretation.
  • DETAILED DESCRIPTION—AUTHOR GAMEBOARD
  • FIG. 6 illustrates the high level processes of a Toolset User. After the Toolset is Acquired (6 a), the Toolset User can Author a Gameboard (6 b). Because Toolsets are location independent, Gameboards can be authored anywhere a map is available. Once the Gameboard is authored it can be interpreted again and again (6 c) (6 d).
  • FIG. 7 illustrates the Author Gameboard process. Using the Gameboard Editor, a Gameboard is constructed new or initialized (7 a). A map is then acquired externally (7 b), and the Gameboard is annotated by adding Virtual Effects (7 c) until the Gameboard Author is satisfied (7 d). Once the Gameboard Author is satisfied, the Gameboard is saved for later use (7 e).
  • FIG. 8 further breaks down the Initialize Gameboard process. After a Gameboard is read from XML (8 a), the Gameboard Control is created (8 b) and given the Gameboard Map with the largest scale (8 c). The largest scale Gameboard Map is used to give the Gameboard Player a view of the entire game area. Each Virtual Effect is read (8 d), instantiated (8 e), images are sized for each Gameboard Map scale (8 f) and added to the Gameboard (8 g).
  • In all embodiments of the game, the Gameboard map represents where the game is to be played. For example, if a Gameboard Author wanted to play Sidewalk Squirrel© within his or her locale, the Gameboard Map would encompass the neighborhood. The annotation process as well as the individually specified map makes each game experience unique and customizable.
  • The example game, Sidewalk Squirrel© uses Microsoft's© MapPoint© to provide Gameboard Maps within the Acquire External Map method (7 b). MapPoint© is available as a web service over the Internet. Besides map images, MapPoint© provides the northeast and southwest bounding coordinates. The coordinates are represented in latitude and longitude.
  • FIG. 9 illustrates the detailed process of Acquiring an External Map. The process starts by entering a location represented as an address (9 a) and calling the external service (9 b). If the address is found by the external service, a center point represented in latitude and longitude is returned (9 c). To retrieve an actual map (9 e), the center point is used in subsequent calls to the service with the Gameboard's default scale and desired map size (9 d). To change the game area (9 i), the map service is called again with altered scale or center point to respectively alter size or location (9 j).
  • The intent of this patent's process and method is to author and interpret a location aware game. The game itself will be played or experienced through moving around outdoors in the game area represented within the Gameboard. Because internet access is not always available outdoors, the map service is not called during the Interpret Gameboard step. Maps are acquired upfront during the Author Gameboard step and managed within the Gameboard Map construct. Many Gameboard Maps can be associated to the Gameboard itself. To represent large game areas, the map service may be called multiple times using different scales (9 g) (9 h).
  • To make it easy to scroll and zoom when larger scales are represented, multiple maps are acquired and managed within the Gameboard having varying sizes and scale. For example, within FIG. 10 the default scale is 6770. Since the device display is 1.5 inches tall by 1.1 inches wide (10 a), the default game area will be 1.5×6770 by 1.1×6770 or 846 by 621 feet. To double the size of the game area, the scale can be either doubled to 13440 keeping the display size constant or the size of the map can be doubled keeping the scale constant. To allow offline scrolling and zooming, both the maps are acquired. The resulting first map is 3 by 2.2 inches with a game area of 1692 by 1242 feet with the default scale of 6770 (10 b). The second map is 1.5×1.1 inches representing the same game area and a doubled scale of 13540 (10 c). To zoom in, the smaller scale map is used (10 d), and to zoom out, the larger scale map is used (10 e). When the map size is larger than the device display for the map larger than the display (10 b), scrolling is used to view the game area by setting the map display offset (10 f) to the horizontal and vertical scroll values (10 g and 10 h).
  • To present realistic Virtual Effect images, as the Gameboard is zoomed using different map images, different Virtual Effect images are used corresponding to the map scale. These images are sized during the Size Images for Map Scales step (8 f) of the Initialize Gameboard process (FIG. 8).
  • FIG. 11 further breaks down the Virtual Effect add and edit process of Gameboard Authoring. If the Virtual Effect is new, the desired type will be selected (11 a). If the Virtual Effect exists, it will be selected (11 a). Within this sub process, a Virtual Effect's location is designated (11 b) and attributes are set (11 c). If the Virtual Effect is new it is added to the Gameboard. Designating the Virtual Effect's location is performed through selection of a point on the Gameboard's map image. The selected point, represented in pixels (x and y) within the device specific Gameboard Control, is then translated to a latitude and longitude location using the Gameboard's bounding Northeast (ne) and Southwest(sw) coordinates within the following algorithm:
    Latitude=ne.latitude−y*absolutevalue((ne.latitude−sw.latitude)/displayheight)
    Longitude=sw.longitude+x*absolutevalue((ne.longitude−sw.longitude)/displaywidth)
  • FIG. 12 shows the Gameboard Control during game editing. Five of the six Virtual Effect types are represented on top of the Gameboard Control (12 a). Within Sidewalk Squirrel©, to add a Virtual Effect to the Gameboard, the Virtual Effect type is designated through a mouse click (12 b). Clicking on the Gameboard Control (12 c), the Virtual Effect's location is designated. The x and y coordinates of the screen are then translated to latitude and longitude coordinates using the above algorithm and held by the Virtual Effect itself. After the location is designated, the Virtual Effect's attributes are set. FIG. 13 illustrates a dialog where Sidewalk Squirrel's© dog attributes are set. Within this example, the Gameboard Author can designate the dog's speed (13 a).
  • By restricting the Gameboard Author to simply designating the location and attribute values of the Virtual Effect, the Author Gameboard process is greatly simplified. Conversely, the Gameboard Author is restricted to the creativity of the Toolset Creator for types and behavior of Virtual Effects.
  • DETAILED DESCRIPTION—INTERPRET GAMEBOARD
  • FIG. 14 illustrates the Interpret Gameboard process. Using the Gameboard Interpreter, a Gameboard is initialized (14 a), a game timer is instantiated (14 b), a location receiver is initialized (14 c), the Gameboard Player is instantiated (14 d) and the game is played (14 e). The Gameboard Player is the Virtual Effect subtype whose movement and location is driven by the location receiver's coordinates. The sample Toolset, Sidewalk Squirrel©, implemented a GPS device as the location receiver. Using the GPS device's coordinates, the Player Virtual Effect becomes a proxy within the game representing the actual Gameboard Player.
  • This invention uses but does not address GPS. The example game, Sidewalk Squirrel© utilizes StormSource Software's© GPS.NET© application to provide connectivity to GPS receivers. During a game, GPS.NET© coordinates and levels of confidence are read into the Game itself. In turn, the Game passes these coordinates to the Player Virtual Effect. FIG. 15 illustrates a Sidewalk Squirrel© Gameboard Control with Virtual Effects instantiated. The topmost Virtual Effect is the Player (the squirrel) (15 a). In this example, the actual Gameboard Player (a human), is standing at the corner of Key Boulevard and North Danville Street in Arlington, Va.
  • FIG. 8 further breaks down the Initialize Gameboard process. This is the same process used during Gameboard Editing. After a Gameboard is instantiated, the Gameboard Control is created and given the Gameboard Map with the largest scale. The largest scale Gameboard Map is used to give the Gameboard Player a view of the entire game area. Virtual Effects are then read, instantiated and added to the Gameboard.
  • FIG. 16 further breaks down the Gameboard Interpreter's Play Game process. Within this process, Virtual Effects are continuously (16 b 16 g) evaluated for movement (16 c), interaction (16 d), appearance (16 e), and audio (16 f). To implement the cadence of Virtual Effect evaluation a device specific timer is used (16 a). This process continues until a game over condition is reached and the timer is disabled (16 h). While playing, the actual Gameboard Player represented by the Player Virtual Effect is placed on the Gameboard with the other Virtual Effects. Over time, the Player Virtual Effect interacts with the other Virtual Effects by entering their zones.
  • During Sidewalk Squirrel©, the Play Game processes Virtual Effects uniformly. However, since the Virtual Effects implement their behavior traits differently, the game has a unique personality. For example, the dog Virtual Effect barks while attacking the Gameboard Player Virtual Effect. While attacking, the dog's image changes over time to give it a running appearance. If the dog enters the Player's zone, the interaction removes one of the three lives granted to the Player. Loss of all three lives evokes a game over condition and the timer stops. Unlike the dog Virtual Effect, acorns and bones are prizes. If the Player enters a prize Virtual Effect zone, the prize disappears (is acquired) and points are awarded. Another Virtual Effect, the finish line, is also immobile. Entering the finish line's zone allows the Player to end game a winner with remaining lives and points.
  • DETAILED DESCRIPTION—OTHER EMBODIMENTS
  • The primary embodiment of this process and method can be implemented as a single player game. FIG. 17 illustrates how a compiled Toolset (17 b) interacts with the operating system (17 a) and device interfaces (17 d, 17 e, 17 f) on a small computing device for a single player. The Toolset uses the mapping (17 e) and network interface (17 d) to retrieve external maps from the Internet (17 g). GPS interfacing software (17 f) is used to receive coordinates from an internal or external GPS device (17 c).
  • Another embodiment of this invention allows communication between multiple players. This embodiment is illustrated in FIG. 18 and requires Gameboards and Virtual Effects to be synchronized across multiple executing Toolsets (18 a, 18 b) during Gameboard Interpretation. This alternative embodiment supports an expanded number of Toolsets and Game themes. For example, a Toolset Creator could enhance a game of hide and seek or sharks and minnows. In both these Toolset examples, players could receive hints to where other players are hiding.
  • This invention's process and method as defined use a square zone represented by horizontal and vertical bounds. Another embodiment could use any shape. For example, a zone could be represented as the area within a set of points or the area could be defined by an equation: x**2+2y**2.
  • Size used in this invention's example implementation (Sidewalk Squirrel©) is represented as pixels. In other embodiments, size could be represented in any unit of measure: feet, inches or meters. It is required though that size correlates to the actual size of the Virtual Effect image to give the Gameboard Player a realistic experience when interacting with the Virtual Effects within a game.
  • Visual display may not be required by some applications. For example, tours might completely rely on audio Virtual Effects.
  • Wizards may aid in the creation of Virtual Effects on a Gameboard. For example, following input from a Game Author, a Wizard could be instructed to place Virtual Effects on all street corners. Likewise, a Wizard could be used to generate dog Virtual Effects in Sidewalk Squirrel© during Gameboard annotation or interpretation.
  • An unlimited variety of Toolsets (Games and Virtual Effects) could be created as other embodiments of this invention. The complexity of trigger conditions and effects is left to the imagination of the Toolset Creator. To follow this invention's process and method, the Virtual Effect must implement methods capturing: movement, interaction, appearance and audio. Defining these traits and corresponding methods allows the Gameboard Editor and Gameboard Interpreter to process Virtual Effects generically. That is, a newly created Virtual Effect can be processed in the same manor as the defined dog or bone from Sidewalk Squirrel©. Following are a few other example Toolsets and their respective Virtual Effects:
      • Scavenger Hunt
      • Task—Represented as a “check” symbol. Instruct player to perform a location specific task or collect an item for points.
      • Question—Represented as a question mark. Ask player location specific question. For example, ask hot dog vender for his middle name.
      • Golf Course Guide
      • Tee—Represented as a green rectangle. Give course statistics and strategies.
      • Green—Represented as an oval. Provide green undulations and speed tips.
      • Walking Tour
      • Trivia—Provide location specific trivia. For example, “Your father kissed your mother for the first time at this very spot on Sep. 3, 1968”.
      • Overlook—Represented as an eye. Provide details of significance and historical pictures of overlook.
      • Monument—Represented as a column. Provide dates of construction and pictures of historical events
  • Whether in its primary or alternative embodiments, this invention allows for the implementation of new location aware game types for small computing devices using location receivers. The Toolsets created using this process and method will promote active rather than sedentary entertainment ultimately promoting more healthy lifestyles.

Claims (23)

What is claimed is:
1. Computer system for creating and playing location aware games comprising:
computing device with display;
plurality of virtual effects having size and interaction rules;
a user interface to permit a user to select and view a gameboard based off of an inputted location and assign virtual effects to locations within the gameboard;
a storage means for storing said gameboard and said plurality of virtual effects;
a second user interface to permit a user to view said gameboard and said plurality of virtual effects while playing a game;
a controller to control the location of one or more virtual effects during game play based on input from at least one of the following: a computer mouse, a screen stylus and a location receiver;
a timing device used by the controller to establish cadence of play.
2. The computer system of claim 1 wherein said computing device further comprising audio delivery.
3. The computer system of claim 2 wherein said gameboard further comprising a center represented as longitude and latitude, bounding coordinates represented as longitude and latitude, a default scale representing a ratio of real world measurements to that of a map image and a plurality of map images each with corresponding scale.
4. The computer system of claim 3 wherein said virtual effect further comprising a real world location represented in longitude and latitude, a plurality of images, a plurality of audio files, movement rules, appearance selection rules and audio selection rules.
5. The computer system of claim 4 wherein said user interfaces permit a user to change scale of said gameboard by changing gameboard map image to that of desired scale and changing plurality of virtual effect images to that of desired scale.
6. The computer system of claim 5 wherein said user interfaces permit a user to change the offset of gameboard map relative to user view to give appearance of scrolling when map image is larger than computing device display.
7. Computer system for creating and playing location aware games comprising:
computing device with audio delivery;
plurality of virtual effects having size and interaction rules;
a user interface to permit a user to select and view a gameboard based off of an inputted location and assign virtual effects to locations within the gameboard;
a storage means for storing said gameboard and said plurality of virtual effects;
a controller to control the location of one or more virtual effects during game play based on input from a location receiver;
a timing device used by the controller to establish cadence of play.
8. The computer system of claim 7 wherein said gameboard further comprising a center represented as longitude and latitude, bounding coordinates represented as longitude and latitude, a default scale representing a ratio of real world measurements to that of a map image and a plurality of map images each with corresponding scale.
9. The computer system of claim 8 wherein said virtual effect further comprising a real world location represented in longitude and latitude, a plurality of audio files, movement rules and audio selection rules.
10. Method for creating, editing and playing location aware games comprising:
plurality of virtual effects having size and interaction rules;
a method for creating and editing a gameboard based off of an inputted location and assign virtual effects to locations within the gameboard;
a method for storing said gameboard and said plurality of virtual effects;
a method for playing user interface to permit a user to view said gameboard and said plurality of virtual effects while playing a game;
a controller to control the location of one or more virtual effects during game play based on input from at least one of the following: a computer mouse, a screen stylus and a location receiver;
a timing device used by the controller to establish cadence of play.
11. The method of claim 10 wherein said gameboard further comprising a center represented as longitude and latitude, bounding coordinates represented as longitude and latitude, a default scale representing a ratio of real world measurements to that of a map image and a plurality of map images each with corresponding scale.
12. The method of claim 11 wherein said virtual effect further comprising a real world location represented in longitude and latitude, a plurality of images, a plurality of audio files, movement rules, appearance selection rules and audio selection rules.
13. The method of claim 12 wherein said gameboard is initialized by reading a file containing XML representing gameboard, instantiating gameboard, reading XML representing said plurality of maps while instantiating each map and adding to gameboard, reading XML representing said plurality of virtual effects while instantiating each virtual effect and sizing each virtual effect image to match plurality of scales contained within each of the plurality of gameboard maps and adding said virtual effect to gameboard.
14. The method of claim 13 wherein the said method of creating and editing gameboard further comprising said initialize gameboard method, an acquire external map method, an add virtual effects method, an edit virtual effects method and a save gameboard method comprising representing said gameboard, said plurality of maps and said plurality of virtual effects as XML and writing XML to a file.
15. The method of claim 14 wherein said method of adding new virtual effect comprising selecting virtual effect type, designating virtual effect location on gameboard image and setting virtual effect attribute values and wherein said method of editing existing virtual effects comprising selecting existing virtual effect, designating new location on gameboard image and changing virtual effect attribute values.
16. The method of claim 15 wherein designating virtual effect location is performed through selection of point on the gameboard map control represented in pixels (x and y) and translating to longitude and latitude using the gameboard northeast(ne) and southwest(sw) bounding coordinates, device display height in pixels and device display width in pixels in the algorithm:

Latitude=ne.latitude−y*absolute value ((ne.latitude−sw.latitude)/displayheight)
Longitude=sw.longitude+x*absolute value ((ne.longitude−sw.longitude)/displaywidth)
16. The method of claim 12 wherein said virtual effects have subtypes that have unique attributes, an overriding movement method, an overriding intersection method, an overriding appearance method, a unique plurality of appearance images, an overriding audio method and unique plurality of audio files.
17. The method of claim 16 wherein said movement evaluation comprising calculating next position of said virtual effect depending on conditions as defined in overriding movement method as defined in virtual effect subtype.
18. The method of claim 17 wherein said intersection evaluation comprising an iteration through plurality of virtual effects and comparing bounding coordinates for overlapping points where intersection triggers events depending on conditions as defined in said intersection method as defined in virtual effect subtype.
19. The method of claim 18 wherein said appearance method comprising of designating said virtual effect image depending on conditions as defined in overriding appearance method as defined in virtual effect subtype.
20. The method of claim 19 wherein said audio method comprising of selecting audio file depending on conditions defined within overriding audio method as defined in virtual effect subtype where selected audio file is interpreted on device specific audio driver.
21. The method of claim 20 wherein said method of interpreting gameboard comprising said initialize gameboard method, instantiating said timer, connecting to location receiver, instantiating said player virtual effect and play game method.
22. The method of claim 21 wherein said play game method further comprising starting said timing device, continuously evaluating plurality of virtual effects using timing device to set evaluation cadence until game over condition is reached where evaluation consists of executing said virtual effect subtype overriding method for movement, executing said virtual effect subtype overriding method for interaction, executing said virtual effect subtype overriding method for appearance and executing said virtual effect subtype overriding method for audio.
US11/163,329 2005-10-14 2005-10-14 Computer system for creating and playing location aware games Abandoned US20070087828A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/163,329 US20070087828A1 (en) 2005-10-14 2005-10-14 Computer system for creating and playing location aware games

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/163,329 US20070087828A1 (en) 2005-10-14 2005-10-14 Computer system for creating and playing location aware games

Publications (1)

Publication Number Publication Date
US20070087828A1 true US20070087828A1 (en) 2007-04-19

Family

ID=37948805

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/163,329 Abandoned US20070087828A1 (en) 2005-10-14 2005-10-14 Computer system for creating and playing location aware games

Country Status (1)

Country Link
US (1) US20070087828A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090005140A1 (en) * 2007-06-26 2009-01-01 Qualcomm Incorporated Real world gaming framework
US20090043626A1 (en) * 2007-08-07 2009-02-12 Samsung Electronics Co., Ltd. System and method for providing product information in lan
US20090207015A1 (en) * 2005-12-23 2009-08-20 Robert S. Babayi System and method for defining an event based on a relationship between an object location and a user-defined zone
US20100174483A1 (en) * 2009-01-07 2010-07-08 Samsung Electronics Co., Ltd. Method and apparatus for road guidance using mobile terminal
US20110039623A1 (en) * 2009-08-12 2011-02-17 3 Legged Dog, Inc. Interactive system and method for digital artifact relocation and activation
US20110039622A1 (en) * 2009-08-12 2011-02-17 3 Legged Dog, Inc. Interactive system and method for digital artifact relocation and activation
WO2012026936A1 (en) * 2010-08-26 2012-03-01 Sony Ericsson Mobile Communications Ab A game engine module and method for playing an electronic game using location information
US20140129130A1 (en) * 2012-11-05 2014-05-08 Nokia Corporation Method and apparatus for providing an application engine based on real-time commute activity
US20140194198A1 (en) * 2012-10-25 2014-07-10 Ecology & Environment, Inc. Computer-based system and method for gamifying ride sharing
US20140363795A1 (en) * 2013-06-06 2014-12-11 Mind Gamez LLC Travel and education application and apparatus
US20150339952A1 (en) * 2014-05-24 2015-11-26 Nirit Glazer Method and system for using location services to teach concepts
US20170167886A1 (en) * 2012-03-28 2017-06-15 Viacom International Inc. Interacting with a User Using a Dynamic Map
EP3117185A4 (en) * 2014-03-14 2017-10-18 Team Action Zone OY Location-based activity
US20180117475A1 (en) * 2011-03-28 2018-05-03 Brian M. Dugan Systems and methods for fitness and video games
US10148774B2 (en) 2005-12-23 2018-12-04 Perdiemco Llc Method for controlling conveyance of electronically logged information originated by drivers of vehicles
US10486067B2 (en) 2011-03-28 2019-11-26 Brian M. Dugan Systems and methods for fitness and video games
US11338196B2 (en) * 2018-04-27 2022-05-24 Neowiz Corporation Game control method, game control device, and recording medium therefor
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5293163A (en) * 1990-06-06 1994-03-08 Mazda Motor Corporation Navigation apparatus for vehicles
US5364093A (en) * 1991-12-10 1994-11-15 Huston Charles D Golf distance measuring system and method
US5422814A (en) * 1993-10-25 1995-06-06 Trimble Navigation Limited Global position system receiver with map coordinate system outputs
US5473324A (en) * 1992-12-28 1995-12-05 Matsushita Electric Industrial Co., Ltd. Map display apparatus
US5596500A (en) * 1993-10-25 1997-01-21 Trimble Navigation Limited Map reading system for indicating a user's position on a published map with a global position system receiver and a database
US5614898A (en) * 1994-03-18 1997-03-25 Aisin Aw Co., Ltd. Guide system
US5721684A (en) * 1995-07-04 1998-02-24 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus having two processors, the first for outputting map data and the second for drawing and scrolling the map
US5751228A (en) * 1904-05-02 1998-05-12 Aisin Aw Co., Ltd. Guide system
US5819199A (en) * 1995-03-28 1998-10-06 Matsushita Electric Works, Ltd. Portable navigation apparatus for indicating present position on a map with indicated reference latitudinal and longitudinal lines based on surveyed present position data
US5848375A (en) * 1995-04-19 1998-12-08 Nippon Telegraph And Telephone Corporation Method of automatically generating road network information and system for embodying the same
US5902343A (en) * 1996-11-22 1999-05-11 Case Corporation Automatic scaling of GPS field maps
US5938709A (en) * 1996-11-22 1999-08-17 Case Corporation Panning display of GPS field maps
US6133853A (en) * 1998-07-30 2000-10-17 American Calcar, Inc. Personal communication and positioning system
US6157342A (en) * 1997-05-27 2000-12-05 Xanavi Informatics Corporation Navigation device
US6199012B1 (en) * 1998-09-25 2001-03-06 Jatco Corporation Map display unit
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
US6397144B2 (en) * 1999-11-30 2002-05-28 Mitsubishi Denki Kabushiki Kaisha On-vehicle information processor with map data and map data management
US6426719B1 (en) * 1999-04-07 2002-07-30 Casio Computer Co., Ltd. Position measurement apparatus that detects location by receiving external signals
US6525768B2 (en) * 1998-10-21 2003-02-25 American Calcar, Inc. Positional camera and GPS data interchange device
US6525690B2 (en) * 1995-09-08 2003-02-25 Prolink, Inc. Golf course yardage and information system with zone detection
US6647336B1 (en) * 1999-08-11 2003-11-11 Nec Corporation Map display terminal and map display method
US6691032B1 (en) * 2002-09-09 2004-02-10 Groundspeak, Inc. System and method for executing user-definable events triggered through geolocational data describing zones of influence
US6714864B2 (en) * 2001-05-29 2004-03-30 Nec Corporation Method and system for displaying automatically scaled map according to degree of precision of estimated mobile position
US6735520B2 (en) * 2001-03-13 2004-05-11 Pioneer Corporation Map information providing method, map information providing system, and recording medium on which the method programed is recorded
US20050049022A1 (en) * 2003-09-02 2005-03-03 Mullen Jeffrey D. Systems and methods for location based games and employment of the same on location enabled devices
US6868335B2 (en) * 1997-06-20 2005-03-15 American Calcar, Inc. Personal communication system for communicating voice data positioning information

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751228A (en) * 1904-05-02 1998-05-12 Aisin Aw Co., Ltd. Guide system
US5293163A (en) * 1990-06-06 1994-03-08 Mazda Motor Corporation Navigation apparatus for vehicles
US5364093A (en) * 1991-12-10 1994-11-15 Huston Charles D Golf distance measuring system and method
US5473324A (en) * 1992-12-28 1995-12-05 Matsushita Electric Industrial Co., Ltd. Map display apparatus
US5422814A (en) * 1993-10-25 1995-06-06 Trimble Navigation Limited Global position system receiver with map coordinate system outputs
US5596500A (en) * 1993-10-25 1997-01-21 Trimble Navigation Limited Map reading system for indicating a user's position on a published map with a global position system receiver and a database
US5614898A (en) * 1994-03-18 1997-03-25 Aisin Aw Co., Ltd. Guide system
US6321158B1 (en) * 1994-06-24 2001-11-20 Delorme Publishing Company Integrated routing/mapping information
US5819199A (en) * 1995-03-28 1998-10-06 Matsushita Electric Works, Ltd. Portable navigation apparatus for indicating present position on a map with indicated reference latitudinal and longitudinal lines based on surveyed present position data
US5848375A (en) * 1995-04-19 1998-12-08 Nippon Telegraph And Telephone Corporation Method of automatically generating road network information and system for embodying the same
US5721684A (en) * 1995-07-04 1998-02-24 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus having two processors, the first for outputting map data and the second for drawing and scrolling the map
US6525690B2 (en) * 1995-09-08 2003-02-25 Prolink, Inc. Golf course yardage and information system with zone detection
US6061618A (en) * 1996-11-22 2000-05-09 Case Corporation Panning display of GPS field maps
US5938709A (en) * 1996-11-22 1999-08-17 Case Corporation Panning display of GPS field maps
US5902343A (en) * 1996-11-22 1999-05-11 Case Corporation Automatic scaling of GPS field maps
US6157342A (en) * 1997-05-27 2000-12-05 Xanavi Informatics Corporation Navigation device
US6868335B2 (en) * 1997-06-20 2005-03-15 American Calcar, Inc. Personal communication system for communicating voice data positioning information
US6133853A (en) * 1998-07-30 2000-10-17 American Calcar, Inc. Personal communication and positioning system
US6199012B1 (en) * 1998-09-25 2001-03-06 Jatco Corporation Map display unit
US6525768B2 (en) * 1998-10-21 2003-02-25 American Calcar, Inc. Positional camera and GPS data interchange device
US6426719B1 (en) * 1999-04-07 2002-07-30 Casio Computer Co., Ltd. Position measurement apparatus that detects location by receiving external signals
US6647336B1 (en) * 1999-08-11 2003-11-11 Nec Corporation Map display terminal and map display method
US6397144B2 (en) * 1999-11-30 2002-05-28 Mitsubishi Denki Kabushiki Kaisha On-vehicle information processor with map data and map data management
US6735520B2 (en) * 2001-03-13 2004-05-11 Pioneer Corporation Map information providing method, map information providing system, and recording medium on which the method programed is recorded
US6714864B2 (en) * 2001-05-29 2004-03-30 Nec Corporation Method and system for displaying automatically scaled map according to degree of precision of estimated mobile position
US6691032B1 (en) * 2002-09-09 2004-02-10 Groundspeak, Inc. System and method for executing user-definable events triggered through geolocational data describing zones of influence
US20050049022A1 (en) * 2003-09-02 2005-03-03 Mullen Jeffrey D. Systems and methods for location based games and employment of the same on location enabled devices

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8717166B2 (en) 2005-12-23 2014-05-06 Geofence Data Access Controls Llc System and method for conveying location information via a plurality of information-sharing environments
US11064038B2 (en) 2005-12-23 2021-07-13 Perdiemco Llc Method for tracking mobile objects based on event conditions met at mobile object locations
US9871874B2 (en) 2005-12-23 2018-01-16 Perdiemco Llc Multi-level database management system and method for an object tracking service that protects user privacy
US11316937B2 (en) 2005-12-23 2022-04-26 Perdiemco Llc Method for tracking events based on mobile device location and sensor event conditions
US10397789B2 (en) 2005-12-23 2019-08-27 Perdiemco Llc Method for controlling conveyance of event information about carriers of mobile devices based on location information received from location information sources used by the mobile devices
US10819809B2 (en) 2005-12-23 2020-10-27 Perdiemco, Llc Method for controlling conveyance of event notifications in sub-groups defined within groups based on multiple levels of administrative privileges
US10602364B2 (en) 2005-12-23 2020-03-24 Perdiemco Llc Method for conveyance of event information to individuals interested devices having phone numbers
US10382966B2 (en) 2005-12-23 2019-08-13 Perdiemco Llc Computing device carried by a vehicle for tracking driving events in a zone using location and event log files
US8223012B1 (en) 2005-12-23 2012-07-17 Geofence Data Access Controls Llc System and method for conveying object location information
US8493207B2 (en) 2005-12-23 2013-07-23 Geofence Data Access Controls Llc Location information sharing system and method for conveying location information based on user authorization
US9680941B2 (en) 2005-12-23 2017-06-13 Perdiemco Llc Location tracking system conveying event information based on administrator authorizations
US9485314B2 (en) 2005-12-23 2016-11-01 Perdiemco Llc Multi-level privilege notification system operated based on indoor location information received from a location information sources
US20090207015A1 (en) * 2005-12-23 2009-08-20 Robert S. Babayi System and method for defining an event based on a relationship between an object location and a user-defined zone
US10148774B2 (en) 2005-12-23 2018-12-04 Perdiemco Llc Method for controlling conveyance of electronically logged information originated by drivers of vehicles
US8149113B2 (en) 2005-12-23 2012-04-03 Darrell Diem Apparatus and method for conveying location event information based on access codes
US10284662B1 (en) 2005-12-23 2019-05-07 Perdiemco Llc Electronic logging device (ELD) for tracking driver of a vehicle in different tracking modes
US9003499B2 (en) 2005-12-23 2015-04-07 Geofence Data Access Controls Llc System and method for conveying event information based on varying levels of administrative privilege under multiple levels of access controls
US9071931B2 (en) 2005-12-23 2015-06-30 Perdiemco Llc Location tracking system with interfaces for setting group zones, events and alerts based on multiple levels of administrative privileges
US9119033B2 (en) 2005-12-23 2015-08-25 Perdiemco Llc System for sharing information about groups of individuals, drivers, vehicles or objects
US10277689B1 (en) 2005-12-23 2019-04-30 Perdiemco Llc Method for controlling conveyance of events by driver administrator of vehicles equipped with ELDs
US10171950B2 (en) 2005-12-23 2019-01-01 Perdiemco Llc Electronic logging device (ELD)
US9319471B2 (en) 2005-12-23 2016-04-19 Perdiemco Llc Object location tracking system based on relative coordinate systems using proximity location information sources
US20090005140A1 (en) * 2007-06-26 2009-01-01 Qualcomm Incorporated Real world gaming framework
US8675017B2 (en) * 2007-06-26 2014-03-18 Qualcomm Incorporated Real world gaming framework
US20090043626A1 (en) * 2007-08-07 2009-02-12 Samsung Electronics Co., Ltd. System and method for providing product information in lan
US8718924B2 (en) * 2009-01-07 2014-05-06 Samsung Electronics Co., Ltd. Method and apparatus for road guidance using mobile terminal
US20100174483A1 (en) * 2009-01-07 2010-07-08 Samsung Electronics Co., Ltd. Method and apparatus for road guidance using mobile terminal
US20110039622A1 (en) * 2009-08-12 2011-02-17 3 Legged Dog, Inc. Interactive system and method for digital artifact relocation and activation
US20110039623A1 (en) * 2009-08-12 2011-02-17 3 Legged Dog, Inc. Interactive system and method for digital artifact relocation and activation
WO2012026936A1 (en) * 2010-08-26 2012-03-01 Sony Ericsson Mobile Communications Ab A game engine module and method for playing an electronic game using location information
US20180117475A1 (en) * 2011-03-28 2018-05-03 Brian M. Dugan Systems and methods for fitness and video games
US11376510B2 (en) 2011-03-28 2022-07-05 Dugan Health, Llc Systems and methods for fitness and video games
US10434422B2 (en) * 2011-03-28 2019-10-08 Brian M. Dugan Systems and methods for fitness and video games
US10486067B2 (en) 2011-03-28 2019-11-26 Brian M. Dugan Systems and methods for fitness and video games
US11869160B2 (en) 2011-04-08 2024-01-09 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US20170167886A1 (en) * 2012-03-28 2017-06-15 Viacom International Inc. Interacting with a User Using a Dynamic Map
US20140194198A1 (en) * 2012-10-25 2014-07-10 Ecology & Environment, Inc. Computer-based system and method for gamifying ride sharing
US9143897B2 (en) * 2012-11-05 2015-09-22 Nokia Technologies Oy Method and apparatus for providing an application engine based on real-time commute activity
US20140129130A1 (en) * 2012-11-05 2014-05-08 Nokia Corporation Method and apparatus for providing an application engine based on real-time commute activity
US9473893B2 (en) 2012-11-05 2016-10-18 Nokia Technologies Oy Method and apparatus for providing an application engine based on real-time commute activity
US20140363795A1 (en) * 2013-06-06 2014-12-11 Mind Gamez LLC Travel and education application and apparatus
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
EP3117185A4 (en) * 2014-03-14 2017-10-18 Team Action Zone OY Location-based activity
US20150339952A1 (en) * 2014-05-24 2015-11-26 Nirit Glazer Method and system for using location services to teach concepts
US11638869B2 (en) * 2017-04-04 2023-05-02 Sony Corporation Information processing device and information processing method
US11338196B2 (en) * 2018-04-27 2022-05-24 Neowiz Corporation Game control method, game control device, and recording medium therefor

Similar Documents

Publication Publication Date Title
US20070087828A1 (en) Computer system for creating and playing location aware games
US7828655B2 (en) Application programming interface for geographic data in computer games
US7970749B2 (en) Method and system for using geographic data in computer game development
US7967678B2 (en) Computer game development factory system and method
US8562439B2 (en) Geographic area templates for computer games
Fränti et al. O-Mopsi: Mobile orienteering game for sightseeing, exercising, and education
Malaka et al. Stage-based augmented edutainment
Rodrigo et al. Igpaw: intramuros—design of an augmented reality game for philippine history
De Paolis Walking in a virtual town to understand and learning about the life in the middle ages
Chang et al. FIASCO: game interface for location-based play
CN109091872B (en) Teaching specialty popularization system based on game mode
Holm et al. Designing ActionTrack: A state-of-the-art authoring tool for location-based games and other activities
Kerdvibulvech Location-based augmented reality games through immersive experiences
Perez-Valle et al. A novel approach for tourism and education through virtual vitoria-gasteiz in the 16 th century
Predescu et al. A case study of mobile games design with a real-world component based on Google Maps and Unity
Raeburn et al. Creating immersive play anywhere location-based storytelling using mobile AR
JP2021037371A (en) Information processor, terminal device and program
Manuel et al. Simplifying location-based serious game authoring
Champion et al. Game–style interaction
Meister et al. On using state of the art computer game engines to visualize archaeological structures in interactive teaching and research
Huang An innovative proposal for young students to learn computer science and technology through Pokémon Go
Fischöder et al. A storytelling smart-city approach to further cross-regional tourism
Gradinar et al. Designing for the dichotomy of immersion in location based games
Champion What Have We Learnt from Game–Style Interaction?
Losh When Walls Can Talk: Animate Cities and Digital Rhetoric

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION