WO2002036225A1 - Virtual reality game system with pseudo 3d display driver and mission control - Google Patents

Virtual reality game system with pseudo 3d display driver and mission control Download PDF

Info

Publication number
WO2002036225A1
WO2002036225A1 PCT/US2001/046939 US0146939W WO0236225A1 WO 2002036225 A1 WO2002036225 A1 WO 2002036225A1 US 0146939 W US0146939 W US 0146939W WO 0236225 A1 WO0236225 A1 WO 0236225A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
satellite
mission control
program
display
Prior art date
Application number
PCT/US2001/046939
Other languages
French (fr)
Inventor
Laurent Scallie
Cedric Boutelier
Original Assignee
Atlantis Cyberspace, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atlantis Cyberspace, Inc. filed Critical Atlantis Cyberspace, Inc.
Priority to AU2002227273A priority Critical patent/AU2002227273A1/en
Publication of WO2002036225A1 publication Critical patent/WO2002036225A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/206Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
    • A63F2300/208Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards for storing personal settings or data of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/402Communication between platforms, i.e. physical link to protocol
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5526Game data structure
    • A63F2300/554Game data structure by saving game or status data
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Definitions

  • This invention generally relates to virtual reality game systems which provide a three-dimensional (3D) immersive experience to game players, and more particularly, to methods for creating 3D stereo vision displays from popular video games, and a mission control system for administration of multiple game playing satellites (pods).
  • 3D three-dimensional
  • VR game stations Commercial virtual reality games are currently played at VR game stations with one or more players.
  • the commonly used VR game station typically provides a VR game that is played by a player wearing stereoscopic goggles or other 3D head-mounted display (HMDs) and manipulating a weapon or other action equipment while executing physical motions such as turning, aiming, crouching, jumping, etc., on a platform or cordoned space.
  • HMDs 3D head-mounted display
  • the VR games played on conventional VR game stations typically are written for the specific, often proprietary, hardware and operating systems provided by manufacturers for their VR game stations. As a result, there are only a limited number of VR games available for play at current VR game stations.
  • VR game system in which popular 3D video games written to be displayed on 2D display hardware can be operated to provide a 3D stereoscopic display without having to re-write the video game software for the 3D display hardware. It would also be very useful for a new VR game system to enable other 3D game services for VR game players based upon popular video games they want to play on VR game stations.
  • Another problem with commercial game systems is that video games and VR games are commonly played at stand-alone game stations.
  • a player typically chooses whatever game he/she wants to play and queues up at or is assigned a stand-alone game station which is loaded with the selected game.
  • Some arcade systems have a computerized station for handling some common administrative functions such as player sign-in, maintaining player accounts, or logging game dates or selections. However, they do not utilize centralized control to keep track of and monitor plays at multiple game stations unless all the games are written in the same proprietary format used by the producer of the arcade system. If a multi-player network game is offered, the game stations are loaded only with the same game and exchange data for only that game via a local network connecting the game stations together.
  • a method (and system) for operating three-dimensional (3D) application software intended to provide output to a two-dimensional (2D) screen display comprises: (a) running the application software in its normal mode to generate 3D application data output which is normally to be sent to an application programming interface (API) driver for the 2D screen display;
  • API application programming interface
  • the 3D application is a 3D video game
  • the 3D stereoscopic display is a set of head-mounted stereo vision goggles used in a virtual reality (VR) game system.
  • the VR game system employs the pseudo 3D display driver to convert 3D game data from existing 3D video game software intended for 2D screen display to right and left stereoscopic image data for the 3D stereoscopic display. Conversion to stereo vision requires the generation of specific right and left image viewpoints which are combined by human vision to yield an immersive 3D image.
  • the Pseudo Driver converts the 3D game data output of the video game software in any of the application programming interface (API) formats commonly used for popular video games to an API format that supports the handling of stereoscopic image outputs, thereby allowing hundreds of existing 3D video games to be played in a commercial VR game system.
  • API application programming interface
  • the invention method can also be used to generate 3D stereoscopic displays for games played on video game consoles or PCs for home use.
  • the intercepted 3D game data can be stored by a 3D data recorder for later play back.
  • a game player can replay a game or scene of a game they previously played, or another player can re-enact the game played by another player.
  • the 3D game data can also be transmitted or downloaded to a remote player through an online interface. This would allow the replay of the player's 3D visuals at home or on other hardware platforms without the original game software (like replaying previously recorded video).
  • the intercepted 3D game data being re-directed to the Pseudo Driver can also be overlaid, edited, or combined with other 2D or 3D images through a mixer for real-time enhancement of the resulting displayed 3D content.
  • Examples include high-score rewards, promotional information, and logo animation before, during, and after a game or mission.
  • the Pseudo Driver for the 3D stereoscopic display can also be operated in tandem with other pseudo drivers such as drivers for stereo sound and/or directional force feedback.
  • a mission control (administration) system for controlling multiple game playing satellite computers on a network comprises:
  • a mission control computer which operates administrative programs for performing administrative functions for multiple game playing stations connected by the network;
  • a plurality of game playing satellite computers provided at respective game playing stations each maintaining a plurality of game programs;
  • said mission control computer includes a mission control program for controlling the plurality of games available to be played on the game playing satellite computers by issuing generic control commands to the game playing satellite computers, and
  • each of said game playing satellite computers includes a satellite game control program for controlling each of the plurality of game programs available to be played on the satellite computer by receiving a generic control command to start a selected game program issued by said mission control computer and loading in response thereto a game-specific command set corresponding to the selected game program, and by providing said mission control computer with a status report of the status of the selected game program being played on the satellite computer.
  • the satellite computer control program scans game log files as games are played and extracts game status information from the log files for its status reports to the mission control computer.
  • the mission control computer control program uses the status report information for a wide range of administrative functions. For example, the mission control computer can generate system-wide gaming reports, membership and player statistics, detailed statistics on specific games played by specific players, current status of the system, hardware, and software troubleshooting, etc.
  • the satellite computers each use the same control program and maintain in a database the game- specific command sets for the game programs offered on the satellite computer.
  • the game-specific command sets are initially derived by analyzing each game offered on the system and determining the activation, control and termination logic for each game.
  • the satellite control program loads the selected game with the corresponding game-specific command set. In this manner, the mission control computer can maintain centralized control of the game playing stations while offering many different games for play.
  • the mission control site may be networked to another mission control site or to a plurality of mission control sites through a wide area network or the Internet.
  • the databases of multiple mission control sites can be replicated to a master database of a network server that provides an online interface for players in any location through the Internet.
  • the online interface allows the system to offer a wide range of related entertainment services to players anywhere, such as looking up statistics for games they or their buddies have played at any of the mission control sites, comparing their statistics to players at other sites, downloading statistics, maintaining accounts, joining groups of players, and communicating with other players.
  • FIG. 1 A is a block diagram illustrating the overall invention method of intercepting 3D game data and using pseudo 3D display drivers for generating a 3D stereoscopic display
  • FIG. 1B is a block diagram illustrating a preferred method for operation of the pseudo driver through the use of the "dll wrapper" method.
  • FIG.2A is a diagram illustrating the conventional API function call for a 2D display from a first type of PC game (OpenGL) software, as compared to FIG. 2B illustrating the pseudo API call for generating a 3D stereoscopic display.
  • OpenGL PC game
  • FIG.3A is a diagram illustrating the conventional API function call for a 2D display from a second type of PC game (Glide) software, as compared to FIG. 3B illustrating the pseudo API call for generating a 3D stereoscopic display.
  • FIG.4A is a diagram illustrating the conventional API call for a 2D display from a third type of PC game (DirectX) software, as compared to FIG. 4B illustrating the pseudo API call for generating a stereoscopic display.
  • DirectX PC game
  • FIG.5 is a diagram of a virtual reality (VR) game system using pseudo 3D display drivers to drive dual graphics cards for generating a 3D stereoscopic display for different types of PC game software.
  • VR virtual reality
  • FIG. 6 is a diagram of a VR game system using pseudo 3D display drivers to drive a single dual- head graphics card for generating a 3D stereoscopic display for different types of PC game software.
  • FIG. 7 is a diagram illustrating a mission control system having a mission control (administration) computer connected to multiple game playing satellite computers (stations or pods) for centralized control in accordance with the present invention.
  • FIG. 8 is a diagram illustrating a network server connected to multiple mission control sites and providing an online interface to players anywhere through the Internet to services based on data replicated from the mission control sites.
  • FIG.9 is a flow chart illustrating the sequence by which the control program at a satellite computer responds to a generic control command from the mission control program to load and operate any one of a plurality of game programs offered on the system.
  • a 3D application software generates 3D application data intended for rendering to a 2D display, but the 3D application data are intercepted and rendered by pseudo drivers for a 3D display instead.
  • the 3D application is a 3D video game
  • the 3D display is a stereoscopic display device.
  • the advantages of this implementation are described in terms of the capability of configuring a commercial virtual reality (VR) game system (with multiple pods) to offer players their choice of many popular video games in an immersive VR mode with stereo vision.
  • VR virtual reality
  • FIG. 1A the basic method and system of the present invention is illustrated for playing one of many popular 3D video games that a player may want to play in 3D vision.
  • the existing (previously written) 3D video game software 10 is played by a Player and generates a stream of 3D visuals through a game engine that outputs 3D game data.
  • Video games are written using one of several common Application Programming Interfaces (API) for handling the rendering and display functions of the game.
  • API Application Programming Interfaces
  • the 3D game data (series of polygons making up image objects to appear in scenes, and light, shading, and color data) are output with API function calls to conventional API drivers 12, which render the 3D game data into display image data that are fed to a graphics display card 14 and result in a 2D image displayed on a 2D display monitor 16.
  • the 3D game data output of the video game software 10 are intercepted and redirected to pseudo API drivers 20 which generate right (R) and left (L) stereoscopic image outputs to right and left stereoscopic display cards 22, 24 that generate the resulting 3D stereoscopic display on a 3D display device 26.
  • “Stereo vision” refers to immersive visual images which provide depth perception to the viewer. Depth perception is obtained by delivering appropriate right and left offset images to the user's right and left eyes.
  • the API function calls intercepted and re-directed to the Pseudo API Drivers 20 result in the intercepted 3D game data output being processed to R/L image data that can be viewed on a 3D display device, such as VR goggles, helmet, or "no glasses required" 3D monitor.
  • the Pseudo Drivers are written to handle the common API formats used for PC games, such as Glide (TM), developed by 3dfx Interactive, Inc., of Alviso, CA, OpenGL (TM), developed by Silicon Graphics, Inc., (SGI) of Mountain View, CA, or DirectX (TM), distributed by Microsoft Corp., of Redmond, WA.
  • the invention method intercepts and redirects the API function calls and
  • the Pseudo Drivers 20 consist of a Wrapper 21 which is given the same name in the
  • dll dynamic link library
  • Original Drivers OEM Drivers
  • the Video game software calls the dll for the API drivers in its usual mode. Due to its assumption of the original dll name, the Wrapper 21 is called instead of the original dll and drivers and effectively intercepts the API function calls and 3D game data of the video game software.
  • the Wrapper 21 establishes a Stereo Viewpoints module 22 and sets up parallel R and L rendering engines from the renamed original dll and drivers, one rendering engine 23 for rendering right (R) image data, and the other rendering engine 24 for rendering left (L) image data.
  • the Wrapper 21 sends the 3D game data to the Stereo Viewpoints module 22 where right (R) and left (L) viewpoints are calculated or specified for the 3D game data, resulting in R View data and L View data.
  • the API function calls are directed by the Wrapper 21 to the R rendering module with the R view data, resulting in rendering the R image data, and to the L rendering module with the L view data, resulting in rendering the L image data.
  • the R and L image data are then sent to the R and L display cards for the 3D stereoscopic display (see FIG. 1 A).
  • the Pseudo Driver intercepts the 3D game data between the game and the API.
  • the 3D game data can thus be rendered into stereo vision for any specified viewpoint.
  • the data stream from the game goes to the API which is specific to the video card, and undergoes rendering and transformation to an image fixed as 2D.
  • the Pseudo Drivers of the invention method intercept the game data stream and invoke the same (or comparable) rendering functions to render the 3D game data into 3D stereoscopic image data, by generating specific right and left image viewpoints.
  • the right and left image data are sent as outputs to the display cards 22 and 24, which then generate the respective bit-mapped image outputs to activate the display elements in the corresponding right and left eyepieces of the stereoscopic display unit 26.
  • the display cards 22 and 24 which then generate the respective bit-mapped image outputs to activate the display elements in the corresponding right and left eyepieces of the stereoscopic display unit 26.
  • two separate display cards are used for the two stereoscopic image feeds for greater processing speed and throughput.
  • an integrated Pseudo Driver system can also include a 3D game data recorder 30 (3D Recorder) for storing the 3D game data for later playback, and a mixer 40 for enhancing the 3D content, such as by overlaying, editing, or combining with other 2D or 3D images.
  • 3D game data recorder 30 (3D Recorder) for storing the 3D game data for later playback
  • mixer 40 for enhancing the 3D content, such as by overlaying, editing, or combining with other 2D or 3D images.
  • Recorder 40 records the 3D game data stream (vertices, polygons, textures, etc.) for subsequent playback without the need to re-access the game software, such as for providing visuals while debriefing players after a game session or for replaying for a player's personal use.
  • the mixer 40 allows other images, 2D or 3D, to be mixed or interspersed with the game images.
  • the mixer 40 takes the form of a dual rendering module which renders the other 3D content and combines it with the game content. It is advantageous to record 3D game data with the 3D Recorder between the Pseudo Driver Wrapper and the mixer (dual rendering module), because all API types will have been converted into the chosen 3D image data format (DirectX 8, as explained below).
  • the data stream can be played back by simply sending it to the dual rendering module. If the data stream is sent to the 3D Recorder between the game and the Pseudo Drivers, then the game data can be played back simply by sending it to the corresponding API.
  • the mixer can always be running. This allows the system total control of the display at all times, and avoids any lapse in the display if, for example, control is switched to another game.
  • the API Wrapper called by the new game re-connects with the dual rendering module.
  • State-of-the-art first person games are composed of a "game engine”, an object-oriented scriptable logic, and game “levels".
  • the game engine is the essential technology that allows for 3D graphics rendering, sound engine, file management, networking support and all other aspects of the core application.
  • the content of the game sits on top of the rendering engine in the form of scripts and levels basically setting up the series of scenes and actions ("world map") forming the visual environment and the logic within it.
  • Tools provided by game developers are available for modification of the scripted logic of the various objects in the world map, as well as generation of new environments. For PC games, this allows for new content to be created by game developers and the life cycle of the game to be significantly greater.
  • the current trend in game development is to license a specific game engine and allow game developers to focus on content creation, the concept and implementation of the levels, sounds, models, textures and game logic.
  • customized game environments can be produced, characters created, weapons and game objects designed, and special game logic implemented to create the desired game content.
  • Conventional 3D video games are written to be run on conventional hardware and operating systems for display on a 2D monitor, and thus the conventional experience is basically 2D.
  • the 3D game data executes function calls to conventional API drivers for the game that result in a 2D screen image being generated.
  • the conventional game system renders a 3D scene as a centered 2D image as if the user were viewing it with one eye. It is desirable to use existing 3D games for play in VR systems that engage players with a 3D stereo vision display for a more immersive game experience. Since the existing games output 3D game data, the 3D game data can be converted to a 3D display.
  • 3D display technology includes, but is not limited to, HMDs, no-glasses-required monitors, LCD glasses, and hologram display units. Typically, all of these hardware types require two separate 2D input images, one for each eye. Each new type of 3D display technology comes with its own 3D format. Typically, they conform to one of the following standard formats (from highest quality to lowest quality): separate right and left (R/L) images; frame sequential images; side-by-side (left/right) images; top-and- bottom (over/under) images; or field sequential (row interleaved) image signals.
  • the highest quality stereo vision signal is simply two separate R/L image signals.
  • the remaining methods use some method of compression to pack both left and right signals into a single signal. Because of this compression, and overloading of a single signal, the stereo vision image quality is lowered, and/or the frame rate is lowered.
  • the lower quality, "single signal" methods are typically used by lower-priced stereovision hardware, like LCD glasses.
  • the nVidia driver effectively converts a 3D game written for DirectX or OpenGL to be viewable in stereo vision using any single-signal 3D device that is connected to the nVidia video card.
  • these card-specific drivers only work if the manufacturer's video card is used.
  • Conventional hardware manufacturers do not support card-independent high-end, separate right and left image signals.
  • Another important aspect of the invention is the interception of the data stream at the game-API level.
  • Conventional stereovision drivers are established between the API and the video card, and the code existing between the API and the video card requires hardware-specific code. Drivers on that level need to be made by the manufacturer of the video card hardware, which is a drawback in a game system that offers many different games using the same video card hardware.
  • Another drawback is that the data has already undergone a 3D game data to 2D image data transformation, and is therefore fixed as 2D. Once the data are converted to 2D, the 2D data can be converted to stereovision only with "less visually accurate" mathematics.
  • two separate video cards 22 and 24 are used for the separate right and left signal inputs of high-end 3D display devices. Doubling the number of video cards allows for the right and left stereo image to be rendered separately and simultaneously. This avoids the typical 2x slowdown required to display stereo rather than mono.
  • the Pseudo Driver thus allows a normal 3D game to power two video cards, which in turn can power high-end 3D display hardware such as V6 or V8 (TM) Stereovision Head Mounted Displays, distributed by Virtual Research Systems, Inc., of Santa Clara, CA, Visette (TM) Stereovision Head Mounted Display, distributed by Cyber Mind, Ltd., of Leicestershire, UK, Datavisor (TM) Stereovision Head Mounted Display, distributed by N-Vision, Inc., of McLean, VA, or DTI 2015XLS or 2018XLQ (TM) Stereovision Monitor, distributed by Dimension Technologies, Inc-
  • the 3D game data output of existing game software are intercepted and re-directed to Pseudo Drivers for 3D display in place of the conventional API drivers for 2D display.
  • the Pseudo Drivers execute the same or comparable image rendering functions but generate the specific right and left image viewpoints required by 3D display devices.
  • the Pseudo Drivers only convert the 3D game output of the game software and do not affect or manipulate the game software itself.
  • the Pseudo Drivers can produce a 3D display from conventional 3D game software without requiring access to or modification of the game source code.
  • 3D display technology has developed to offer very high resolution and wide field of view. When used with a head mounted display unit (HMD) which allows direct head tracking, VR systems can offer a very immersive virtual reality experience for the player.
  • HMD head mounted display unit
  • Other 3D display devices that may be used include 3D monitors, such as the DTI3D (TM) monitor distributed by Dimension Technologies, Inc., of Rochester, NY, which delivers a stereo vision image without requiring the use of stereoscopic glasses.
  • TM Microsoft Windows
  • Typical graphics API's have some 400 functions that the game program can call to render 3D polygonal scenes. These functions, generally speaking, have names like LoadTexture, SetTexture, RenderPolygons, Display Image, etc. All of the API's functions are held in a dynamic link library (dll).
  • the API's .dll is stored in the computer's C: ⁇ Windows ⁇ System directory. Depending on which API format it is written for, a game will automatically load the appropriate .dll stored in the System directory, and the functions contained within are used to render the game's 3D world map to the 2D screen.
  • the API converts the data internally, and forwards the data to the video card-specific driver. The driver optionally modifies the data further into a format specific to the current video card hardware.
  • the video card renders the data to the screen in the form of textured polygons.
  • the final image appears on the user's monitor as a 2D projection of the 3D world map.
  • the Pseudo Driver of the present invention intercepts the data being sent from the game to the API.
  • the simplest method to do this borrows from a technique called ".dll wrapping".
  • a "Pseudo API” is named and substituted for the usual original API for which the game issues the display function calls. That is, the Pseudo API assumes the identity of the usual API's .dll that the game is looking for. This substitution is done at the installation level for the VR system by storing the Pseudo API in the System directory in place of the original API.
  • the Pseudo API When the game executes function calls for the API, the Pseudo API is called and intercepts the 3D game data.
  • the data stream between the game and the rendering API consists of thousands of vertices, polygons, and texture data per frame.
  • the Pseudo API then either executes calls, or issues subcalls to the original APIs which are set up to be running in the background, for the usual rendering functions, then passes the rendered data to the Pseudo Driver matched to the type of 3D display unit used in the VR system.
  • the Pseudo Driver generates the card- independent R/L stereoscopic image signals which are passed as inputs to the 3D display unit.
  • the game run in conventional PC-based mode initializes with an API call for the OpenGL dynamic link library, called "opengl32.dll", stored in the C: ⁇ Windows ⁇ System directory.
  • the game software loads the opengl32.dll, then sends the stream of game data generated by play of the game to opengl32.dll for linkage to the appropriate API drivers for rendering the game's series of scenes to the 2D screen.
  • the API drivers render the game data to image data and sends the image data to the graphics card used by the API to drive the 2D display.
  • a Pseudo OpenGL Driver has a wrapper named "opengl32.dll" is substituted in the System directory in place of the OpenGL .dll formerly of that name.
  • the OpenGL API is never actually initialized; in fact, it is not needed on the machine at all.
  • the Pseudo OpenGL Driver linked to the psuedo OpenGL wrapper pretends to be the OpenGL driver, however, all the data sent to it is converted into a format that can be rendered for stereo vision by a dual rendering system for the dual R/L stereoscopic image outputs.
  • DirectX 8 is used as the rendered data format since it can support the use of multiple outputs to multiple graphics cards. For about 370 functions, some translation and/or redirection is required. Generally speaking, only about 20% of the functions are actually used by games. Each of these functions has a small amount of code that is translated. Translation could be as simple as calling "LoadDirectX ⁇ Texture", when "LoadOpenGLTexture" is called, for example.
  • the DirectX 8 calls are linked through the real DirectX 8 .dll ("d3d8.dll").
  • Other functions require large amounts of code that converts vertex, index, or texture data. All the game data is handled in this way by the Pseudo Driver.
  • the Pseudo Driver effectively ports Quake3 for OpenGL and 2D display to DirectX 8 for stereoscopic display without touching Quake3 source code.
  • FIG. 3A shows a Glide game run in conventional mode for a 2D display
  • FIG. 3B shows the Glide game run in pseudo wrapper mode for a 3D display. Source code for a Pseudo Glide2x.dll wrapper was written, and stored in the C:Windows ⁇ System directory.
  • the Pseudo Glide wrapper exports the same rendering functions as the real Glide2x.dll. From the outside, the two .dlls are indistinguishable.
  • a Glide game such as Unreal Tournament
  • the game Unreal Tournament then sends game data to the Pseudo Glide wrapper, which manipulates the data, changing it into a format for stereoscopic display to two video cards for the right and left image viewpoints.
  • DirectX uses a linking structure named Common Object Method (COM), which is a different method of storing functions inside dynamic link libraries. Therefore, the Pseudo DirectX wrapper was written to handle the COM link structure.
  • the code for the DirectX COM wrappers is more complex than the OpenGL, or Glide wrappers. For example, in the opengl32.dll structure, all of the rendering functions are accessible to OpenGL programmers. However, the DirectX COM structure has an initial index which only points to 3 categories of functions, as follows: ValidatePixelShader ValidateVertexShader
  • Direct3DCreate8 The category index has a link structure which points to the actual rendering functions one layer deeper.
  • the DirectX API named "d3d8.dll" is loaded.
  • the game must first call the Direct3DCreate8 function, which returns a class pointer. This class pointer can then be used to access all DirectX 8 rendering functions.
  • the pseudo DirectX wrapper handling the COM method also requires a wrapper for the class.
  • FIG. 4A shows a DirectX game run in conventional mode for a 2D display
  • FIG. 4B shows the DirectX game run in pseudo wrapper mode for a 3D display.
  • Source code for a Pseudo DirectX wrapper was written, and stored in the C:Windows ⁇ System directory. There are actually two DirectX wrappers stored, one for the DirectX 7 .dll named "d3dim700.dll” for games written for DirectX 7, and one for DirectX 8 .dll named “d3d8.dll” for games written for DirectX 8.
  • the pseudo d3dim700.dll converts DirectX 7 function calls and data into DirectX 8 function calls, whereas the pseudo d3d8.dll links directly to DirectX 8 function calls.
  • the Pseudo DirectX wrapper renders the game data into a format for stereoscopic display to two video cards for the right and left image viewpoints.
  • FIG.5 an example of the virtual reality game system is shown incorporating pseudo 3D display drivers for existing PC games to generate a stereo vision display.
  • the system can accommodate most of the popular games that are written for OpenGL, DirectX 7, and/or DirectX 8.
  • Pseudo OpenGL, DirectX 7, and DirectX 8 wrappers take the 3D game data output of any of the games and re-directs them to Dual Rendering links to real DirectX 8 rendering functions.
  • the resulting R/L stereo image outputs are fed to dual graphics cards, which are nVidia GeForce2 cards using the card-specific driver nvdisp.drv in this example.
  • the separate R and L image display outputs are fed to the respective R and L eyepieces of a stereo vision head mounted display.
  • a parallel system can be configured for Glide games using a Pseudo Glide wrapper and Glide-specific graphics cards.
  • FIG. 6 shows an alternate configuration in which the R/L stereo image outputs are fed to a single dual-head graphics card, which is an ATI Radeon 8500 Dual Monitor card in this example.
  • the single "dual head" card has 2 VGA monitor-capable outputs. Some of the cards components, like the PCI interface for example, are shared between the two devices. As a result, the single card solution is slower than the same system with dual cards. Thus, the dual-head system offers a tradeoff of somewhat lower performance against a lower cost than the two-card system.
  • Extremely high-end stereo devices take two inputs, one for each eye. Typically, the two inputs are provided from two separate video cameras to achieve stereoscopic vision in the final 3D display.
  • the Pseudo Driver instead provides a high-end synthetic connection to the 3D display through the re-direction of 3D game data to dual rendering functions and dual graphics cards to provide the two stereoscopic images.
  • Each card (or card head) renders a slightly different image, one from the viewpoint of the left eye, and the other from the viewpoint of the right eye. Because both frames are rendered simultaneously, the typical 2x stereo vision slowdown is avoided.
  • This allows regular PC games like Quake3 to be viewable in stereo vision using the latest 3D display technology.
  • the pseudo driver methodology enables an integrated VR game system to be played for most of the popular PC games known to players, and allows integration of related functions that can take advantage of its game data interception and dual rendering functions.
  • Pseudo Driver Architecture The pseudo driver software architecture allows interfacing of VR input and output devices to a 3D game application without its source code.
  • Pseudo drivers are drivers or applications that lie between the game application and the actual legitimate video, sound, input or output driver.
  • the VR system wraps existing applications with a series of pseudo drivers.
  • the pseudo driver method generically allows creating a quality depth perception display from any 3D application without needing to access its source code and regardless of the API (Glide, DirectX, or OpenGL).
  • the outputs are two separate high quality VGA signals with no compromise in frame rate or resolution. It is not an interlaced output.
  • the 3D Recorder Since the 3D Recorder records 3D game data output from any Glide, DirectX, or OpenGL applications, it can replay the visuals of a player's game in high quality 3D vision without needing to run the original application.
  • One of the further advantages of this is that the recorded data can be replayed on any DirectX capable player, making it possible to use an online interface allowing members to download their mission replay to their home hardware platform.
  • the game visuals can be overlaid with other 3D content or animation, text and graphics in real time. Examples include high-scores, promotional information, and logo animation before, during or after a mission.
  • the pseudo driver methodology allows PC games to be played as VR games using head-mounted devices.
  • the HMDs allow for head tracking in real-time inside a game environment with 3 degrees of freedom (looking up/down, left/right and tilting) without access to the game source code.
  • Native versus mouse emulation tracking allows for zero lag and high-resolution positioning, ultimately increasing quality immersion and reducing motion sickness.
  • a critical benefit of native tracking is that the user does not experience head recalibration since horizontal in real space is known by the device in this mode only.
  • Duo Tracking Support Use of HMDs frees up the player's hands to control a weapon or other type of action device.
  • 3D consumer games only support a single 2 degrees of freedom input device (mouse, trackball, and joystick).
  • the VR system can support two 3-degrees-of-freedom tracking devices via a combination of external drivers and game script modification.
  • duo tracking head tracking and weapon tracking
  • a player will be able to look one way and shoot the other for example.
  • Peripheral Input Engine This tool enables the system to interface a variety of input devices like guns, buttons on the POD, pod rail closure sensors and the like to the 3D game or the mission control software.
  • a pseudo sound and/or force feedback driver can be added in tandem with the pseudo 3D display driver. This would allow real-time filtering of sounds and generating accurate force feedback for custom designed hardware like a force feedback vest.
  • the vest can have a number of recoil devices that would be activated based on the analysis of the nature and impact locations of ammunition in the virtual opponents. For example, a rocket contact from the back would trigger all recoil devices at once, while a nail gun hitting from the back to the front as one is turning would be felt accurately by the user.
  • Further applications of the pseudo driver method could include an intercom localized in the 3D environment and replacement or addition of game sound with other sounds.
  • an administration or "mission control” site employs a mission control computer 10 connected by a network to multiple game playing satellite computers 20 installed at respective game playing stations or “pods".
  • the mission control computer operates a variety of administrative programs for performing administrative functions for the game playing stations on the network.
  • Each satellite computer operates a plurality of game programs (video games, virtual reality games, simulation games, etc.) which a player may select from.
  • the mission control computer has a mission control program for controlling the games played on the satellite computers.
  • the satellite computers have a satellite game control program which responds to generic control commands issued by the mission control computer to start a selected game by loading the selected game along with its game-specific command set and sending status reports to the mission control computer.
  • the mission control computer can thus maintain centralized control of the games played on the game playing stations without having to control each of the many different games offered for play.
  • the Mission Control Program used by the mission control computer is the heart of the mission control system. It determines what satellite computer should be playing which games, how long the games will last, how much the player will be charged, etc.
  • the Mission Control Program connects to each Satellite Control Program over the network to control what that satellite computer will do.
  • the commands it issues to the Satellites can include "Start new network game", "Debrief game”, "Quit current game and start new game", etc.
  • the Mission Control Program can also connect via a wide area network or the Internet to another mission control site or to a central network server which provides an online interface to players anywhere (described in further detail below).
  • the mission control computer maintains in its associated database 12 game statistics, game playing data, and other information for a wide range of administrative functions.
  • the Mission Control Program can provide the operator of the system with reports, targeted marketing and promotional materials, membership details, current status of the system, hardware and software troubleshooting, etc. It can also generate and print detailed player statistics after each mission has finished as well as provide information access through the on-line interface. It can also maintain a history of all player information on the system, including: available funds on account or passcard; mailing and email addresses; current ranking among other players; etc. This information provides the system operator the ability to tailor a marketing campaign to the specific individuals based on the information stored in the database.
  • the mission control system can be extended to multiple mission control sites connected via a wide area network or the Internet.
  • the game data from each mission control site are replicated to the central server's master database that contains all the information for all of the sites on the system.
  • the server provides an online interface that allows players anywhere to access the game data from all (participating) sites remotely.
  • the online interface can allow a player or players to view their current stats on the system from the Internet, including player history, player information, etc. This creates content on the Internet for a system website that supports interaction and communication among players anywhere.
  • the players could also have the ability to download data from the games that they have played for their personal interests.
  • the online interface can also allow players to maintain their handles (user IDs) or accounts in the system, change address information, create or join groups (teams or clans), and chat with other players.
  • Individual site operators can customize their presence on the online interface depending on what player information they choose to share with the entire online community.
  • the Mission Control program has the ability to control multiple Satellites running different games at the same time.
  • a particular game can be played on a stand-alone station, or the same game must be played on all designated stations offering a particular network game.
  • each Satellite can offer many different games, and its Control Program will load the game-specific command set from its database required for a selected one of the many games.
  • Mission Control can issue a generic control command to the Satellite, e.g., to "Start (particular) game", and the Satellite then loads the game-specific command set that will enable control of that game.
  • the Satellite game control program can read the log files, the current dialog boxes or windows opened by the application (game program) running on the system, messages from the Notification API, or some other method used by the game program for external communications.
  • the log files can be parsed for information, e.g., whether the game is still running, when a player dies, when a player kills someone, when the game is over, when the game started, etc. By gathering this information, a status report on the game can be provided by the Satellite to Mission Control.
  • Table II An example of a typical log file for the popular game "Quake” is shown in Table II.
  • the log file is parsed for keywords identified by the system as providing status information, such as "version”, “I am (player)”, “playing demo”, “exited (level)”, “game over”, etc.
  • Table III examples of conversion of messages parsed from the log files into reports issued by the Satellite to Mission Control are illustrated.
  • the Mission Control program applies generic logic to communicate to the Satellites, however there is some minor logic based on the different types of games to control it better. For example “Quake” and “Unreal Tournament” behave slightly differently on the game server. Therefore, there is some special logic for games like “Quake” that provide full logging functionality on their servers as compared to “Unreal Tournament” which provides no logging on the server, but is provided via a “virtual cam” recorder.
  • Mission Control is only notified by the Satellite when there are changes to the system state. For example, if the game were to finish then the Satellite would send Mission Control the command GAME DVER. To prevent the assumption that a machine is working, Mission Control also expects a GAME_READY command from the Satellite every 10 seconds or so with a parameter that it is ready or not. This feature is mainly used for situations that are beyond the control of the Satellite or Mission Control where the machine locks up or the operator exits the program unexpectedly.
  • the Notification API are tools that allow Mission Control to determine the current status of a game program without doing any detailed research on the program itself. This is achieved by giving the game developer or program developer a set of APIs that they call in their application to notify of the current status of the program. For example, these commands can include "Game started”, “Game over”, “Program started”, “Player joined”, “Player killed”, etc.
  • the Notification API are stored as a set of DLLs available for each operating system supported by Mission Control.
  • the Control API allows Mission Control, and more specifically, the Satellite the ability to control a current game or program without knowing any detailed information about the program itself. This is achieved by providing the game developer or program developer a set of APIs that they use to issue commands that "hook into” a particular game, e.g., "Start a new game”, “End current game”, “Join network game”, etc.
  • the game or program To port an existing game into Mission Control without having used the Notification API or the Control API, the game or program must be analyzed for details of the logic sequences needed to control the program, e.g., how to start and stop the game, how to tell if the program is still running, how to start the program, how to start a game server, how to join a game server, etc.
  • the command architecture for each game offered by the system is analyzed and the appropriate game control signals matched to the activation, termination, and control logic for each game are stored in the Satellite database.
  • the game control signals may be stored in a relational database indexed to each particular game.
  • the relational database retrieves the set of all command signals used by that specific game and loads it with the Satellite operating system so that activation by the player of the hardware buttons and other controls results in the correct signals being delivered to the game.
  • the computer system To let the "Unreal Tournament” game know that the player wants to shoot a weapon in the game when the player presses a trigger on the hardware console, the computer system must have the game-specific command set loaded so that a lookup of "Fire” in the command set results in the corresponding Keybd_Event of the ⁇ CTRL> signal being sent to the game. Similarly, the input "Move Forward" for this and other games will result in the ⁇ UP ARROW> keyboard signal being sent, etc.
  • FIG. 9 a typical "Start game” sequence by which the Satellite Control Program interacts with the Mission Control Program is illustrated.
  • Mission Control sends a generic "Start Network Game” (Unreal Tournament) command to the Satellite Control Program.
  • the Satellite Program reads from its database the corresponding Key Table at block 31 , and Message Table for parsing the games log files at block 32.
  • decision 33 it tests whether Unreal Tournament is already running, or at decision 34 whether another game is running . If another game is running , it is shut down at block 35, then the parser for Unreal Tournament is initialized at block 36, and the Unreal Tournament game is launched at block 37. Once the game is ready (decision 38), the "Ready" status is reported to Mission Control at block 39.
  • the Satellite Control Program checks whether its system is connected to the Network Mission (communicating with the other Satellites having players participating in the same mission) at decision 41 , then sends a "Connected” message to Mission Control at block 42. It then checks whether Mission Control has sent the "Everyone Ready” message at decision 43, then starts the game at block 44. The game is then played by the player on the Satellite, in conjunction with other the other participating players on other Satellites, at block 45. All signals exchanged between networked players are handled at the game application level. When the "Game Over" message is parsed from the game's log files, at block 46, the Satellite then sends a "Game Over" message to Mission Control, at block 47.
  • Mission Control Through the use of generic control commands to the multiple Satellites running many different games at different times, the Mission Control system provides a universal intermediary between the game programs and the hardware interface. Mission Control maintains data on the games played on the system by tracking the status reports of the Satellites. It can also handle other desired management functions, such as cash dispenser, briefing/training, monitoring game hardware, mission control, back-end information management, member management, Internet connectivity for remote management, automated updates, and player interactions.
  • the Mission Control system allows the site operator multi-faceted control and interactivity with participants, thus enhancing player experiences and the value delivered.
  • the Mission Control system leverages a database-driven client-server architecture. Each client is fully controlling the activity of all applications and activity on the client computer and reporting to Mission Control, the central server that coordinates and monitors every single event in the system.
  • the server provides a comprehensive graphical user interface for monitoring full system activity and for modifying or creating individualized missions based on a variety of player-desired parameters.
  • a high-end database engine e.g., Sybase
  • the Mission Control system gains the benefit of storing all information and events into a scalable database engine, the capacity of replication for backup, remote management, site link-up and on-line interface capabilities.
  • the Mission Control system can also record the experiences of network game players by running a satellite client of the game as an "observer".
  • the "observer” can enter the game space along with the other players and see (record) the game action. The recording can be done from different camera angles or points of view.
  • this "observer” is referenced as Virtual Cam 22.
  • the Virtual Cam is programmed to execute a sequence of views or to record specific actions in the game when certain conditions are detected.
  • the Mission Control system can be combined with other technologies in a totally integrated game entertainment system.
  • the integrated system can include other 3D and VR enhancement tools like the use of player and spectator videocams to record different viewpoints using 3D space analysis, artificial intelligence methods, and a library of pre-defined camera angles and motions to generate high-quality
  • the entire VR system can be controlled and monitored by a user friendly GUI. All user and system activity can be simply monitored. All software components can be protected via a variety of security methods including preventing decompilation of code, encryption of network packets, etc.
  • the online interface on the Internet provided by linking mission control sites to a network server and master database allows players to engage in enhanced services and the ability to communicate with each other and play network games.
  • Two or more Mission Controls can combine game controls and player input and output into a format that can be streamed over the network or Internet for playback on any other system.
  • players can remotely view and download their stats, their buddies' stats, clan stats, find out who the competition is, vote for best player, change handles, email or street addresses, choose player skins or faces for games that they wish to play, etc.
  • the accumulation of player information at each Mission Control site will allow an operator or groups of operators to organize and automate the management and marketing to players for any entertainment site of chain of sites. For example, repeat business can be automatically targeted with email messages to existing members, or to new members introduced by existing members, competitions and special events can be organized for players via email, and past members can be automatically contacted for return visits.
  • the system can store player information on selection of games, teams, choice of bots, weapons, etc.
  • a statistics engine can analyze and print players stats as well as store them for the on-line interface. This provides the player with a detailed account of their game.
  • Video conferencing capability can be provided between mission control sites to allow players to join a game at another site and to have a briefing and debriefing with other players to exchange tips and strategies, etc. When the game is over, they will be able to talk and see each other during the playback of the debriefing
  • the Mission Control system can also provide players' "team leaders” with access to a mini version of the Mission Control to allow them to view the progress of games from different camera angles and help with strategy development and execution. This will also increase spectator interest as they can gain insight into the leaders' methods and approaches, similar to being able to hear a coach discuss strategy while watching the game on television.
  • a similar mini version of the Mission Control can be provided for "directors" to mimic the control center for a televised sports event where the director has access to many cameras and directs which angles the cameras focus on and decides which ones to utilize.
  • Mission Control system can greatly facilitate social interaction among players and teams and excitement over the games. All the above-mentioned features can combine to provide the best possible entertainment experience from the stand point of technology, game play, social interaction, competitive aspect, spectator sports, length of the experience versus cost and ultimately lead to high repeat business and a viable financial model.
  • Azure Agony Camera running Guest313 entered the game Guest312 entered the game
  • Debug entered the game Guest313 left the game with 0 frags Camera deactivated Guest312 left the game with 0 frags Debug left the game with 0 frags

Abstract

A virtual reality game system and method uses pseudo drivers to generate stereo vision outputs for a 3D stereoscopic display from game software normally intended for output to a 2D display of a conventional game console or PC. A 'mission control' system is also provided for controlling multiple game playing satellite computers (20) on a network running many game programs of different types from different publishers. The mission control program sends generic control commands to satellite computers for controlling any of the game programs, and the satellite game control program loads a game-specific command set from its database (12) for controlling the selected game program, and also provides the mission control program with information on the status of the game program. A plurality of mission control sites (10) can be connected via Internet to a network server which provides an online interface to players anywhere.

Description

VIRTUAL REALITY GAME SYSTEM WITH PSEUDO 3D DISPLAY DRIVER & MISSION CONTROL
SPECIFICATION
TECHNICAL FIELD
This invention generally relates to virtual reality game systems which provide a three-dimensional (3D) immersive experience to game players, and more particularly, to methods for creating 3D stereo vision displays from popular video games, and a mission control system for administration of multiple game playing satellites (pods).
BACKGROUND OF INVENTION
Commercial virtual reality games are currently played at VR game stations with one or more players. To create an immersive environment without the high cost of installing surrounding wall displays in large room environments, the commonly used VR game station typically provides a VR game that is played by a player wearing stereoscopic goggles or other 3D head-mounted display (HMDs) and manipulating a weapon or other action equipment while executing physical motions such as turning, aiming, crouching, jumping, etc., on a platform or cordoned space. The VR games played on conventional VR game stations typically are written for the specific, often proprietary, hardware and operating systems provided by manufacturers for their VR game stations. As a result, there are only a limited number of VR games available for play at current VR game stations.
Players of VR games often want to play games that are popular video games they are used to playing on game consoles or PCs. Even though many video games are written to create 3D game effects, the common video game console or PC hardware supports image displays for 2D monitors or TV screens. While the 2D displays allow the viewer to view the image in simulated 3D space, it does not provide the immersive depth of vision of a true 3D experience. It is as if the viewer is seeing the 3D image with only one eye. Popular video games therefore are not used at VR game stations employing stereoscopic 3D displays unless the publishers of those video games have chosen to write versions for operation on the hardware and operating systems used at VR game stations of the different manufacturers.
It would therefore be very desirable to have a VR game system in which popular 3D video games written to be displayed on 2D display hardware can be operated to provide a 3D stereoscopic display without having to re-write the video game software for the 3D display hardware. It would also be very useful for a new VR game system to enable other 3D game services for VR game players based upon popular video games they want to play on VR game stations.
Another problem with commercial game systems is that video games and VR games are commonly played at stand-alone game stations. A player typically chooses whatever game he/she wants to play and queues up at or is assigned a stand-alone game station which is loaded with the selected game. Some arcade systems have a computerized station for handling some common administrative functions such as player sign-in, maintaining player accounts, or logging game dates or selections. However, they do not utilize centralized control to keep track of and monitor plays at multiple game stations unless all the games are written in the same proprietary format used by the producer of the arcade system. If a multi-player network game is offered, the game stations are loaded only with the same game and exchange data for only that game via a local network connecting the game stations together. As a result, current arcade systems either do not offer the many popular video game titles written by other game publishers, or offer another game title on a stand-alone game station or multi-player game title on dedicated networked game stations which do not offer other game titles and are not subject to centralized control.
SUMMARY OF INVENTION
In accordance with one aspect of the present invention, a method (and system) for operating three-dimensional (3D) application software intended to provide output to a two-dimensional (2D) screen display comprises: (a) running the application software in its normal mode to generate 3D application data output which is normally to be sent to an application programming interface (API) driver for the 2D screen display;
(b) intercepting the 3D application data output from the application software and redirecting the data to a pseudo driver for generating a 3D stereoscopic display; and
(c) using the pseudo 3D display driver to generate a 3D stereoscopic display.
In a preferred embodiment, the 3D application is a 3D video game, and the 3D stereoscopic display is a set of head-mounted stereo vision goggles used in a virtual reality (VR) game system. The VR game system employs the pseudo 3D display driver to convert 3D game data from existing 3D video game software intended for 2D screen display to right and left stereoscopic image data for the 3D stereoscopic display. Conversion to stereo vision requires the generation of specific right and left image viewpoints which are combined by human vision to yield an immersive 3D image. The Pseudo Driver converts the 3D game data output of the video game software in any of the application programming interface (API) formats commonly used for popular video games to an API format that supports the handling of stereoscopic image outputs, thereby allowing hundreds of existing 3D video games to be played in a commercial VR game system. The invention method can also be used to generate 3D stereoscopic displays for games played on video game consoles or PCs for home use.
As a further aspect of the invention, the intercepted 3D game data can be stored by a 3D data recorder for later play back. In this mode, a game player can replay a game or scene of a game they previously played, or another player can re-enact the game played by another player. The 3D game data can also be transmitted or downloaded to a remote player through an online interface. This would allow the replay of the player's 3D visuals at home or on other hardware platforms without the original game software (like replaying previously recorded video).
The intercepted 3D game data being re-directed to the Pseudo Driver can also be overlaid, edited, or combined with other 2D or 3D images through a mixer for real-time enhancement of the resulting displayed 3D content. Examples include high-score rewards, promotional information, and logo animation before, during, and after a game or mission.
The Pseudo Driver for the 3D stereoscopic display can also be operated in tandem with other pseudo drivers such as drivers for stereo sound and/or directional force feedback.
In another aspect of the present invention, a mission control (administration) system for controlling multiple game playing satellite computers on a network comprises:
(a) a mission control computer which operates administrative programs for performing administrative functions for multiple game playing stations connected by the network; (b) a plurality of game playing satellite computers provided at respective game playing stations each maintaining a plurality of game programs;
(c) a network connecting the mission control computer to the plurality of game playing satellite computers,
(d) wherein said mission control computer includes a mission control program for controlling the plurality of games available to be played on the game playing satellite computers by issuing generic control commands to the game playing satellite computers, and
(e) wherein each of said game playing satellite computers includes a satellite game control program for controlling each of the plurality of game programs available to be played on the satellite computer by receiving a generic control command to start a selected game program issued by said mission control computer and loading in response thereto a game-specific command set corresponding to the selected game program, and by providing said mission control computer with a status report of the status of the selected game program being played on the satellite computer. In a preferred embodiment of the mission control system, the satellite computer control program scans game log files as games are played and extracts game status information from the log files for its status reports to the mission control computer. The mission control computer control program uses the status report information for a wide range of administrative functions. For example, the mission control computer can generate system-wide gaming reports, membership and player statistics, detailed statistics on specific games played by specific players, current status of the system, hardware, and software troubleshooting, etc.
The satellite computers each use the same control program and maintain in a database the game- specific command sets for the game programs offered on the satellite computer. The game-specific command sets are initially derived by analyzing each game offered on the system and determining the activation, control and termination logic for each game. When a generic control command is issued by the mission control computer to start a particular game, the satellite control program loads the selected game with the corresponding game-specific command set. In this manner, the mission control computer can maintain centralized control of the game playing stations while offering many different games for play.
The mission control site may be networked to another mission control site or to a plurality of mission control sites through a wide area network or the Internet. The databases of multiple mission control sites can be replicated to a master database of a network server that provides an online interface for players in any location through the Internet. The online interface allows the system to offer a wide range of related entertainment services to players anywhere, such as looking up statistics for games they or their buddies have played at any of the mission control sites, comparing their statistics to players at other sites, downloading statistics, maintaining accounts, joining groups of players, and communicating with other players.
Other objects, features, and advantages of the present invention will be explained in the following detailed description of the invention having reference to the accompanying drawings.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 A is a block diagram illustrating the overall invention method of intercepting 3D game data and using pseudo 3D display drivers for generating a 3D stereoscopic display, and FIG. 1B is a block diagram illustrating a preferred method for operation of the pseudo driver through the use of the "dll wrapper" method.. FIG.2A is a diagram illustrating the conventional API function call for a 2D display from a first type of PC game (OpenGL) software, as compared to FIG. 2B illustrating the pseudo API call for generating a 3D stereoscopic display.
FIG.3A is a diagram illustrating the conventional API function call for a 2D display from a second type of PC game (Glide) software, as compared to FIG. 3B illustrating the pseudo API call for generating a 3D stereoscopic display.
FIG.4A is a diagram illustrating the conventional API call for a 2D display from a third type of PC game (DirectX) software, as compared to FIG. 4B illustrating the pseudo API call for generating a stereoscopic display.
FIG.5 is a diagram of a virtual reality (VR) game system using pseudo 3D display drivers to drive dual graphics cards for generating a 3D stereoscopic display for different types of PC game software.
FIG. 6 is a diagram of a VR game system using pseudo 3D display drivers to drive a single dual- head graphics card for generating a 3D stereoscopic display for different types of PC game software.
FIG. 7 is a diagram illustrating a mission control system having a mission control (administration) computer connected to multiple game playing satellite computers (stations or pods) for centralized control in accordance with the present invention.
FIG. 8 is a diagram illustrating a network server connected to multiple mission control sites and providing an online interface to players anywhere through the Internet to services based on data replicated from the mission control sites.
FIG.9 is a flow chart illustrating the sequence by which the control program at a satellite computer responds to a generic control command from the mission control program to load and operate any one of a plurality of game programs offered on the system.
DETAILED DESCRIPTION OF INVENTION
In the following description of the invention, a 3D application software generates 3D application data intended for rendering to a 2D display, but the 3D application data are intercepted and rendered by pseudo drivers for a 3D display instead. In a preferred implementation, the 3D application is a 3D video game, and the 3D display is a stereoscopic display device. The advantages of this implementation are described in terms of the capability of configuring a commercial virtual reality (VR) game system (with multiple pods) to offer players their choice of many popular video games in an immersive VR mode with stereo vision. However, it is to be understood that the principles of the invention disclosed herein apply equally to other types of games, programs, and 3D applications, including, for example, CAD applications, simulation applications, and the like, as well as to other use environments, such as home use, standalone PCs, networked game stations, and online (Internet) gaming.
Referring to FIG. 1A, the basic method and system of the present invention is illustrated for playing one of many popular 3D video games that a player may want to play in 3D vision. The existing (previously written) 3D video game software 10 is played by a Player and generates a stream of 3D visuals through a game engine that outputs 3D game data. Video games are written using one of several common Application Programming Interfaces (API) for handling the rendering and display functions of the game. In a conventional mode (dashed arrows), the 3D game data (series of polygons making up image objects to appear in scenes, and light, shading, and color data) are output with API function calls to conventional API drivers 12, which render the 3D game data into display image data that are fed to a graphics display card 14 and result in a 2D image displayed on a 2D display monitor 16.
In the present invention (solid line arrows), the 3D game data output of the video game software 10 are intercepted and redirected to pseudo API drivers 20 which generate right (R) and left (L) stereoscopic image outputs to right and left stereoscopic display cards 22, 24 that generate the resulting 3D stereoscopic display on a 3D display device 26. "Stereo vision" refers to immersive visual images which provide depth perception to the viewer. Depth perception is obtained by delivering appropriate right and left offset images to the user's right and left eyes.
The API function calls intercepted and re-directed to the Pseudo API Drivers 20 result in the intercepted 3D game data output being processed to R/L image data that can be viewed on a 3D display device, such as VR goggles, helmet, or "no glasses required" 3D monitor. In order to use any of the hundreds of existing PC games, the Pseudo Drivers are written to handle the common API formats used for PC games, such as Glide (TM), developed by 3dfx Interactive, Inc., of Alviso, CA, OpenGL (TM), developed by Silicon Graphics, Inc., (SGI) of Mountain View, CA, or DirectX (TM), distributed by Microsoft Corp., of Redmond, WA.
As illustrated in FIG. 1B, the invention method intercepts and redirects the API function calls and
3D game data output from the existing 3D video game software 10 to Pseudo API Drivers 20. In the preferred implementation shown using the so-called "dll wrapper" method (specific examples described in detail below), the Pseudo Drivers 20 consist of a Wrapper 21 which is given the same name in the
System directory as the dynamic link library ("dll") for the original API drivers (Original Drivers"), while the original dll is renamed under a different name and maintained with the Original Drivers. When the video game software is initialized, it calls the dll for the API drivers in its usual mode. Due to its assumption of the original dll name, the Wrapper 21 is called instead of the original dll and drivers and effectively intercepts the API function calls and 3D game data of the video game software. The Wrapper 21 establishes a Stereo Viewpoints module 22 and sets up parallel R and L rendering engines from the renamed original dll and drivers, one rendering engine 23 for rendering right (R) image data, and the other rendering engine 24 for rendering left (L) image data. The Wrapper 21 sends the 3D game data to the Stereo Viewpoints module 22 where right (R) and left (L) viewpoints are calculated or specified for the 3D game data, resulting in R View data and L View data. The API function calls are directed by the Wrapper 21 to the R rendering module with the R view data, resulting in rendering the R image data, and to the L rendering module with the L view data, resulting in rendering the L image data. The R and L image data are then sent to the R and L display cards for the 3D stereoscopic display (see FIG. 1 A).
In the invention, the Pseudo Driver intercepts the 3D game data between the game and the API. The 3D game data can thus be rendered into stereo vision for any specified viewpoint. In the conventional mode by contrast, the data stream from the game goes to the API which is specific to the video card, and undergoes rendering and transformation to an image fixed as 2D. The Pseudo Drivers of the invention method intercept the game data stream and invoke the same (or comparable) rendering functions to render the 3D game data into 3D stereoscopic image data, by generating specific right and left image viewpoints. The right and left image data are sent as outputs to the display cards 22 and 24, which then generate the respective bit-mapped image outputs to activate the display elements in the corresponding right and left eyepieces of the stereoscopic display unit 26. In the preferred embodiment shown, two separate display cards are used for the two stereoscopic image feeds for greater processing speed and throughput.
Computational methods for generating right and left stereoscopic images from given image data are well known, for example, as described in "3d Stereo Rendering Using OpenGL (and GLUT), by Paul B o u r k e , N o v e m b e r 1 9 9 9 , a v a i l a b l e a t t h e I n t e r n e t p a g e http://astronomv.swin.edu.au/pbourke/opengl/stereoql/. The method of determining the right and left eye offset and computing corresponding left and right eye images is deemed to be conventional and not described in further detail herein.
Referring again to FIG. 1A, an integrated Pseudo Driver system can also include a 3D game data recorder 30 (3D Recorder) for storing the 3D game data for later playback, and a mixer 40 for enhancing the 3D content, such as by overlaying, editing, or combining with other 2D or 3D images. The 3D
Recorder 40 records the 3D game data stream (vertices, polygons, textures, etc.) for subsequent playback without the need to re-access the game software, such as for providing visuals while debriefing players after a game session or for replaying for a player's personal use. The mixer 40 allows other images, 2D or 3D, to be mixed or interspersed with the game images. For other 3D content, the mixer 40 takes the form of a dual rendering module which renders the other 3D content and combines it with the game content. It is advantageous to record 3D game data with the 3D Recorder between the Pseudo Driver Wrapper and the mixer (dual rendering module), because all API types will have been converted into the chosen 3D image data format (DirectX 8, as explained below). Using data compression techniques, the large amount of data can be minimized and stored to disk. The data stream can be played back by simply sending it to the dual rendering module. If the data stream is sent to the 3D Recorder between the game and the Pseudo Drivers, then the game data can be played back simply by sending it to the corresponding API.
Because of the separation between the Wrapper 21 and the mixer (dual rendering module) 40, the mixer can always be running. This allows the system total control of the display at all times, and avoids any lapse in the display if, for example, control is switched to another game. When the next game is run, the API Wrapper called by the new game re-connects with the dual rendering module.
Use of Existing Game Software in VR Systems
State-of-the-art first person games are composed of a "game engine", an object-oriented scriptable logic, and game "levels". The game engine is the essential technology that allows for 3D graphics rendering, sound engine, file management, networking support and all other aspects of the core application. The content of the game sits on top of the rendering engine in the form of scripts and levels basically setting up the series of scenes and actions ("world map") forming the visual environment and the logic within it.
Tools provided by game developers are available for modification of the scripted logic of the various objects in the world map, as well as generation of new environments. For PC games, this allows for new content to be created by game developers and the life cycle of the game to be significantly greater. The current trend in game development is to license a specific game engine and allow game developers to focus on content creation, the concept and implementation of the levels, sounds, models, textures and game logic. Using those editing tools, customized game environments can be produced, characters created, weapons and game objects designed, and special game logic implemented to create the desired game content.
Conventional 3D video games are written to be run on conventional hardware and operating systems for display on a 2D monitor, and thus the conventional experience is basically 2D. The 3D game data executes function calls to conventional API drivers for the game that result in a 2D screen image being generated. The conventional game system renders a 3D scene as a centered 2D image as if the user were viewing it with one eye. It is desirable to use existing 3D games for play in VR systems that engage players with a 3D stereo vision display for a more immersive game experience. Since the existing games output 3D game data, the 3D game data can be converted to a 3D display. However, mere connection of a 3D monitor to a standard 3D game like Quake3 (TM), distributed by Activision, Inc., , CA, would not yield a stereo vision image. Doubling a centered image using 3D display hardware also would not yield a stereo image. Only the generation of specific right and left image viewpoints for stereo vision will yield a correct stereo image on a 3D display unit.
3D display technology includes, but is not limited to, HMDs, no-glasses-required monitors, LCD glasses, and hologram display units. Typically, all of these hardware types require two separate 2D input images, one for each eye. Each new type of 3D display technology comes with its own 3D format. Typically, they conform to one of the following standard formats (from highest quality to lowest quality): separate right and left (R/L) images; frame sequential images; side-by-side (left/right) images; top-and- bottom (over/under) images; or field sequential (row interleaved) image signals.
The highest quality stereo vision signal is simply two separate R/L image signals. The remaining methods use some method of compression to pack both left and right signals into a single signal. Because of this compression, and overloading of a single signal, the stereo vision image quality is lowered, and/or the frame rate is lowered. The lower quality, "single signal" methods are typically used by lower-priced stereovision hardware, like LCD glasses. Some hardware vendors, such as nVidia Corp., of Santa Clara, CA, have recently provided support for single-signal, stereo vision formats. For example, the nVidia stereo vision drivers are contained within the nVidia video card-specific driver, nvdisp.drv. The nVidia driver effectively converts a 3D game written for DirectX or OpenGL to be viewable in stereo vision using any single-signal 3D device that is connected to the nVidia video card. However, these card-specific drivers only work if the manufacturer's video card is used. Conventional hardware manufacturers do not support card-independent high-end, separate right and left image signals.
Another important aspect of the invention is the interception of the data stream at the game-API level. Conventional stereovision drivers are established between the API and the video card, and the code existing between the API and the video card requires hardware-specific code. Drivers on that level need to be made by the manufacturer of the video card hardware, which is a drawback in a game system that offers many different games using the same video card hardware. Another drawback is that the data has already undergone a 3D game data to 2D image data transformation, and is therefore fixed as 2D. Once the data are converted to 2D, the 2D data can be converted to stereovision only with "less visually accurate" mathematics.
In the preferred embodiment of the invention, two separate video cards 22 and 24 are used for the separate right and left signal inputs of high-end 3D display devices. Doubling the number of video cards allows for the right and left stereo image to be rendered separately and simultaneously. This avoids the typical 2x slowdown required to display stereo rather than mono. The Pseudo Driver thus allows a normal 3D game to power two video cards, which in turn can power high-end 3D display hardware such as V6 or V8 (TM) Stereovision Head Mounted Displays, distributed by Virtual Research Systems, Inc., of Santa Clara, CA, Visette (TM) Stereovision Head Mounted Display, distributed by Cyber Mind, Ltd., of Leicestershire, UK, Datavisor (TM) Stereovision Head Mounted Display, distributed by N-Vision, Inc., of McLean, VA, or DTI 2015XLS or 2018XLQ (TM) Stereovision Monitor, distributed by Dimension Technologies, Inc-
Pseudo 3D Display Drivers
In the present invention, the 3D game data output of existing game software are intercepted and re-directed to Pseudo Drivers for 3D display in place of the conventional API drivers for 2D display. The Pseudo Drivers execute the same or comparable image rendering functions but generate the specific right and left image viewpoints required by 3D display devices. The Pseudo Drivers only convert the 3D game output of the game software and do not affect or manipulate the game software itself. Thus, the Pseudo Drivers can produce a 3D display from conventional 3D game software without requiring access to or modification of the game source code.
3D display technology has developed to offer very high resolution and wide field of view. When used with a head mounted display unit (HMD) which allows direct head tracking, VR systems can offer a very immersive virtual reality experience for the player. Other 3D display devices that may be used include 3D monitors, such as the DTI3D (TM) monitor distributed by Dimension Technologies, Inc., of Rochester, NY, which delivers a stereo vision image without requiring the use of stereoscopic glasses. Most new 3D display technology can be hooked up to games running on standard Intel-based PCs with the Microsoft Windows (TM) operating system.
Typical graphics API's have some 400 functions that the game program can call to render 3D polygonal scenes. These functions, generally speaking, have names like LoadTexture, SetTexture, RenderPolygons, Display Image, etc. All of the API's functions are held in a dynamic link library (dll). The API's .dll is stored in the computer's C:\Windows\System directory. Depending on which API format it is written for, a game will automatically load the appropriate .dll stored in the System directory, and the functions contained within are used to render the game's 3D world map to the 2D screen. The API converts the data internally, and forwards the data to the video card-specific driver. The driver optionally modifies the data further into a format specific to the current video card hardware. The video card renders the data to the screen in the form of textured polygons. The final image appears on the user's monitor as a 2D projection of the 3D world map. The Pseudo Driver of the present invention intercepts the data being sent from the game to the API. The simplest method to do this borrows from a technique called ".dll wrapping". In this method, a "Pseudo API" is named and substituted for the usual original API for which the game issues the display function calls. That is, the Pseudo API assumes the identity of the usual API's .dll that the game is looking for. This substitution is done at the installation level for the VR system by storing the Pseudo API in the System directory in place of the original API. When the game executes function calls for the API, the Pseudo API is called and intercepts the 3D game data. The data stream between the game and the rendering API consists of thousands of vertices, polygons, and texture data per frame. The Pseudo API then either executes calls, or issues subcalls to the original APIs which are set up to be running in the background, for the usual rendering functions, then passes the rendered data to the Pseudo Driver matched to the type of 3D display unit used in the VR system. The Pseudo Driver generates the card- independent R/L stereoscopic image signals which are passed as inputs to the 3D display unit.
Example: Pseudo OpenGL Driver Many popular PC games are written for OpenGL API, such as Quake3. As illustrated in FIG.2A
(Prior Art), the game run in conventional PC-based mode initializes with an API call for the OpenGL dynamic link library, called "opengl32.dll", stored in the C:\Windows\System directory. The game software loads the opengl32.dll, then sends the stream of game data generated by play of the game to opengl32.dll for linkage to the appropriate API drivers for rendering the game's series of scenes to the 2D screen. The API drivers render the game data to image data and sends the image data to the graphics card used by the API to drive the 2D display.
As shown in FIG.2B, a Pseudo OpenGL Driver has a wrapper named "opengl32.dll" is substituted in the System directory in place of the OpenGL .dll formerly of that name. When Quake3 is run, it calls for the "opengl32.dll" and binds with the Pseudo OpenGL Wrapper that was substituted. In this case, the OpenGL API is never actually initialized; in fact, it is not needed on the machine at all. The Pseudo OpenGL Driver linked to the psuedo OpenGL wrapper pretends to be the OpenGL driver, however, all the data sent to it is converted into a format that can be rendered for stereo vision by a dual rendering system for the dual R/L stereoscopic image outputs. DirectX 8 is used as the rendered data format since it can support the use of multiple outputs to multiple graphics cards. For about 370 functions, some translation and/or redirection is required. Generally speaking, only about 20% of the functions are actually used by games. Each of these functions has a small amount of code that is translated. Translation could be as simple as calling "LoadDirectXδTexture", when "LoadOpenGLTexture" is called, for example. The DirectX 8 calls are linked through the real DirectX 8 .dll ("d3d8.dll"). Other functions require large amounts of code that converts vertex, index, or texture data. All the game data is handled in this way by the Pseudo Driver. The Pseudo Driver effectively ports Quake3 for OpenGL and 2D display to DirectX 8 for stereoscopic display without touching Quake3 source code. Example: Pseudo Glide Driver
The Glide API has been used in many popular games, but is no longer being supported. A Glide- only Pseudo Driver was created for use only for Glide games. Glide is technically unusual in that it allows access to multiple graphics cards (but only if 3dfx cards are used). This made the creation of the Glide Pseudo Driver easier than for OpenGL which does not allow access to multiple video cards. As before, FIG. 3A (Prior Art) shows a Glide game run in conventional mode for a 2D display, and FIG. 3B shows the Glide game run in pseudo wrapper mode for a 3D display. Source code for a Pseudo Glide2x.dll wrapper was written, and stored in the C:Windows\System directory. The Pseudo Glide wrapper exports the same rendering functions as the real Glide2x.dll. From the outside, the two .dlls are indistinguishable. As a result, when a Glide game such as Unreal Tournament is run, it loads the Pseudo Glide2x.dll from the C:Windows\System directory. The game Unreal Tournament then sends game data to the Pseudo Glide wrapper, which manipulates the data, changing it into a format for stereoscopic display to two video cards for the right and left image viewpoints.
Example: Pseudo DirectX Driver
Many popular games today, such as Unreal Tournament, are written for DirectX 7 or earlier versions or have the option to render using DirectX 7. The Pseudo Driver system is set up to use DirectX 8, because DirectX 8 can support multiple hardware devices. Therefore, games written for DirectX 7 uses a pseudo wrapper which provides for conversion from DirectX 7 to DirectX 8. Games written for DirectX 8 can use a pseudo wrapper which links to the real DirectX 8 functions and the required further links for generation of the R/L stereo vision outputs.
DirectX uses a linking structure named Common Object Method (COM), which is a different method of storing functions inside dynamic link libraries. Therefore, the Pseudo DirectX wrapper was written to handle the COM link structure. The code for the DirectX COM wrappers is more complex than the OpenGL, or Glide wrappers. For example, in the opengl32.dll structure, all of the rendering functions are accessible to OpenGL programmers. However, the DirectX COM structure has an initial index which only points to 3 categories of functions, as follows: ValidatePixelShader ValidateVertexShader
Direct3DCreate8 The category index has a link structure which points to the actual rendering functions one layer deeper. When a DirectX 8 game initializes, the DirectX API named "d3d8.dll" is loaded. The game must first call the Direct3DCreate8 function, which returns a class pointer. This class pointer can then be used to access all DirectX 8 rendering functions. Thus, in addition to the standard .dll wrapper, the pseudo DirectX wrapper handling the COM method also requires a wrapper for the class. FIG. 4A (Prior Art) shows a DirectX game run in conventional mode for a 2D display, and FIG. 4B shows the DirectX game run in pseudo wrapper mode for a 3D display. Source code for a Pseudo DirectX wrapper was written, and stored in the C:Windows\System directory. There are actually two DirectX wrappers stored, one for the DirectX 7 .dll named "d3dim700.dll" for games written for DirectX 7, and one for DirectX 8 .dll named "d3d8.dll" for games written for DirectX 8. The pseudo d3dim700.dll converts DirectX 7 function calls and data into DirectX 8 function calls, whereas the pseudo d3d8.dll links directly to DirectX 8 function calls. The Pseudo DirectX wrapper renders the game data into a format for stereoscopic display to two video cards for the right and left image viewpoints.
Integration of Pseudo 3D Display Drivers in VR Game System
Referring to FIG.5, an example of the virtual reality game system is shown incorporating pseudo 3D display drivers for existing PC games to generate a stereo vision display. The system can accommodate most of the popular games that are written for OpenGL, DirectX 7, and/or DirectX 8. Pseudo OpenGL, DirectX 7, and DirectX 8 wrappers take the 3D game data output of any of the games and re-directs them to Dual Rendering links to real DirectX 8 rendering functions. The resulting R/L stereo image outputs are fed to dual graphics cards, which are nVidia GeForce2 cards using the card-specific driver nvdisp.drv in this example. The separate R and L image display outputs are fed to the respective R and L eyepieces of a stereo vision head mounted display. A parallel system can be configured for Glide games using a Pseudo Glide wrapper and Glide-specific graphics cards.
FIG. 6 shows an alternate configuration in which the R/L stereo image outputs are fed to a single dual-head graphics card, which is an ATI Radeon 8500 Dual Monitor card in this example. The single "dual head" card has 2 VGA monitor-capable outputs. Some of the cards components, like the PCI interface for example, are shared between the two devices. As a result, the single card solution is slower than the same system with dual cards. Thus, the dual-head system offers a tradeoff of somewhat lower performance against a lower cost than the two-card system.
Extremely high-end stereo devices take two inputs, one for each eye. Typically, the two inputs are provided from two separate video cameras to achieve stereoscopic vision in the final 3D display. In the invention, the Pseudo Driver instead provides a high-end synthetic connection to the 3D display through the re-direction of 3D game data to dual rendering functions and dual graphics cards to provide the two stereoscopic images. Each card (or card head) renders a slightly different image, one from the viewpoint of the left eye, and the other from the viewpoint of the right eye. Because both frames are rendered simultaneously, the typical 2x stereo vision slowdown is avoided. This allows regular PC games like Quake3 to be viewable in stereo vision using the latest 3D display technology. The pseudo driver methodology enables an integrated VR game system to be played for most of the popular PC games known to players, and allows integration of related functions that can take advantage of its game data interception and dual rendering functions. Some of these system integration features and advantages are described below.
Pseudo Driver Architecture: The pseudo driver software architecture allows interfacing of VR input and output devices to a 3D game application without its source code. Pseudo drivers are drivers or applications that lie between the game application and the actual legitimate video, sound, input or output driver. The VR system wraps existing applications with a series of pseudo drivers.
Generic (Game-Independent) Stereo Vision Display: The pseudo driver method generically allows creating a quality depth perception display from any 3D application without needing to access its source code and regardless of the API (Glide, DirectX, or OpenGL). The outputs are two separate high quality VGA signals with no compromise in frame rate or resolution. It is not an interlaced output.
Generic (Game-Independent) Recording Engine: Since the 3D Recorder records 3D game data output from any Glide, DirectX, or OpenGL applications, it can replay the visuals of a player's game in high quality 3D vision without needing to run the original application. One of the further advantages of this is that the recorded data can be replayed on any DirectX capable player, making it possible to use an online interface allowing members to download their mission replay to their home hardware platform.
Generic (Game-Independent) Video Overlay Graphics: By leveraging the architecture of the pseudo driver, it becomes possible to fully control and even enhance the 3D game output though a mixer.
The game visuals can be overlaid with other 3D content or animation, text and graphics in real time. Examples include high-scores, promotional information, and logo animation before, during or after a mission.
Native Head Tracking Support: The pseudo driver methodology allows PC games to be played as VR games using head-mounted devices. The HMDs allow for head tracking in real-time inside a game environment with 3 degrees of freedom (looking up/down, left/right and tilting) without access to the game source code. Native versus mouse emulation tracking allows for zero lag and high-resolution positioning, ultimately increasing quality immersion and reducing motion sickness. A critical benefit of native tracking is that the user does not experience head recalibration since horizontal in real space is known by the device in this mode only.
Duo Tracking Support: Use of HMDs frees up the player's hands to control a weapon or other type of action device. Currently, 3D consumer games only support a single 2 degrees of freedom input device (mouse, trackball, and joystick). The VR system can support two 3-degrees-of-freedom tracking devices via a combination of external drivers and game script modification. In duo tracking (head tracking and weapon tracking), a player will be able to look one way and shoot the other for example.
Peripheral Input Engine: This tool enables the system to interface a variety of input devices like guns, buttons on the POD, pod rail closure sensors and the like to the 3D game or the mission control software.
Pseudo Sound & Force Feedback Drivers: A pseudo sound and/or force feedback driver can be added in tandem with the pseudo 3D display driver. This would allow real-time filtering of sounds and generating accurate force feedback for custom designed hardware like a force feedback vest. The vest can have a number of recoil devices that would be activated based on the analysis of the nature and impact locations of ammunition in the virtual opponents. For example, a rocket contact from the back would trigger all recoil devices at once, while a nail gun hitting from the back to the front as one is turning would be felt accurately by the user. Further applications of the pseudo driver method could include an intercom localized in the 3D environment and replacement or addition of game sound with other sounds.
In another aspect of the invention, principles for centralized control of multiple game playing stations are explained using the preferred example of a commercial level, multi-station game system offering many different game programs for play. Details of the system are based on a commercially available product for game arcade systems offered by the assignee of the present invention, Atlantic Cyberspace, Inc., Honolulu, Hawaii, which is referred to herein by its tradename designation Atlantis OS. It is to be understood that other variations and modifications may be made given the principles of the invention described herein.
Referring to FIG.7, an administration or "mission control" site employs a mission control computer 10 connected by a network to multiple game playing satellite computers 20 installed at respective game playing stations or "pods". The mission control computer operates a variety of administrative programs for performing administrative functions for the game playing stations on the network. Each satellite computer operates a plurality of game programs (video games, virtual reality games, simulation games, etc.) which a player may select from. The mission control computer has a mission control program for controlling the games played on the satellite computers. The satellite computers have a satellite game control program which responds to generic control commands issued by the mission control computer to start a selected game by loading the selected game along with its game-specific command set and sending status reports to the mission control computer. The mission control computer can thus maintain centralized control of the games played on the game playing stations without having to control each of the many different games offered for play. The Mission Control Program used by the mission control computer is the heart of the mission control system. It determines what satellite computer should be playing which games, how long the games will last, how much the player will be charged, etc. The Mission Control Program connects to each Satellite Control Program over the network to control what that satellite computer will do. The commands it issues to the Satellites can include "Start new network game", "Debrief game", "Quit current game and start new game", etc. The Mission Control Program can also connect via a wide area network or the Internet to another mission control site or to a central network server which provides an online interface to players anywhere (described in further detail below).
The mission control computer maintains in its associated database 12 game statistics, game playing data, and other information for a wide range of administrative functions. For example, the Mission Control Program can provide the operator of the system with reports, targeted marketing and promotional materials, membership details, current status of the system, hardware and software troubleshooting, etc. It can also generate and print detailed player statistics after each mission has finished as well as provide information access through the on-line interface. It can also maintain a history of all player information on the system, including: available funds on account or passcard; mailing and email addresses; current ranking among other players; etc. This information provides the system operator the ability to tailor a marketing campaign to the specific individuals based on the information stored in the database.
Referring to FIG. 8, the mission control system can be extended to multiple mission control sites connected via a wide area network or the Internet. The game data from each mission control site are replicated to the central server's master database that contains all the information for all of the sites on the system. The server provides an online interface that allows players anywhere to access the game data from all (participating) sites remotely. For example, the online interface can allow a player or players to view their current stats on the system from the Internet, including player history, player information, etc. This creates content on the Internet for a system website that supports interaction and communication among players anywhere. The players could also have the ability to download data from the games that they have played for their personal interests. The online interface can also allow players to maintain their handles (user IDs) or accounts in the system, change address information, create or join groups (teams or clans), and chat with other players. Individual site operators can customize their presence on the online interface depending on what player information they choose to share with the entire online community.
The Mission Control program has the ability to control multiple Satellites running different games at the same time. In conventional systems, only a particular game can be played on a stand-alone station, or the same game must be played on all designated stations offering a particular network game. In the invention by contrast, each Satellite can offer many different games, and its Control Program will load the game-specific command set from its database required for a selected one of the many games. In this manner, Mission Control can issue a generic control command to the Satellite, e.g., to "Start (particular) game", and the Satellite then loads the game-specific command set that will enable control of that game.
An example of some of the functions performed in starting and ending a game and the commands issued by Mission Control and responses of the Satellite in the Atlantis OS system offered by Atlantis Cyberspace, Inc., is shown on Table I.
To report the current status of a game, the Satellite game control program can read the log files, the current dialog boxes or windows opened by the application (game program) running on the system, messages from the Notification API, or some other method used by the game program for external communications. For games that maintain log files, the log files can be parsed for information, e.g., whether the game is still running, when a player dies, when a player kills someone, when the game is over, when the game started, etc. By gathering this information, a status report on the game can be provided by the Satellite to Mission Control.
An example of a typical log file for the popular game "Quake" is shown in Table II. The log file is parsed for keywords identified by the system as providing status information, such as "version", "I am (player)", "playing demo", "exited (level)", "game over", etc. In Table III, examples of conversion of messages parsed from the log files into reports issued by the Satellite to Mission Control are illustrated.
The Mission Control program applies generic logic to communicate to the Satellites, however there is some minor logic based on the different types of games to control it better. For example "Quake" and "Unreal Tournament" behave slightly differently on the game server. Therefore, there is some special logic for games like "Quake" that provide full logging functionality on their servers as compared to "Unreal Tournament" which provides no logging on the server, but is provided via a "virtual cam" recorder.
In the preferred implementation, Mission Control is only notified by the Satellite when there are changes to the system state. For example, if the game were to finish then the Satellite would send Mission Control the command GAME DVER. To prevent the assumption that a machine is working, Mission Control also expects a GAME_READY command from the Satellite every 10 seconds or so with a parameter that it is ready or not. This feature is mainly used for situations that are beyond the control of the Satellite or Mission Control where the machine locks up or the operator exits the program unexpectedly.
The Notification API are tools that allow Mission Control to determine the current status of a game program without doing any detailed research on the program itself. This is achieved by giving the game developer or program developer a set of APIs that they call in their application to notify of the current status of the program. For example, these commands can include "Game started", "Game over", "Program started", "Player joined", "Player killed", etc. The Notification API are stored as a set of DLLs available for each operating system supported by Mission Control.
Similarly, the Control API allows Mission Control, and more specifically, the Satellite the ability to control a current game or program without knowing any detailed information about the program itself. This is achieved by providing the game developer or program developer a set of APIs that they use to issue commands that "hook into" a particular game, e.g., "Start a new game", "End current game", "Join network game", etc.
To port an existing game into Mission Control without having used the Notification API or the Control API, the game or program must be analyzed for details of the logic sequences needed to control the program, e.g., how to start and stop the game, how to tell if the program is still running, how to start the program, how to start a game server, how to join a game server, etc.
Popular games offered by different game publishers often employ different command structures and use different command protocols to start, stop, and control games. In the invention system, the command architecture for each game offered by the system is analyzed and the appropriate game control signals matched to the activation, termination, and control logic for each game are stored in the Satellite database. For example, the game control signals may be stored in a relational database indexed to each particular game. When a command is issued by Mission Control to start a particular game, the relational database retrieves the set of all command signals used by that specific game and loads it with the Satellite operating system so that activation by the player of the hardware buttons and other controls results in the correct signals being delivered to the game.
As a specific example, to control a PC-based game like "Quake", distributed by Activision, Inc., , CA, the computer system must send keyboard commands to the game window using the
SendMessage and PostMessage commands of the Microsoft Windows (TM) operating system API functions. However, a network game like "Unreal Tournament" uses http commands for the game server, but requires Keybd_Event commands for the client station to control the game. Thus, the correct signals required for each of the game programs must be determined based upon the game's command architecture, e.g., keystrokes, http commands, TCP/IP commands, writing files, its control API, or via serial communications if there is a modem or a COM port on the computer. The Satellite Control Program can then interpret inputs from the player on the hardware console or network game commands sent from other stations via the network through an http interface into the correct signals required by the game to control its actions. An example of typical commands for games maintained in a Command database is illustrated in Table IV. To let the "Unreal Tournament" game know that the player wants to shoot a weapon in the game when the player presses a trigger on the hardware console, the computer system must have the game-specific command set loaded so that a lookup of "Fire" in the command set results in the corresponding Keybd_Event of the <CTRL> signal being sent to the game. Similarly, the input "Move Forward" for this and other games will result in the <UP ARROW> keyboard signal being sent, etc.
In FIG. 9, a typical "Start game" sequence by which the Satellite Control Program interacts with the Mission Control Program is illustrated. At block 30, Mission Control sends a generic "Start Network Game" (Unreal Tournament) command to the Satellite Control Program. The Satellite Program reads from its database the corresponding Key Table at block 31 , and Message Table for parsing the games log files at block 32. At decision 33, it tests whether Unreal Tournament is already running, or at decision 34 whether another game is running . If another game is running , it is shut down at block 35, then the parser for Unreal Tournament is initialized at block 36, and the Unreal Tournament game is launched at block 37. Once the game is ready (decision 38), the "Ready" status is reported to Mission Control at block 39. When a command is received from Mission Control to "Join network mission" at block 40, the Satellite Control Program checks whether its system is connected to the Network Mission (communicating with the other Satellites having players participating in the same mission) at decision 41 , then sends a "Connected" message to Mission Control at block 42. It then checks whether Mission Control has sent the "Everyone Ready" message at decision 43, then starts the game at block 44. The game is then played by the player on the Satellite, in conjunction with other the other participating players on other Satellites, at block 45. All signals exchanged between networked players are handled at the game application level. When the "Game Over" message is parsed from the game's log files, at block 46, the Satellite then sends a "Game Over" message to Mission Control, at block 47.
Through the use of generic control commands to the multiple Satellites running many different games at different times, the Mission Control system provides a universal intermediary between the game programs and the hardware interface. Mission Control maintains data on the games played on the system by tracking the status reports of the Satellites. It can also handle other desired management functions, such as cash dispenser, briefing/training, monitoring game hardware, mission control, back-end information management, member management, Internet connectivity for remote management, automated updates, and player interactions. The Mission Control system allows the site operator multi-faceted control and interactivity with participants, thus enhancing player experiences and the value delivered.
The Mission Control system leverages a database-driven client-server architecture. Each client is fully controlling the activity of all applications and activity on the client computer and reporting to Mission Control, the central server that coordinates and monitors every single event in the system. The server provides a comprehensive graphical user interface for monitoring full system activity and for modifying or creating individualized missions based on a variety of player-desired parameters. By developing the entire controlling software around a high-end database engine (e.g., Sybase), the Mission Control system gains the benefit of storing all information and events into a scalable database engine, the capacity of replication for backup, remote management, site link-up and on-line interface capabilities.
For purposes or playback or debriefing the players, the Mission Control system can also record the experiences of network game players by running a satellite client of the game as an "observer". The "observer" can enter the game space along with the other players and see (record) the game action. The recording can be done from different camera angles or points of view. In FIG. 7, this "observer" is referenced as Virtual Cam 22. The Virtual Cam is programmed to execute a sequence of views or to record specific actions in the game when certain conditions are detected.
The Mission Control system can be combined with other technologies in a totally integrated game entertainment system. The integrated system can include other 3D and VR enhancement tools like the use of player and spectator videocams to record different viewpoints using 3D space analysis, artificial intelligence methods, and a library of pre-defined camera angles and motions to generate high-quality
Hollywood-style coverage of action within a virtual world.
The entire VR system can be controlled and monitored by a user friendly GUI. All user and system activity can be simply monitored. All software components can be protected via a variety of security methods including preventing decompilation of code, encryption of network packets, etc.
The online interface on the Internet provided by linking mission control sites to a network server and master database allows players to engage in enhanced services and the ability to communicate with each other and play network games. Two or more Mission Controls can combine game controls and player input and output into a format that can be streamed over the network or Internet for playback on any other system. Through the online interface, players can remotely view and download their stats, their buddies' stats, clan stats, find out who the competition is, vote for best player, change handles, email or street addresses, choose player skins or faces for games that they wish to play, etc.
The accumulation of player information at each Mission Control site will allow an operator or groups of operators to organize and automate the management and marketing to players for any entertainment site of chain of sites. For example, repeat business can be automatically targeted with email messages to existing members, or to new members introduced by existing members, competitions and special events can be organized for players via email, and past members can be automatically contacted for return visits. Using briefing and de-briefing tools, the system can store player information on selection of games, teams, choice of bots, weapons, etc. A statistics engine can analyze and print players stats as well as store them for the on-line interface. This provides the player with a detailed account of their game.
Video conferencing capability can be provided between mission control sites to allow players to join a game at another site and to have a briefing and debriefing with other players to exchange tips and strategies, etc. When the game is over, they will be able to talk and see each other during the playback of the debriefing
The Mission Control system can also provide players' "team leaders" with access to a mini version of the Mission Control to allow them to view the progress of games from different camera angles and help with strategy development and execution. This will also increase spectator interest as they can gain insight into the leaders' methods and approaches, similar to being able to hear a coach discuss strategy while watching the game on television. A similar mini version of the Mission Control can be provided for "directors" to mimic the control center for a televised sports event where the director has access to many cameras and directs which angles the cameras focus on and decides which ones to utilize.
Thus, the Mission Control system can greatly facilitate social interaction among players and teams and excitement over the games. All the above-mentioned features can combine to provide the best possible entertainment experience from the stand point of technology, game play, social interaction, competitive aspect, spectator sports, length of the experience versus cost and ultimately lead to high repeat business and a viable financial model.
It is understood that many modifications and variations may be devised given the above description of the principles of the invention. It is intended that all such modifications and variations be considered as within the spirit and scope of this invention, as defined in the following claims.
TABLE I
Figure imgf000024_0001
TABLE II
Console initialized.
Winsock TCP/IP Initialized WIPX nit: Unable to open control socket
Exe: 09:30:49 Mar 21 1997
16.0 megabyte heap
Sound Initialization
637k surface cache 320x240
CD Audio Initialized joystick not found — no valid joysticks (aδ) execing quake.rc execing default.cfg Unknown command "volume" execing config.cfg execing autoexec.cfg
8 demo(s) in loop
Playing demo from dontdel.dem. ERROR: couldn t open. execing server.cfg
"skill" is "1"
3 demo(s) in loop
VERSION 1.09 SERVER (21066 CRC)
Azure Agony execing server.cfg
"skill" is "1.000000"
VERSION 1.09 SERVER (21066 CRC)
Azure Agony
Guest313 entered the game Guest312 entered the game Debug entered the game Camera running
I Am Player DragonMan Guest313 killed Guest312 Guest312 killed Guest312 NET_GetMessage: disconnected socket VERSION 1.09 SERVER (21066 CRC)
Azure Agony
Camera running
Guest313 entered the game Guest312 entered the game
Debug entered the game
Guest312 left the game with 0 frags
Guest313 left the game with 0 frags
Camera deactivated Debug left the game with 0 frags execing server.cfg
"skill" is "1.000000"
VERSION 1.09 SERVER (21066 CRC) Azure Agony Guest313 entered the game
Guest312 entered the game
Debug entered the game
Camera running
NET_GetMessage: disconnected socket
VERSION 1.09 SERVER (21066 CRC)
Azure Agony Camera running Guest313 entered the game Guest312 entered the game Debug entered the game Guest313 left the game with 0 frags Camera deactivated Guest312 left the game with 0 frags Debug left the game with 0 frags
TABLE
Figure imgf000026_0001
TABLE IV
Figure imgf000026_0002

Claims

CLAIMS:
1. A method for operating three-dimensional (3D) application software intended to provide a display output to a two-dimensional (2D) screen display, characterized by the steps of: (a) running the application software in its normal mode to generate 3D application data output which is normally to be sent to an application programming interface (API) driver for the 2D screen display;
(b) intercepting the 3D application data output from the application software and redirecting the data to a pseudo driver for generating a 3D stereoscopic display; and (c) using the pseudo 3D display driver to generate a 3D stereoscopic display.
2. A method according to Claim 1 , wherein the 3D stereoscopic display is selected from the group consisting of head-mounted "stereo vision" goggles, head-mounted 3D display device, and a stereo vision monitor.
3. A method according to Claim 1 , wherein the 3D application software is a 3D video game software which provides 3D game data output.
4. A method according to Claim 3, wherein the intercepting and redirecting of the 3D game data is obtained by providing a wrapper for the game software's native API having stereoscopic display function calls linked under the same name as the game software's native API for 2D display.
5. A method according to Claim 1 , wherein the pseudo driver generates a 3D stereoscopic display using separate graphics cards or one graphics card with dual heads for rendering right and left image viewpoints for the 3D stereoscopic display.
6. A method according to Claim 3, wherein the intercepted 3D game data is stored in a
3D data recorder for later play back.
7. A method according to Claim 3, wherein the intercepted 3D game data is combined with other 3D content using a mixer and a dual rendering system.
8. A method according to Claim 3, wherein the pseudo 3D display driver is operated with other pseudo drivers such as a stereo sound or a directional force feedback driver.
9. A mission control (administration) system for controlling multiple game playing satellite computers on a network, characterized by:
(a) a mission control computer (10) which operates administrative programs for performing administrative functions for multiple game playing stations connected by the network;
(b) a plurality of game playing satellite computers (20) provided at respective game playing stations each maintaining a plurality of game programs; (c) a network connecting the mission control computer to the plurality of game playing satellite computers,
(d) wherein said mission control computer includes a mission control program for controlling the plurality of games available to be played on the game playing satellite computers by issuing generic control commands to the game playing satellite computers, and (e) wherein each of said game playing satellite computers includes a satellite game control program for controlling each of the plurality of game programs available to be played on the satellite computer by receiving a generic control command to start a selected game program issued by said mission control computer and loading in response thereto a game-specific command set corresponding to the selected game program, and by providing said mission control computer with a status report of the status of the selected game program being played on the satellite computer.
10. A system according to Claim 9, wherein a game program on a satellite computer generates one or more of the following sources of information tracking the operation of the game program, and said satellite game control program parses the source of information for desired status information and provides the status information to the mission control program: game log files; dialog boxes or windows opened by the game program; messages from the Notification API; and a method used by the game program for external communications.
11. A system according to Claim 9, wherein the satellite game control program maintains a database of game-specific command sets for each of the game programs offered on the satellite computer, and, when a control command is issued by the mission control computer to start a particular game, the satellite control program loads the corresponding game-specific command set from its database.
12. A system according to Claim 9, wherein said mission control program maintains a database of game data based upon information provided by the game playing satellite computers, and generates one or more administrative reports from the group consisting of: system-wide gaming reports; membership and player statistics; detailed statistics on specific games played by specific players; current status of the system, hardware, and software troubleshooting.
13. A system according to Claim 9, wherein a plurality of mission control computers are maintained at respective mission control sites and are connected via a network to a network server that provides an online interface of the mission control system to the Internet.
14. A system according to Claim 13, wherein said online interface allows players to perform one or more activities of the group consisting of: looking up statistics for games they have played; seeing how their buddies are doing; seeing statistics for comparison at other sites; downloading statistics for their own later use; maintaining their accounts; joining or maintaining their status with a group of players; and communicating with other players.
15. A method for controlling multiple game playing satellite computers on a network, characterized by the steps of:
(a) providing a mission control computer for performing administrative functions for multiple game playing stations on the network;
(b) providing satellite game playing computers at respective game playing stations, each of which maintains a plurality of game programs;
(c) providing a mission control program on the mission control computer for issuing generic control commands to the game playing satellite computers, and (d) providing a satellite game control program on each of the game playing satellite computers for receiving a generic control command to start a selected game program issued by the mission control computer and loading in response thereto a game-specific command set corresponding to the selected game program, and for providing the mission control computer with a status report of the status of the selected game program being played on the satellite computer.
PCT/US2001/046939 2000-11-02 2001-11-02 Virtual reality game system with pseudo 3d display driver and mission control WO2002036225A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002227273A AU2002227273A1 (en) 2000-11-02 2001-11-02 Virtual reality game system with pseudo 3d display driver and mission control

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US24479500P 2000-11-02 2000-11-02
US24479600P 2000-11-02 2000-11-02
US60/244,795 2000-11-02
US60/244,796 2000-11-02

Publications (1)

Publication Number Publication Date
WO2002036225A1 true WO2002036225A1 (en) 2002-05-10

Family

ID=26936790

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/046939 WO2002036225A1 (en) 2000-11-02 2001-11-02 Virtual reality game system with pseudo 3d display driver and mission control

Country Status (2)

Country Link
AU (1) AU2002227273A1 (en)
WO (1) WO2002036225A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004076016A1 (en) * 2003-02-27 2004-09-10 Differend Games, S.A. Method and electronic device for controlling an interactive game
FR2852526A1 (en) * 2003-03-19 2004-09-24 Jean Francois Riviere Simulating machine for use in sport hall, has mechanical platform mounted on pneumatic jack with electric control to react dynamically in order to feel shocks and vibrations according to vision
US7463270B2 (en) 2006-02-10 2008-12-09 Microsoft Corporation Physical-virtual interpolation
US7552402B2 (en) 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
US7612786B2 (en) 2006-02-10 2009-11-03 Microsoft Corporation Variable orientation input mode
GB2470759A (en) * 2009-06-03 2010-12-08 Sony Comp Entertainment Europe Displaying videogame on 3D display by generating stereoscopic version of game without modifying source code
US8139059B2 (en) 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
US8930834B2 (en) 2006-03-20 2015-01-06 Microsoft Corporation Variable orientation user interface
CN106201396A (en) * 2016-06-29 2016-12-07 乐视控股(北京)有限公司 A kind of method for exhibiting data and device, virtual reality device and playing controller
US9779554B2 (en) 2015-04-10 2017-10-03 Sony Interactive Entertainment Inc. Filtering and parental control methods for restricting visual activity on a head mounted display
WO2018002800A1 (en) * 2016-06-28 2018-01-04 Nokia Technologies Oy Method and apparatus for creating sub-content within a virtual reality content and sharing thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796373A (en) * 1996-10-10 1998-08-18 Artificial Parallax Electronics Corp. Computerized stereoscopic image system and method of using two-dimensional image for providing a view having visual depth
US6099408A (en) * 1996-12-31 2000-08-08 Walker Digital, Llc Method and apparatus for securing electronic games
US6295068B1 (en) * 1999-04-06 2001-09-25 Neomagic Corp. Advanced graphics port (AGP) display driver with restricted execute mode for transparently transferring textures to a local texture cache

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796373A (en) * 1996-10-10 1998-08-18 Artificial Parallax Electronics Corp. Computerized stereoscopic image system and method of using two-dimensional image for providing a view having visual depth
US6099408A (en) * 1996-12-31 2000-08-08 Walker Digital, Llc Method and apparatus for securing electronic games
US6295068B1 (en) * 1999-04-06 2001-09-25 Neomagic Corp. Advanced graphics port (AGP) display driver with restricted execute mode for transparently transferring textures to a local texture cache

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004076016A1 (en) * 2003-02-27 2004-09-10 Differend Games, S.A. Method and electronic device for controlling an interactive game
ES2229881A1 (en) * 2003-02-27 2005-04-16 Differend Games, S.A. Method and electronic device for controlling an interactive game
FR2852526A1 (en) * 2003-03-19 2004-09-24 Jean Francois Riviere Simulating machine for use in sport hall, has mechanical platform mounted on pneumatic jack with electric control to react dynamically in order to feel shocks and vibrations according to vision
US7463270B2 (en) 2006-02-10 2008-12-09 Microsoft Corporation Physical-virtual interpolation
US7612786B2 (en) 2006-02-10 2009-11-03 Microsoft Corporation Variable orientation input mode
US8930834B2 (en) 2006-03-20 2015-01-06 Microsoft Corporation Variable orientation user interface
US8139059B2 (en) 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
US7552402B2 (en) 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
GB2470759A (en) * 2009-06-03 2010-12-08 Sony Comp Entertainment Europe Displaying videogame on 3D display by generating stereoscopic version of game without modifying source code
US9779554B2 (en) 2015-04-10 2017-10-03 Sony Interactive Entertainment Inc. Filtering and parental control methods for restricting visual activity on a head mounted display
WO2018002800A1 (en) * 2016-06-28 2018-01-04 Nokia Technologies Oy Method and apparatus for creating sub-content within a virtual reality content and sharing thereof
CN106201396A (en) * 2016-06-29 2016-12-07 乐视控股(北京)有限公司 A kind of method for exhibiting data and device, virtual reality device and playing controller

Also Published As

Publication number Publication date
AU2002227273A1 (en) 2002-05-15

Similar Documents

Publication Publication Date Title
WO2003039698A1 (en) Virtual reality game system with pseudo 3d display driver &amp; mission control
US11660531B2 (en) Scaled VR engagement and views in an e-sports event
US9616338B1 (en) Virtual reality session capture and replay systems and methods
US20020154214A1 (en) Virtual reality game system using pseudo 3D display driver
CN112040264B (en) Interactive system, method, device, computer equipment and storage medium
US6999083B2 (en) System and method to provide a spectator experience for networked gaming
CN112839725A (en) AR system for providing Augmented Reality (AR) in video games
US9138647B2 (en) Game play skill training
US20080079752A1 (en) Virtual entertainment
US6257982B1 (en) Motion picture theater interactive gaming system
US20080082311A1 (en) Transformations for virtual guest representation
US20090221367A1 (en) On-line gaming
JP6576245B2 (en) Information processing apparatus, control method, and program
CN112104594A (en) Immersive interactive remote participation in live entertainment
US20080268961A1 (en) Method of creating video in a virtual world and method of distributing and using same
US8862658B2 (en) Method and apparatus for recording and replaying network game
WO2002036225A1 (en) Virtual reality game system with pseudo 3d display driver and mission control
US7628702B2 (en) Mission control system for game playing satellites on network
WO2018106461A1 (en) Methods and systems for computer video game streaming, highlight, and replay
JP6200062B2 (en) Information processing apparatus, control method, program, and recording medium
Li Computer Games.
KR20200044195A (en) System of Providing Gaming Video Using Cloud Computer
US20230381674A1 (en) Triggering virtual help or hindrance based on audience participation tiers
CN112423855B (en) Scaled VR participation and viewing in electronic competition
WO2023188022A1 (en) Image generation device, image generation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP